```r
model model_type cost z_norm r ICC_mean perf_mean perf_geometric perf_min z ICC_mean_llm_inaccurate
<char> <char> <num> <num> <num> <num> <num> <num> <num> <num> <num>
1: openai/o1-pro expensive 0.5292717 1.0000 0.673 0.960 0.980 0.980 0.9604 0.815 0.962
2: perplexity/sonar-pro online 0.0026756 0.9004 0.645 0.931 0.916 0.915 0.9004 0.766 0.924
3: openai/gpt-4o-search-preview online 0.0014057 0.6607 0.577 0.927 0.794 0.783 0.6607 0.658 0.927
4: perplexity/sonar online 0.0002792 0.7366 0.599 0.828 0.782 0.781 0.7366 0.692 0.829
5: openai/o3-mini-high expensive 0.0122783 0.5018 0.527 0.940 0.721 0.687 0.5018 0.587 0.933
6: google/gemini-2.0-flash-001 offline 0.0000537 0.4517 0.511 0.902 0.677 0.638 0.4517 0.564 0.906
7: google/gemini-2.5-flash-preview-05-20 offline 0.0000753 0.4527 0.511 0.872 0.663 0.628 0.4527 0.564 0.863
8: openai/gpt-4o-mini-search-preview online 0.0001118 0.3734 0.484 0.906 0.640 0.582 0.3734 0.529 0.918
9: openai/gpt-4o-mini offline 0.0000788 0.3679 0.483 0.879 0.624 0.569 0.3679 0.526 0.904
10: google/gemini-2.5-flash-preview-05-20:thinking expensive 0.0023038 0.3695 0.483 0.836 0.603 0.556 0.3695 0.527 0.835
11: google/gemini-2.5-flash-lite-preview-06-17 offline 0.0000499 0.2622 0.445 0.802 0.532 0.459 0.2622 0.479 0.780
12: meta-llama/llama-4-maverick offline 0.0000837 0.2195 0.430 0.756 0.488 0.407 0.2195 0.459 0.740
13: deepseek/deepseek-r1 expensive 0.0003155 0.1828 0.416 0.676 0.430 0.352 0.1828 0.443 0.669
14: google/gemini-flash-1.5-8b offline 0.0000195 0.0421 0.362 0.857 0.450 0.190 0.0421 0.380 0.842
15: openai/gpt-4.1-nano offline 0.0000577 0.0350 0.360 0.848 0.442 0.172 0.0350 0.376 0.865
16: mistralai/mistral-small offline 0.0000898 0.0100 0.350 0.758 0.384 0.087 0.0100 0.365 0.732
```