| 1 | Gemini 3 Flash (Preview) | 98.9% | $0.0004 | 1.9s | 95% |
| 2 | Stealth: Aurora Alpha | 98.2% | — | 1.7s | 90% |
| 3 | GPT-5 Nano | 99.9% | $0.0010 | 28.2s | 99% |
| 4 | GPT-5 Mini | 98.6% | $0.0043 | 26.2s | 93% |
| 5 | GPT-5.2 | 99.7% | $0.011 | 15.0s | 98% |
| 6 | o4 Mini | 97.9% | $0.0083 | 20.8s | 91% |
| 7 | Minimax M2.5 | 98.4% | $0.0031 | 39.2s | 90% |
| 8 | Claude Opus 4.5 | 93.5% | $0.0052 | 6.8s | 79% |
| 9 | o4 Mini High | 98.2% | $0.011 | 27.6s | 92% |
| 10 | GPT-4.1 | 88.0% | $0.0012 | 2.8s | 71% |
| 11 | Gemini 3 Pro (Preview) | 99.1% | $0.018 | 13.0s | 93% |
| 12 | MoonshotAI: Kimi K2.5 | 99.2% | $0.0086 | 55.1s | 95% |
| 13 | GPT-4o Mini (temp=0) | 86.1% | $0.0001 | 9.2s | 69% |
| 14 | GPT-5.1 | 98.9% | $0.017 | 26.6s | 94% |
| 15 | GPT-4o, May 13th (temp=0) | 87.0% | $0.0025 | 4.6s | 60% |
| 16 | Llama 3.1 70B | 84.3% | $0.0002 | 2.1s | 57% |
| 17 | GPT-4o Mini (temp=1) | 84.8% | $0.0001 | 6.1s | 57% |
| 18 | ByteDance Seed 1.6 Flash | 85.1% | $0.0007 | 13.5s | 55% |
| 19 | Mistral Medium 3.1 | 80.5% | $0.0003 | 4.3s | 52% |
| 20 | GPT-4o, May 13th (temp=1) | 82.0% | $0.0022 | 4.7s | 52% |
| 21 | ByteDance Seed 1.6 | 88.0% | $0.0027 | 31.1s | 58% |
| 22 | GPT-4.1 Mini | 80.3% | $0.0002 | 2.2s | 45% |
| 23 | GPT-4.1 Nano | 77.1% | $0.0001 | 2.4s | 48% |
| 24 | Claude Opus 4.6 | 83.8% | $0.0055 | 7.6s | 53% |
| 25 | Z.AI GLM 4.7 Flash | 91.0% | $0.0018 | 1.3m | 74% |
| 26 | Claude Opus 4 | 87.7% | $0.015 | 13.2s | 67% |
| 27 | Llama 3.1 Nemotron 70B | 79.8% | $0.0001 | 5.7s | 43% |
| 28 | Llama 3.1 8B | 79.9% | $0.0000 | 910ms | 40% |
| 29 | Z.AI GLM 5 | 98.5% | $0.011 | 1.6m | 89% |
| 30 | Grok 4 | 82.8% | $0.0072 | 15.0s | 56% |
| 31 | Claude Sonnet 4.5 | 81.0% | $0.0033 | 6.0s | 45% |
| 32 | Claude 3.5 Haiku | 78.9% | $0.0006 | 2.5s | 39% |
| 33 | GPT-5 | 99.6% | $0.031 | 51.9s | 98% |
| 34 | Qwen 2.5 72B | 77.3% | $0.0003 | 16.6s | 44% |
| 35 | Claude 3.7 Sonnet | 77.1% | $0.0032 | 5.1s | 42% |
| 36 | Claude Sonnet 4 | 78.2% | $0.0029 | 5.2s | 39% |
| 37 | Gemini 2.5 Pro | 87.0% | $0.018 | 16.6s | 64% |
| 38 | GPT-4o, Aug. 6th (temp=1) | 74.8% | $0.0015 | 2.4s | 37% |
| 39 | Gemma 3 27B | 73.4% | $0.0000 | 5.4s | 33% |
| 40 | Ministral 3 14B | 66.1% | $0.0001 | 2.0s | 38% |
| 41 | DeepSeek V3 (2025-03-24) | 71.1% | $0.0001 | 6.9s | 30% |
| 42 | Claude 3.5 Sonnet | 72.3% | $0.0028 | 4.6s | 29% |
| 43 | Claude Haiku 4.5 | 67.0% | $0.0009 | 2.9s | 27% |
| 44 | Claude Sonnet 4.6 | 70.5% | $0.0024 | 4.6s | 25% |
| 45 | Grok 4.1 Fast | 69.7% | $0.0006 | 10.7s | 26% |
| 46 | DeepSeek V3 (2024-12-26) | 65.7% | $0.0002 | 6.6s | 26% |
| 47 | GPT-4o, Aug. 6th (temp=0) | 67.5% | $0.0013 | 2.2s | 24% |
| 48 | Mistral Small Creative | 58.0% | $0.0001 | 1.2s | 32% |
| 49 | Writer: Palmyra X5 | 62.8% | $0.0014 | 7.9s | 32% |
| 50 | Gemini 3.1 Pro (Preview) | 100.0% | $0.051 | 43.9s | 100% |
| 51 | Z.AI GLM 4.7 | 95.1% | $0.0065 | 2.5m | 71% |
| 52 | Gemma 3 12B | 63.0% | $0.0000 | 4.1s | 15% |
| 53 | Mistral Small 3.2 24B | 54.7% | $0.0001 | 2.6s | 22% |
| 54 | Grok 4 Fast | 53.2% | $0.0003 | 4.0s | 22% |
| 55 | Gemma 3 4B | 57.5% | $0.0000 | 1.8s | 15% |
| 56 | Hermes 3 405B | 58.3% | $0.0000 | 11.9s | 18% |
| 57 | Mistral Large 3 | 57.0% | $0.0003 | 4.6s | 16% |
| 58 | Qwen 3.5 Plus (2026-02-15) | 52.4% | $0.0003 | 5.9s | 20% |
| 59 | Gemini 2.5 Flash Lite | 52.6% | $0.0000 | 785ms | 16% |
| 60 | Claude 3 Haiku | 48.4% | $0.0002 | 2.7s | 20% |
| 61 | DeepSeek V3.2 | 60.8% | $0.0003 | 16.5s | 11% |
| 62 | DeepSeek V3.1 | 54.0% | $0.0001 | 9.1s | 15% |
| 63 | Qwen 3.5 397B A17B | 100.0% | $0.025 | 3.0m | 100% |
| 64 | Ministral 3 3B | 41.8% | $0.0000 | 1.0s | 21% |
| 65 | Cohere Command R+ (Aug. 2024) | 44.9% | $0.0008 | 2.0s | 17% |
| 66 | Arcee AI: Trinity Large (Preview) | 42.9% | $0.0000 | 3.6s | 18% |
| 67 | WizardLM 2 8x22b | 43.8% | $0.0002 | 8.4s | 19% |
| 68 | Gemini 2.5 Flash | 44.0% | $0.0003 | 1.3s | 12% |
| 69 | Ministral 3 8B | 41.9% | $0.0000 | 1.5s | 10% |
| 70 | Hermes 3 70B | 41.9% | $0.0001 | 5.8s | 11% |
| 71 | Z.AI GLM 4.5 | 42.0% | $0.0003 | 5.8s | 8% |
| 72 | Z.AI GLM 4.6 | 58.2% | $0.0034 | 55.9s | 16% |
| 73 | Mistral Large 2 | 37.4% | $0.0007 | 2.9s | 7% |
| 74 | Arcee AI: Trinity Mini | 31.3% | $0.0001 | 3.2s | 12% |
| 75 | Mistral Large | 40.4% | $0.0037 | 5.3s | 8% |
| 76 | Ministral 3B | 26.5% | $0.0000 | 768ms | 15% |
| 77 | Ministral 8B | 28.2% | $0.0000 | 904ms | 7% |
| 78 | DeepSeek-V2 Chat | 33.3% | $0.0001 | 10.5s | 3% |
| 79 | Rocinante 12B | 27.1% | $0.0001 | 8.4s | 9% |
| 80 | Mistral NeMO | 18.6% | $0.0000 | 1.9s | 7% |