| 1 | o4 Mini | 100.0% | $0.0023 | 5.1s | 100% |
| 2 | o4 Mini High | 100.0% | $0.0028 | 6.5s | 100% |
| 3 | GPT-5 Nano | 100.0% | $0.0004 | 13.0s | 100% |
| 4 | Gemini 2.5 Pro | 100.0% | $0.0089 | 6.4s | 100% |
| 5 | Claude Opus 4.6 | 100.0% | $0.012 | 4.1s | 100% |
| 6 | MoonshotAI: Kimi K2.5 | 100.0% | $0.0020 | 13.9s | 100% |
| 7 | Gemini 3 Pro (Preview) | 100.0% | $0.010 | 6.5s | 100% |
| 8 | GPT-5.1 | 98.0% | $0.0017 | 4.1s | 72% |
| 9 | GPT-5 Mini | 98.0% | $0.0010 | 6.9s | 72% |
| 10 | Gemini 3.1 Pro (Preview) | 100.0% | $0.011 | 9.2s | 100% |
| 11 | Grok 4 | 100.0% | $0.011 | 12.0s | 100% |
| 12 | GPT-5.2 | 94.0% | $0.0025 | 2.9s | 53% |
| 13 | Minimax M2.5 | 96.0% | $0.0007 | 9.1s | 61% |
| 14 | Grok 4.1 Fast | 92.0% | $0.0004 | 3.3s | 46% |
| 15 | Stealth: Aurora Alpha | 94.0% | — | 1.4s | 53% |
| 16 | Claude Sonnet 4 | 94.0% | $0.0075 | 4.2s | 53% |
| 17 | Llama 3.1 Nemotron 70B | 92.0% | $0.0006 | 9.2s | 46% |
| 18 | Z.AI GLM 4.7 Flash | 92.0% | $0.0004 | 15.7s | 46% |
| 19 | GPT-4.1 | 84.0% | $0.0022 | 3.5s | 27% |
| 20 | Qwen 3.5 397B A17B | 100.0% | $0.0057 | 34.3s | 100% |
| 21 | DeepSeek V3.2 | 88.0% | $0.0004 | 11.9s | 35% |
| 22 | Claude Opus 4.5 | 90.0% | $0.013 | 3.8s | 40% |
| 23 | Claude Sonnet 4.5 | 86.0% | $0.0074 | 3.4s | 31% |
| 24 | ByteDance Seed 1.6 Flash | 80.0% | $0.0004 | 5.2s | 20% |
| 25 | Claude 3.5 Haiku | 80.0% | $0.0020 | 3.7s | 20% |
| 26 | Z.AI GLM 5 | 90.0% | $0.0024 | 15.8s | 40% |
| 27 | Ministral 3 14B | 76.0% | $0.0003 | 3.0s | 15% |
| 28 | Mistral Large 2 | 82.0% | $0.0044 | 6.2s | 23% |
| 29 | Mistral NeMO | 74.0% | $0.0002 | 2.4s | 12% |
| 30 | Claude Haiku 4.5 | 76.0% | $0.0026 | 2.4s | 15% |
| 31 | Z.AI GLM 4.6 | 96.0% | $0.0019 | 31.1s | 61% |
| 32 | Z.AI GLM 4.7 | 96.0% | $0.0017 | 32.6s | 61% |
| 33 | Ministral 3 8B | 70.0% | $0.0002 | 1.7s | 8% |
| 34 | ByteDance Seed 1.6 | 80.0% | $0.0015 | 13.5s | 20% |
| 35 | GPT-4.1 Mini | 66.0% | $0.0002 | 1.8s | 5% |
| 36 | Mistral Medium 3.1 | 68.0% | $0.0009 | 2.9s | 7% |
| 37 | Qwen 3.5 Plus (2026-02-15) | 78.0% | $0.0016 | 11.7s | 17% |
| 38 | Claude Sonnet 4.6 | 72.0% | $0.0064 | 2.1s | 10% |
| 39 | Claude 3.7 Sonnet | 74.0% | $0.0068 | 4.4s | 12% |
| 40 | DeepSeek V3 (2024-12-26) | 70.0% | $0.0007 | 6.4s | 8% |
| 41 | Arcee AI: Trinity Mini | 64.0% | $0.0001 | 3.4s | 4% |
| 42 | Arcee AI: Trinity Large (Preview) | 64.0% | $0.0000 | 3.8s | 4% |
| 43 | Gemma 3 12B | 64.0% | $0.0001 | 3.8s | 4% |
| 44 | DeepSeek V3 (2025-03-24) | 74.0% | $0.0006 | 12.1s | 12% |
| 45 | Mistral Small Creative | 60.0% | $0.0002 | 2.0s | 2% |
| 46 | Llama 3.1 70B | 66.0% | $0.0011 | 5.8s | 5% |
| 47 | Ministral 3 3B | 58.0% | $0.0002 | 1.2s | 1% |
| 48 | Llama 3.1 8B | 58.0% | $0.0002 | 1.8s | 1% |
| 49 | Z.AI GLM 4.5 | 66.0% | $0.0007 | 7.2s | 5% |
| 50 | Gemini 3 Flash (Preview) | 56.0% | $0.0012 | 1.6s | 1% |
| 51 | DeepSeek-V2 Chat | 64.0% | $0.0003 | 8.0s | 4% |
| 52 | Claude 3 Haiku | 54.0% | $0.0005 | 1.3s | 0% |
| 53 | Ministral 8B | 52.0% | $0.0002 | 1.4s | 0% |
| 54 | Writer: Palmyra X5 | 66.0% | $0.0025 | 9.1s | 5% |
| 55 | Ministral 3B | 50.0% | $0.0001 | 1.2s | 0% |
| 56 | Claude 3.5 Sonnet | 64.0% | $0.0067 | 4.6s | 4% |
| 57 | Gemini 2.5 Flash Lite | 48.0% | $0.0001 | 482ms | 0% |
| 58 | Gemini 2.5 Flash | 48.0% | $0.0002 | 535ms | 0% |
| 59 | DeepSeek V3.1 | 64.0% | $0.0004 | 10.9s | 4% |
| 60 | GPT-4o Mini (temp=1) | 46.0% | $0.0002 | 919ms | 0% |
| 61 | Mistral Large 3 | 50.0% | $0.0010 | 2.6s | 0% |
| 62 | Qwen 2.5 72B | 56.0% | $0.0006 | 6.5s | 1% |
| 63 | Hermes 3 70B | 50.0% | $0.0006 | 3.6s | 0% |
| 64 | GPT-4o, May 13th (temp=1) | 62.0% | $0.0095 | 3.4s | 3% |
| 65 | Mistral Large | 54.0% | $0.0040 | 3.3s | 0% |
| 66 | Gemma 3 4B | 42.0% | $0.0001 | 1.1s | 0% |
| 67 | WizardLM 2 8x22b | 52.0% | $0.0015 | 6.3s | 0% |
| 68 | GPT-4o Mini (temp=0) | 40.0% | $0.0002 | 811ms | 0% |
| 69 | Gemma 3 27B | 46.0% | $0.0002 | 4.4s | 0% |
| 70 | GPT-4.1 Nano | 40.0% | $0.0001 | 1.3s | 0% |
| 71 | GPT-4o, Aug. 6th (temp=1) | 46.0% | $0.0038 | 1.3s | 0% |
| 72 | Mistral Small 3.2 24B | 42.0% | $0.0002 | 3.0s | 0% |
| 73 | GPT-5 | 58.0% | $0.0042 | 9.0s | 1% |
| 74 | GPT-4o, Aug. 6th (temp=0) | 44.0% | $0.0037 | 1.2s | 0% |
| 75 | Grok 4 Fast | 40.0% | $0.0003 | 2.1s | 0% |
| 76 | GPT-4o, May 13th (temp=0) | 56.0% | $0.0096 | 4.1s | 1% |
| 77 | Cohere Command R+ (Aug. 2024) | 44.0% | $0.0049 | 2.9s | 0% |
| 78 | Rocinante 12B | 52.0% | $0.0004 | 11.8s | 0% |
| 79 | Hermes 3 405B | 44.0% | $0.0000 | 8.1s | 0% |
| 80 | Claude Opus 4 | 84.0% | $0.037 | 9.8s | 27% |