| 1 | GPT-4o Mini (temp=0) | 100.0% | $0.0002 | 3.4s | 100% |
| 2 | Mistral Small 3.2 24B | 100.0% | $0.0001 | 4.1s | 100% |
| 3 | Grok 4 Fast | 100.0% | $0.0003 | 4.2s | 100% |
| 4 | Claude 3 Haiku | 100.0% | $0.0004 | 3.4s | 100% |
| 5 | Gemma 3 12B | 100.0% | $0.0000 | 7.4s | 100% |
| 6 | Ministral 3 14B | 100.0% | $0.0001 | 8.2s | 100% |
| 7 | Gemini 3 Flash (Preview) | 100.0% | $0.0009 | 3.2s | 100% |
| 8 | Z.AI GLM 4.5 | 100.0% | $0.0005 | 5.8s | 100% |
| 9 | Mistral Medium 3.1 | 100.0% | $0.0006 | 6.1s | 100% |
| 10 | Mistral Large 3 | 100.0% | $0.0005 | 7.2s | 100% |
| 11 | Grok 4.1 Fast | 100.0% | $0.0003 | 10.2s | 100% |
| 12 | DeepSeek V3 (2024-12-26) | 100.0% | $0.0004 | 10.1s | 100% |
| 13 | Gemma 3 27B | 100.0% | $0.0001 | 12.5s | 100% |
| 14 | Claude 3.5 Haiku | 100.0% | $0.0012 | 5.0s | 100% |
| 15 | GPT-4o Mini (temp=1) | 100.0% | $0.0002 | 15.5s | 100% |
| 16 | DeepSeek V3 (2025-03-24) | 100.0% | $0.0003 | 14.6s | 100% |
| 17 | DeepSeek-V2 Chat | 100.0% | $0.0001 | 16.1s | 100% |
| 18 | DeepSeek V3.1 | 100.0% | $0.0002 | 15.4s | 100% |
| 19 | Hermes 3 405B | 100.0% | $0.0000 | 17.7s | 100% |
| 20 | GPT-4.1 | 100.0% | $0.0023 | 4.7s | 100% |
| 21 | Mistral Large 2 | 100.0% | $0.0019 | 8.1s | 100% |
| 22 | Writer: Palmyra X5 | 100.0% | $0.0021 | 8.8s | 100% |
| 23 | GPT-4o, Aug. 6th (temp=0) | 100.0% | $0.0030 | 3.8s | 100% |
| 24 | Z.AI GLM 4.7 Flash | 100.0% | $0.0006 | 24.6s | 100% |
| 25 | Claude Sonnet 4 | 100.0% | $0.0044 | 7.2s | 100% |
| 26 | GPT-4o, May 13th (temp=0) | 100.0% | $0.0048 | 5.2s | 100% |
| 27 | Claude Sonnet 4.5 | 100.0% | $0.0045 | 7.3s | 100% |
| 28 | GPT-4o, May 13th (temp=1) | 100.0% | $0.0049 | 5.5s | 100% |
| 29 | Claude 3.5 Sonnet | 100.0% | $0.0049 | 6.2s | 100% |
| 30 | Claude 3.7 Sonnet | 100.0% | $0.0050 | 7.1s | 100% |
| 31 | Llama 3.1 70B | 96.1% | $0.0003 | 3.2s | 76% |
| 32 | Claude Opus 4.5 | 100.0% | $0.0077 | 8.7s | 100% |
| 33 | Llama 3.1 Nemotron 70B | 96.1% | $0.0001 | 9.6s | 76% |
| 34 | Claude Opus 4.6 | 100.0% | $0.0079 | 9.5s | 100% |
| 35 | Z.AI GLM 4.7 | 100.0% | $0.0027 | 51.3s | 100% |
| 36 | Mistral Large | 100.0% | $0.0091 | 8.4s | 100% |
| 37 | Grok 4 | 100.0% | $0.0080 | 18.1s | 100% |
| 38 | Z.AI GLM 4.6 | 100.0% | $0.0033 | 51.6s | 100% |
| 39 | GPT-4o, Aug. 6th (temp=1) | 96.1% | $0.0030 | 3.7s | 76% |
| 40 | Gemini 2.5 Flash Lite | 92.1% | $0.0001 | 1.4s | 69% |
| 41 | MoonshotAI: Kimi K2.5 | 100.0% | $0.0053 | 40.7s | 100% |
| 42 | Z.AI GLM 5 | 100.0% | $0.0049 | 50.1s | 100% |
| 43 | Claude Haiku 4.5 | 92.1% | $0.0016 | 4.0s | 69% |
| 44 | GPT-5.1 | 96.1% | $0.0049 | 12.4s | 76% |
| 45 | Claude Sonnet 4.6 | 92.1% | $0.0046 | 7.7s | 69% |
| 46 | Gemini 2.5 Pro | 100.0% | $0.015 | 14.6s | 100% |
| 47 | Gemini 3 Pro (Preview) | 100.0% | $0.017 | 13.7s | 100% |
| 48 | Mistral Small Creative | 83.5% | $0.0001 | 2.2s | 44% |
| 49 | DeepSeek V3.2 | 90.0% | $0.0002 | 13.7s | 40% |
| 50 | Gemini 2.5 Flash | 78.3% | $0.0009 | 2.5s | 38% |
| 51 | Cohere Command R+ (Aug. 2024) | 86.2% | $0.0030 | 7.0s | 39% |
| 52 | GPT-4.1 Mini | 82.7% | $0.0004 | 3.3s | 31% |
| 53 | Mistral NeMO | 75.6% | $0.0000 | 4.0s | 36% |
| 54 | ByteDance Seed 1.6 | 90.0% | $0.0023 | 27.7s | 40% |
| 55 | Claude Opus 4 | 100.0% | $0.023 | 15.8s | 100% |
| 56 | o4 Mini High | 96.1% | $0.014 | 34.7s | 76% |
| 57 | GPT-5 Nano | 91.4% | $0.0023 | 58.8s | 48% |
| 58 | Gemini 3.1 Pro (Preview) | 100.0% | $0.024 | 23.5s | 100% |
| 59 | GPT-5 Mini | 78.2% | $0.0022 | 14.1s | 37% |
| 60 | Qwen 3.5 Plus (2026-02-15) | 81.5% | $0.0008 | 13.3s | 26% |
| 61 | Hermes 3 70B | 73.5% | $0.0001 | 12.7s | 26% |
| 62 | GPT-5.2 | 78.8% | $0.0046 | 9.8s | 31% |
| 63 | GPT-5 | 100.0% | $0.027 | 51.7s | 100% |
| 64 | GPT-4.1 Nano | 60.2% | $0.0001 | 2.9s | 12% |
| 65 | Minimax M2.5 | 82.2% | $0.0044 | 1.2m | 38% |
| 66 | ByteDance Seed 1.6 Flash | 52.3% | $0.0003 | 6.9s | 15% |
| 67 | Qwen 2.5 72B | 52.8% | $0.0001 | 7.5s | 7% |
| 68 | o4 Mini | 67.4% | $0.0063 | 15.9s | 14% |
| 69 | Ministral 3 8B | 37.0% | $0.0001 | 2.0s | 12% |
| 70 | Rocinante 12B | 42.5% | $0.0002 | 14.5s | 4% |
| 71 | Gemma 3 4B | 33.8% | $0.0000 | 3.4s | 5% |
| 72 | Arcee AI: Trinity Mini | 33.8% | $0.0001 | 3.9s | 5% |
| 73 | Arcee AI: Trinity Large (Preview) | 37.8% | $0.0000 | 6.8s | 1% |
| 74 | Ministral 3 3B | 29.8% | $0.0000 | 1.5s | 7% |
| 75 | Llama 3.1 8B | 29.9% | $0.0001 | 1.3s | 2% |
| 76 | Ministral 3B | 23.8% | $0.0000 | 1.8s | 0% |
| 77 | Stealth: Aurora Alpha | 50.9% | — | 3.3s | 13% |
| 78 | Ministral 8B | 15.0% | $0.0000 | 2.1s | 0% |
| 79 | WizardLM 2 8x22b | 7.7% | $0.0005 | 12.6s | 0% |
| 80 | Qwen 3.5 397B A17B | 92.1% | $0.026 | 3.1m | 69% |