| 1 | Gemma 3 4B | 100.0% | $0.0000 | 249ms | 100% |
| 2 | Gemini 2.5 Flash Lite | 100.0% | $0.0000 | 355ms | 100% |
| 3 | Gemma 3 12B | 100.0% | $0.0000 | 414ms | 100% |
| 4 | Gemini 2.5 Flash | 100.0% | $0.0000 | 455ms | 100% |
| 5 | Gemma 3 27B | 100.0% | $0.0000 | 691ms | 100% |
| 6 | Gemini 3 Flash (Preview) | 100.0% | $0.0000 | 831ms | 100% |
| 7 | Stealth: Aurora Alpha | 100.0% | — | 1.5s | 100% |
| 8 | Grok 4 Fast | 100.0% | $0.0001 | 1.5s | 100% |
| 9 | GPT-5 Mini | 100.0% | $0.0003 | 3.1s | 100% |
| 10 | GPT-5.2 | 100.0% | $0.0006 | 2.1s | 100% |
| 11 | GPT-5.1 | 100.0% | $0.0005 | 2.7s | 100% |
| 12 | GPT-4.1 Nano | 98.9% | $0.0000 | 994ms | 79% |
| 13 | Claude 3.5 Haiku | 98.9% | $0.0001 | 1.1s | 79% |
| 14 | GPT-5 Nano | 100.0% | $0.0001 | 5.3s | 100% |
| 15 | o4 Mini High | 100.0% | $0.0007 | 3.0s | 100% |
| 16 | Qwen 3.5 Plus (2026-02-15) | 98.9% | $0.0000 | 2.4s | 79% |
| 17 | Minimax M2.5 | 100.0% | $0.0004 | 6.5s | 100% |
| 18 | Z.AI GLM 4.7 Flash | 100.0% | $0.0002 | 8.7s | 100% |
| 19 | MoonshotAI: Kimi K2.5 | 100.0% | $0.0008 | 7.1s | 100% |
| 20 | GPT-5 | 100.0% | $0.0018 | 4.3s | 100% |
| 21 | Z.AI GLM 4.5 | 94.4% | $0.0001 | 2.7s | 54% |
| 22 | ByteDance Seed 1.6 Flash | 93.3% | $0.0001 | 1.8s | 50% |
| 23 | Mistral Small Creative | 88.9% | $0.0000 | 354ms | 37% |
| 24 | Ministral 3 14B | 88.9% | $0.0000 | 424ms | 37% |
| 25 | Mistral Medium 3.1 | 88.9% | $0.0000 | 650ms | 37% |
| 26 | Qwen 2.5 72B | 88.9% | $0.0000 | 715ms | 37% |
| 27 | Mistral Large 3 | 88.9% | $0.0000 | 1.0s | 37% |
| 28 | DeepSeek-V2 Chat | 88.9% | $0.0000 | 1.5s | 37% |
| 29 | GPT-4.1 Mini | 88.9% | $0.0000 | 1.4s | 37% |
| 30 | Claude Haiku 4.5 | 88.9% | $0.0001 | 1.1s | 37% |
| 31 | Hermes 3 70B | 87.8% | $0.0000 | 742ms | 34% |
| 32 | DeepSeek V3 (2024-12-26) | 87.8% | $0.0000 | 897ms | 34% |
| 33 | GPT-4o, Aug. 6th (temp=1) | 88.9% | $0.0002 | 962ms | 37% |
| 34 | GPT-4o, Aug. 6th (temp=0) | 88.9% | $0.0002 | 966ms | 37% |
| 35 | GPT-4.1 | 88.9% | $0.0002 | 1.2s | 37% |
| 36 | DeepSeek V3.2 | 91.1% | $0.0000 | 3.7s | 43% |
| 37 | Llama 3.1 8B | 85.6% | $0.0000 | 404ms | 30% |
| 38 | Claude Sonnet 4 | 88.9% | $0.0004 | 1.5s | 37% |
| 39 | Claude Sonnet 4.5 | 88.9% | $0.0003 | 1.9s | 37% |
| 40 | Claude 3.7 Sonnet | 88.9% | $0.0004 | 1.7s | 37% |
| 41 | Mistral Large 2 | 85.6% | $0.0002 | 436ms | 30% |
| 42 | Llama 3.1 Nemotron 70B | 84.4% | $0.0000 | 821ms | 28% |
| 43 | Mistral NeMO | 86.7% | $0.0000 | 2.9s | 32% |
| 44 | Gemini 3.1 Pro (Preview) | 100.0% | $0.0032 | 5.9s | 100% |
| 45 | o4 Mini | 100.0% | $0.0006 | 17.7s | 100% |
| 46 | Claude Opus 4.6 | 87.8% | $0.0006 | 2.4s | 34% |
| 47 | Claude Opus 4.5 | 88.9% | $0.0009 | 2.2s | 37% |
| 48 | ByteDance Seed 1.6 | 90.0% | $0.0004 | 5.3s | 40% |
| 49 | Z.AI GLM 4.7 | 100.0% | $0.0008 | 17.4s | 100% |
| 50 | Writer: Palmyra X5 | 88.9% | $0.0002 | 5.2s | 37% |
| 51 | DeepSeek V3 (2025-03-24) | 82.2% | $0.0000 | 1.7s | 24% |
| 52 | Claude 3 Haiku | 87.8% | $0.0000 | 5.4s | 34% |
| 53 | Llama 3.1 70B | 80.0% | $0.0001 | 533ms | 20% |
| 54 | Claude 3.5 Sonnet | 88.9% | $0.0004 | 5.4s | 37% |
| 55 | Gemini 2.5 Pro | 100.0% | $0.0040 | 4.9s | 100% |
| 56 | Mistral Small 3.2 24B | 77.8% | $0.0000 | 505ms | 17% |
| 57 | GPT-4o, May 13th (temp=1) | 88.9% | $0.0004 | 5.9s | 37% |
| 58 | Z.AI GLM 5 | 100.0% | $0.0015 | 15.9s | 100% |
| 59 | GPT-4o Mini (temp=0) | 88.9% | $0.0000 | 7.6s | 37% |
| 60 | GPT-4o, May 13th (temp=0) | 88.9% | $0.0004 | 6.2s | 37% |
| 61 | Arcee AI: Trinity Large (Preview) | 77.8% | $0.0000 | 1.2s | 17% |
| 62 | Gemini 3 Pro (Preview) | 100.0% | $0.0041 | 5.3s | 100% |
| 63 | Grok 4.1 Fast | 81.1% | $0.0001 | 2.6s | 22% |
| 64 | Z.AI GLM 4.6 | 100.0% | $0.0011 | 18.3s | 100% |
| 65 | Hermes 3 405B | 85.6% | $0.0000 | 6.3s | 30% |
| 66 | Ministral 3 3B | 75.6% | $0.0000 | 365ms | 14% |
| 67 | DeepSeek V3.1 | 78.9% | $0.0000 | 2.3s | 18% |
| 68 | Grok 4 | 100.0% | $0.0042 | 6.6s | 100% |
| 69 | Claude Sonnet 4.6 | 77.8% | $0.0004 | 1.1s | 17% |
| 70 | Qwen 3.5 397B A17B | 100.0% | $0.0021 | 15.7s | 100% |
| 71 | Arcee AI: Trinity Mini | 75.6% | $0.0001 | 2.1s | 14% |
| 72 | GPT-4o Mini (temp=1) | 88.9% | $0.0000 | 11.3s | 37% |
| 73 | Ministral 8B | 70.0% | $0.0000 | 307ms | 8% |
| 74 | Ministral 3B | 66.7% | $0.0000 | 297ms | 6% |
| 75 | Ministral 3 8B | 66.7% | $0.0000 | 372ms | 6% |
| 76 | Claude Opus 4 | 88.9% | $0.0017 | 6.2s | 37% |
| 77 | Rocinante 12B | 57.8% | $0.0000 | 2.8s | 1% |
| 78 | Cohere Command R+ (Aug. 2024) | 46.7% | $0.0002 | 446ms | 0% |
| 79 | WizardLM 2 8x22b | 48.9% | $0.0001 | 6.1s | 0% |
| 80 | Mistral Large | 41.1% | $0.0010 | 5.9s | 0% |