Community Benchmarks
Download oMLX →Real performance numbers from oMLX users around the world. Submit yours from the app (v0.2.6+).
| # | Chip | RAM | Model | Quant | Ctx | PP tok/s | TG tok/s | Date ↓ |
|---|---|---|---|---|---|---|---|---|
| 39021 | M4 (10c) | 24 GB | Qwen3.5-4B | unknown | 1k | 252.9 | 2.4 | 26-03-12 |
| 39022 | M3 Max (30c) | 36 GB | Qwen3.5-27B | 4bit | 4k | 153.5 | 15.2 | 26-03-12 |
| 39023 | M3 Max (30c) | 36 GB | Qwen3.5-27B | 4bit | 1k | 164.3 | 17.5 | 26-03-12 |
| 39024 | M5 (10c) | 32 GB | Qwen3.5-9B | 4bit | 16k | 222.7 | 21.8 | 26-03-12 |
| 39025 | M5 (10c) | 32 GB | Qwen3.5-9B | 4bit | 1k | 235.9 | 26.1 | 26-03-12 |
| 39026 | M3 Max (30c) | 36 GB | LFM2-24B-A2B | 4bit | 4k | 1,436 | 106.1 | 26-03-12 |
| 39027 | M3 Max (30c) | 36 GB | LFM2-24B-A2B | 4bit | 1k | 1,235 | 118.1 | 26-03-12 |
| 39028 | M4 Pro (16c) | 24 GB | Qwen3.5-9B | 8bit | 64k | 276.5 | 15.6 | 26-03-12 |
| 39029 | M4 Pro (16c) | 24 GB | Qwen3.5-9B | 8bit | 32k | 299.2 | 20.3 | 26-03-12 |
| 39030 | M1 (8c) | 16 GB | Llama-3.2-3B-Instruct | 4bit | 4k | 292.3 | 23.8 | 26-03-12 |