Datasets:
gpu_name large_stringclasses 5
values | model_base large_stringclasses 29
values | quant large_stringclasses 33
values | n_ctx float64 2.05k 130k ⌀ | concurrent_users float64 1 32 ⌀ | backends large_stringclasses 5
values | runner_type large_stringclasses 5
values | mean_throughput_tok_s float64 1.94 1.46k ⌀ | std_throughput_tok_s float64 0 280 ⌀ | sample_count int64 0 22 ⌀ | mean_avg_ttft_ms float64 12.9 261k ⌀ | std_avg_ttft_ms float64 0 149k ⌀ | avg_ttft_ms_count int64 0 22 ⌀ | mean_p50_ttft_ms float64 13.1 279k ⌀ | std_p50_ttft_ms float64 0 159k ⌀ | p50_ttft_ms_count int64 0 22 ⌀ | mean_p99_ttft_ms float64 17 584k ⌀ | std_p99_ttft_ms float64 0 753k ⌀ | p99_ttft_ms_count int64 0 22 ⌀ | mean_avg_itl_ms float64 2.3 3.37k ⌀ | std_avg_itl_ms float64 0 877 ⌀ | avg_itl_ms_count int64 0 22 ⌀ | mean_p50_itl_ms float64 2.29 3.46k ⌀ | std_p50_itl_ms float64 0 900 ⌀ | p50_itl_ms_count int64 0 22 ⌀ | mean_p99_itl_ms float64 3.06 5.07k ⌀ | std_p99_itl_ms float64 0 1.01k ⌀ | p99_itl_ms_count int64 0 22 ⌀ | mean_avg_power_w float64 3.2 549 ⌀ | std_avg_power_w float64 0 192 ⌀ | avg_power_w_count int64 0 22 ⌀ | mean_avg_gpu_temp_c float64 30 82.5 ⌀ | std_avg_gpu_temp_c float64 0 12.7 ⌀ | avg_gpu_temp_c_count int64 0 22 ⌀ | ci95_low_tok_s float64 -1,975.23 1.35k ⌀ | ci95_high_tok_s float64 2.68 3.11k ⌀ |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 4,096 | null | null | tool-accuracy | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 8,192 | 1 | Metal 4 | llama-server | 77.555 | 0.4596 | 2 | 40.955 | 0.0495 | 2 | 30.2 | 0.1556 | 2 | 106.185 | 0.6152 | 2 | 12.76 | 0.0566 | 2 | 12.75 | 0.0849 | 2 | 13.565 | 0.3889 | 2 | null | null | 0 | null | null | 0 | 73.4256 | 81.6844 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 8,192 | 2 | Metal 4 | llama-server | 98.36 | 1.3435 | 2 | 65.675 | 1.1809 | 2 | 56.99 | 0.5657 | 2 | 135.82 | 3.7901 | 2 | 19.965 | 0.2475 | 2 | 18.085 | 0.0071 | 2 | 27.865 | 0.1061 | 2 | null | null | 0 | null | null | 0 | 86.2893 | 110.4307 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 8,192 | 4 | Metal 4 | llama-server | 120.465 | 0.1344 | 2 | 107.875 | 8.7893 | 2 | 97.71 | 0.4384 | 2 | 221.535 | 63.0103 | 2 | 32.175 | 0.0778 | 2 | 32.175 | 0.0354 | 2 | 37.33 | 2.9557 | 2 | null | null | 0 | null | null | 0 | 119.2579 | 121.6721 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 8,192 | 8 | Metal 4 | llama-server | 121.29 | 1.4566 | 2 | 7,642.225 | 206.5671 | 2 | 8,403.35 | 159.2687 | 2 | 8,779.835 | 144.9922 | 2 | 32.375 | 0.5162 | 2 | 32.115 | 0.5728 | 2 | 54.84 | 0.1697 | 2 | null | null | 0 | null | null | 0 | 108.2028 | 134.3772 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 8,192 | 16 | Metal 4 | llama-server | 121.63 | 2.3052 | 2 | 22,596.71 | 330.6714 | 2 | 25,042.325 | 442.2317 | 2 | 26,174.445 | 432.9403 | 2 | 32.56 | 0.5515 | 2 | 32.14 | 0.5515 | 2 | 54.87 | 0.9192 | 2 | null | null | 0 | null | null | 0 | 100.9192 | 142.3408 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 8,192 | 32 | Metal 4 | llama-server | 123.215 | 1.1102 | 2 | 52,430.065 | 290.2603 | 2 | 56,977.665 | 1,334.5721 | 2 | 59,835.725 | 1,029.6677 | 2 | 32.195 | 0.2475 | 2 | 31.745 | 0.2192 | 2 | 54.57 | 0.9334 | 2 | null | null | 0 | null | null | 0 | 113.2408 | 133.1892 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 8,192 | null | null | multiturn | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 16,384 | 1 | Metal 4 | llama-server | 76.82 | 0.198 | 2 | 41.44 | 0.1556 | 2 | 30.72 | 0.198 | 2 | 107.77 | 0.3111 | 2 | 12.865 | 0.0071 | 2 | 12.86 | 0 | 2 | 13.255 | 0.0212 | 2 | null | null | 0 | null | null | 0 | 75.0412 | 78.5988 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 16,384 | 2 | Metal 4 | llama-server | 101.225 | 4.3063 | 2 | 68.06 | 1.0041 | 2 | 60.83 | 2.1496 | 2 | 138.775 | 1.3364 | 2 | 19.095 | 0.3465 | 2 | 18.185 | 0.0354 | 2 | 27.75 | 0.0141 | 2 | null | null | 0 | null | null | 0 | 62.5352 | 139.9148 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 16,384 | 4 | Metal 4 | llama-server | 121.27 | 1.1738 | 2 | 111.275 | 15.0684 | 2 | 90.06 | 8.5418 | 2 | 291.66 | 162.4931 | 2 | 31.7 | 0.1131 | 2 | 31.8 | 0.0141 | 2 | 43.24 | 13.138 | 2 | null | null | 0 | null | null | 0 | 110.724 | 131.816 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 16,384 | 8 | Metal 4 | llama-server | 121.915 | 0.3323 | 2 | 7,468.925 | 13.2724 | 2 | 8,336.515 | 12.5511 | 2 | 8,809.285 | 57.212 | 2 | 32.16 | 0.1131 | 2 | 31.87 | 0.0141 | 2 | 44.51 | 14.6088 | 2 | null | null | 0 | null | null | 0 | 118.9291 | 124.9009 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 16,384 | 16 | Metal 4 | llama-server | 122.255 | 0.1202 | 2 | 22,053.045 | 703.7056 | 2 | 24,720.41 | 197.2686 | 2 | 25,651.605 | 335.6565 | 2 | 32.29 | 0.0424 | 2 | 31.87 | 0 | 2 | 44.55 | 14.6371 | 2 | null | null | 0 | null | null | 0 | 121.175 | 123.335 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 16,384 | 32 | Metal 4 | llama-server | 123.025 | 0.3323 | 2 | 53,312.715 | 152.0492 | 2 | 57,668.46 | 188.0197 | 2 | 60,120.575 | 163.1649 | 2 | 32.25 | 0.0566 | 2 | 31.82 | 0.0566 | 2 | 54.495 | 0.2758 | 2 | null | null | 0 | null | null | 0 | 120.0391 | 126.0109 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 32,768 | 1 | Metal 4 | llama-server | 77.26 | 0.2546 | 2 | 41.565 | 0.0636 | 2 | 30.635 | 0.1202 | 2 | 107.965 | 0.0212 | 2 | 12.815 | 0.0495 | 2 | 12.805 | 0.0636 | 2 | 13.16 | 0.099 | 2 | null | null | 0 | null | null | 0 | 74.9729 | 79.5471 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 32,768 | 2 | Metal 4 | llama-server | 92.85 | 11.0167 | 2 | 66.235 | 1.2657 | 2 | 56.09 | 0.0141 | 2 | 134.25 | 2.291 | 2 | 21.265 | 2.4395 | 2 | 22.64 | 6.2367 | 2 | 27.82 | 0.0424 | 2 | null | null | 0 | null | null | 0 | -6.1297 | 191.8297 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 32,768 | 4 | Metal 4 | llama-server | 121.58 | 0.1838 | 2 | 100.02 | 2.1637 | 2 | 95.35 | 3.2951 | 2 | 175.56 | 0.5374 | 2 | 31.705 | 0.2616 | 2 | 31.8 | 0.0283 | 2 | 42.865 | 11.3491 | 2 | null | null | 0 | null | null | 0 | 119.9282 | 123.2318 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 32,768 | 8 | Metal 4 | llama-server | 122.37 | 0.3394 | 2 | 7,505.75 | 46.6549 | 2 | 8,330.89 | 0.1131 | 2 | 8,769.6 | 4.4689 | 2 | 32.14 | 0.0566 | 2 | 31.84 | 0.0141 | 2 | 53.885 | 0.2899 | 2 | null | null | 0 | null | null | 0 | 119.3206 | 125.4194 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 32,768 | 16 | Metal 4 | llama-server | 122.865 | 0.0495 | 2 | 22,605.92 | 77.9232 | 2 | 24,885.095 | 5.6356 | 2 | 25,955.84 | 41.9031 | 2 | 32.235 | 0.0212 | 2 | 31.865 | 0.0071 | 2 | 54.595 | 0.2192 | 2 | null | null | 0 | null | null | 0 | 122.4203 | 123.3097 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 32,768 | 32 | Metal 4 | llama-server | 122.645 | 0.1344 | 2 | 51,708.22 | 387.1692 | 2 | 57,695.895 | 245.8257 | 2 | 60,164.995 | 361.2538 | 2 | 32.34 | 0.0566 | 2 | 31.865 | 0.0212 | 2 | 55.71 | 0.9899 | 2 | null | null | 0 | null | null | 0 | 121.4379 | 123.8521 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 130,064 | 1 | Metal 4 | llama-server | 76.875 | 0.0354 | 2 | 41.875 | 0.0919 | 2 | 31.1 | 0 | 2 | 108 | 0.5233 | 2 | 12.875 | 0.0071 | 2 | 12.87 | 0 | 2 | 13.225 | 0.0212 | 2 | null | null | 0 | null | null | 0 | 76.5573 | 77.1927 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 130,064 | 2 | Metal 4 | llama-server | 103.225 | 2.2557 | 2 | 68.88 | 0.5374 | 2 | 61.425 | 1.8031 | 2 | 139.945 | 0.8556 | 2 | 18.865 | 0.1202 | 2 | 18.215 | 0.0071 | 2 | 27.73 | 0.0141 | 2 | null | null | 0 | null | null | 0 | 82.9589 | 123.4911 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 130,064 | 4 | Metal 4 | llama-server | 119.1 | 2.39 | 2 | 106.165 | 7.4741 | 2 | 97.19 | 2.0223 | 2 | 215.035 | 55.4725 | 2 | 31.92 | 0.1838 | 2 | 31.805 | 0.0212 | 2 | 43.305 | 13.1027 | 2 | null | null | 0 | null | null | 0 | 97.6269 | 140.5731 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 130,064 | 8 | Metal 4 | llama-server | 122.745 | 0.2899 | 2 | 7,561.45 | 170.0309 | 2 | 8,339.32 | 17.6211 | 2 | 8,734.595 | 50.2965 | 2 | 32.07 | 0.1414 | 2 | 31.835 | 0.0071 | 2 | 44.32 | 14.3684 | 2 | null | null | 0 | null | null | 0 | 120.1403 | 125.3497 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 130,064 | 16 | Metal 4 | llama-server | 122.635 | 0.2899 | 2 | 22,537.63 | 8.2166 | 2 | 24,834.555 | 21.4041 | 2 | 25,999.765 | 39.704 | 2 | 32.205 | 0.0636 | 2 | 31.855 | 0.0354 | 2 | 54.915 | 0.5869 | 2 | null | null | 0 | null | null | 0 | 120.0303 | 125.2397 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | 130,064 | 32 | Metal 4 | llama-server | 123.06 | 0.396 | 2 | 52,369.19 | 480.3801 | 2 | 57,319.3 | 14.8775 | 2 | 60,300.87 | 1.0324 | 2 | 32.235 | 0.1202 | 2 | 31.8 | 0.0849 | 2 | 54.815 | 0.2192 | 2 | null | null | 0 | null | null | 0 | 119.5023 | 126.6177 |
Apple M4 Pro | Qwen3.5-0.8B | BF16 | null | null | null | context-rot | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 4,096 | null | null | tool-accuracy | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 8,192 | 1 | Metal 4 | llama-server | 108.275 | 0.2475 | 2 | 39.245 | 0.4596 | 2 | 28.75 | 0.1414 | 2 | 108.18 | 0.1697 | 2 | 9.105 | 0.0071 | 2 | 9.08 | 0.0141 | 2 | 9.535 | 0.0636 | 2 | null | null | 0 | null | null | 0 | 106.0515 | 110.4986 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 8,192 | 2 | Metal 4 | llama-server | 120.935 | 8.959 | 2 | 66.455 | 12.735 | 2 | 46.425 | 0.0354 | 2 | 214.965 | 120.5971 | 2 | 16.145 | 0.9546 | 2 | 15.51 | 0.0707 | 2 | 18.13 | 3.0123 | 2 | null | null | 0 | null | null | 0 | 40.4425 | 201.4275 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 8,192 | 4 | Metal 4 | llama-server | 141.135 | 3.6133 | 2 | 99.67 | 2.4749 | 2 | 87.99 | 1.6971 | 2 | 250.615 | 1.2233 | 2 | 26.615 | 0.4172 | 2 | 26.79 | 0 | 2 | 29.02 | 0.0283 | 2 | null | null | 0 | null | null | 0 | 108.6712 | 173.5988 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 8,192 | 8 | Metal 4 | llama-server | 145.01 | 0.3253 | 2 | 6,549.18 | 146.2155 | 2 | 7,029.7 | 4.681 | 2 | 7,564.77 | 30.4056 | 2 | 27.065 | 0.0495 | 2 | 26.825 | 0.0071 | 2 | 29.1 | 0.0141 | 2 | null | null | 0 | null | null | 0 | 142.0876 | 147.9324 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 8,192 | 16 | Metal 4 | llama-server | 145.49 | 0.3394 | 2 | 19,249.865 | 300.4002 | 2 | 20,952 | 51.4774 | 2 | 22,183.575 | 33.128 | 2 | 27.21 | 0.0141 | 2 | 26.855 | 0.0495 | 2 | 45.255 | 0.4031 | 2 | null | null | 0 | null | null | 0 | 142.4406 | 148.5394 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 8,192 | 32 | Metal 4 | llama-server | 145.23 | 0.3394 | 2 | 45,178.83 | 670.012 | 2 | 48,769.335 | 11.5329 | 2 | 50,788.495 | 268.3117 | 2 | 27.305 | 0.0495 | 2 | 26.905 | 0.0636 | 2 | 46.465 | 1.5344 | 2 | null | null | 0 | null | null | 0 | 142.1806 | 148.2794 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 8,192 | null | null | multiturn | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 16,384 | 1 | Metal 4 | llama-server | 106.925 | 0.0071 | 2 | 39.495 | 0.2616 | 2 | 29.355 | 0.0212 | 2 | 108.455 | 0.2899 | 2 | 9.23 | 0 | 2 | 9.23 | 0 | 2 | 9.705 | 0.0636 | 2 | null | null | 0 | null | null | 0 | 106.8615 | 106.9885 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 16,384 | 2 | Metal 4 | llama-server | 118.51 | 0.4525 | 2 | 61.31 | 0.297 | 2 | 54.26 | 0.396 | 2 | 170.235 | 53.9027 | 2 | 16.2 | 0.0141 | 2 | 15.615 | 0.0212 | 2 | 20.505 | 0.0354 | 2 | null | null | 0 | null | null | 0 | 114.4441 | 122.5759 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 16,384 | 4 | Metal 4 | llama-server | 144.86 | 1.5698 | 2 | 104.905 | 20.7677 | 2 | 79.48 | 5.4871 | 2 | 286.925 | 172.7957 | 2 | 26.935 | 0.0778 | 2 | 26.96 | 0 | 2 | 36.81 | 10.946 | 2 | null | null | 0 | null | null | 0 | 130.7563 | 158.9637 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 16,384 | 8 | Metal 4 | llama-server | 144.34 | 0.2121 | 2 | 6,397.64 | 95.8695 | 2 | 7,081.67 | 8.9095 | 2 | 7,486.435 | 50.7208 | 2 | 27.35 | 0.0566 | 2 | 27.01 | 0.0141 | 2 | 46.475 | 2.5102 | 2 | null | null | 0 | null | null | 0 | 142.4341 | 146.2459 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 16,384 | 16 | Metal 4 | llama-server | 144.43 | 0.0707 | 2 | 19,094.23 | 705.6077 | 2 | 21,085.135 | 47.6802 | 2 | 22,294.355 | 78.4394 | 2 | 27.375 | 0.0071 | 2 | 27.03 | 0.0141 | 2 | 47.68 | 1.3294 | 2 | null | null | 0 | null | null | 0 | 143.7947 | 145.0653 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 16,384 | 32 | Metal 4 | llama-server | 144.49 | 0.1273 | 2 | 45,216.545 | 57.5938 | 2 | 48,782.44 | 503.955 | 2 | 51,044.27 | 1,197.2449 | 2 | 27.43 | 0.0283 | 2 | 27.03 | 0 | 2 | 48.225 | 0.3182 | 2 | null | null | 0 | null | null | 0 | 143.3465 | 145.6335 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 32,768 | 1 | Metal 4 | llama-server | 107.535 | 1.1384 | 2 | 39.245 | 0.6859 | 2 | 28.885 | 0.0636 | 2 | 107.95 | 0.8202 | 2 | 9.16 | 0.1131 | 2 | 9.155 | 0.1344 | 2 | 9.6 | 0.1273 | 2 | null | null | 0 | null | null | 0 | 97.3067 | 117.7633 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 32,768 | 2 | Metal 4 | llama-server | 112.135 | 9.4116 | 2 | 57.585 | 1.0253 | 2 | 50.83 | 2.3193 | 2 | 127.48 | 4.2144 | 2 | 16.995 | 0.6435 | 2 | 17.84 | 3.0123 | 2 | 20.67 | 0.0283 | 2 | null | null | 0 | null | null | 0 | 27.5766 | 196.6934 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 32,768 | 4 | Metal 4 | llama-server | 143.725 | 1.0253 | 2 | 106.875 | 5.9609 | 2 | 90.385 | 1.1243 | 2 | 309.675 | 72.4289 | 2 | 26.48 | 0.0141 | 2 | 26.96 | 0 | 2 | 29.215 | 0.1344 | 2 | null | null | 0 | null | null | 0 | 134.5132 | 152.9368 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 32,768 | 8 | Metal 4 | llama-server | 144.55 | 0.3394 | 2 | 6,436.2 | 117.2807 | 2 | 7,067.21 | 8.7964 | 2 | 7,559.725 | 77.0534 | 2 | 27.215 | 0.0071 | 2 | 26.99 | 0.0141 | 2 | 31.61 | 3.2385 | 2 | null | null | 0 | null | null | 0 | 141.5006 | 147.5994 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 32,768 | 16 | Metal 4 | llama-server | 144.255 | 0.3182 | 2 | 19,139.45 | 54.2492 | 2 | 21,049.63 | 45.0993 | 2 | 22,181.545 | 39.3788 | 2 | 27.395 | 0.0071 | 2 | 27.015 | 0.0071 | 2 | 48.375 | 0.0071 | 2 | null | null | 0 | null | null | 0 | 141.3962 | 147.1138 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 32,768 | 32 | Metal 4 | llama-server | 144.425 | 0.1485 | 2 | 44,779.945 | 151.4269 | 2 | 48,683.6 | 71.6582 | 2 | 51,066.325 | 477.9971 | 2 | 27.455 | 0.0071 | 2 | 27.025 | 0.0071 | 2 | 48.395 | 0.7425 | 2 | null | null | 0 | null | null | 0 | 143.0909 | 145.7591 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 130,064 | 1 | Metal 4 | llama-server | 106.825 | 0.2899 | 2 | 39.2 | 0.1697 | 2 | 29.395 | 0.0495 | 2 | 107.38 | 1.5981 | 2 | 9.225 | 0.0071 | 2 | 9.23 | 0 | 2 | 9.71 | 0.0849 | 2 | null | null | 0 | null | null | 0 | 104.2203 | 109.4297 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 130,064 | 2 | Metal 4 | llama-server | 119.29 | 4.0447 | 2 | 62.79 | 6.1518 | 2 | 49.85 | 3.677 | 2 | 169.78 | 55.4372 | 2 | 16.145 | 0.5728 | 2 | 15.665 | 0.0495 | 2 | 20.51 | 0.1273 | 2 | null | null | 0 | null | null | 0 | 82.9508 | 155.6292 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 130,064 | 4 | Metal 4 | llama-server | 143.465 | 2.3264 | 2 | 110.295 | 14.9553 | 2 | 82.505 | 7.46 | 2 | 331.23 | 112.3876 | 2 | 26.73 | 0.3111 | 2 | 26.96 | 0.0141 | 2 | 29.135 | 0.1768 | 2 | null | null | 0 | null | null | 0 | 122.5636 | 164.3664 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 130,064 | 8 | Metal 4 | llama-server | 144.585 | 0.1202 | 2 | 6,605.2 | 54.4896 | 2 | 7,077.615 | 6.965 | 2 | 7,573.18 | 66.3973 | 2 | 27.22 | 0.0849 | 2 | 27.01 | 0.0141 | 2 | 37.59 | 11.7097 | 2 | null | null | 0 | null | null | 0 | 143.505 | 145.665 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 130,064 | 16 | Metal 4 | llama-server | 144.23 | 0.1838 | 2 | 19,407.22 | 378.7264 | 2 | 21,072.03 | 51.5905 | 2 | 22,269.68 | 117.1252 | 2 | 27.385 | 0.0071 | 2 | 27.03 | 0.0283 | 2 | 47.09 | 1.3859 | 2 | null | null | 0 | null | null | 0 | 142.5782 | 145.8818 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | 130,064 | 32 | Metal 4 | llama-server | 144.43 | 0.0141 | 2 | 44,530.51 | 213.7725 | 2 | 48,955.865 | 112.7199 | 2 | 50,902.11 | 797.6023 | 2 | 27.415 | 0.0212 | 2 | 27.01 | 0.0283 | 2 | 48.445 | 0.7566 | 2 | null | null | 0 | null | null | 0 | 144.3029 | 144.5571 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_NL | null | null | null | context-rot | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 4,096 | null | null | tool-accuracy | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 8,192 | 1 | Metal 4 | llama-server | 107.355 | 0.1344 | 2 | 40.06 | 0.099 | 2 | 29.59 | 0.1838 | 2 | 109.275 | 0.2616 | 2 | 9.18 | 0 | 2 | 9.16 | 0.0141 | 2 | 9.64 | 0.0283 | 2 | null | null | 0 | null | null | 0 | 106.1479 | 108.5621 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 8,192 | 2 | Metal 4 | llama-server | 118.56 | 3.9457 | 2 | 59.615 | 2.1567 | 2 | 50.845 | 4.6315 | 2 | 132.15 | 1.7961 | 2 | 16.435 | 0.4172 | 2 | 15.74 | 0.0283 | 2 | 20.495 | 0.0354 | 2 | null | null | 0 | null | null | 0 | 83.1103 | 154.0097 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 8,192 | 4 | Metal 4 | llama-server | 140.605 | 0.9546 | 2 | 116.295 | 13.329 | 2 | 77.61 | 2.9557 | 2 | 410.705 | 90.5874 | 2 | 27.625 | 0.1485 | 2 | 27.83 | 0.0283 | 2 | 29.92 | 0.0424 | 2 | null | null | 0 | null | null | 0 | 132.0284 | 149.1816 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 8,192 | 8 | Metal 4 | llama-server | 140.29 | 0.8768 | 2 | 6,682.83 | 130.4046 | 2 | 7,276.045 | 24.8972 | 2 | 7,834.195 | 115.4352 | 2 | 28.03 | 0.3394 | 2 | 27.83 | 0.0566 | 2 | 38.435 | 11.9996 | 2 | null | null | 0 | null | null | 0 | 132.4123 | 148.1677 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 8,192 | 16 | Metal 4 | llama-server | 140.04 | 0.1131 | 2 | 19,946.875 | 256.2201 | 2 | 21,744.32 | 13.0673 | 2 | 22,949.77 | 1.1738 | 2 | 28.255 | 0.0071 | 2 | 27.885 | 0.0071 | 2 | 46.405 | 0.6859 | 2 | null | null | 0 | null | null | 0 | 139.0235 | 141.0565 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 8,192 | 32 | Metal 4 | llama-server | 140.285 | 0.0212 | 2 | 46,367.955 | 699.2367 | 2 | 50,270.71 | 532.8191 | 2 | 53,250.385 | 84.3649 | 2 | 28.27 | 0.0141 | 2 | 27.865 | 0.0071 | 2 | 50.11 | 0.1838 | 2 | null | null | 0 | null | null | 0 | 140.0944 | 140.4756 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 8,192 | null | null | multiturn | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 16,384 | 1 | Metal 4 | llama-server | 108.87 | 0.1414 | 2 | 39.675 | 0.4738 | 2 | 29.08 | 0.2121 | 2 | 107.715 | 0.4172 | 2 | 9.06 | 0.0141 | 2 | 9.075 | 0.0354 | 2 | 9.63 | 0 | 2 | null | null | 0 | null | null | 0 | 107.5994 | 110.1406 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 16,384 | 2 | Metal 4 | llama-server | 122.63 | 3.6487 | 2 | 63.52 | 5.8124 | 2 | 51.745 | 4.3911 | 2 | 169.83 | 53.8815 | 2 | 15.735 | 0.3465 | 2 | 15.45 | 0 | 2 | 20.265 | 0.1626 | 2 | null | null | 0 | null | null | 0 | 89.8485 | 155.4115 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 16,384 | 4 | Metal 4 | llama-server | 141.115 | 0.1768 | 2 | 108.09 | 16.1362 | 2 | 81.995 | 4.8578 | 2 | 330.81 | 107.9611 | 2 | 27.62 | 0.1273 | 2 | 27.62 | 0.0424 | 2 | 29.82 | 0.1556 | 2 | null | null | 0 | null | null | 0 | 139.5268 | 142.7032 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 16,384 | 8 | Metal 4 | llama-server | 140.78 | 0.3818 | 2 | 6,609.395 | 13.4987 | 2 | 7,236.31 | 12.8269 | 2 | 7,681.59 | 17.8332 | 2 | 28 | 0.0707 | 2 | 27.67 | 0.0141 | 2 | 47.775 | 1.7324 | 2 | null | null | 0 | null | null | 0 | 137.3494 | 144.2106 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 16,384 | 16 | Metal 4 | llama-server | 141.23 | 0.2121 | 2 | 19,989.53 | 164.6852 | 2 | 21,589.66 | 36.5291 | 2 | 22,651.38 | 179.5344 | 2 | 28.04 | 0.1273 | 2 | 27.665 | 0.0636 | 2 | 47.715 | 2.8921 | 2 | null | null | 0 | null | null | 0 | 139.3241 | 143.1359 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 16,384 | 32 | Metal 4 | llama-server | 141.21 | 0.2828 | 2 | 45,926.61 | 97.906 | 2 | 50,034.71 | 212.6836 | 2 | 52,748.355 | 19.4525 | 2 | 28.1 | 0.0566 | 2 | 27.71 | 0.0424 | 2 | 47.19 | 2.0648 | 2 | null | null | 0 | null | null | 0 | 138.6688 | 143.7512 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 32,768 | 1 | Metal 4 | llama-server | 108.72 | 0.2121 | 2 | 39.545 | 0.1202 | 2 | 29.175 | 0.1768 | 2 | 108.94 | 0.4525 | 2 | 9.06 | 0.0283 | 2 | 9.045 | 0.0212 | 2 | 9.66 | 0.0141 | 2 | null | null | 0 | null | null | 0 | 106.8141 | 110.6259 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 32,768 | 2 | Metal 4 | llama-server | 119.1 | 9.4187 | 2 | 63.02 | 7.2408 | 2 | 50.14 | 0.9758 | 2 | 168.205 | 59.2485 | 2 | 16.215 | 1.0677 | 2 | 15.575 | 0.2333 | 2 | 20.34 | 0.4101 | 2 | null | null | 0 | null | null | 0 | 34.478 | 203.722 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 32,768 | 4 | Metal 4 | llama-server | 138.765 | 0.2616 | 2 | 92.885 | 5.5508 | 2 | 81.47 | 2.5739 | 2 | 207.985 | 63.675 | 2 | 27.82 | 0.0849 | 2 | 27.82 | 0.0566 | 2 | 37.275 | 10.2036 | 2 | null | null | 0 | null | null | 0 | 136.4144 | 141.1156 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 32,768 | 8 | Metal 4 | llama-server | 139.87 | 0.495 | 2 | 6,792.965 | 11.2784 | 2 | 7,287.34 | 1.9799 | 2 | 7,791.31 | 49.0308 | 2 | 28.065 | 0.0495 | 2 | 27.885 | 0.0495 | 2 | 39.555 | 13.1027 | 2 | null | null | 0 | null | null | 0 | 135.4229 | 144.3171 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 32,768 | 16 | Metal 4 | llama-server | 139.87 | 0.0424 | 2 | 19,983.655 | 118.7869 | 2 | 21,772.12 | 6.8165 | 2 | 23,027.5 | 36.0624 | 2 | 28.295 | 0.0636 | 2 | 27.955 | 0.0071 | 2 | 48.235 | 1.2799 | 2 | null | null | 0 | null | null | 0 | 139.4888 | 140.2512 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 32,768 | 32 | Metal 4 | llama-server | 140.35 | 0.6788 | 2 | 46,011.33 | 130.744 | 2 | 50,577.775 | 395.1525 | 2 | 53,216.605 | 22.0829 | 2 | 28.265 | 0.1344 | 2 | 27.875 | 0.1202 | 2 | 49.79 | 0.3253 | 2 | null | null | 0 | null | null | 0 | 134.2511 | 146.4489 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 130,064 | 1 | Metal 4 | llama-server | 107.87 | 0.1273 | 2 | 39.285 | 0.1202 | 2 | 29.81 | 0.099 | 2 | 109.115 | 0.0778 | 2 | 9.145 | 0.0071 | 2 | 9.14 | 0.0141 | 2 | 9.61 | 0.0707 | 2 | null | null | 0 | null | null | 0 | 106.7265 | 109.0135 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 130,064 | 2 | Metal 4 | llama-server | 110.1 | 13.435 | 2 | 59.17 | 2.6304 | 2 | 50.755 | 4.9851 | 2 | 132.41 | 1.0182 | 2 | 17.675 | 1.8455 | 2 | 17.815 | 3.0759 | 2 | 20.55 | 0.1838 | 2 | null | null | 0 | null | null | 0 | -10.607 | 230.807 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 130,064 | 4 | Metal 4 | llama-server | 140.5 | 1.3011 | 2 | 104.405 | 3.2598 | 2 | 83.305 | 6.5266 | 2 | 301.355 | 64.0285 | 2 | 27.705 | 0.0071 | 2 | 27.77 | 0.0141 | 2 | 29.945 | 0.1626 | 2 | null | null | 0 | null | null | 0 | 128.8105 | 152.1895 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 130,064 | 8 | Metal 4 | llama-server | 140.315 | 0.4031 | 2 | 6,762.77 | 36.0907 | 2 | 7,285.465 | 0.7707 | 2 | 7,720.675 | 2.5385 | 2 | 28.115 | 0.0495 | 2 | 27.81 | 0.0283 | 2 | 46.145 | 0.0212 | 2 | null | null | 0 | null | null | 0 | 136.6938 | 143.9362 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 130,064 | 16 | Metal 4 | llama-server | 140.145 | 0.3748 | 2 | 19,678.255 | 15.521 | 2 | 21,674.49 | 14.2128 | 2 | 22,913.5 | 10.0268 | 2 | 28.225 | 0.0071 | 2 | 27.835 | 0.0071 | 2 | 49.725 | 0.8415 | 2 | null | null | 0 | null | null | 0 | 136.7779 | 143.5121 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | 130,064 | 32 | Metal 4 | llama-server | 140.53 | 0.0849 | 2 | 46,275.305 | 116.9342 | 2 | 50,420.61 | 71.8986 | 2 | 52,906.43 | 181.9951 | 2 | 28.245 | 0.0071 | 2 | 27.83 | 0.0283 | 2 | 48.8 | 1.9233 | 2 | null | null | 0 | null | null | 0 | 139.7676 | 141.2924 |
Apple M4 Pro | Qwen3.5-0.8B | IQ4_XS | null | null | null | context-rot | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 4,096 | null | null | tool-accuracy | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 8,192 | 1 | Metal 4 | llama-server | 102.88 | 0.0424 | 2 | 40.44 | 0.4667 | 2 | 29.595 | 0.4596 | 2 | 111.34 | 0.1414 | 2 | 9.595 | 0.0071 | 2 | 9.585 | 0.0071 | 2 | 10.065 | 0.0071 | 2 | null | null | 0 | null | null | 0 | 102.4988 | 103.2612 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 8,192 | 2 | Metal 4 | llama-server | 112.985 | 2.5385 | 2 | 58.005 | 0.2899 | 2 | 49.72 | 2.0648 | 2 | 131.5 | 4.5255 | 2 | 16.91 | 0.3253 | 2 | 16 | 0.0566 | 2 | 21.3 | 0.1131 | 2 | null | null | 0 | null | null | 0 | 90.1777 | 135.7923 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 8,192 | 4 | Metal 4 | llama-server | 136.8 | 0.7354 | 2 | 100.18 | 4.2992 | 2 | 86.975 | 3.2032 | 2 | 254.535 | 11.9289 | 2 | 28.59 | 0.0849 | 2 | 28.555 | 0.0071 | 2 | 30.81 | 0.0849 | 2 | null | null | 0 | null | null | 0 | 130.1929 | 143.4071 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 8,192 | 8 | Metal 4 | llama-server | 137.8 | 2.1779 | 2 | 6,855.55 | 139.6253 | 2 | 7,365.055 | 143.5073 | 2 | 7,961.5 | 95.2614 | 2 | 28.39 | 0.4101 | 2 | 28.05 | 0.5657 | 2 | 39.845 | 12.5653 | 2 | null | null | 0 | null | null | 0 | 118.2328 | 157.3672 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 8,192 | 16 | Metal 4 | llama-server | 137.2 | 0.6647 | 2 | 20,607.935 | 201.9426 | 2 | 22,261.455 | 7.4176 | 2 | 23,281.135 | 165.8378 | 2 | 28.805 | 0.1061 | 2 | 28.47 | 0.0424 | 2 | 49.255 | 2.2415 | 2 | null | null | 0 | null | null | 0 | 131.2282 | 143.1718 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 8,192 | 32 | Metal 4 | llama-server | 136.945 | 0.0212 | 2 | 47,912.02 | 226.7974 | 2 | 51,300.065 | 155.1746 | 2 | 53,074.735 | 233.2109 | 2 | 28.975 | 0.0071 | 2 | 28.57 | 0 | 2 | 49.5 | 1.5415 | 2 | null | null | 0 | null | null | 0 | 136.7544 | 137.1356 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 8,192 | null | null | multiturn | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null | 0 | null | null |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 16,384 | 1 | Metal 4 | llama-server | 103.19 | 0.1556 | 2 | 39.82 | 0 | 2 | 29.685 | 0.1202 | 2 | 111.045 | 0.0495 | 2 | 9.555 | 0.0071 | 2 | 9.55 | 0 | 2 | 10.08 | 0.0707 | 2 | null | null | 0 | null | null | 0 | 101.7923 | 104.5877 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 16,384 | 2 | Metal 4 | llama-server | 123.65 | 0.2687 | 2 | 67.545 | 0.3182 | 2 | 46.925 | 0.0636 | 2 | 213.43 | 1.0324 | 2 | 15.965 | 0.0354 | 2 | 15.965 | 0.0354 | 2 | 16.525 | 0.0636 | 2 | null | null | 0 | null | null | 0 | 121.2359 | 126.0641 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 16,384 | 4 | Metal 4 | llama-server | 136.59 | 0.7354 | 2 | 114.245 | 7.5873 | 2 | 89.13 | 13.4492 | 2 | 351.25 | 0.8627 | 2 | 28.325 | 0.2616 | 2 | 28.525 | 0.0212 | 2 | 30.59 | 0.0566 | 2 | null | null | 0 | null | null | 0 | 129.9829 | 143.1971 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 16,384 | 8 | Metal 4 | llama-server | 136.455 | 0.9263 | 2 | 7,011.42 | 22.5708 | 2 | 7,489.285 | 6.1165 | 2 | 7,997.485 | 119.7485 | 2 | 28.72 | 0.1131 | 2 | 28.565 | 0.0212 | 2 | 40.31 | 13.4916 | 2 | null | null | 0 | null | null | 0 | 128.1326 | 144.7774 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 16,384 | 16 | Metal 4 | llama-server | 137 | 0.396 | 2 | 20,587.525 | 40.7081 | 2 | 22,243.44 | 28.1146 | 2 | 23,617.235 | 1,180.4511 | 2 | 28.855 | 0.1202 | 2 | 28.53 | 0.0707 | 2 | 39.3 | 11.936 | 2 | null | null | 0 | null | null | 0 | 133.4423 | 140.5577 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 16,384 | 32 | Metal 4 | llama-server | 136.83 | 0.198 | 2 | 47,652.32 | 518.2668 | 2 | 50,799.495 | 449.43 | 2 | 52,562.21 | 145.0983 | 2 | 28.995 | 0.0495 | 2 | 28.575 | 0.0636 | 2 | 50.335 | 0.601 | 2 | null | null | 0 | null | null | 0 | 135.0512 | 138.6088 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 32,768 | 1 | Metal 4 | llama-server | 102.845 | 0.0071 | 2 | 40.145 | 0.1202 | 2 | 29.465 | 0.3323 | 2 | 111.67 | 0.1556 | 2 | 9.59 | 0.0141 | 2 | 9.585 | 0.0071 | 2 | 10.115 | 0.1202 | 2 | null | null | 0 | null | null | 0 | 102.7815 | 102.9085 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 32,768 | 2 | Metal 4 | llama-server | 122.785 | 1.393 | 2 | 67.86 | 0.7495 | 2 | 46.875 | 0.0919 | 2 | 212.505 | 2.8496 | 2 | 15.985 | 0.0495 | 2 | 15.965 | 0.0071 | 2 | 18.72 | 3.2244 | 2 | null | null | 0 | null | null | 0 | 110.2696 | 135.3004 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 32,768 | 4 | Metal 4 | llama-server | 136.125 | 0.502 | 2 | 103.745 | 17.0342 | 2 | 87.755 | 9.355 | 2 | 258.325 | 127.0176 | 2 | 28.28 | 0.2687 | 2 | 28.425 | 0.0071 | 2 | 37.715 | 10.2743 | 2 | null | null | 0 | null | null | 0 | 131.6144 | 140.6356 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 32,768 | 8 | Metal 4 | llama-server | 137.12 | 0.1697 | 2 | 6,986.87 | 99.0232 | 2 | 7,480.84 | 14.7927 | 2 | 7,970.405 | 105.8468 | 2 | 28.71 | 0.2121 | 2 | 28.495 | 0.0778 | 2 | 40.265 | 13.7108 | 2 | null | null | 0 | null | null | 0 | 135.5953 | 138.6447 |
Apple M4 Pro | Qwen3.5-0.8B | Q3_K_M | 32,768 | 16 | Metal 4 | llama-server | 136.755 | 0.2616 | 2 | 20,635.885 | 376.202 | 2 | 22,272.605 | 7.3751 | 2 | 23,183.275 | 120.0314 | 2 | 28.935 | 0.1061 | 2 | 28.56 | 0.0566 | 2 | 50.27 | 0.9051 | 2 | null | null | 0 | null | null | 0 | 134.4044 | 139.1056 |
Poor Paul's Benchmark Results
Community-submitted LLM inference benchmark data from real consumer, prosumer, and small-business hardware.
Each row is one normalized benchmark result: a specific model × quantization × hardware × configuration combination with measured performance metrics.
Use this to:
- Compare throughput, latency, and power efficiency across GPU models and quantizations
- Study how concurrency and context length scale on different hardware
- Build leaderboards, dashboards, and data-driven GPU purchasing decisions
- Query via AI: any MCP client connected to
mcp.poorpaul.devcan answer questions from this data directly
Files
| File | Description |
|---|---|
data/results_*.jsonl |
Raw benchmark submissions, one file per run, appended continuously |
llms.txt |
Machine-readable summary for LLM context injection |
Quick start
import pandas as pd
from datasets import load_dataset
# Stream all rows (recommended for large queries)
ds = load_dataset("paulplee/ppb-results", streaming=True, split="train")
df = pd.DataFrame(iter(ds))
# Filter to throughput rows for a specific GPU
tput = df[
(df["gpu_name"] == "NVIDIA GeForce RTX 5090") &
(df["runner_type"] == "llama-bench")
][["model_base", "quant", "n_ctx", "throughput_tok_s"]]
Or via DuckDB directly from HuggingFace:
SELECT gpu_name, model_base, quant, concurrent_users,
AVG(throughput_tok_s) AS mean_tok_s,
COUNT(*) AS n
FROM read_parquet('hf://datasets/paulplee/ppb-results/data/*.jsonl')
WHERE runner_type = 'llama-bench'
GROUP BY ALL
ORDER BY mean_tok_s DESC;
Schema (v0.9.0)
All columns are present on every row. Fields that do not apply to a given runner are null.
Model identity
| Column | Type | Description |
|---|---|---|
run_type |
string | quantitative, qualitative, or all |
model |
string | Full model path (e.g. unsloth/Qwen3.5-9B-GGUF/Qwen3.5-9B-Q8_0.gguf) |
model_base |
string | Base model name without quant suffix (e.g. Qwen3.5-9B) |
quant |
string | Quantization format (e.g. Q4_K_M, Q8_0, BF16) |
model_org |
string|null | HuggingFace organisation (e.g. unsloth); null for local paths |
model_repo |
string|null | Full HF org/repo string; null for local paths |
runner_type |
string | Benchmark backend: llama-bench, llama-server, or llama-server-loadtest |
LLM engine
| Column | Type | Description |
|---|---|---|
llm_engine_name |
string|null | Inference engine (e.g. llama.cpp) |
llm_engine_version |
string|null | Engine version with build hash (e.g. b5063 (58ab80c3)) |
Hardware
| Column | Type | Description |
|---|---|---|
gpu_name |
string|null | Primary GPU model name |
gpu_vram_gb |
float|null | Primary GPU VRAM in GB |
gpu_driver |
string|null | GPU driver version |
gpu_count |
int | Number of GPUs used |
gpu_names |
string|null | Comma-joined list of all GPU names (multi-GPU runs) |
gpu_total_vram_gb |
float|null | Total VRAM across all GPUs |
unified_memory |
bool|null | true for Apple Silicon — GPU and CPU share the same memory pool |
gpu_compute_capability |
string|null | CUDA compute capability (e.g. "9.0" for Blackwell); null for non-CUDA |
gpu_pcie_gen |
int|null | PCIe generation (e.g. 5); null for unified-memory platforms |
gpu_pcie_width |
int|null | PCIe link width in lanes (e.g. 16); null for unified-memory platforms |
gpu_power_limit_w |
float|null | Configured TDP limit in Watts (from NVML); null for non-NVIDIA |
backends |
string|null | Compute backend with version (e.g. CUDA 13.0, Metal, CPU) |
cpu_model |
string|null | CPU model name |
Benchmark configuration
| Column | Type | Description |
|---|---|---|
n_ctx |
int|null | Context window size in tokens |
n_batch |
int|null | Batch size for prompt processing |
split_mode |
string|null | Multi-GPU split strategy (layer, row, none); null for single-GPU |
tensor_split |
string|null | Per-GPU VRAM weight string (e.g. "1,1"); null for single-GPU |
concurrent_users |
int|null | Number of simulated parallel users. For llama-server-loadtest, each row is one concurrency level from the measured curve. |
Workload
| Column | Type | Description |
|---|---|---|
task_type |
string|null | Workload category (e.g. text-generation, context-rot-niah) |
prompt_dataset |
string|null | Prompt source (e.g. sharegpt-v3); null for llama-bench |
num_prompts |
int|null | Prompts sent per run; null for llama-bench |
n_predict |
int|null | Max tokens generated per prompt; null for llama-bench |
Performance — throughput
| Column | Type | Description |
|---|---|---|
throughput_tok_s |
float|null | Tokens per second (primary throughput metric) |
vram_cliff_tokens |
int|null | Largest n_ctx that loaded without OOM during pre-flight discovery |
Performance — power
| Column | Type | Description |
|---|---|---|
avg_power_w |
float|null | Average GPU power draw in Watts |
max_power_w |
float|null | Peak GPU power draw in Watts |
Performance — thermal
| Column | Type | Description |
|---|---|---|
avg_gpu_temp_c |
float|null | Average GPU temperature (°C) |
max_gpu_temp_c |
float|null | Peak GPU temperature (°C) |
avg_cpu_temp_c |
float|null | Average CPU temperature (°C) |
max_cpu_temp_c |
float|null | Peak CPU temperature (°C) |
avg_fan_speed_rpm |
float|null | Average fan speed (RPM) |
max_fan_speed_rpm |
float|null | Peak fan speed (RPM) |
Performance — user experience (server runners only)
| Column | Type | Description |
|---|---|---|
avg_ttft_ms |
float|null | Average Time-To-First-Token (ms) |
p50_ttft_ms |
float|null | Median TTFT (ms) |
p99_ttft_ms |
float|null | 99th-percentile TTFT (ms) |
avg_itl_ms |
float|null | Average Inter-Token Latency (ms) |
p50_itl_ms |
float|null | Median ITL (ms) |
p99_itl_ms |
float|null | 99th-percentile ITL (ms) |
Qualitative evaluation
Populated when run_type is qualitative or all. All null for pure quantitative runs.
| Column | Type | Description |
|---|---|---|
context_rot_score |
float|null | Mean accuracy across all (length × depth) long-context recall cases |
context_rot_accuracy_by_length |
string|null | JSON {haystack_length: accuracy} map |
context_rot_accuracy_by_depth |
string|null | JSON {depth_pct: accuracy} map |
tool_selection_accuracy |
float|null | Fraction of cases with correct tool name selected |
parameter_accuracy |
float|null | Fraction of cases with all required arguments matching ground truth |
parameter_hallucination_rate |
float|null | Fraction of cases with invented arguments not in schema |
parse_success_rate |
float|null | Fraction of cases with parseable tool-call JSON |
overall_tool_accuracy |
float|null | Geometric mean of tool selection × parameter accuracy |
knowledge_accuracy_mean |
float|null | Mean fraction of factual claims judged consistent with common knowledge |
knowledge_accuracy_std |
float|null | Standard deviation of per-prompt knowledge-accuracy scores |
answer_relevancy_mean |
float|null | Mean judge-rated response relevancy (0–1) |
coherence_mean |
float|null | Mean judge-rated coherence (0–1) |
quality_composite_score |
float|null | Mean of knowledge accuracy, relevancy, and coherence |
memory_accuracy |
float|null | LongMemEval recall accuracy (0–1); null when MT-Bench mode was used |
mt_bench_score |
float|null | MT-Bench score (1–10 scale); null when LongMemEval mode was used |
cases_evaluated |
int|null | Number of evaluation cases that completed |
cases_skipped_context |
int|null | Cases skipped because context exceeded vram_cliff_tokens |
Blob columns
| Column | Type | Description |
|---|---|---|
qualitative |
string|null | Full qualitative result payload as JSON string |
quantitative |
string|null | Full quantitative result payload as JSON string |
meta |
string|null | Reproducibility hints (e.g. quality_prompts_cache_hash) as JSON string |
OS / system context
| Column | Type | Description |
|---|---|---|
os_system |
string|null | OS family: Linux, Darwin, Windows |
os_release |
string|null | Kernel / OS release string |
os_machine |
string|null | CPU architecture (e.g. x86_64, arm64) |
os_distro |
string|null | Distribution name (e.g. Ubuntu, macOS) |
os_distro_version |
string|null | Distribution version (e.g. 24.04, 15.5) |
cpu_cores |
int|null | Number of logical CPU cores |
ram_total_gb |
float|null | Total system RAM in GB |
Submission metadata
| Column | Type | Description |
|---|---|---|
submitter |
string|null | Optional public display name of the contributor |
timestamp |
string|null | ISO 8601 UTC time the benchmark run produced the row |
submitted_at |
string|null | ISO 8601 UTC time the row was uploaded |
Provenance and deduplication
| Column | Type | Description |
|---|---|---|
schema_version |
string | Schema version at time of flattening (0.9.0) |
benchmark_version |
string | PPB software version that produced the row |
suite_run_id |
string|null | UUID shared by all rows from the same ppb invocation |
submission_id |
string|null | UUID assigned during upload |
row_id |
string | UUID uniquely identifying this row |
machine_fingerprint |
string | SHA-256 of hardware profile fields (anonymous machine identity) |
run_fingerprint |
string | SHA-256 of benchmark configuration + machine fingerprint |
result_fingerprint |
string | SHA-256 of run identity + measured metrics — uniquely identifies one result |
source_file_sha256 |
string|null | SHA-256 of the source JSONL file |
Extensibility
| Column | Type | Description |
|---|---|---|
tags |
string|null | Free-form JSON string for arbitrary metadata from the suite TOML |
Null value guide
Many columns are runner-specific. Expected nulls by runner type:
| Column group | llama-bench |
llama-server |
llama-server-loadtest |
|---|---|---|---|
| TTFT / ITL metrics | null | populated | populated |
prompt_dataset, num_prompts, n_predict |
null | populated | populated |
concurrent_users |
null | populated | populated (one row per level) |
gpu_pcie_gen, gpu_pcie_width |
null on Apple Silicon | null on Apple Silicon | null on Apple Silicon |
unified_memory |
null on NVIDIA | null on NVIDIA | null on NVIDIA |
| Qualitative columns | null | null | null |
Qualitative columns are populated only when run_type is qualitative or all.
Deduplication
Use fingerprints to control for duplicates in analysis:
# Exact duplicate rows (same result, same machine, same run)
df.drop_duplicates(subset=["result_fingerprint"], inplace=True)
# Latest run per (gpu, model, quant, n_ctx, concurrent_users) config
latest = (
df.sort_values("timestamp")
.drop_duplicates(subset=["run_fingerprint"], keep="last")
)
Ecosystem
| Benchmark tool | poor-pauls-benchmark — run benchmarks and contribute results |
| MCP server | ppb-mcp — lets any MCP-compatible LLM client query this dataset directly |
| Analytics | poorpaul.dev/insights — leaderboard and visual analysis |
Connect any MCP client to https://mcp.poorpaul.dev/mcp to query this data conversationally.
Contributing results
- Clone poor-pauls-benchmark
- Configure
suites/my_gpu.tomlwith your hardware and models - Run:
uv run ppb.py all suites/my_gpu.toml - Results are pushed here automatically — no PR required
No hardware contribution is too small. Every GPU tier that's missing from this dataset is a blind spot for the community.
License
Dataset content: CC BY 4.0 — contributions are attributed to their submitters.
Tooling: MIT — see the benchmark repository.
Third-party evaluation data included in rows:
- BFCL v4 evaluation cases © UC Berkeley, CC BY 4.0
- MT-Bench questions © LMSYS, MIT licence
- ShareGPT prompts under their original licence
- Downloads last month
- 1,712