返回列表
LLMxAICC每日性能基线
2023-10-08
浏览量:14424
# 写在前面 update@2025.3.14: 对于目前qwen系列最新模型,已经基于mindie框架在32GB芯片完成推理,并在下文中提供性能参考。 update@2024.1.16: 对于目前的社区顶流MoE模型的推理,已经基于AICC完成推理的开箱适配(以DeepSeek-MoE-16B为例),并在下文中提供性能参考。 update@2023.10.18: 对于当前的64GB/32GBA芯片,推荐使用CANN7.0+MindSpore2.2/PyTorch2.1的镜像(已经全量上架AICC镜像列表),较老版本镜像可能存在性能与精度问题。 update@2023.10.17: 64GB+CANN7.0+PyTorch2.1+Transformer4.32.0已经支持LLM的开箱即用(如qwen等)。 first@2023.10.9: 本文提供武汉AICC运营团队基于各架构硬件进行性能实测的结果,作为参考 * 文章中性能结果为AICC运营团队复现的结果,*不代表该模型在芯片上的最佳性能表现,最佳性能表现与并行策略、版本等相关,以官方发布数据为准* * 部分性能结果还在施工中 * 未明确指定镜像的模型,请在镜像列表选取最新的对应MindSpore/PyTorch镜像 * 未提供相关的模型信息可以联系武汉AICC运营团队人员进行反馈 # 参考资料一览 * [镜像仓库](http://mirrors.cn-central-221.ovaijisuan.com):获取最新的配套镜像 * [MindFormers仓库](https://gitee.com/mindspore/mindformers/tree/dev/research/baichuan2):MindSpore大模型开发的套件 * [MindFormers手册](https://mindformers.readthedocs.io/zh\_CN/latest/):MindFormer相关手册、接口 * [AscendPyTorch](https://gitee.com/ascend/pytorch):Ascend+PyTorch代码仓库 * [AscendSpeed](https://gitee.com/ascend/AscendSpeed):Ascend+PyTorch大模型开发套件 * [Ascend ModelZoo](https://www.hiascend.com/software/modelzoo/big-models?activeTab=computer-vision):Ascend已适配模型仓库 # 训练场景 | 模型 | 芯片型号 | 状态 | 框架 | 卡数 | 性能 | 环境 | 环境镜像 | 参考文档 | | ------- | -------- | -------- | --------- | ---- | -------- | ---- | -------- | -------- | | ChatGLM-6B | 32GB | OK | MindSpore | 1*8 | - | MA/BMS | - | - | | ChatGLM-6B | 64GB | OK | PyTorch | 1*8 | - | BMS | [HERE](/detail/104.html) | [HERE](/detail/104.html#Chatglm-6B) | | Baichuan2-7B | 32GB | OK | MindSpore | 1*8 | - | MA/BMS | - | - | | Baichuan2-13B | 64GB | OK | MindSpore | 1*8 | 563.2tokens/s/p | MA/BMS | [HERE](detail/107.html) | [HERE](https://gitee.com/mindspore/mindformers/blob/dev/research/baichuan2/baichuan2.md#64GB-2) | | Baichuan2-7B | 64GB | OK | PyTorch | 1*8 | - | BMS | [HERE](/detail/104.html) | [HERE](/detail/104.html) | | Baichuan2-13B | 64GB | OK | PyTorch | 1*8 | 1124.6tokens/s/p | BMS | [HERE](/detail/105.html) | [HERE](https://gitee.com/ascend/AscendSpeed/blob/master/examples/baichuan/README.md#accuracy-of-the-loss-1) | # 推理场景 | 模型 | 芯片型号 | 状态 | 框架 | 卡数 | 性能 | 环境 | 环境镜像 | 参考文档 | | ------- | -------- | -------- | --------- | ---- | -------- | ---- | -------- | -------- | | ChatGLM-6B | 32GB | OK | MindSpore | 1*1 | 22.4tokens/s/p | MA/BMS | - | - | | ChatGLM-6B | 32GB | OK | PyTorch | 1*1 | 0.24tokens/s/p | MA/BMS | [HERE](/detail/69.html) | - | | ChatGLM2-6B | 64GB | OK | MindSpore | 1*1 | 24.1tokens/s/p | MA | [HERE](/detail/103.html) | - | |ChatGLM3-6B | 32GB | OK | MindSpore | 1*1 | 28.86tokens/s/p | MA | - | - | | Baichuan2-7B | 32GB | OK | MindSpore | 1*1 | - | MA/BMS | [HERE](/detail/103.html) | - | | Baichuan2-7B | 64GB | OK | MindSpore| 1*1 | - | MA/BMS | [HERE](/detail/103.html) | - | | Baichuan2-7B | 64GB | OK | PyTorch| 1*1 | - | MA/BMS | [HERE](/detail/104.html) | [HERE](/detail/104.html#ChatGLM-6B) | | Baichuan2-13B | 64GB | OK | MindSpore | 1*1 | 14.1tokens/s/p | MA | [HERE](/detail/107.html) | [HERE](https://gitee.com/mindspore/mindformers/blob/dev/research/baichuan2/baichuan2.md#64GB-3) | | Baichuan2-13B | 64GB | OK | PyTorch | 1*1 | - | BMS | - | - | | QWEN-14B | 64GB | OK | PyTorch | 1*1 | - | BMS | [HERE](/detail/106.html) | - | | QWEN-7B | 64GB | OK | PyTorch | 1*1 | - | BMS | [HERE](/detail/106.html) | - | | ChatGLM3-6B | 64GB | OK | PyTorch | 1*1 | 16.48tokens/s/p | BMS | - | - | | Yi-6B-200K | 64GB | OK | PyTorch | 1*1 | 14.67tokens/s/p | BMS | [HERE](/detail/106.html) | - | | DeepSeek-MoE-16B | 64GB | OK | PyTorch | 1*1 | 4.01tokens/s/p | BMS | [HERE](/detail/106.html) | - | # LoRA | 模型 | 芯片型号 | 状态 | 框架 | 卡数 | 性能 | 环境 | 环境镜像 | 参考文档 | | ------- | -------- | -------- | --------- | ---- | -------- | ---- | -------- | -------- | | ChatGLM-6B | 32GB | OK | MindSpore | 1*1 | - | NA/BMS | - | - | | ChatGLM-6B | 32GB | OK | PyTorch | 1*1 | - | MA/BMS | [HERE](/detail/81.html) | - | | ChatGLM-6B | 64GB | OK | PyTorch | 1*1 | - | BMS | [HERE](/detail/104.html) | [HERE](/detail/104.html#ChatGLM-6B) | | Baichuan2-7B | 32GB | OK | MindSpore | 1*1 | - | MA/BMS | - | - | # 模型镜像 因为众所周知的原因,在国内通过HuggingFace直接拉取模型可能会因为一系列魔法而导致失败,根据我们实际的测试情况,可以基于ModelScope等第三方平台拉取模型,这里在武汉AICC上也提供了常见模型的内部镜像源,可以通过[obsutils](https://support.huaweicloud.com/utiltg-obs/obs\_11\_0003.html)组件直接拉取。 | 模型 | 来源 | 版本 | 推荐框架 | 模型镜像地址 | | ------- | -------- | -------- | --------- | ---- | | ChatGLM-6B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/ChatGLM/ChatGLM-6B/ | | ChatGLM2-6B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/ChatGLM/ChatGLM2-6B/ | | ChatGLM3-6B | ModelScope | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/ChatGLM/ChatGLM3-6B/ | | Yi-6B-200K | ModelScope | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/Yi-6B-200K/ | | baichuan-7B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/baichuan-7B/ | | Baichuan-13B-Chat | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/Baichuan/Baichuan-13B-Chat/ | | Baichuan-13B-Base | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/Baichuan/Baichuan-13B-Base/ | | Baichuan2-7B-Base | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Baichuan2-7B-Base/ | | Baichuan2-13B-Base | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Baichuan2-13B-Base/ | | LLaMA-7B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/llama-7b-hf/ | | LLaMA-13B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/llama-13b-hf/ | | LLaMA-33B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/llama-33b-hf/ | | LLaMA-65B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/llama-65b-hf/ | | LLaMA2-7B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/llama-2-7b-hf/ | | LLaMA2-13B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Llama-2-13b-hf/ | | LLaMA2-70B-Chat | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Llama-2-70b-chat-ms/ | | 鹏城.脑海-7B | OpenI | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/PengCheng-7B/ | | SAM | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/SAM/ | | Qwen-72B | ModelScope | 1.0.4 | PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Qwen-72B/ | | QWEN-14B-Chat | ModelScope | 1.0.4 | PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Qwen-14B/ | | QWEN-7B-Chat | ModelScope | 1.0.4 | PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Qwen-7B/| | Qwen1.5-72B | ModelScope | 1.0.4 | PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Qwen1.5-72B/ | | Vicuna-7B | Huggingface | 1.5 | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/vicuna/ | | Aquila-7B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Aquila-7B/ | | Mixtral-8x7B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Mixtral-8x7B-v0.1/ | | Bloom-7B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/bloom-7b1/ | | Internlm-7B | Huggingface | - | MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/internlm-7b/ | | Meta-Llama-3-8B-Instruct | ModelScope | - | PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Meta-Llama-3-8B-Instruct/ | | Meta-Llama-3-70B-Instruct | ModelScope | - | PyTorch | obs://obs-whaicc-fae-public/checkpoint/ModelLink/Meta-Llama-3-70B-Instruct/ | |Qwen-VL-Chat | Huggingface | - | PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/Qwen/Qwen-VL-Chat/| |Qwen1.5-0.5B | Huggingface | - | PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/Qwen/Qwen1.5-0.5B/| |Qwen1.5-7B | Huggingface | - | PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/Qwen/Qwen1.5-7B/| |gpt2-xl | Huggingface | - | MindSpore/yTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/gpt/gpt2-xl/| |gpt2 | Huggingface | - |MindSpore/PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/gpt/gpt2/| |CodeFuse-DeepSeek-33B | ModelScope | - | PyTorch | obs://obs-whaicc-fae-public/checkpoint/huggingface/CodeFuse-DeepSeek-33B/| # qwen系列推理性能 测试环境,芯片型号:32GB,推理框架:MindIE,推理镜像: [HERE](/detail/163.html) | 模型 | 卡数 | Batchsize | In\_seq | Out\_seq | Total time(s) | First token time(ms) | Non-first token time(ms) | Non-first token Throughout(Tokens/s) | Throughout(Tokens/s) | Non-first token Throughout Average(Tokens/s) | E2E Throughout Average(Tokens/s) | | --------------------- | ---------- | --------| --------- | ---------- | --------------- | ---------------------- | -------------------------- | --------------------------------------- | ---------------------- | ----------------------------------------------- | ---------------------------------- | | qwen2.5\_7b | 1 | 1 | 1024 | 1024 | 19.88901 | 115.74 | 19.25 | 51.94805 | 51.48572 | 51.94805 | 51.48572 | | qwen2.5\_7b | 1 | 2 | 1024 | 1024 | 22.16863 | 225.41 | 21.37 | 93.58914 | 92.38281 | 93.58914 | 92.38281 | | qwen2.5\_7b | 1 | 4 | 1024 | 1024 | 24.7084 | 438.2 | 23.64 | 169.2047 | 165.7736 | 169.2047 | 165.7736 | | qwen2.5\_7b | 1 | 8 | 1024 | 1024 | 30.9415 | 878.26 | 29.29 | 273.1308 | 264.7577 | 273.1308 | 264.7577 | | qwen2.5\_7b | 1 | 16 | 1024 | 1024 | 43.28965 | 1792.08 | 40.46 | 395.4523 | 378.4738 | 395.4523 | 378.4738 | | qwen2.5\_7b | 1 | 1 | 2048 | 2048 | 43.2857 | 260.35 | 20.94 | 47.75549 | 47.31355 | 47.75549 | 47.31355 | | qwen2.5\_7b | 1 | 2 | 2048 | 2048 | 47.88171 | 496.78 | 23.08 | 86.65511 | 85.54415 | 86.65511 | 85.54415 | | qwen2.5\_7b | 1 | 4 | 2048 | 2048 | 59.05803 | 999.85 | 28.29 | 141.3927 | 138.711 | 141.3927 | 138.711 | | qwen2.5\_7b | 1 | 8 | 2048 | 2048 | 81.53607 | 2031.56 | 38.76 | 206.3983 | 200.9417 | 206.3983 | 200.9417 | | qwen2.5\_7b | 1 | 16 | 2048 | 2048 | 127.6667 | 4266.57 | 60.18 | 265.8691 | 256.6683 | 265.8691 | 256.6683 | | qwen2.5\_7b | 1 | 1 | 4096 | 4096 | 96.92498 | 635.39 | 23.45 | 42.64392 | 42.25949 | 42.64392 | 42.25949 | | qwen2.5\_7b | 1 | 2 | 4096 | 4096 | 115.7366 | 1242.9 | 27.89 | 71.71029 | 70.78141 | 71.71029 | 70.78141 | | qwen2.5\_7b | 1 | 4 | 4096 | 4096 | 159.5849 | 2513.8 | 38.28 | 104.4932 | 102.6663 | 104.4932 | 102.6663 | | qwen2.5\_7b | 1 | 8 | 4096 | 4096 | 245.2499 | 5421.17 | 58.49 | 136.7755 | 133.6107 | 136.7755 | 133.6107 | | qwen2.5\_7b | 2 | 16 | 4096 | 4096 | 247.5071 | 7188.08 | 58.6 | 273.0375 | 264.7843 | 273.0375 | 264.7843 | | qwen2.5\_7b | 1 | 1 | 8192 | 8192 | 243.223 | 1801.97 | 29.41 | 34.00204 | 33.68102 | 34.00204 | 33.68102 | | qwen2.5\_7b | 1 | 2 | 8192 | 8192 | 324.7464 | 3483.73 | 39.15 | 51.08557 | 50.45167 | 51.08557 | 50.45167 | | qwen2.5\_7b | 1 | 4 | 8192 | 8192 | 503.4496 | 7414.25 | 60.49 | 66.12663 | 65.08695 | 66.12663 | 65.08695 | | qwen2.5\_7b | 2 | 16 | 8192 | 8192 | 845.1011 | 18288.82 | 100.86 | 158.6357 | 155.0962 | 158.6357 | 155.0962 | | qwen2.5\_14b | 1 | 1 | 1024 | 1024 | 38.01625 | 221.63 | 36.86 | 27.12968 | 26.93585 | 27.12968 | 26.93585 | | qwen2.5\_14b | 1 | 2 | 1024 | 1024 | 42.7679 | 423.38 | 41.31 | 48.41443 | 47.88639 | 48.41443 | 47.88639 | | qwen2.5\_14b | 1 | 4 | 1024 | 1024 | 51.12755 | 859.07 | 49 | 81.63265 | 80.11336 | 81.63265 | 80.11336 | | qwen2.5\_14b | 1 | 1 | 2048 | 2048 | 84.29012 | 493.76 | 40.87 | 24.46782 | 24.29703 | 24.46782 | 24.29703 | | qwen2.5\_14b | 1 | 2 | 2048 | 2048 | 98.25019 | 1008.07 | 47.43 | 42.1674 | 41.68949 | 42.1674 | 41.68949 | | qwen2.5\_14b | 2 | 1 | 1024 | 1024 | 72.97134 | 181.91 | 71.08 | 14.06866 | 14.03291 | 14.06866 | 14.03291 | | qwen2.5\_14b | 2 | 2 | 1024 | 1024 | 75.47412 | 340.39 | 73.37 | 27.2591 | 27.13513 | 27.2591 | 27.13513 | | qwen2.5\_14b | 2 | 4 | 1024 | 1024 | 75.86003 | 678.84 | 73.41 | 54.48849 | 53.99418 | 54.48849 | 53.99418 | | qwen2.5\_14b | 2 | 8 | 1024 | 1024 | 76.63994 | 1380.49 | 73.48 | 108.8732 | 106.8894 | 108.8732 | 106.8894 | | qwen2.5\_14b | 2 | 16 | 1024 | 1024 | 79.72219 | 2848.29 | 75.05 | 213.1912 | 205.5137 | 213.1912 | 205.5137 | | qwen2.5\_14b | 2 | 1 | 2048 | 2048 | 156.1528 | 390.64 | 76.02 | 13.15443 | 13.11536 | 13.15443 | 13.11536 | | qwen2.5\_14b | 2 | 2 | 2048 | 2048 | 150.143 | 753.94 | 72.91 | 27.43108 | 27.28065 | 27.43108 | 27.28065 | | qwen2.5\_14b | 2 | 4 | 2048 | 2048 | 150.3928 | 1533.21 | 72.64 | 55.06608 | 54.47069 | 55.06608 | 54.47069 | | qwen2.5\_14b | 2 | 8 | 2048 | 2048 | 160.912 | 3161.11 | 76.98 | 103.9231 | 101.8196 | 103.9231 | 101.8196 | | qwen2.5\_14b | 2 | 16 | 2048 | 2048 | 168.5616 | 6611.9 | 79.02 | 202.4804 | 194.3978 | 202.4804 | 194.3978 | | qwen2.5\_14b | 2 | 1 | 4096 | 4096 | 308.4805 | 931.58 | 75.03 | 13.328 | 13.27799 | 13.328 | 13.27799 | | qwen2.5\_14b | 2 | 2 | 4096 | 4096 | 303.8981 | 1831.77 | 73.69 | 27.14072 | 26.95641 | 27.14072 | 26.95641 | | qwen2.5\_14b | 2 | 4 | 4096 | 4096 | 308.0404 | 3740.4 | 74.24 | 53.87931 | 53.18783 | 53.87931 | 53.18783 | | qwen2.5\_14b | 2 | 8 | 4096 | 4096 | 333.7525 | 7921.79 | 79.49 | 100.6416 | 98.18055 | 100.6416 | 98.18055 | | qwen2.5\_14b | 4 | 16 | 4096 | 4096 | 970.3565 | 10659.72 | 234.25 | 68.30309 | 67.53807 | 68.30309 | 67.53807 | | qwen2.5\_14b | 2 | 1 | 8192 | 8192 | 614.0193 | 2545.8 | 74.58 | 13.40842 | 13.3416 | 13.40842 | 13.3416 | | qwen2.5\_14b | 2 | 2 | 8192 | 8192 | 627.371 | 4925.62 | 75.92 | 26.34352 | 26.11533 | 26.34352 | 26.11533 | | qwen2.5\_14b | 2 | 4 | 8192 | 8192 | 679.0121 | 10179.22 | 81.58 | 49.03163 | 48.25835 | 49.03163 | 48.25835 | | qwen2.5\_14b | 4 | 8 | 8192 | 8192 | 1931.444 | 12755.89 | 234.15 | 34.16613 | 33.9311 | 34.16613 | 33.9311 | | qwen2.5\_14b | 4 | 16 | 8192 | 8192 | 1957.037 | 25215.76 | 235.73 | 67.87426 | 66.97471 | 67.87426 | 66.97471 | | qwen2.5\_32b\_w8a8 | 2 | 1 | 1024 | 1024 | 108.0264 | 290.27 | 105.22 | 9.503897 | 9.479162 | 9.503897 | 9.479162 | | qwen2.5\_32b\_w8a8 | 2 | 2 | 1024 | 1024 | 112.0608 | 537.36 | 108.92 | 18.3621 | 18.27579 | 18.3621 | 18.27579 | | qwen2.5\_32b\_w8a8 | 2 | 4 | 1024 | 1024 | 110.3893 | 1093.17 | 106.73 | 37.47775 | 37.10506 | 37.47775 | 37.10506 | | qwen2.5\_32b\_w8a8 | 2 | 8 | 1024 | 1024 | 111.5551 | 2220.2 | 106.78 | 74.9204 | 73.43458 | 74.9204 | 73.43458 | | qwen2.5\_32b\_w8a8 | 2 | 16 | 1024 | 1024 | 113.4264 | 4525.57 | 106.35 | 150.4466 | 144.4461 | 150.4466 | 144.4461 | | qwen2.5\_32b\_w8a8 | 2 | 1 | 2048 | 2048 | 211.8011 | 596.05 | 103.1 | 9.699321 | 9.669451 | 9.699321 | 9.669451 | | qwen2.5\_32b\_w8a8 | 2 | 2 | 2048 | 2048 | 218.1957 | 1198.06 | 105.92 | 18.88218 | 18.77214 | 18.88218 | 18.77214 | | qwen2.5\_32b\_w8a8 | 2 | 4 | 2048 | 2048 | 213.0752 | 2431.06 | 102.82 | 38.90294 | 38.44651 | 38.90294 | 38.44651 | | qwen2.5\_32b\_w8a8 | 2 | 8 | 2048 | 2048 | 222.6873 | 4897.39 | 106.31 | 75.25162 | 73.57401 | 75.25162 | 73.57401 | | qwen2.5\_32b\_w8a8 | 4 | 16 | 2048 | 2048 | 660.9509 | 6705.41 | 319.49 | 50.07981 | 49.57706 | 50.07981 | 49.57706 | | qwen2.5\_32b\_w8a8 | 2 | 1 | 4096 | 4096 | 439.3244 | 1428.12 | 106.85 | 9.358914 | 9.323407 | 9.358914 | 9.323407 | | qwen2.5\_32b\_w8a8 | 2 | 2 | 4096 | 4096 | 435.0344 | 2821.25 | 105.47 | 18.96274 | 18.83069 | 18.96274 | 18.83069 | | qwen2.5\_32b\_w8a8 | 2 | 4 | 4096 | 4096 | 445.9503 | 5698.4 | 107.43 | 37.23355 | 36.73952 | 37.23355 | 36.73952 | | qwen2.5\_32b\_w8a8 | 4 | 8 | 4096 | 4096 | 1319.759 | 7355.15 | 320.38 | 24.97035 | 24.82877 | 24.97035 | 24.82877 | | qwen2.5\_32b\_w8a8 | 4 | 16 | 4096 | 4096 | 1327.009 | 15386.62 | 320.18 | 49.97189 | 49.38623 | 49.97189 | 49.38623 | | qwen2.5\_32b\_w8a8 | 2 | 1 | 8192 | 8192 | 875.1647 | 3768.31 | 106.31 | 9.406453 | 9.360524 | 9.406453 | 9.360524 | | qwen2.5\_32b\_w8a8 | 2 | 2 | 8192 | 8192 | 871.3131 | 7299.47 | 105.41 | 18.97353 | 18.8038 | 18.97353 | 18.8038 | | qwen2.5\_32b\_w8a8 | 4 | 4 | 8192 | 8192 | 2665.852 | 9045.59 | 324.25 | 12.33616 | 12.29176 | 12.33616 | 12.29176 | | qwen2.5\_32b\_w8a8 | 4 | 8 | 8192 | 8192 | 2712.475 | 18420.6 | 328.8 | 24.3309 | 24.16096 | 24.3309 | 24.16096 | | qwen2.5\_32b\_w8a8 | 8 | 16 | 8192 | 8192 | 5250.721 | 22628.99 | 638.15 | 25.07248 | 24.96267 | 25.07248 | 24.96267 | | qwen2\_72b\_w8a8 | 4 | 1 | 1024 | 1024 | 419.3291 | 679.51 | 409.12 | 2.444271 | 2.441996 | 2.444271 | 2.441996 | | qwen2\_72b\_w8a8 | 4 | 2 | 1024 | 1024 | 419.5926 | 670.05 | 409.4 | 4.885198 | 4.880925 | 4.885198 | 4.880925 | | qwen2\_72b\_w8a8 | 4 | 4 | 1024 | 1024 | 426.5259 | 1421.11 | 415.43 | 9.628578 | 9.603169 | 9.628578 | 9.603169 | | qwen2\_72b\_w8a8 | 4 | 8 | 1024 | 1024 | 406.8124 | 2904.02 | 394.71 | 20.26804 | 20.13704 | 20.26804 | 20.13704 | | qwen2\_72b\_w8a8 | 4 | 16 | 1024 | 1024 | 424.4589 | 5775.08 | 409.14 | 39.10642 | 38.59973 | 39.10642 | 38.59973 | | qwen2\_72b\_w8a8 | 4 | 1 | 2048 | 2048 | 753.9075 | 712.81 | 367.85 | 2.718499 | 2.716514 | 2.718499 | 2.716514 | | qwen2\_72b\_w8a8 | 4 | 2 | 2048 | 2048 | 831.6901 | 1554.5 | 405.43 | 4.933034 | 4.924911 | 4.933034 | 4.924911 | | qwen2\_72b\_w8a8 | 4 | 4 | 2048 | 2048 | 733.2746 | 3067.87 | 356.61 | 11.21674 | 11.1718 | 11.21674 | 11.1718 | | qwen2\_72b\_w8a8 | 4 | 8 | 2048 | 2048 | 845.7427 | 6160.71 | 410.04 | 19.51029 | 19.37232 | 19.51029 | 19.37232 | | qwen2\_72b\_w8a8 | 4 | 16 | 2048 | 2048 | 364.1432 | 15476.23 | 170.26 | 93.97392 | 89.98659 | 93.97392 | 89.98659 | | qwen2\_72b\_w8a8 | 4 | 1 | 4096 | 4096 | 1607.69 | 1728.36 | 392.08 | 2.5505 | 2.547756 | 2.5505 | 2.547756 | | qwen2\_72b\_w8a8 | 4 | 2 | 4096 | 4096 | 1687.465 | 3512.63 | 411.11 | 4.864878 | 4.854618 | 4.864878 | 4.854618 | | qwen2\_72b\_w8a8 | 4 | 4 | 4096 | 4096 | 1613.921 | 7187.2 | 392.26 | 10.19732 | 10.15167 | 10.19732 | 10.15167 | | qwen2\_72b\_w8a8 | 4 | 8 | 4096 | 4096 | 1703.813 | 14057.13 | 412.53 | 19.39253 | 19.23216 | 19.39253 | 19.23216 | | qwen2\_72b\_w8a8 | 8 | 16 | 4096 | 4096 | 3318.572 | 18898.19 | 805.65 | 19.85974 | 19.74825 | 19.85974 | 19.74825 | | qwen2\_72b\_w8a8 | 4 | 1 | 8192 | 8192 | 3344.501 | 4282.38 | 407.69 | 2.452844 | 2.449394 | 2.452844 | 2.449394 | | qwen2\_72b\_w8a8 | 4 | 2 | 8192 | 8192 | 3364.605 | 8638.74 | 409.61 | 4.882693 | 4.869517 | 4.882693 | 4.869517 | | qwen2\_72b\_w8a8 | 4 | 4 | 8192 | 8192 | 3373.532 | 17247.46 | 409.64 | 9.764671 | 9.713261 | 9.764671 | 9.713261 | | qwen2\_72b\_w8a8 | 8 | 8 | 8192 | 8192 | 6583.141 | 22105.45 | 800.89 | 9.988887 | 9.955126 | 9.988887 | 9.955126 | | qwen2\_72b\_w8a8 | 8 | 16 | 8192 | 8192 | 6608.551 | 44805.77 | 801.21 | 19.9698 | 19.8337 | 19.9698 | 19.8337 |