
Omni-Infer v0.7.0 正式上线,带来面向超大规模混合专家(MoE)模型的高效推理加速能力。
v0.7.0
核心亮点
- Omni Cache 新增对 MLA(Multi-Head Latent Attention)与 GQA(Grouped-Query Attention)架构的支持
- 引入 chunk prefill 混合调度机制,实现更优的计算图融合与内存复用
- 全面兼容 SGLang 接口协议,提升复杂推理流程的灵活性与易用性
性能提升
- 在 2P8-1D32@A3 硬件配置下,支持 3.5K+1K 输入长度场景,Deepseek R1 实测 QPM 达 186,首字延迟(TTFT)显著优化
- 在 2P2-1D4@A3 配置下,openPangu-72B 单卡解码吞吐峰值达 1560 TPS,平均单 Token 延迟(TPOT)表现优异
已验证模型清单
| 模型 | 硬件平台 | 量化精度 | 部署模式 |
|---|---|---|---|
| openPangu-Ultra-MoE-718B | A3 | INT8 | PD分离 |
| openPangu-Ultra-MoE-718B | A2 | INT8 | PD分离 |
| openPangu-72B | A3 | INT8 | PD分离 |
| openPangu-38B | A3 | INT8 | 混布 |
| openPangu-38B | A2 | INT8 | 混布 |
| openPangu-7B | A3 | BF16 | 混布 |
| openPangu-7B | A2 | BF16 | 混布 |
| openPangu-7BVL | A3 | BF16 | 混布 |
| DeepSeek-R1 | A3 | INT8 | PD分离 |
| DeepSeek-R1 | A3 | W4A8C16 | PD分离 |
| DeepSeek-R1 | A3 | BF16 | PD分离 |
| DeepSeek-R1 | A2 | INT8 | PD分离 |
| DeepSeek-V3.1 | A3 | INT8 | PD分离 |
| DeepSeek-V3.2 | A3 | INT8 | PD分离 |
| DeepSeek-OCR | A2 | BF16 | 混布 |
| Qwen2.5-7B | A3 | INT8 | 混布(TP>=1 DP=1) |
| Qwen2.5-7B | A2 | INT8 | 混布(TP>=1 DP=1) |
| QwQ | A3 | BF16 | PD分离 |
| QwQ | A2 | BF16 | PD分离 |
| Qwen3-235B | A3 | INT8 | PD分离 |
| Qwen3-235B | A2 | BF16 | PD分离 |
| Qwen3-32B | A3 | BF16 | PD分离 |
| Qwen3-32B | A3 | INT8 | PD分离 |
| Qwen3-30B | A3 | BF16 | PD分离 |
| Kimi-K2 | A3 | W4A8C16 | PD分离 |
| Kimi-K2 Thinking | A3 | W4A8C16 | PD分离 |
| Longcat-flash | A3 | BF16 | PD分离 |
| Ling-1T | A3 | BF16 | PD分离 |
| GPT-OSS120B | A3 | INT8 | PD分离 |
| GPT-OSS120B | A2 | INT8 | PD分离 |
| GPT-OSS20B | A3 | INT8 | PD分离 |
| GPT-OSS20B | A2 | INT8 | PD分离 |
安装方式
| 硬件平台 | CPU架构 | Docker镜像地址 | Tar包名称 |
|---|---|---|---|
| A3 | arm | docker pull swr.cn-east-4.myhuaweicloud.com/omni/omniinfer-a3-arm:release\_v0.7.0-vllm | omni\_infer-a3-arm:v0.7.0\_vllm |
| A3 | x86 | docker pull swr.cn-east-4.myhuaweicloud.com/omni/omniinfer-a3-x86:release\_v0.7.0-vllm | omni\_infer-a3-x86:v0.7.0\_vllm |
| A2 | arm | docker pull swr.cn-east-4.myhuaweicloud.com/omni/omniinfer-a2-arm:release\_v0.7.0-vllm | omni\_infer-a2-arm:v0.7.0\_vllm |
| A2 | x86 | docker pull swr.cn-east-4.myhuaweicloud.com/omni/omniinfer-a2-x86:release\_v0.7.0-vllm | omni\_infer-a2-x86:v0.7.0\_vllm |
更多详情请参阅:https://www.php.cn/link/9f78e8aa1530b26c85f555017d89e745
源码获取:点击下载










