
Omni-Infer v0.8.0 正式上线,带来面向超大规模 MoE 架构模型的高效推理优化能力。
v0.8.0
核心亮点
✓ 新增对 Pangu72B 模型的强化学习(RL)训练支持
已验证支持模型清单
| 模型名称 | 适配硬件 | 量化/精度格式 | 部署模式 |
|---|---|---|---|
| openPangu-Ultra-MoE-718B | A3 | INT8 | PD分离 |
| openPangu-Ultra-MoE-718B | A2 | INT8 | PD分离 |
| openPangu-72B | A3 | INT8 | PD分离 |
| openPangu-38B | A3 | INT8 | 混布 |
| openPangu-38B | A2 | INT8 | 混布 |
| openPangu-7B | A3 | BF16 | 混布 |
| openPangu-7B | A2 | BF16 | 混布 |
| openPangu-7BVL | A3 | BF16 | 混布 |
| DeepSeek-R1 | A3 | INT8 | PD分离 |
| DeepSeek-R1 | A3 | W4A8C16 | PD分离 |
| DeepSeek-R1 | A3 | BF16 | PD分离 |
| DeepSeek-R1 | A2 | INT8 | PD分离 |
| DeepSeek-V3.1 | A3 | INT8 | PD分离 |
| DeepSeek-V3.2 | A3 | INT8 | PD分离 |
| DeepSeek-OCR | A2 | BF16 | 混布 |
| Qwen2.5-7B | A3 | INT8 | 混布(TP≥1, DP=1) |
| Qwen2.5-7B | A2 | INT8 | 混布(TP≥1, DP=1) |
| QwQ | A3 | BF16 | PD分离 |
| QwQ | A2 | BF16 | PD分离 |
| Qwen3-235B | A3 | INT8 | PD分离 |
| Qwen3-235B | A2 | BF16 | PD分离 |
| Qwen3-32B | A3 | BF16 | PD分离 |
| Qwen3-32B | A3 | INT8 | PD分离 |
| Qwen3-30B | A3 | BF16 | PD分离 |
| Kimi-K2 | A3 | W4A8C16 | PD分离 |
| Kimi-K2 Thinking | A3 | W4A8C16 | PD分离 |
| Longcat-flash | A3 | BF16 | PD分离 |
| Ling-1T | A3 | BF16 | PD分离 |
| GPT-OSS120B | A3 | INT8 | PD分离 |
| GPT-OSS120B | A2 | INT8 | PD分离 |
| GPT-OSS20B | A3 | INT8 | PD分离 |
| GPT-OSS20B | A2 | INT8 | PD分离 |
安装资源获取
| 硬件平台 | CPU架构 | Docker镜像地址 | Tar包名称 |
|---|---|---|---|
| A3 | arm | docker pull swr.cn-east-4.myhuaweicloud.com/omni-ci/omniinfer-a3-arm:release\_v0.8.0-vllm | omni\_infer-a3-arm:v0.8.0\_vllm |
| A3 | x86 | docker pull swr.cn-east-4.myhuaweicloud.com/omni-ci/omniinfer-a3-x86:release\_v0.8.0-vllm | omni\_infer-a3-x86:v0.8.0\_vllm |
| A2 | arm | docker pull swr.cn-east-4.myhuaweicloud.com/omni-ci/omniinfer-a2-arm:release\_v0.8.0-vllm | omni\_infer-a2-arm:v0.8.0\_vllm |
| A2 | x86 | docker pull swr.cn-east-4.myhuaweicloud.com/omni-ci/omniinfer-a2-x86:release\_v0.8.0-vllm | omni\_infer-a2-x86:v0.8.0\_vllm |
更多更新说明请参阅:https://www.php.cn/link/168f6cac16567296b81233afac6f127b
源码及安装包下载:立即获取










