omniai

zanqingzhang@gmail.com  

仓库 (6)

omni-npu
A vLLM (0.12.0) out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).
omni_infer
Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expandin...
omni-proxy
暂无描述
omni-eplb
暂无描述
community
Omni-AI社区治理、运作、活动相关内容
7 9 5
omni_infer_1
forked from omni_infer 
Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expandin...

搜索帮助