Wang Jun

@august_ust

Wang Jun 暂无简介

所有 个人的 我参与的
Forks 暂停/关闭的

    Wang Jun/omni-npu_qwen forked from omniai/omni-npu

    A vLLM out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).

    Wang Jun/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Yao Yunxiang/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_cache forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni-npu forked from omniai/omni-npu

    A vLLM out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).

    Wang Jun/omni_infer_elb forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_master forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_d2p forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_paper forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_070 forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_600 forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_060 forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_hzw forked from HeZiwei/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omniinfer_trace forked from jjchen007/omniinfer_trace

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_op forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_0.4.2 forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_omni forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_cli forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_sgl_dev forked from He Jian/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Wang Jun/omni_infer_sgl forked from 吳航/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

搜索帮助