Best Open Source moe Libraries
A curated list of the most popular GitHub repositories tagged with moe. Select any project to visualize its architecture and dive into the codebase using RepoMind's AI engine.
#1vllm-project/vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
#2hiyouga/LlamaFactory
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
#3sgl-project/sglang
SGLang is a high-performance serving framework for large language models and multimodal models.
#4NVIDIA/TensorRT-LLM
TensorRT LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and supports state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in a performant way.
#5modelscope/ms-swift
Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 600+ LLMs (Qwen3, Qwen3-MoE, DeepSeek-R1, GLM4.5, InternLM3, Llama4, ...) and 300+ MLLMs (Qwen3-VL, Qwen3-Omni, InternVL3.5, Ovis2.5, GLM4.5v, Llava, Phi4, ...) (AAAI 2025).