M
mixture-of-experts
Projects with this topic
-
https://github.com/hiyouga/LLaMA-Factory Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024) Updated -
https://github.com/lucidrains/st-moe-pytorch Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
Updated