Mixture of Experts (MoE) Collection Sometimes I finetune models specifically to take on expert roles in a MoE configuration, sometimes I find interesting models others have fine tuned. • 8 items • Updated about 18 hours ago