moe-architecture
1 article tagged with moe-architecture
February 20, 2026
model release
Segmind releases SegMoE, a mixture-of-experts diffusion model for faster image generation
Segmind has released SegMoE, a mixture-of-experts (MoE) diffusion model designed to accelerate image generation while reducing computational overhead. The model applies MoE techniques traditionally used in large language models to the diffusion model architecture, enabling selective expert activation during inference.