LLM News

Every LLM release, update, and milestone.

Filtered by:negative-sampling✕ clear
research

TSEmbed combines mixture-of-experts with LoRA to scale multimodal embeddings across conflicting tasks

Researchers propose TSEmbed, a multimodal embedding framework that combines Mixture-of-Experts (MoE) with Low-Rank Adaptation (LoRA) to handle task conflicts in universal embedding models. The approach introduces Expert-Aware Negative Sampling (EANS) to improve discriminative power and achieves state-of-the-art results on the Massive Multimodal Embedding Benchmark (MMEB).