LLM News

Every LLM release, update, and milestone.

Filtered by:machine-translation✕ clear
research

Meta's NLLB-200 learns universal language structure, study finds

A new study of Meta's NLLB-200 translation model reveals it has learned language-universal conceptual representations rather than merely clustering languages by surface similarity. Using 135 languages and cognitive science methods, researchers found the model's embeddings correlate with actual linguistic phylogenetic distances (ρ = 0.13, p = 0.020) and preserve semantic relationships across typologically diverse languages.

2 min readvia arxiv.org