LLM News

Every LLM release, update, and milestone.

Filtered by:deep-learning✕ clear
research

SiNGER framework improves vision transformer distillation by suppressing high-norm artifacts

Researchers introduce SiNGER (Singular Nullspace-Guided Energy Reallocation), a knowledge distillation framework that improves how Vision Transformer features transfer to smaller student models. The method suppresses high-norm artifacts that degrade representation quality while preserving informative signals from teacher models.