LLM News

Every LLM release, update, and milestone.

Filtered by:byteflow-net✕ clear
research

ByteFlow Net removes tokenizers, learns adaptive byte compression for language models

Researchers introduce ByteFlow Net, a tokenizer-free language model architecture that learns to segment raw byte streams into semantically meaningful units through compression-driven segmentation. The method adapts internal representation granularity per input, outperforming both BPE-based Transformers and previous byte-level approaches in experiments.