language-model
4 articles tagged with language-model
Researchers release 13B-parameter language model trained exclusively on pre-1931 data
A team of researchers has released Talkie, a 13-billion-parameter language model trained exclusively on digitized English-language texts published before the end of 1930. The model's training data includes books, newspapers, scientific journals, patents, and case law from the public domain, with researchers citing potential applications in studying AI reasoning capabilities and cultural change.
Arcee AI releases Trinity Large Thinking, open-source reasoning model with 262K context window
Arcee AI has released Trinity Large Thinking, an open-source reasoning model featuring a 262,144 token context window. The model is priced at $0.25 per million input tokens and $0.90 per million output tokens, with free access available through OpenRouter for the first five days.
Alibaba releases Qwen3.5-9B, a multimodal 9B parameter model
Alibaba has released Qwen3.5-9B, a 9-billion parameter multimodal language model capable of processing both images and text. The model is available under Apache 2.0 license on Hugging Face with transformer-compatible architecture.
Inception's Mercury 2 uses diffusion for language reasoning, claims 5x speed over autoregressive models
Inception has released Mercury 2, positioning it as the first diffusion-based language reasoning model. Rather than generating text sequentially word-by-word like standard language models, Mercury 2 refines entire passages in parallel, according to the company.