arcee-ai
4 articles tagged with arcee-ai
Arcee AI releases Trinity-Large-Thinking, open reasoning model matching Claude Opus on agent tasks
Arcee AI has released Trinity-Large-Thinking, a 400-billion-parameter open-weight reasoning model with a mixture-of-experts architecture that activates only 13 billion parameters per token. The model matches Claude Opus 4.6 on agent benchmarks like Tau2 and PinchBench but lags on general reasoning tasks. The company spent approximately $20 million—roughly half its total venture capital—to train the model on 2,048 Nvidia B300 GPUs over 33 days.
Arcee AI releases Trinity-Large-Thinking: 398B sparse MoE model with chain-of-thought reasoning
Arcee AI released Trinity-Large-Thinking, a 398B-parameter sparse Mixture-of-Experts model with approximately 13B active parameters per token, post-trained with extended chain-of-thought reasoning for agentic workflows. The model achieves 94.7% on τ²-Bench, 91.9% on PinchBench, and 98.2% on LiveCodeBench, generating explicit reasoning traces in <think>...</think> blocks before producing responses.
Arcee releases Trinity Large Thinking, an open-source reasoning model built on $20M budget
Arcee, a 26-person U.S. startup, released Trinity Large Thinking, an open-source reasoning model it claims is the most capable open-weight model ever released by a non-Chinese company. Built on a $20 million budget, the model competes with other top open-source offerings while maintaining Apache 2.0 licensing, positioning itself as an alternative to both closed-source Western models and Chinese alternatives.
Arcee AI releases Trinity Large Thinking, open-source reasoning model with 262K context window
Arcee AI has released Trinity Large Thinking, an open-source reasoning model featuring a 262,144 token context window. The model is priced at $0.25 per million input tokens and $0.90 per million output tokens, with free access available through OpenRouter for the first five days.