ai2

4 articles tagged with ai2

May 8, 2026
model release

Allen Institute releases EMO, 14B parameter MoE model with selective 12.5% expert use

Allen Institute for AI released EMO, a 1B-active, 14B-total-parameter mixture-of-experts model trained on 1 trillion tokens. The model uses 8 active experts per token from a pool of 128 total experts, and can maintain near full-model performance while using just 12.5% of its experts for specific tasks.

March 25, 2026
model release

AI2 releases MolmoWeb, open web agent matching proprietary systems with 8B parameters

The Allen Institute for AI has released MolmoWeb, a fully open web agent that operates websites using only screenshots without access to source code. The 8B-parameter model achieves 78.2% success on WebVoyager—nearly matching OpenAI's o3 at 79.3%—while being trained on one of the largest public web task datasets ever released.

March 14, 2026
model release

AI2 releases robotics models trained entirely in simulation, achieving zero-shot real-world transfer

AI2 has released MolmoSpaces and MolmoBot, robotics models trained exclusively in simulation that transfer directly to real robots without manual real-world data collection or fine-tuning. The approach eliminates months of teleoperated demonstrations typically required for simulation-trained robots. Both systems are open-source.

March 11, 2026
research

AI2 uses virtual simulation data to train physical AI robots, reducing real-world data costs

AI2 is developing physical AI systems trained primarily on virtual simulation data rather than expensive real-world demonstrations. The approach, demonstrated through projects like MolmoBot, addresses the historical bottleneck of manually collecting hardware training data.