Version History

1.0major

Initial release of ZAYA1-8B, a post-trained reasoning MoE model with 760M active and 8.4B total parameters achieving frontier-level mathematical performance.

Benchmark Scores

Full leaderboard →
71.0%
GPQA
65.8%
LiveCodeBench
74.2%
MMLU-Pro

Coverage