Harrier (27B)

Microsoft🇺🇸 United States
active
Context window131K tokens

Version History

v1major

Initial release of Harrier embedding model. Trained on 2B+ examples with GPT-5 synthetic data. Achieves top ranking on MTEB v2 multilingual benchmark with 131K context window.

Coverage

model releaseMicrosoft

Microsoft open-sources Harrier embedding model with 27B parameters, 131K context window

Microsoft's Bing team has open-sourced Harrier, a 27-billion-parameter embedding model that supports over 100 languages and features a 131,072-token context window. The model ranks first on the MTEB v2 multilingual benchmark, outperforming proprietary offerings from OpenAI and Amazon, and is available on Hugging Face under the MIT license.

2 min read