Alibaba releases Qwen3.5-35B-A3B, a 35B multimodal model with Apache 2.0 license
Alibaba's Qwen team has released Qwen3.5-35B-A3B-Base, a 35-billion parameter multimodal model supporting image-text-to-text tasks. The model is available under the Apache 2.0 license and compatible with major inference endpoints including Azure deployment.
Alibaba's Qwen division has released Qwen3.5-35B-A3B-Base, a 35-billion parameter multimodal language model designed for image-text-to-text tasks.
Model Details
The model was published on February 24, 2026 on Hugging Face and carries an Apache 2.0 license, allowing both commercial and research use without licensing restrictions. It is tagged as part of the Qwen3.5 MoE (mixture of experts) family, indicating the model uses conditional computation techniques to improve efficiency.
Qwen3.5-35B-A3B-Base supports multimodal inputs, processing both images and text to generate text outputs. The model is compatible with the Transformers library and uses SafeTensors format for weight storage, a security-focused serialization standard.
Availability and Deployment
The model has achieved 1,937 downloads and 62 likes on Hugging Face as of publication. It is compatible with inference endpoints through major cloud providers, including Azure deployment options, making it accessible for production use cases.
The base model variant indicates this is the foundational version without instruction-tuning or fine-tuning for specific tasks, leaving optimization to end users or downstream applications.
Context
This release continues Alibaba's Qwen series momentum in the open-weight model space. The Qwen3.5 line represents an iteration beyond Qwen3, with the A3B variant designation referring to a specific model configuration within the 35B parameter class.
The mixture-of-experts architecture employed in this model typically provides efficiency improvements during inference compared to dense models of equivalent parameter count, though exact computational requirements are not yet published.
What This Means
Alibaba is positioning Qwen3.5-35B-A3B as an open alternative for organizations needing multimodal capabilities at the 35B scale. The Apache 2.0 license removes commercial deployment barriers, and cloud provider integration lowers infrastructure barriers. The model joins a competitive field of open multimodal 30B+ parameter models from Meta, Mistral, and others, each with different architectural choices and trade-offs in performance, efficiency, and licensing.