Alibaba releases Qwen3.5-27B, a 27B multimodal model with Apache 2.0 license
Alibaba Qwen has released Qwen3.5-27B, a 27-billion parameter model capable of processing both images and text. The model is available under an Apache 2.0 open license and is compatible with standard transformer endpoints.
Alibaba Qwen Releases Qwen3.5-27B Multimodal Model
Alibaba's Qwen team has published Qwen3.5-27B, a 27-billion parameter model designed to handle both image and text inputs. The release marks the latest iteration in Alibaba's open-source model lineup.
Model Specifications
Qwen3.5-27B is a multimodal model with an architecture supporting image-text-to-text tasks. The model carries an Apache 2.0 license, making it freely available for both research and commercial use. It is compatible with standard transformer endpoints and follows the safetensors format for model weights.
The model's parameter count of 27 billion positions it in the mid-range segment—larger than models like Mistral 7B but smaller than many instruction-tuned variants in the 70B range. This size targets deployment scenarios where computational resources are constrained but model capability remains a priority.
Capability Profile
Qwen3.5-27B is tagged for conversational tasks and multimodal understanding, suggesting it can engage in dialogue while processing images alongside text prompts. The image-text-to-text classification indicates the model accepts images and text as combined inputs and generates text responses.
Specific benchmark scores, training data composition, knowledge cutoff date, and maximum context window length have not been disclosed in the initial release metadata.
Availability and Licensing
The model is hosted on Hugging Face and is immediately available for download. The Apache 2.0 license removes legal barriers to commercial deployment, distinguishing this release from many restricted-license models. Support for standard transformer inference frameworks means existing tooling can run the model without custom implementations.
No pricing information or commercial hosting details have been announced.
What This Means
Qwen3.5-27B expands Alibaba's open competition with other mid-range multimodal models like Qwen's own larger variants and offerings from Mistral, Meta, and others. The 27B parameter count targets developers who need multimodal capability without the compute overhead of 70B+ models. The Apache 2.0 license removes deployment friction compared to restricted models. However, without disclosed benchmarks or performance data, comparative positioning against competing 27-30B multimodal models remains unclear. Organizations evaluating this model should establish baselines on their specific use cases before production deployment.