model releaseIbm

IBM releases Apache 2.0 Granite 4.1 LLMs in 3B, 8B, and 30B sizes

TL;DR

IBM has released the Granite 4.1 family of language models under Apache 2.0 license. The models come in 3B, 8B, and 30B parameter sizes. Unsloth has released 21 GGUF quantized variants of the 3B model ranging from 1.2GB to 6.34GB.

1 min read
0

IBM releases Apache 2.0 Granite 4.1 LLMs in 3B, 8B, and 30B sizes

IBM has released the Granite 4.1 family of language models under Apache 2.0 license. The models are available in three sizes: 3B, 8B, and 30B parameters.

Model availability and quantization

Unsloth released 21 GGUF quantized variants of the 3B model on Hugging Face. The quantized files range from 1.2GB to 6.34GB in size, with the full collection totaling 51.3GB. GGUF encoding allows the models to run on consumer hardware with reduced memory requirements.

Training details

Granite team member Yousaf Shah published a detailed description of the training process in "Granite 4.1 LLMs: How They're Built" on the Hugging Face blog. The post covers the technical architecture and training methodology used for the model family.

Model performance

An informal test of the 3B model's SVG generation capabilities across different quantization levels showed inconsistent results. A benchmark test prompting all 21 quantized variants to "Generate an SVG of a pelican riding a bicycle" revealed no clear correlation between model size and output quality. All variants produced abstract shapes rather than recognizable images, suggesting the model was not specifically trained for visual generation tasks.

What this means

The Apache 2.0 license makes Granite 4.1 commercially deployable without restrictions, positioning it as an alternative to models with more restrictive licenses. However, the availability of 21 quantized variants demonstrates the tradeoff space between model size and deployment flexibility. The lack of visual generation capability indicates these models are focused on text processing rather than multimodal tasks, despite being able to output SVG markup. Organizations evaluating Granite 4.1 should test it on their specific use cases rather than assume capabilities based on parameter count alone.

Related Articles

model release

IBM releases Granite 4.1-8B with 131K context window and enhanced tool-calling capabilities

IBM has released Granite 4.1-8B, an 8-billion parameter long-context model with a 131,072-token context window. The model achieves 85.37% on HumanEval and 73.84% on MMLU 5-shot, with enhanced tool-calling capabilities reaching 68.27% on BFCL v3. Released under Apache 2.0 license, it supports 12 languages.

model release

IBM Releases Granite 4.1 30B With 131K Context Window and Enhanced Tool-Calling

IBM released Granite 4.1 30B, a 30-billion parameter instruction-following model with a 131,072 token context window. The model scores 80.16 on MMLU 5-shot and 88.41 on HumanEval pass@1, with enhanced tool-calling capabilities following OpenAI's function definition schema.

model release

IBM Releases Granite 4.1 8B with 131K Context Window at $0.05/M Input Tokens

IBM has released Granite 4.1 8B, an 8-billion-parameter decoder-only language model with a 131,072-token context window. The model supports 12 languages and costs $0.05 per million input tokens and $0.10 per million output tokens, available under the Apache 2.0 license.

model release

IBM's Granite 4.1: 8B Dense Model Matches 32B MoE Performance on 15T Tokens

IBM released Granite 4.1, a family of dense decoder-only LLMs (3B, 8B, 30B parameters) trained on approximately 15 trillion tokens using a five-phase pre-training pipeline. The 8B instruct model matches or surpasses the previous Granite 4.0-H-Small (32B-A9B MoE) despite using fewer parameters and a simpler dense architecture. All models support up to 512K context windows and are released under Apache 2.0 license.

Comments

Loading...