GitHub Copilot adds flexible model switching to Pro tier, launches Max plan on June 1
GitHub is restructuring its Copilot individual subscription tiers effective June 1, 2025. The company is introducing flexible model allotments to Pro and Pro+ plans and launching a new Max tier, responding to user feedback on plan structure.
GitHub Copilot adds flexible model switching to Pro tier, launches Max plan on June 1
GitHub is restructuring its Copilot individual subscription tiers effective June 1, 2025, introducing flexible model selection capabilities and a new premium Max plan.
Plan changes
The update affects three tiers:
Pro and Pro+: Both plans will now include "flex allotments," allowing subscribers to switch between different AI models based on their development needs. GitHub has not disclosed specific details about which models will be available or usage limits.
Max (new tier): GitHub is launching a new Max subscription level, positioned above the current Pro+ tier. Pricing and specific features for Max have not been announced.
Pricing and availability
GitHub has not released pricing information for any of the updated tiers. The company stated the changes are "based on your feedback," suggesting the restructuring responds to user requests for more flexibility in model selection.
The current GitHub Copilot individual plan costs $10 per month or $100 per year. GitHub has not confirmed whether these prices will change with the new tier structure.
Technical details not disclosed
GitHub's announcement does not specify:
- Which AI models will be included in flex allotments
- Token limits or usage caps for model switching
- Specific features distinguishing Max from Pro+
- Whether the base $10/month tier will remain available
- Integration changes for existing users
The announcement appears to be a preliminary disclosure ahead of more detailed information expected before the June 1 launch date.
What this means
GitHub is moving toward a tiered flexibility model common in AI coding assistants, where users pay more for access to multiple underlying models. The flex allotment approach suggests GitHub may allow developers to choose between faster, cheaper models for routine tasks and more capable models for complex problems. However, without disclosed usage limits or model options, it's unclear whether this represents genuine flexibility or marketing repositioning of existing capabilities. The timing aligns with increased competition in AI coding tools from Cursor, Windsurf, and other products offering multi-model support.
Related Articles
Google Home update accelerates Gemini voice commands, enables voice-based 'Ask Home' queries
Google has deployed a new update to Google Home that accelerates Gemini voice command processing, particularly for timers and alarms. The update extends Gemini's 'Ask Home' feature to voice commands, allowing users to query camera history and family member locations via smart speakers and displays.
Google launches Gemini Intelligence for Android, enabling multi-app task automation
Google announced Gemini Intelligence at I/O 2026, a system-level AI layer that automates multi-step tasks across Android apps. Rolling out first to Samsung Galaxy and Pixel phones this summer, it enables the OS to understand screen context and execute complex workflows without manual app-switching.
Google announces Googlebooks laptop platform with Gemini AI integration, launching fall 2026
Google previewed Googlebooks, a new laptop platform combining Android and ChromeOS with Gemini AI at its core. The platform features AI capabilities like Magic Pointer for contextual assistance and seamless Android phone integration. Hardware partners include Acer, Asus, Dell, HP, and Lenovo, with devices launching fall 2026.
Meta tests AI chatbot integration in Threads feeds across five countries
Threads is testing a feature where users can mention @meta.ai in posts to get context on trends and breaking news. The integration is currently in beta in Malaysia, Saudi Arabia, Mexico, Argentina, and Singapore.
Comments
Loading...