product updateAnthropic

Anthropic's Claude Cowork now runs on Amazon Bedrock with consumption-based pricing

TL;DR

Anthropic announced Claude Cowork is now available on Amazon Bedrock, allowing organizations to deploy the desktop AI assistant through their AWS infrastructure with consumption-based pricing. Unlike Claude Enterprise, pricing flows through existing AWS agreements with no per-seat licensing from Anthropic.

2 min read
0

Claude Cowork Launches on Amazon Bedrock with Consumption-Based Pricing

Anthropic announced Claude Cowork is now available on Amazon Bedrock, enabling organizations to run the desktop AI assistant through their AWS infrastructure.

Unlike Claude Enterprise, Claude Cowork on Bedrock uses consumption-based pricing through existing AWS billing agreements, with no per-seat licensing from Anthropic. The integration routes all model inference exclusively through Amazon Bedrock in the customer's AWS account.

Deployment and Configuration

Setup requires two steps: users download Claude Desktop, then IT administrators push configuration through device management systems like Jamf, Microsoft Intune, or Group Policy. The configuration specifies the model ID, Amazon Bedrock Inference Profile, authentication method, and organizational policies.

Organizations using LLM gateways can point Claude Desktop at their gateway URL through the same managed configuration.

Architecture and Data Flow

Claude Cowork has three outbound paths: model inference to Amazon Bedrock in configured AWS Regions, optional MCP server connections to approved endpoints, and aggregate telemetry to Anthropic (token counts, model ID, error codes, anonymous device ID). Telemetry can be disabled through configuration.

Amazon Bedrock does not store prompts, files, tool inputs and outputs, or model responses, and does not use them to train foundation models, according to AWS.

The integration supports:

  • Authentication through AWS IAM or Amazon Bedrock API keys
  • Network isolation via VPC endpoints
  • Observability through OpenTelemetry export to Amazon CloudWatch
  • Audit through AWS CloudTrail
  • Consolidated AWS billing with granular cost attribution

Feature Set and Limitations

Claude Cowork includes projects, artifacts, memory, file upload and export, remote connectors, skills, plugins, and MCP servers. Features requiring Anthropic-hosted inference—including the Chat tab, Computer Use, and the Skills Marketplace—are not available because all inference routes through Amazon Bedrock.

The desktop application can delegate research, document analysis, data processing, and report generation. It can connect to external data sources through MCP servers for access to live documentation, web search, and other tools.

Availability

Claude Cowork is available on macOS and Windows in AWS Regions where Claude models are available on Amazon Bedrock. Organizations already running Claude Code in Amazon Bedrock can use the same infrastructure setup.

What This Means

This release extends Anthropic's enterprise distribution strategy by offering a third deployment option beyond direct API access and Claude Enterprise. The consumption-based pricing model removes the barrier of per-seat licensing for organizations that want to deploy AI assistants broadly across knowledge workers while maintaining AWS-level security controls. For AWS, this deepens Bedrock's competitive position by offering exclusive access to a packaged desktop application that competes with Microsoft's Copilot and Google's Workspace AI tools, all while keeping data within customers' AWS environments.

Related Articles

product update

NSA Using Anthropic's Unreleased Mythos Model While Pentagon Labels Company Supply Chain Risk

The National Security Agency is using Anthropic's Mythos Preview, an unreleased cybersecurity model limited to roughly 40 organizations, according to Axios. The deployment comes weeks after the Department of Defense labeled Anthropic a "supply chain risk" following the company's refusal to grant Pentagon officials unrestricted access to its models.

product update

AWS Reduces Video Search Routing Cost 95% Using Nova Premier-to-Micro Model Distillation

Amazon Web Services released a model distillation pipeline on Amazon Bedrock that transfers video search routing intelligence from Nova Premier to Nova Micro. According to AWS, the approach reduces inference cost by over 95% and latency by 50% compared to using Claude Haiku for intent routing.

product update

Amazon Launches Nova Multimodal Embeddings for Video Semantic Search Across Visual, Audio, and Text Signals

Amazon released Nova Multimodal Embeddings on Amazon Bedrock, a unified embedding model that processes text, documents, images, video, and audio into a shared 1024-dimensional semantic vector space. The model supports up to 30 seconds of video per embedding and enables semantic search across all modalities simultaneously without converting video to text first.

product update

GitHub halts Copilot Pro signups as agentic AI workloads overwhelm infrastructure

GitHub has paused new subscriptions for Copilot Pro, Pro+, and Student plans due to compute capacity constraints. The company cites agentic workflows as consuming significantly more resources than its original pricing structure anticipated, forcing tighter usage limits and a shift away from flat-rate billing.

Comments

Loading...