funding

Thinking Machines Lab secures Nvidia compute deal with 1+ gigawatt power allocation

TL;DR

Thinking Machines Lab has secured a multi-year compute deal with Nvidia involving at least 1 gigawatt of processing power, according to the company. The agreement also includes a strategic investment from Nvidia, marking a significant infrastructure commitment for the AI research organization.

1 min read
0

Thinking Machines Lab Secures Nvidia Compute Deal With 1+ Gigawatt Allocation

Thinking Machines Lab has inked a multi-year compute agreement with Nvidia that allocates at least 1 gigawatt of processing capacity, the company announced. Nvidia is also making a strategic investment as part of the deal.

The agreement represents a substantial infrastructure commitment. At current Nvidia pricing, 1 gigawatt of sustained compute translates to significant monthly operational capacity for large-scale AI model training and inference workloads.

Deal Structure

While specific financial terms remain undisclosed, the arrangement combines:

  • Compute allocation: At least 1 gigawatt of processing power over multiple years
  • Strategic investment: Capital contribution from Nvidia (amount not disclosed)
  • Multi-year commitment: Duration and scaling terms not specified

What We Don't Know

The announcement lacks critical details:

  • Total deal value
  • Exact compute allocation duration and scaling schedule
  • Whether this covers H100/H200 GPUs or other Nvidia hardware
  • Whether the investment includes board seats or strategic influence
  • Thinking Machines Lab's planned use cases (model development, inference, other)

Context

Thinking Machines Lab is a Philippine-based AI research organization. This deal positions them alongside other well-capitalized AI labs securing major compute commitments from Nvidia. Similar infrastructure agreements have become standard among frontier AI research groups building or fine-tuning large language models.

What This Means

Thinking Machines Lab now has guaranteed access to substantial compute resources, a prerequisite for developing competitive frontier AI models or running large-scale training operations. For Nvidia, the investment signals confidence in the Philippine-based lab's technical direction while expanding its AI ecosystem footprint beyond traditional Western research centers. The strategic investment component suggests Nvidia sees long-term value in the partnership beyond immediate GPU sales.

Related Articles

funding

Nvidia reportedly planning $30 billion investment in OpenAI

Nvidia is reportedly planning a $30 billion investment in OpenAI, according to Reuters citing sources familiar with the matter. The deal would represent one of the largest funding commitments in the AI sector to date. Terms and timeline have not been officially confirmed by either company.

funding

Nvidia to spend $26B on open-weight AI models, filing reveals

Nvidia will invest $26 billion over the next five years to build open-weight AI models, according to a 2025 financial filing confirmed by executives. The move signals a strategic shift from chipmaker to AI frontier lab, with the company releasing Nemotron 3 Super (128B parameters) and claiming it outperforms GPT-OSS on multiple benchmarks.

funding

Nvidia-backed Nscale raises $2B, hits $14.6B valuation with Sandberg and Clegg joining board

Nvidia-backed British AI infrastructure startup Nscale has raised $2 billion in a new funding round, bringing its valuation to $14.6 billion. The round marks a significant milestone for the infrastructure-focused startup, with Meta's former COO Sheryl Sandberg and Meta's former VP of Global Affairs Nick Clegg joining the board.

funding

OpenAI closes $110B funding round from Amazon, Nvidia, SoftBank at $730B valuation

OpenAI has closed a $110 billion funding round with Amazon committing $50 billion, Nvidia $30 billion, and SoftBank $30 billion. The company is now valued at $730 billion, following a previous $40 billion round in 2025. The funding includes custom model development agreements between OpenAI and Amazon Web Services.

Comments

Loading...