×
AI supercomputers are a US first, China second phenomenon. And growing rapidly.
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

AI supercomputers are scaling at an exponential rate, with performance doubling every nine months while power requirements and costs double annually. This unprecedented growth, detailed in a comprehensive study of 500 AI systems from 2019-2025, reveals a dramatic shift toward private ownership of computing resources, with industry now controlling 80% of global AI compute power. Understanding these trends is crucial as we approach a future where leading AI systems could require power equivalent to multiple cities and hardware investments in the hundreds of billions.

The big picture: AI supercomputers have experienced explosive growth in computational performance, increasing 2.5x annually through deploying more numerous and powerful specialized chips.

  • Leading systems that once contained fewer than 10,000 chips now regularly feature more than 100,000, exemplified by xAI’s Colossus with its 200,000 AI chips.
  • This growth is driven by a yearly 1.6x increase in chip quantity combined with a 1.6x annual improvement in performance per chip.

Behind the numbers: The expansion of AI supercomputers has created massive energy and financial demands, with power requirements and hardware costs doubling every year.

  • xAI‘s Colossus, the most powerful AI supercomputer as of March 2025, requires approximately 300 megawatts of power—equivalent to 250,000 households—and cost an estimated $7 billion in hardware alone.
  • Despite growing power demands, computational performance per watt has increased by 1.34x annually, primarily through the adoption of more energy-efficient chips.

Key shift: The landscape of AI supercomputing has transformed from primarily academic and public research to industry dominance in just six years.

  • Industry’s share of global AI compute jumped from 40% in 2019 to 80% in 2025, as private companies rapidly scaled their systems to conduct larger training runs.
  • Leading industry systems grew by 2.7x annually, significantly outpacing the 1.9x annual growth of public sector systems.

The big picture: The global distribution of AI supercomputing power shows overwhelming American dominance, with significant implications for technological leadership.

  • The United States controls approximately 75% of global AI supercomputer performance in the dataset, with China a distant second at 15%.
  • Traditional supercomputing powers like the UK, Germany, and Japan now play marginal roles in the AI supercomputing landscape.

Implications: If current growth trajectories continue, the scale of future AI systems will test physical and economic boundaries.

  • Projections suggest that by 2030, the largest AI supercomputer could require 9 gigawatts of power and cost hundreds of billions of dollars to build.
  • These unprecedented requirements raise serious questions about the sustainability of AI scaling and who will be able to participate in frontier AI development.
Trends in AI Supercomputers

Recent News

Scaling generative AI 4 ways from experiments to production

Organizations face significant hurdles when moving generative AI initiatives from experimentation to production-ready systems, with most falling short of deployment goals despite executive interest.

Google expands Gemini AI with 2 new plans, leak reveals

Google prepares to introduce multiple subscription tiers for Gemini, addressing the gap between its free and premium AI offerings.

AI discovers potential Alzheimer’s cause and treatment

AI identifies PHGDH gene as a direct cause of Alzheimer's disease beyond its role as a biomarker, offering a new understanding of spontaneous cases and potential treatment pathways.