×
Spreading out: Startups build cutting-edge AI models without data centers
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

A new approach to AI model training could disrupt the centralized power structure that has dominated artificial intelligence development. By using distributed computing across regular GPUs connected via the internet, two startups have demonstrated an alternative path that might challenge the resource-intensive model building methods that currently give tech giants their competitive edge in AI development.

The big picture: Researchers from Flower AI and Vana have successfully trained a language model called Collective-1 using GPUs spread across the globe rather than concentrated in datacenters.

  • This distributed approach allowed them to incorporate both public and private data sources, including messages from X, Reddit, and Telegram provided by Vana.
  • Though modest at 7 billion parameters compared to hundreds of billions in cutting-edge models like ChatGPT, this proof-of-concept demonstrates a potentially transformative approach to AI development.

Why this matters: The current AI landscape is dominated by companies with access to massive computing resources in centralized datacenters, creating high barriers to entry.

  • Distributed model training could democratize AI development by eliminating the need for organizations to own or rent expensive GPU clusters connected via specialized networking.
  • This shift might enable smaller players to build competitive AI systems without the capital requirements that currently favor tech giants.

What’s next: Flower AI is already scaling up its distributed approach with ambitious plans for larger models.

  • The company is currently training a 30 billion parameter model using conventional data sources.
  • According to Nic Lane, computer scientist at Cambridge University and Flower AI cofounder, they plan to train a 100 billion parameter model later this year—approaching the scale of industry-leading systems.

The bottom line: While still early in development, this distributed training methodology represents a potential inflection point in how AI systems are built, potentially reshaping industry power dynamics by lowering the technical and financial barriers to advanced AI development.

These Startups Are Building Advanced AI Models Without Data Centers

Recent News

SaaStr 2025 unites top cloud, B2B and AI leaders in SF Bay

Featuring over 15,000 attendees and 500 speakers, the three-day event will highlight proven strategies from executives who have built successful cloud businesses rather than theoretical AI discussions.

Visa develops AI-powered cards for seamless automated purchases

Visa's platform allows AI assistants to execute transactions using tokenized credentials within user-defined parameters, eliminating payment friction in automated shopping.

Meta’s Q1 revenue surpasses expectations

Strong advertising revenue drives Meta's first-quarter earnings beyond analyst forecasts, prompting a 6% stock jump in after-hours trading.