The debate over the feasibility of building gigantic data centers in orbit had been heating up for months. It is Silicon Valley’s new big idea to solve the insatiable energy appetite of artificial intelligence. Until, as usual, Elon Musk has entered the conversation with the subtlety of a hammer.
Elon Musk has joined the chat. After weeks of debate about the feasibility of building servers in space, Eric Berger, editor of Ars Technica, argued that will end up being a more plausible option when the technology exists to assemble satellites in orbit autonomously. It was the moment chosen by Elon Musk to enter the conversation.
“It will be enough to scale the Starlink V3 satellites, which have high-speed laser links,” wrote the CEO of SpaceX. “SpaceX is going to do it,” he said. A phrase that has probably fallen like a blow on startups that are taking advantage of the momentum of AI to go out in search of financing.
Why the hell do we want servers in space? The idea of moving computing to Earth orbit responds to a very real crisis: AI is an energy monster, and Demand for data centers continues to grow. Given this panorama, space offers two advantages that are impossible on Earth:
- Almost unlimited energy: In a sun-synchronous orbit, solar panels receive sunlight almost continuously (more than 95% of the time).
- Free Cooling: Land-based data centers consume millions of liters of fresh water to cool. With a large enough radiator, the gap can be “an infinite heatsink at -270°C.” The heat would be radiated into the vacuum without wasting a single drop of water.
The new titans of space AI. Musk is not the first to see the business. In fact, he arrives at a party where the first contracts are already being distributed. Jeff Bezos predicted during the Italian Tech Week that we will see “giant training clusters” of AI in orbit in the next 10 or 20 years.
Eric Schmidt, the former CEO of Google, bought rocket company Relativity Space precisely for this purpose. And Nvidia, the undisputed king of AI hardware, has actively backed startup Starcloud, which plans to launch the first NVIDIA H100 GPU into space this November, with the goal of eventually building a monster 5-gigawatt orbital data center.
Why Musk would win. The vision of Bezos, Schmidt and Starcloud faces two colossal obstacles: the cost of launch and the construction of the servers themselves. Calculations for a 1 GW data center would require more than 150 launches with current technology. And Starcloud’s plan for a 4 kilometer wide array is a logistical nightmare.
Elon Musk has Starship, the giant rocket on which all of his competitors’ business models depend to be profitable. And you don’t need build a new orbital data center. Just adapt and scale the one you already have.
10,000 satellites and counting. SpaceX’s Starlink constellation no longer competes against satellite internet, goes for terrestrial fiber. Musk’s company has already launched 10,000 satellites and is preparing the deployment of the new V3 satellites, designed for Starship with high-speed laser links.
According to SpaceX itself, each Starship launch will add 60 terabits per second of capacity to a network that is already, in practice, a global computing and data mesh. While Starcloud needs to hire a rocket and assemble 4km-wide solar and cooling panels, Musk simply needs Starship to finish development to continue launching satellites.

GIPHY App Key not set. Please check settings