Training models like ChatGPT, Gemini or Claude requires more and more electricity and water, to the point that the energy consumption of AI threatens to exceed that of entire countries. Data centers have become real resource sinks. According to estimates by the International Energy Agencythe electrical expenditure of data centers could double before 2030, driven by the explosion of generative AI.
Faced with this perspective, technology giants are desperately looking for alternatives. And Google believes it has found something that seems straight out of science fiction: sending its artificial intelligence chips into space.
Conquering space. The company Project Suncatcher has been revealedan ambitious experiment that sounds like science fiction: placing its TPUs—the chips that power its artificial intelligence—on satellites powered by solar energy. The chosen orbit, sun-synchronous, guarantees almost constant light. In theory, these panels could work 24 hours a day and be up to eight times more efficient than the ones we have on Earth.
Google plans to test its technology with two prototype satellites before 2027, in a joint mission with the Planet company. The objective will be to check if its chips and communication systems can survive the space environment and, above all, if it is feasible to perform AI calculations in orbit.
The engineering behind the idea. Although it sounds like science fiction, the project has solid scientific bases. Google proposes to build constellations of small satellites—dozens or even hundreds—that orbit in compact formation at an altitude of about 650 kilometers. Each one would have chips on board Trillium TPU connected to each other by laser optical links. Such light beams would allow satellites to “talk” to each other at speeds of up to tens of terabits per second. It is an essential capability to process AI tasks in a distributed manner, as a terrestrial data center would do.
The technical challenge is enormous: at these distances, the optical signal weakens quickly. To compensate, the satellites would have to fly just a few hundred meters apart. According to Google’s own studyKeeping them so close will require precise maneuvering, but calculations suggest that small orbit adjustments would be enough to keep the formation stable.
In addition, engineers have already tested the radiation resistance of their chips. In an experiment with a 67 MeV proton beam, Trillium TPUs safely withstood a dose three times higher than they would receive during a five-year mission in low orbit. “They are surprisingly robust for space applications,” the company concludes in its preliminary report.
The great challenge: making it profitable. Beyond the technical problems, the economic challenge is what is in focus. According to calculations cited by Guardian and Ars Technicaif the launch price falls below $200 per kilogram by the mid-2030s, an orbital data center could be economically comparable to a terrestrial one. The calculation is made in energy cost per kilowatt per year. “Our analysis shows that space data centers are not limited by physics or insurmountable economic barriers,” says the Google team.
In space, solar energy is practically unlimited. A panel can perform up to eight times more than on the Earth’s surface and generate almost continuous electricity. That would eliminate the need for huge batteries or water-based cooling systems, one of the biggest environmental problems in today’s data centers. However, not everything shines in a vacuum. As The Guardian recallseach launch emits hundreds of tons of CO₂, and astronomers warn that the growing number of satellites “is like looking at the universe through a windshield full of insects.” Furthermore, flying such compact constellations increases the risk of collisions and space debris, an already worrying threat in low orbit.
A race to conquer the sky. Google’s announcement comes in the midst of a fever for space data centers. It is not the only company looking up. Elon Musk recently assured that SpaceX plans to scale its Starlink satellite network—already with more than 10,000 units—to create its own data centers in orbit. “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links. SpaceX is going to do it,” wrote Musk in X.
For his part, Jeff Bezos, founder of Amazon and Blue Origin, predicted during the Italian Tech Week that we will see “giant AI training clusters” in space in the next 10 to 20 years. In his vision, these centers would be more efficient and sustainable than terrestrial ones: “We will take advantage of solar energy 24 hours a day, without clouds or night cycles.”
Another unexpected actor is Eric Schmidt, former CEO of Google, who bought the rocket company Relativity Space precisely to move in that direction. “Data centers will require tens of additional gigawatts in a few years. Taking them off the Earth may be a necessity, not an option,” Schmidt warned in a hearing before the US Congress.
And Nvidia, the AI chip giant, also wants to try his luck: The startup Starcloud, backed by its Inception program, will launch the first H100 GPU into space this month to test a small orbital cluster. Their ultimate goal: a 5-gigawatt data center orbiting the Earth.
The new battlefield. The Google project is still in the research phase. There are no prototypes in orbit and no guarantees that there will be any soon. But the mere fact that a company of such caliber has published orbital models, radiation calculations and optical communication tests shows that the concept has already moved from the realm of speculation to that of applied engineering.
The project inherits the philosophy of others moonshots of the company —like Waymo’s self-driving cars either quantum computers—: explore impossible ideas until they stop being impossible. The future of computing may not be underground or in huge industrial warehouses, but in swarms of satellites shining in the permanent sun of space.
Image | Google

GIPHY App Key not set. Please check settings