While Silicon Valley dreams of servers in orbit, Russia prepares a nuclear reactor on lunar soil

Until recently, the space race was about seeing who could get there first. Today, the question is different: who will be able to turn on the light on the Moon? While companies like Google or Nvidia imagine satellites loaded with computers for their Artificial Intelligence, Russia has hit the table with a much more earthly (or lunar) plan: installing a small nuclear power plant on the surface of our satellite. A reactor by 2036. The Russian space corporation, Roscosmos, has signed a state contract with the aerospace company NPO Lavochkin to develop a lunar nuclear power plant. According to Reutersthe deadline marked in the contract is 2036. However, the political times are much more aggressive: Yury Borisov, head of Roscosmos, has placed the real operational window between 2033 and 2035. Although official statements sometimes avoid the word “nuclear” directly, project participants dispel any doubts, the Kurchatov Institute (a leader in nuclear research in Russia) and Rosatom (the state atomic flagship company) are in charge. As the Interfax media points outthe objective is to power the infrastructure of the International Lunar Research Station (ILRS), a joint project with China that seeks to move from “round trip” missions to a permanent human presence. But why what nuclear? A colony on the Moon faces nights that last 14 Earth days. During that time, the frigid temperatures and lack of light make the solar panels useless to keep astronauts alive or power life support systems. Mikhail Kovalchuk, head of the Kurchatov Institute, he explained in an interview with the Russian agency TASS that Russia must “run forward.” According to this medium, the country seeks to consolidate its leadership through the “Atomic Project 2.0”, which includes new generation reactors and closed cycle systems. It’s not just about science; Russia admits that partners like China and India have learned a lot from them and are now direct competitors. Eyes in the sky: preparing the ground. For the Russian reactor to reach the Moon, Moscow is already preparing the logistics. According to another TASS statementRussia plans to launch 52 satellites from the Vostochny cosmodrome. Among them, the Aist-2T stands out, capable of creating 3D models of the lunar terrain and monitoring emergency situations. It is the necessary infrastructure so that the “lunar atom” does not suffer the same fate as the failed Luna-25 probe in 2023. The Moscow-Beijing axis: a long-range alliance. This deployment is not a solitary effort. As Interfax detailsRussia and China formalized their ambition in May 2024 with a memorandum of cooperation for the joint construction of this nuclear plant. They are not starting from scratch: both countries presented a roadmap in 2021 that includes five joint missions to deploy modules in lunar orbit and surface. While Russia brings its historical advantage in space nuclear facilities, China provides the scientific capacity and resources for the ILRS Station to be permanently inhabited from 2030. The board of the new Cold War. Washington has not stood idly by in the face of the Russian-Chinese alliance. NASA has received a clear directive from the current administration, in which they state that They need a reactor on the Moon by 2030. “We are in a race with China,” said Sean Duffy, Secretary of Transportation and who has led this directive. The background of this urgency is not only prestige, but the control of strategic resources. The Moon is the great deposit of Helium-3, an extremely rare isotope that is emerging as the “fuel of the future” for nuclear fusion. The White House’s fear is that if the alliance between Russia and China comes sooner, they will be able to declare “exclusion zones,” blocking access to this isotope and other essential metals for the technology industry. Faced with this threat, the US has increased the power of its nuclear project from the original 40 kW to a minimum power of 100 kW. Infrastructure over prestige. The space race of the 21st century has ceased to be a question of prestige and has become a question of infrastructure. While Big Tech tries to solve its energy limits with promises of servers in orbitRussia and China have opted for the pragmatism of the reactor on solid, but lunar, soil. Image| freepik Xataka | The race to bring data centers to space promises a lot. Physics says otherwise

AI is running out of power in this world. So Nvidia has opted for servers in space

The energy appetite of data centers is nothing new. Elon Musk predicts a shortage of transformers in two years. Sam Altman believes we will need an energy revolution, such as nuclear fusion, to keep pace. The planet was not prepared for so much energy demand. And that’s why Nvidia is funding a possible solution: deploy the servers outside of Earth. It’s not science fiction. It is the business model of several startups that propose building the next hyperdata centers in Earth orbit and even on the Moon. The idea, which until recently sounded far-fetched, is gaining traction driven mainly by two factors: the insatiable demand for AI and the low-cost launches that Starship promises. One of the companies leading this idea is Starcloud, supported by the NVIDIA Inception program. And he is so serious that he plans to launch his first satellite, the Starcloud-1in November. On board it will carry the first GPU for data centers launched into space: an NVIDIA H100. The difficult part will come later. Starcloud-1 is a test unit the size of a small refrigerator, but the company’s goal is to build a monster five-gigawatt orbital data center. Adding the solar panels and the enormous radiator, it would measure four kilometers wide. Its goal is the training of large AI models in orbit. Why in space? As detailed in an extensive white paperfuture models like GPT-6 or Llama 5 could require multi-gigawatt clusters, something “simply impossible with the current energy infrastructure” on Earth. In space, there is no such limitation. It’s more. According to Starcloud calculations, server energy costs are 10 times lower in space than on Earth. The value proposition of space data centers is based precisely on two pillars that are a problem on Earth: energy and cooling. Solar energy 24/7. On Earth, solar energy is intermittent. They depend on the day/night cycle, the weather and the atmosphere, which attenuates the radiation. In space, things change. By placing your data centers in a sun-synchronous “dawn-dusk” orbit, Satellites follow the line that divides day and night on Earth. With the panels illuminated by the sun almost continuously, the system increases its capacity to more than 95%. “Almost unlimited, low-cost renewable energy,” in the words of Starcloud. And the refrigeration? How would they dissipate all that heat? Land-based data centers consume millions of liters of fresh water to cool. There is no water in space, but they have something much better: an infinite heatsink at -270°C. The plan is not to ventilate the servers. The heat generated by GPUs (such as the H100) will be managed within sealed modules using liquid cooling (direct-to-chip or immersion), like high-performance systems on Earth. The difference is that this hot liquid does not go to an evaporation tower, but is pumped to gigantic radiator panels. These panels simply radiate waste heat into the vacuum of space in the form of infrared radiation. The Starcloud white paper details the calculations using the Stefan-Boltzmann law, estimating that a radiator at 20°C can cleanly dissipate more than 630 watts per square meter. Without using a single drop of water. Not everything that glitters in space is gold. The pillar that supports this entire concept is the launch of high-capacity reusable rockets, such as SpaceX’s Starship. Starcloud calculations are based on a long-term cost of $30 per kilo put into orbit. But Starship is not ready, and it is certainly far from achieving its full and rapid reusability capability. If that cost does not materialize, the economic viability of the system collapses. The other big problem is radiation. Commercial GPUs are not designed for space. Cosmic radiation and solar flares can fry electronics. The solution is shielding, which adds mass and therefore launch cost. Not to mention that maintenance is not possible with current technology.

14 audio servers to host your music collection and listen to it wherever you want

We bring you a small collection of 14 applications to set up a music server on your computer. With them, you can configure a server on your main computer, and then you can listen to the music you have on it from other devices through official websites or applications. With this, what you are going to get is create your own Spotify with the music files you have on your computer. This way, you can choose to buy digital music directly from the artists so that they get more benefits than with Spotify and then manage it without fees and with more privacy, although you can also use your own files after ripping a CD. And as we always say at Xataka Basics, these are the proposals that we have chosen, but perhaps you know others that also deserve to be on the list. If so, we invite you to Leave us your proposals in the comments sectionand so that all readers can benefit from the knowledge of our xatakeros. Plex We can only start with this application, because you can also set up your own Spotify with Plex. You have the server application for your computer, and then clients to watch movies, music and more on any other device. Plex It is very versatile, and very well known. It has some specific functions like an exclusive player for music and others that are paid, although you can make a single payment of just 120 euros and have it for life. Swing Music An open source self-hosted streaming service, which stands out for having a magnificent design maximum care. It has functions such as separating different versions of a disc, showing related artists or albums, or being able to navigate between your music files. The client works through the browserso you can use it anywhere. You also have a search engine, handle duplicate files, and more options that keep coming. You can install the server on Windows, macOS or GNU/Linux. koel This alternative is open source, has a modern design, and is feature-rich. It’s free, although you also have a paid version that allows you to store your files in the cloud or share music with friends. Its modern and colorful design makes it catch the eye, very clean and simplified, but it also stands out for its optimization and speed. All of this being a great player with all the options, including metadata editing or genre categorization. In addition, it integrates with Last.fm, Spotify or YouTube among others. Jellyfin Jellyfin is one of the main Plex alternatives, and in fact also has a specific music app in developmentalthough it may not work as well yet. However, you can also use its general app to find the music section. The main characteristics of this tool compared to Plex is that It is totally freeand at the moment it does not have a paid version. In short, it’s like setting up your own Spotify and Netflix all together. Emby Emby is another of the great alternatives to Plex, and although It does not have a specific application to listen to musicin the complete app of the tool you have a section to access the songs from the folders on your computer that you have chosen. Like Plex, this tool also offers most of its free options, although it also has several others that are with the paid version. Navidrome Free and open source, it allows you to stream from your computer. virtually any audio format. It has good support for multi-artist compilations, and makes use of all the metadata you’ve been editing in the files. Uses very few resourcesand can be used on both macOS, Linux and WIndows, even for Raspberry Pi. The client is web, for use in any browser, although it is also compatible with service clients such as Subsonic, Madsonic or Airsonic. mStream A fairly simple, self-hosted music streaming service that can work on any operating system with server applications for Windows, macOS, GNU/Linux and Raspberry Pi. You also have clients to listen to your music for Android and iOS, as well as for the browser. Playback is almost perfect with a good sound viewer and playlist sharing features. You can also easily drag files to upload to the server, all with an open source project. funkwhale This is a community-led projectin which there are different pods or servers created by other users where music is hosted and you can upload yours. You also have the opportunity to create your own pod, although it is somewhat more complex. In the end, the idea is to be able to enjoy your music or podcasts wherever you want. Ampache A veteran project but constantly updated, focused on being able to create your own music streaming service. All this with an open source tool that has been developing since 2001, and with which you will be able to install third-party clients for mobile, tablet or television. It has a web interface with which you will be able to configure the local and remote file synchronizationall in a single consistent collection. And then, you can access your music from any device, whether with an app or an HTML5 player. Gonic A recent and lightweight alternative, compatible with Subsonic client appswith simple management of your folders and support for podcasts. Gonic is especially recommended for those looking for a minimalist, modular and easy-to-install alternative. It stands out for its scanning speed, and for also being compatible with scrobbling applications such as listenbrainz or last.fm, to have your listening statistics. Perhaps its only drawback is a simpler and more schematic interface, but in exchange it is very easy to use. Subsonic This is a powerful streaming service where your computer also becomes a server. It is used to stream video and music. It is also a base for many third party clientsso if you don’t like their applications you can always turn to others. The price of this server is $1 per month, … Read more

The United Kingdom needs cheaper heating, so it is replacing gas boilers with Raspberry Pi servers

The idea is eccentric, but makes sense. The light of the light is in the clouds. Gas boilers are condemned to extinguish. And the demand for computing capacity does not stop growing. The solution: replace the boilers with a cluster of 500 Raspberry Pi to generate heat. Mini -provenors in oil. UK Power Networks, the largest distribution networks in the United Kingdom, is testing to replace Traditional gas boilers with small data centers to the size of a heat pump. They consist of a 500 mini -proven rack Raspberry Pi cm4 either Cm5 submerged in oil. The oil is heated as computers work, and the heat is then distributed by radiators and the water of the house. A distributed cloud. These devices called “Heathub” are actually part of the Thermify distributed computing service. The company has completed a pilot test in Wales, and now hopes to climb the service to 100,000 facilities annually from here to 2030. Thermify believes that low -income families will be interested in Heathub to relieve their economic burden, reducing the electricity bill and avoiding the Aerothermia installation. Thanks to cloud income, the company can offer a cheap and low alternative in carbon emissions. How it works. Within each Heathub container, 500 Raspberry Pi modules work endlessly processing loads for the cloud service clients of Thermify. All this hardware is refrigerated by immersionwhat in this case has a double function, because it allows efficiently to capture the heat generated to use it as heating. The residual heat is transferred to the central and hot water system of the house, as a substitute “plug and play” of the conventional gas boiler. As for how it affects the Internet connection: not to reduce customer bandwidth, each unit has a dedicated network connection. Cheaper invoices. Why was someone to install an foreign data center at home? For the same reason that telephone antennas on the roofs of the buildings are installed: money. In this case, customers pay a fixed monthly fee of 5.60 pounds per month (about 6.60 euros), which reduces their bills by 40% without losing heating capacity. Beyond individual savings, the proposal of Thermify and UKPN makes sense from the environmental point of view: use energy twice, taking advantage of a heat that traditional data centers usually waste. Perhaps the greatest obstacle that thermify is facing is the competition. Other companies Like the French Qarnot and The British Heata either Deep Green They are already working on similar projects, heating from water deposits to public pools. Images | UKPN, Thermify In Xataka | The best way to heat the house: we analyze the spending and energy efficiency of heat pumps and heating

Your servers for AI are not profitable enough

HP’s split business division, Hewlett Packard Enterprise, has published its Financial results From the first quarter of the year and the conclusions confirm that it is not going through its best financial moment, which forces the technological to apply a cost adjustment plan to compensate for the fall in income. This translates into the dismissal of up to 5% of global employees, of an estimated staff in more than 61,000 workers, according to the company in its Last annual report. Not enough money enters. Hewlett Pckard Enterprise (HPE) registered sales of sales of 7,854 million dollars in its first fiscal quarter of 2025, exceeding expectations that project 7,810 million dollars. This represents a growth of 16% year -on -year in total quarterly sales, which represents a net profit of 538 million dollars. However, growth forecasts have not been expected and forecasts point to the need to activate a shock plan to reduce structural costs, given the expectation that the benefits fall even more in the second quarter of the year. 2,500 less employees. One of the first decisions announced by the company has been the application of a reduction of personnel that would affect some 2,500 employees around the world. The measure is part of the adjustment plan that the technology plans to apply from now until the fiscal year of 2026. The objective of the adjustment plan on its workforce is to save 350 million dollars until 2027, according to confirmed A spokesman a CNBC. The company has not yet pronounced on which departments and countries will be affected by the dismissals of personnel. The servers are not sold. Part of the responsibility in the company’s loss of revenue falls on the server market. According to published ExpansionAntonio Neri, CEO of HPE, attributed these forecasts downward to the difficulties facing their server segment. “The problems in the unit of servers that caused lower profits were present both in traditional teams and in artificial intelligence,” confirmed to Bloomberg Antonio Neri, CEO of HPE. The AI ​​leaves little margin. The rise of The demand has fired of powerful servers, but its little margin of benefits makes it a complicated segment, which is aggravated by the increase in the cost of the chips for Nvidia. According to Hewlett Packard Enterprise CEO, among the main challenges for the next quarter is the increase in operating costs, the accumulation of inventory, the need to offer discounts to boost servers sales. During the first fiscal quarter, the HPE AI systems They registered revenues of 900 million dollars, well below the 1.5 billion dollars signed by the previous quarter. In Xataka | Zuckerberg dismissed 5% of the goal template “for low performance”. His former employees say there is another reason Image | Wikimedia Commons (Tony Webster)

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.