consumes much more electricity than data centers

In 2022 in the air conditioning it represented 7% of world consumption of electricity. And its pressure on electrical infrastructure does not stop increasing year after year. On the other hand, data centers will be responsible for 10% of the increase in energy demand until 2030. Air conditioning will contribute a much greater percentage. These estimates formalized them in November 2024 International Energy Agency (IEA) and clearly reflect that the energy consumption of air conditioning devices is more worrying than that of data centers. The rise of artificial intelligence (AI) that we are living has triggered the proliferation of these facilities in the US, China, Japan, Singapore, India, Germany, Netherlands or Ireland, among other nations, but, in theory, the forecast of the IEA has taken it into account. And also that air conditioning devices have to respect an increasingly strict regulation in energy matters. However, as Casey Crownhart points out, a journalist specialized in the behavior of the weather of MIT Technology Reviewhe global warming Play against us. Refrigeration is the authentic monster of global energy demand In 2016 there were just under 2,000 million air conditioning devices throughout the planet. However, in 2050 the IEE estimates that there will be about 6,000 million of these devices. This strong growth is the consequence of the constant increase in the number of cooling degrees per day derived from global warming. In 2024 the cooling degrees per day were 6% higher than in 2023, and nothing less than 20% higher than the average of the first two decades of this century. The countries that invest more energy in the air conditioning devices are precisely the three most populated on the planet: India, China and the US The countries that invest more energy in the air conditioning devices are precisely the three most populated on the planet: India, China and the US. However, the impact of this energy demand is global. Besides, not only does it matter How much electricity consume air conditioning devices; It is also very relevant When this energy demand occurs. In fact, in the US these devices represent up to 70% of consumption in those time slots in which it is more hot. This behavior submits the electricity to enormous stress, although this voltage oscillates as the hottest hours give way to the freshest moments of the day. Whatever it is is evident that It is very important to innovate in the field of cooling technologies. The presumable proliferation of air conditioning devices over the next few years will require that each of them consume less energy. Even if it’s just a bit. A minimum improvement in a huge park park can make a big difference. The good news is that Innovations are coming in the technology of cooling devices that invite us to look to the future with reasonable optimism. Desiccant cooling systems use special materials that absorb moisture to cool spaces in a more efficient way. And heat exchangers containing not only air conditioning devices, but also refrigerators or heat pumps, are increasingly efficient. It seems that this is the way. Let us trust technological development to help us solve this enormous challenge. Image | Sergei a More information | MIT Technology Review In Xataka | Samsung throws the handbrake: delays the construction of its US avant -garde factories and South Korea

What are they and how to manage or delete the key data that you remember about you and use in all chats

Let’s explain What is the chatgpt memory and howa system with which artificial intelligence remembers several key data about you. Because now Chatgpt remembers things you talked in the past, and will take them into account to use them in the future. We are going to start the article explaining what exactly is the chatgpt memory, telling you what it is for and putting some examples about your operation. Then we will tell you How to manage stored memories that you have from your chats to be able to erase some or all. What is the chatgpt memory Chatgpt is now capable of Remember useful information about you that is extracting from the conversations you have with him. For example, if in any conversation you tell him what your musical tastes are or where you work, the AI ​​will remember this information to use it in future chats. Each data of interest that Chatgpt keeps on you is a memory, and the chatbot stores memories with which to have these relevant data. The idea is to be able to Offer more personalized and relevant answers taking into account the context of the data about you. For example, something funny that happened to me. At a specific moment I spoke with Chatgpt about a musical group that I liked a lot. I said it passed, I think that to ask for some illustration or drawing with his name. He recalled this, and a few weeks later I asked to reimagine a photo of mine in the Ghibli style, and in the drawing he put me with the shirt of this band. There are two ways in which Chatgpt stores memories. First, it can be from normal conversations when you mention key data. But also You can ask you to remember something telling him to remember something, saying for example “remember that my favorite color is black.” You have control over Chatgpt’s memories, and you will be able to eliminate them individually or reset all erasing them. You can also deactivate this function so that you do not remember things or using what is remembered. And if you have it activated but want to start a chat that does not take into account the memoryyou can do it with the option to start a temporary chat. How to review and eliminate memories To review and manage the memories stored in chatgpt, you have to enter the configuration of this artificial intelligence. You can do it from its official website or from mobile and computer applications. Once within the configuration, Enter the section of Personalization. Here, under custom instructions You have the options for Memory. You can deactivate referring to the stored memoirs in the event that you do not want them to be used in your chats, and below you will have the option of Manage memories. If you click on Manage memorieswill be shown A list with all stored memories In chatgpt. In it, you can read what you are remembering in each one, and you will have to the right the option to erase the ones you want. You also have a button to eliminate everything if you want to forget everything you are remembering. In Xataka Basics | How to translate a sign, sign or any text with chatgpt, even asking you to explain it to you

Openai’s hypothetical social network does not want to connect people. Want your data to train your AI

In Openai they do not conform to being absolute referents in the segment of artificial intelligence. The last rumors point to that They intend to create a social network that would go beyond Chatgpt. The reason, eye, is not to compete with Facebook, Instagram or X. At least, not directly. Social network = data to feed the AIThe movement responds without a doubt to that voracious hunger of new data that the AI ​​models have and that allows them to improve and polish their behavior in different scenarios. A social network would allow Openai to use all those data entered by users to train their models. X already discovered that trend. The Fusion between X and XAI It was already a clear demonstration of that strategy: suddenly I had a perfect system to train its AI, Grok model, with all the posts of X users. And goal, of course, too. Goal takes time doing the same With Facebook and Instagram – although The EU forced her to slightly change your plans – and also collect data when we use ai target at whatsapp, Although you can avoid it. All with the same goal: to have “fresh food” for its artificial intelligence models. A social network to share images. Apparently the prototype that is already developed focuses on Chatgpt’s ability to generate images. It would therefore be a social network more similar to an instagram full of images generated by AI, from which we imagine that users would be generated. Altman already warned. The funny thing is that the rumor occurs weeks after Sam Altman himself, CEO of OpenAi, jokes with that possibility. When the news appeared that Meta was preparing an independent app for goal AI, Altman replied saying “ok, perfect, perhaps we make a social app.” This does not connect people. Perhaps they already had the project and it was not a joke, but although Altman pointed out that that could allow “revenge” of Facebook, the intentions would not be those of competing with Facebook when connecting people – the original purpose of that social network, at least – but that of collecting more and more data for their AI, which is what they have also ended up doing social networks. And the AGI for when. The problem of this theoretical project is that it would be partly a distraction for Openai. It would certainly allow for more data for the training of its AI models, but it is not clear that the scaling is the right path for the ultimate end of OpenAi: get a general artificial intelligence or agi. Too many parallel projects. Altman is famous for generate excessive expectations about the AGI. However, this challenge is tarnished by their last releases, especially in the case of GPT-4.5. The company seems to be somewhat scattered with gigantic projects like Stargatethe development of Your own chips o The mysterious hardware project In collaboration with Jony Ive. Too many apples in the basket? We will see. Image | Xataka with chatgpt In Xataka | Chatgpt is already the most downloaded app in the world. His only problem is that he does not know how to make money with it

Synthetic data is not enough, so Apple wants users to help improve their AI: this will work its system

Apple Intelligence is herebut its premiere has not been as brilliant as many expected. The company’s new proposal still has improvement marginboth in English and Spanish. Although it has been presented as a clear sales argument, its impact among people has been rather discreet. The first impressions are mixed And, for now, he has not managed to generate a groundbreaking enthusiasm. From Cupertino they already move to reinforce one of the greatest software bets in their recent history. Among the movements in progress stand out two fronts: The rumored restructuring of the team responsible for Siri, whose improved version has been delayed up to 2026and the creation of new techniques designed to improve their language models, with the mission of not neglecting their approach to privacy. One step beyond synthetic data Apple usually training its models with synthetic data and data labeled by humansa solution that has been effective to some extent. It does not always represent the real world. Consequently, it limits the functioning of AI products. This has led the technological led by Tim Cook to develop a new solution that combines synthetic data with anonymous signals of participating devices. As explained in an article published this weekeverything begins with a synthetic message, that is, an email invented by Apple itself with a format that simulates the real emails. For example: “Would you like to play tennis tomorrow at 11:30?” From there, several variants are generated that change some elements, such as sport, schedule or tone, to try different possible structures. These phrases are sent to a part of the devices whose users have agreed to share analytics with Apple. There is something key: each iPhone, iPad or Mac Take a handful of real emails transformed into Embeddings local, that is, mathematical representations that convert each message into a set of numbers that reflect their theme, style and length. The important thing is that these emails never leave the device. Thus, the system compares Embeddings synthetic, which Apple has previously generated, with the Embeddings Real emails, to see which ones are more similar. This resemblance is reduced to an anonymous signal, a simple “this version coincides better”, which is sent to Apple without revealing the original mail or the Embedding of the user. With this, Apple intends to learn which synthetic variants better reflect the real use of language, but without seeing a single fragment of private content. The idea is that this helps improve Apple Intelligence functions such as email summaries or writing tools. This approach is based on the same differential privacy techniques that Apple already uses in other functions such as Genmoji. In that case, the company collects anonymous signals about what Prompts They are more popular, such as “a dinosaur with a hat”, to improve the results without registering which user made what request. The idea is simple but very interesting. Improving without using user data data allows you to maintain the privacy approach that the company has been defending so many years. This new technique will begin to be implemented in the next betas of iOS 18.5, Ipados 18.5 and Macos 15.5. It should be noted that only those who have activated the option to share analytics from the privacy settings are activated. So, if you don’t want to be part of this system, You can deactivate it whenever. You just have to go to Settings > Privacy and safety > Analysis and improvements and deactivate the option “Share iPhone analysis”. Images | Apple | APPSHUNTER In Xataka | Public administrations turn the tortilla: Midni promises to be another success with Midgt and my citizen folder

This nuclear reactor is different from everyone else. It has been expressly designed for data centers

The proliferation of large data centers for artificial intelligence (AI) raises a very serious energy problem. Both, in fact, that the US Department of Energy is considering the possibility that companies that have large data centers Dedicated to the training of AI models install in their vicinity A small nuclear power plant that is able to meet your energy needs. This strategy would also reinforce the US bet for energy sources that do not emit greenhouse gases. What is not clear is yet what investment the technology companies and what subsidies will contribute the government will assume. At the moment some of the great technology firms have already invested in nuclear energy, although not necessarily in fission. Microsoft, for example, It has an agreement with Helion Energy to obtain in the future energy from its reactors of nuclear fusion. An extra modular reactor adapted to the needs of data centers The image we have published on the cover of this article is a recreation made by the American company Aalo Atomics of its extra modular reactor Aal Pod. This machine has a lot in common with SMR reactors (Small Modular Reactro compact modular reactor) of which We have spoken to you in other articlesbut, according to its creators, they uncheck of the latter in something very important: its modularity is even greater. This peculiarity is precisely the According to them It makes it suitable for data centers. However, the reactor of Fourth Generation Nuclear Fission Aalo POD has another quality that, on paper, is very attractive: its enormous flexibility. And, again according to his designers, he can work in a completely independent way of the electricity grid, coupled to it, and even, hybrid. In this way, the owners of the data centers can use the strategy that better solves their needs by balancing the electricity produced by the reactor and the one that is capable of giving them the existing electrical infrastructure. Sounds good. Each Aalo POD incorporates five aal-1 micro-recruitors matched with a single electricity generating turbine The prototype that Aal Atomics has made known is capable of delivering 50 MWE (electric megawatts), but its modularity allows you to climb this machine to be able to deliver several hundred MWE, and even thousands of MWE. The image we have is a recreation, but it allows us to intuit what the architecture of this reactor is. Interestingly, it looks more like a linear particle accelerator than a conventional nuclear fission reactor. An interesting note: Each Aal POD incorporates five micro-recruitors Aal-1 paired with a single electricity generating turbine. However, this is not all. The heart of the Aalo Atomics strategy consists in developing a production technology that allows to manufacture the modules of each Aal Pod Industrial and chain. As if they were cars. Or reaction turbines for airplanes. According to this company, this approach will allow them to install their reactor with the data centers in less time, occupying less space and for less money than a conventional SMR reactor would cost. In addition, again according to Aalo Atomics, each micro -reactor can be resumed at any time without the need to stop others with which it is matched and refrigerated by sodium, so it is not necessary to have a water source close. The promises of this company on paper paint well. Now the important thing is to materialize everything you have announced in a final product that lives up to expectations. Data centers continue to proliferate. And they don’t rest. Image | Aalo Atomics More information | Aalo Atomics In Xataka | Nuclear fission has been waiting for a type of fuel to take off. And he already caresses it with the tip of the fingers

Those responsible for the Robinson list deny having been hacked. The data of more than 600,000 people are at stake

Hackmanac hackeos monitoring account published A few hours ago a worrying advertisement: the Robinson list It has supposedly hacked in Spain. According to hacking data, personal data of 614,197 people They would have been exposed. Among these data are full names, postal addresses, ID, telephone, date of birth and emails. These data represent a real treasure for cybercriminals who make use of the phishing technique to try to cheat users. With this information they can make personalized messages even try to supplant identities in attacks more specific to anyone. Our partners Xataka mobile indicate who have contacted Adigital, the agency that manages the Robinson list. Those responsible deny that this hacking has occurred And they have issued the following statement: “We have carefully reviewed the information available with our technical and legal team and we can conclude with total security that there has been no hacking or illicit access to our systems. In any case, we will be attentive to evolution and make our knowledge and tools available to the Spanish Agency for Data Protection.” The Robinson list is An advertising exclusion service to which any Spanish citizen can sign up for free. Its objective is to send advertising to registered users, both to their postal addresses and email or by mobile messages. A potential theft of the data on the list could precisely cause the opposite: that these users became direct victims of future Phishing attacks. It remains to be seen if the information published by Hackmanac and other media was really true or not – the authenticity of cyberboo is To be confirmed– But the organism presume to maintain the most important repository in the world of “verified, successful and publicly known cyberbrays.” In Xataka | The Robinson list works. And the companies that jump are already known the punishment: 10,000 euros

There is an open source alternative to Notion and Google Docs. And its value is where you store your data

France and Germany have joined strength to create something uninpected: Docscollaborative and open source software that seeks to reduce dependence on US services. And incidentally, guarantee the control of sensitive information. Why is it important. For European public companies and administrations, maintaining sensitive data away from American servants is not only an economic issue, but a strategic priority. US legislation allows its authorities to access data stored by national companies, even if servers are physically in Europe. This involves a problem for confidential government information, sensitive documents or personal data protected by GDPR. In addition, European governments want to reduce the dependence of this type of platforms. In detail. Docs is not just a service highly similar to others that already exist, such as Notion either Google Docsbut it incorporates functions specifically designed to adapt to the requirements of public institutions. The project is part of ‘The numerique suite‘, an initiative that also includes vision, an alternative to Zoom or Google Meet. The platform allows: Real -time collaborative edition with offline support. Export in several formats (.odt, .doc, .pdf). Granular access control to guarantee security. Customizable templates for official documents. Functions of AI to generate, summarize, correct and translate content. The code It is available in github under MIT license, which allows not only its free use, but also its adaptation by companies that want to offer services based on it. And now what. The adoption of docs and other similar tools will depend on the level of institutional support and the creation of viable ecosystems around these technologies. If success will not only depend on its technical quality, but on the distribution, on the ease of implementation or integration with existing systems. Outstanding image | Suite Numérique, Xataka In Xataka | The notion template business: an ocean of black and white avatars

I have changed the Apple calendar for Notion Calendar. Now my data and my time live under the same roof

Notion introduced its calendar application, Notion Calendar, at the beginning of 2024. I was interested, and much, but limit to Google calendars left me out: most of my events occur in ICloud calendars. Ten days ago they finally announced the integration with ICloud, and there I was, as a user of Notion For years to see how this proposal was and if it is worth making the leap from Apple’s native application. They are not the functions, but the unification My first impression was surprisingly positive. The application is fast, very fast. In me MacBook Pro M1 Proalmost four years ago, the speed with which it opens and the fluidity when navigating between days, weeks or months is impressive. It contrasts remarkably with Apple’s calendar that, without being slow, does not have this feeling of lightness and agility. The monthly view interface throughout its glory. Image: Xataka. The interface is minimalist without feeling empty. It maintains that characteristic air of notion with its soft and white gray tones, but retains its own personality. Visually it is much more pleasant than other calendars such as Google Calendarwhich prioritizes functionality over aesthetics. If you are one di noi And you are looking to optimize every second in front of the screen, the Calendar Notion keyboard shortcuts are a gift. “S” to share availability. “C” to create an event. “T” to go to the current day. “W” to activate the weekly view. “M” for the monthly. It is an approach that is clearly designed for users who prefer not to take off the hands of the keyboard. As Things A function that has conquered me is Availability management. Before, when someone wanted to meet with me, I had to manually send my free holes or use tools such as Calendly That, for some reason, they always made me feel a little impostor. With Notion Calendar, I select the available time blocks, generate a link to the moment and voila. One of those little friction of the day to day that disappear. The availability function. The white holes (not gray with stripes) are the ones that we establish as available. With sharing the link that appears to the right we can facilitate another person our gaps available for a meeting. As an integrated calendly already demands. Image: Xataka. The management of time areas is also remarkable. For those of us who work with people in different parts of the world or when we are organizing a trip, we can see our calendar with multiple schedules at the same time is canonical. And here, Notion does it much more intuitively than Apple’s calendar. Nevertheless, where notion calendar takes muscle is in its integration with notion, in its way of unifying information and time. It is an independent calendar for those who do not use notion, but that integration is the key to this proposal: generate ecosystem, not independent islands. The strategy behind the calendar What we are seeing with Notion Calendar is not just a new calendar application. It is one more piece on a board where Notion is building a complete productivity ecosystem. First was the main application for notes and databases, now the calendar, and not long ago also arrived with Notion mailat the moment on the waiting list. What was previously a product, Notion, now begins to be the center of an ecosystem that continues to grow. Image: Notion, Xataka. Instead of putting everything in a single application that could be overwhelming, Notion is creating independent but totally interconnected applications. It is an intelligent strategy that allows them to compete in each category without sacrificing the user experience. True magic occurs when you link notion databases with dated fields to the calendar. Suddenly, all these elements appear directly on my calendar with my normal events. For example, I have a database with all scripts in notion of the podcastand each one has its publication date. Now I can see those dates directly on my calendar, mixed with my meetings and personal events. The database with the infinite loop scripts that I prepare every day. By including a publication date they have everything you need to appear, if I want, in Notion Calendar. Image: Xataka. This unified vision is not only practical, it is a paradigm shift. It allows to see the events and tasks in a temporary context that previously required several tools. And not only that: you can attach notion pages directly to calendar events. For a meeting I can link the agenda, previous notes or related documents, everything without leaving notion. Towards a more mature experience The iOS version is also very well achieved. Unlike many applications that simply reduce their desktop version, Notion schedule for iOS (It is also For Android) is really adapted to mobile experience. You can choose if you want to see 1, 2 or 3 days when opening the app, something much more flexible than the native calendar. Again: more than the intention of launching a calendar is intuited here. A strategy is intuited that fuses information management with time management. When I saw the initial announcement of Notion Calendar, my first reaction was a mixture of interest and skepticism. Do we really need another calendar application? The answer is in the objective: Notion is not simply building a calendaris building that bridge between information management and time management. It is as if Notion were saying: “You already have all your information organized with us, why not also your time?” And it makes sense. If you spend much of your day in Notion, having your integrated calendar right there makes everything more consistent. Is it worth who does not use Notion? Surprisingly, I think so. Even without taking advantage of integrations, it is a solid calendar application, well designed and with functions that exceed the Apple or Google calendar in several aspects. For me, in fact, he has already replaced Apple’s. And the most interesting thing is to see how Notion is evolving from … Read more

Large technological ones begin to turn with their investment in data centers

Everything was frenzy in the data centers segment a few weeks ago. The Big Tech fought to see What is the one that was spent more money facing the theoretical (and inevitable?) AI revolution. Microsoft was one of the champions of this bet, but the panorama is changing, and there are those who talk about how the segment has been oversized. 2 GW less. As indicated In BloombergMicrosoft has abandoned its plans to create several new data centers in the US and Europe. The joint power of these projects would be 2 GW according to analysts of the TD Cowen firm, and the reason attributed to the decision is singular: now it turns out that there is an excessive supply of clusters dedicated to artificial intelligence. Or what is the same: there will be enough data centers dedicated to AI. There is already talk of a “bubble of data centers”. Joe Tsai, president of the Chinese group Alibaba, He warned these days precisely from the potential existence of a bubble of data centers for ia. To this millionaire fever for these ambitious projects begins to seem indiscriminate, and highlighted how in some cases there may be no clear clients to direct those resources. Plan to invest 52,000 million dollars in data centers, but within three years, therefore therefore of the 100,000 million dollars of Amazonthe 80,000 of Microsoft, the 75,000 Alphabet or the 65,000 finish In a single year. An exaggerated demand is being screened. This manager also spoke of the hypothetical investment of 500,000 million dollars of the Stargate project. “I think, in a way, people are investing anticipating the demand they are seeing today, but they are projecting a much greater demand (of which there may be).” China in fact accumulates unfortunate data centers. In Microsoft they relax their strategy. Redmond’s firm, said these analysts, has made this decision shortly after loosen ties with OpenAicompany in which it has invested around 13,000 million dollars. That will cause the company led by Sam Altman to go to cloud services of other partners. Google and Meta take the opportunity. The withdrawal of these projects assumes that Microsoft has annulled some of those contracts and postponed others. Interestingly Google and Meta seem to have taken advantage and have appropriated some of the projects that Microsoft has abandoned in Europe. The details of the projects from which the firm has been withdrawn are not known, and neither if that change of plans could affect projects already signed such as data centers They are already announced in Aragon. We already have enough. A Microsoft spokesman indicated in a statement to Bloomberg how the company has already made a significant investment. “While we may reduce or strategically adjust our infrastructure in some areas, we will continue to grow strongly in all regions,” he explained. “This allows us to invest and assign resources to growth areas for our future.” In recent times we are also seeing how climbing no longer compensates so much, and GPT-4.5 is a good demonstration of it. What about ambition. At the beginning of the year we knew that Microsoft expect to invest 80,000 million dollars throughout fiscal year 2025 in the construction of new data centers. These intentions are maintained according to their spokesman, but they hope that the growth rate should slow down the next fiscal year, and the efforts will focus on filling those server data centers and other equipment. In Xataka | The B300 GPU is the new Nvidia beast for Ia. And we already know what prepares for 2026 and 2027

China accumulates unfortunate data centers, and it is not the only

When Chatgpt broke into the scene in November 2022unleashed one of the most intense technological careers in recent years. Companies and governments rushed to take positions so as not to stay out of the rise of the artificial intelligence. In the center of that reaction were the data centers: key infrastructure that make it possible to train language models that shape chatbots and other AI -based applications. American giants such as Microsoft, Google, Amazon and Meta They announced the expansion of their infrastructure beyond their borderswith millionaire projects that also arrived in Spain and They unleashed authentic fever for this type of facilities in the region. The AI ​​earthquake also shook China, where the central government declared its development as a national priority and promoted the creation of new infrastructure to sustain it. The data centers boom begins to stagger According to IDC analysis firm databetween 2022 and 2024, more than 200 projects linked to data centers focused on artificial intelligence were tendered, distributed in 28 provinces and 81 Chinese cities. The growth rate shot against previous years, with a wave of “intelligent computer science” initiatives that not only sought to strengthen the country’s digital infrastructure, but also promised to boost local economies. There were several well -known names in this part of the world: Alibaba, Bytedonce – the Tiktok matrix -, Tencent, Baidu or Deepseek, all betting hard for this land. The objective was clear: if artificial intelligence was going to mark the future, China could not afford to be left in front of the United States. In order not to lose positions in this race, the Asian giant needed to move fast, very fast. Although neither companies nor governments said it openly, each new project was announced with full awareness that technology was not yet mature enough, and the business model, either at all defined. The bet – as it usually occurs in this type of initiatives – was based on the expectation that, sooner or later, it will generate a relevant economic value, either directly or indirectly, for those who are promoting it. Despite the millionaire investments in new data centers, China’s enthusiasm for large -scale linguistic models is losing strength. As Mit Technology Review collectsmore than half of the recently built computer resources remain without using. To this situation are added factors such as the lack of technical and market experience of many of the actors who bet on this type of infrastructure for being a trend. The result: dozens of smaller data centers are looking for customers willing to pay for their use, but the truth is that, although China is a huge market, demand is not responding as expected. The large technological groups in the country are already dealing with their own infrastructure, and smaller companies, instead of training their own models in these centers, are opting for payment solutions for use. Finally, they point out that many of the data centers built in recent years were designed thinking about pre -entry workloads, that is, long and demanding processes that require huge volumes of data. However, the current demand focuses on inference: executing models already trained to offer real -time responses. And that is where many of these infrastructure are not prepared. A phenomenon that extends beyond China According to TD Cowen analysts cited by BloombergMicrosoft would have canceled new data centers projects In the United States and Europe. The company has not made official statements, so it is not yet clear what facilities would be affected. However, experts point to a concrete cause: the reduction of commitments with Openai, the startup of AI in which Microsoft has invested billions. For years, Openai depended exclusively on Microsoft’s cloud infrastructure. But that changed recently, when opening to other computer suppliers. In parallel, Microsoft maintains plans to invest 80,000 million dollars in data centers during its current fiscal year, which ends in June. Even so, analysts expect that investment rhythm will later decelerate an unexpected movement. Images | DC Studio | Scott Rodgerson In Xataka | Personalized GPTS are one of Openai’s great inventions. Now Google has just released yours in Gemini

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.