The bet of “Everything on subscriptions” has not turned out as Microsoft expected

Microsoft bought Activision for almost 70,000 million with a clear and undisguised idea: put the successive installments of ‘Call of Duty’ in Game Pass from day one. A strategy that, on paper, would boost the service’s subscribers and change the tides of the industry. Eighteen months later, perhaps the numbers have not turned out to be so favorable and the company is rectifying that strategy at full speed. The most curious thing? After some changes, users end up paying more for a service that offers less. Price changes. Six months ago, Game Pass Ultimate cost 17.99 euros per month and included from day 1 the latest ‘Call of Duty’ to date, ‘Black Ops 6’. In October 2025 Microsoft raised the price to 26.99 euros, 50% hitjust two weeks before the premiere of ‘Black Ops 7’. Now on April 21, 2026 announces which lowers it to 20.99, but without the future ‘Call of Duty’ on day one. PC Game Pass, in parallel, goes to 12.99 euros from the previous 14.99, although it is also above the 11.99 it cost before the last increase in price. Not so discount. Despite the apparent discount, the April subscriber pays three euros more than the subscriber in September of last year, but also renounces the main claim for the service. Future ‘Call of Duty’ will arrive on Ultimate and PC Game Pass “during the following holiday season, approximately one year after” their commercial release, while ‘Black Ops 6’ and ‘Black Ops 7’ remain in the catalog. Only half a year has passed after the total restructuring with the purchase of Activision… but with the rate somewhat more expensive than it was not too long ago. Why it didn’t work. The strategy made sense: if ‘Call of Duty’ is the most profitable franchise on the market, offering it on the first day within the subscription would make Game Pass an almost impossible proposition to refuse. The numbers did not match. A report last October estimated that Microsoft had missed out on $300 million in revenue by including ‘Black Ops 6’ on the service in 2024. That same report noted that 82% of the game’s full-price sales occurred on PlayStation 5. Things got worse with ‘Black Ops 7’. Due to its presence on Game Pass from day one, the game’s launch sales fell more than 60% in some marketsand in the United States it ended 2025 as the fifth best-selling title of the year, the lowest position for a franchise game in almost two decades. Subscription was cannibalizing sales without growing enough to make up for it. The accounts don’t work out. Perhaps Microsoft’s accounts collided with an indisputable reality: there is not enough Xbox to support the expenses of a blockbuster of the caliber of Call of Duty, mainly with subscriptions. In November 2025 Calculations placed Xbox Series X/S at 34.10 million units sold compared to 86.12 million for PS5. Of course, the difference grows quarter by quarter. Giving away a game on day one that still costs $69.99 on PlayStation meant giving up margin in the most profitable territory to monetize Call of Duty. What point are we at? Christopher Dring, editor of The Game Business, pointed that the decision has also been made due to an imminent launch: ‘Forza Horizon 6’ arrives on Xbox in May and is, right now, the third most desired game on Steam, with nearly 2.7 million people who have included it in their wishlists. It is a good asset, with its previous arrival on the console, to increase the subscriber base, and the price drop may interest more than one player with doubts. In Xataka | Game Pass is already an unsustainable investment: more than 2,000 euros for each generation of console and without anything owned

They have kidnapped agents from Anthropic, Google and Microsoft for the sake of science. The three companies ended up paying

In some development teams it is already becoming common to rely on artificial intelligence agents to review incidents, analyze code changes and move through tasks that were previously left in human hands. The problem appears when these systems not only read information that may come from outside, but also operate in spaces where they coexist. sensitive keys, tokens and permissions. That is what recent research puts on the table: we are not simply facing a useful tool that can make mistakes, but rather an architecture that can also become dangerous if it is deployed without very clear limits. The alarm has been turned on Aonan Guan and Johns Hopkins researchers Zhengyu Liu and Gavin Zhong after demonstrating attacks against three agents deployed on the aforementioned platform: Claude Code Security Review, from Anthropic, Gemini CLI Action, from Google, and GitHub Copilot Agent, a GitHub tool under Microsoft. According to your documentation, The failures were communicated in a coordinated manner and ended in financial rewards paid by the companies, but what is relevant is that they point to a broader problem. This is how they managed to twist the agents from within The name that Guan gives to the discovery helps a lot to understand what this is all about: “Comment and Control.” The idea is simple to explain, although the substance is not so simple. Instead of setting up an external infrastructure to direct the attack, GitHub itself acts as an entry and exit channel: the attacker leave the instruction in a titlean incident or a comment, the agent processes it as if it were part of normal work and the result ends up reappearing within that same environment. Everything stays at home, and that is precisely the key to the problem. And that “everything stays at home” is not a minor detail, but the basis of what the research describes. The three agents share a very similar logic: they read normal content from GitHub, incorporate it as a work context, and from there, execute actions within automated flows. The clash appears because that same space not only contains text sent by third parties, but also tools, permissions and secrets that the agent needs to operate. The first case Guan details concerns Claude Code Security Review, an Anthropic GitHub action designed to review code changes and look for possible security flaws. Up to this point, everything is within what was expected. The problem, as the researcher explains, is that it was enough to introduce malicious instructions in the title of a pull requestwhich is the request that someone sends to propose changes to a project, so that the agent will execute commands and return the result as if it were part of your review. The team then managed to go a step further and demonstrate that it could also extract credentials from the environment. The interesting thing is that the same scheme also appeared in the other two services, although with nuances. At Google, Gemini CLI Action could be pushed to reveal the GEMINI_API_KEY from instructions snuck into an issue and its comments; In GitHub Copilot Agent, the variant was even more worrying, because the attack was hidden in an HTML comment that a person did not see on the screen, but the agent did process when another person assigned it to the case. In both scenarios, the background was the same again: apparently normal content that ended up twisting the behavior of the system until exposing credentials or sensitive information within GitHub itself. Guan assures that the pattern made it possible to leak API keys, GitHub tokens and other secrets exposed in the environment where the agent ran, that is, just the credentials that can later open the door to much more delicate actions. Who does this affect? Especially to repositories that run agents in GitHub Actions on content sent by untrustworthy collaborators and, in addition, give them access to secrets or powerful tools. The researcher himself clarifies that the risk depends a lot on the configuration: by default GitHub does not expose secrets to pull requests from forksbut there are deployments that open that door. And here another layer of the matter appears, less technical but just as important. As published by The RegisterAnthropic, Google, and GitHub ended up paying bounties for the findings, but none of the three had published public notices or assigned CVE at the time of that information. Guan was quite clear about this: he said he knew “for certain” that some users were still stuck on vulnerable versions and warned that, without visible communication, many may never know that they were exposed or even being attacked. So although there were mitigations and changes in documentation or in the internal treatment of reports, there was no equivalent public notice for all those potentially affected. Anthropic settled the case on November 25, 2025 and paid $100 Google rewarded the discovery on January 20, 2026 with $1,337 GitHub closed the case on March 9, 2026 with a payment of $500 What makes this case especially delicate is that GitHub does not seem like the end of the road, but rather the first visible showcase. Guan argues that the same pattern can probably be reproduced in other agents who work with tools and secrets within automatic flows, and there he mentions from Slack-connected bots to Jira agentsmail or deployment automation. The logic is the same again: if the system has to read external content to do its job and also has enough access to act, the field is fertile for someone to try to twist it from within. The conclusion that Guan reaches is not about selling a magic solution, but about returning to a fairly classic idea in security: giving each system only what is essential to do its job. If an agent reviews code, they shouldn’t have access to tools or secrets they don’t need; If you’re just summarizing issues, it wouldn’t make sense for you to write to GitHub or touch sensitive credentials. That … Read more

France has begun to retire Windows from its administration. It is the beginning of his divorce from Microsoft, Google and Amazon

Digital sovereignty in Europe has gone from being a theoretical concept to something increasingly tangible and desirable with respect to the technology we consume. It is no longer just a trend that is increasingly more individual people are tryingbut has also become an object of desire for administrations and companies. The path to becoming independent from big tech in the United States is not easy and while there are startups like Mistral who gets rich in the processthere is a state that has decided to take a brave step forward: France. In a global environment where data and infrastructure are geopolitical weapons, the French Government, through the Interministerial Directorate for Digital (DINUM), has launched an aggressive roadmap to regain control over their information systems, thus reducing the hegemony of non-EU technological solutions. And it has started with Windows. The decision. In a high-level inter-ministerial seminar, DINUM together with ANSSI, the State Purchasing Directorate and the DGE formalized the most ambitious commitment to digital sovereignty adopted to date by a Western European power. Or what is the same: France wants to exit the American technological ecosystem in a systematic, planned way and with specific deadlines. It is not an experiment, it is state policy. The guideline is clear: map and reduce dependence on technology suppliers from outside the EU. The measure is not a veto but rather a mandatory transition towards a model where public administration must prioritize local or open source solutions, especially in critical services and sensitive data processing. As has declared the Minister of Action and Public Accounts David Amiel: “ We can no longer accept that our data, our infrastructure and our strategic decisions depend on solutions whose rules, prices, evolution and risks we do not control.” Why is it important. From a systems engineering and cybersecurity point of view, the measure is vital for issues such as protecting against Cloud Act of the United States, the law that allows its authorities to access data stored in American companies regardless of where the servers are located. On the other hand, it guarantees that the state maintains its necessary technical capabilities to operate its own infrastructure without depending on proprietary “black boxes” and to heal itself in the event of a change in conditions or other external problems. But this phased migration is much more than an OS change: it involves dismantling the entire associated ecosystem, certificates and applications designed for Windows. It means rebuilding the digital foundations of the state from the roots so that they function with total autonomy and without foreign parts, without citizens noticing the change on the surface. Context. Our daily personal, professional and bureaucratic lives live in an ecosystem governed by hyperscalersthose technology companies like Microsoft, Google or Amazon that dominate storage and cloud computing. This mention is not random: they alone eat more than 60% of the cloud cake, as Statista collects. The increase in cyber threats and the US technological monopoly in the West and its increasingly invasive turn to the privacy of others have done the rest. France has been maturing the doctrine for years “Cloud au Center“. While the ANSSI audited the dependencies on critical infrastructures, its sovereign cloud was being forged as a real alternative. In addition, the European regulatory framework, with the NIS2 directive wave cyber resilience lawhas created the ideal breeding ground. With tools like TchapVisio, FranceTransfert and Socle Numérique (alternatives to WhatsApp, Teams, WeTransfer or Microsoft 365, respectively) France no longer only has a plan, but a real operational base on which to scale. The plan towards sovereignty. It is neither a toast to the sun nor does it have vague and diffuse measurements or distant dates, but concrete, tangible movements and which is either already being implemented or is scheduled to be completed before the end of the year: DINUM abandons Windows and migrates its jobs to Linux. It is the first central State agency to do so. Already underway. Migration of 80,000 agents from the Caisse Nationale d’Assurance Maladie (equivalent to Social Security) to sovereign tools: Tchap, Visio and FranceTransfert. Already underway. Migration of the health data platform to a reliable European solution. Scheduled for the end of 2026. Duties for each ministry: present a dependency reduction plan, which includes databases, antivirus, AI or collaborative tools. For this fall. Yes, but. France has a basic skeleton and a legal framework, as well as public-private coalitions to accelerate the transition through concrete and measurable public commitments. But it won’t be easy. Exiting Windows involves disassembling Active Directory and what is behind it, something that costs a lot of time and money. And migrating 80,000 agents to new tools is not so much a technology problem but rather a problem of implementing new management. Also, go out where. Many European solutions still do not reach the integration, ease of use and capacity (especially in AI) of American big tech, which implies a step backwards in terms of quality. But even if it were possible, moving from a proprietary infrastructure to a sovereign one implies an enormous investment in time, personnel training and data migration. Finally, maintaining and evolving our own infrastructure requires specialized and experienced personnel in a market where talent is scarce and expensive. In Xataka | The CEO of Mistral sends a message to Europe: the end of being the technological vassal of the United States In Xataka | Europe seeks to become independent from Microsoft Office. Your alternative is already here, but not without controversy Cover | Clint Patterson and Arno Senoner

Europe seeks to become independent from Microsoft Office. Your alternative is already here, but not without controversy

For a few months now, and seeing how the situation is, in Europe a feeling of change has awakened about the technology we consume. Movements have appeared among users to abandon software and hardware from American companiesbut that is something that is also impacting governments and among own European companies. And something that seems minor, but is not at all, is the European software A-Team that has come together to create Euro-Office, the alternative to Microsoft Office. And it hasn’t started off on the best foot. Euro-Office. The name couldn’t be more apt, but something must be said: it doesn’t come out of nowhere. This is an initiative that was born as a fork direct from OnlyOffice. Android users Do you know what a fork is? and, basically, it is taking another software… copying it. The desired changes are made and it is launched independently. Since it is usually free or open source software, there are no problems creating a new version. The software will not be a standalone thing, but rather a package consisting of a text editor, spreadsheet, PDF editor and a presentation tool. Support includes formats such as DOCX, XLSX, PPTX and ODF versions. Come on, it wants to be an alternative to Office, but also to Docs and any other suite. Where does it come from?. Perhaps the most interesting thing about the project is that it is not an initiative of a university, a startup or a specific country. The project was made public a few days ago and has nsuch powerful ombres behind such as IONOS, Nextcloud, Eurostack, XWiki, BTactic, Soverin and OpenProject, among others. In fact, it seems that Proton is also out there (which apart from its own suite, has cloud storage systems, email and VPNbeing one of the strongest alternatives to the Google suite). And the common narrative is that it is a European ‘front’ to reduce dependence on American suites in sensitive environments. Because yes, when a Government, for example, saves documents in the cloud of Google or any other foreign company, who is to say that there is no access. This is what the text editor looks like Digital sovereignty. As I said at the beginning of the article, Europe seeks sovereignty in different areas. In technology, they want to become a power in chip manufacturing (they already have part of the way done by having ASMLthe company more cutting-edge when it comes to creating machines that allow advanced chips to be manufactured). They also want to stop depending on NASA or SpaceX for space exploration, so we have gotten into that race. And in the digital sovereignty becomes independent from American and Russian services. For this reason, Euro-Office is considered from the beginning as a service integrated into the GDPR that is not subject to external jurisdictions such as the US CLOUD Act and that is integrated into public administration, education, government-regulated companies, critical infrastructure, health or education. For everyone. And since changing so much is complicated, the intention is to make the transition as simple as possible for users. This is where maximum compatibility with Microsoft formats comes into play, but also a familiar interface so as not to generate friction. And, above all, it was born with the desire to focus that independence on software. Because until now we had LibreOffice and OnlyOffice, but what is sought is to stop waging war on their own and for all European organizations to go in unison The controversy. Here may be the question, and also the controversy. If there was already something, why spend time developing something else and not use that already existing alternative as the “official” one? Well, according to the promoters of Euro-Office, because collaboration with OnlyOffice was not viable. They quote the Russian roots of the project (although the headquarters are in Latvia) and decisions such as the withdrawal of functions in the mobile app as some of the reasons why the fork was the last, but necessary, resort. From OnlyOffice hold that Euro-Office violates certain terms of its license, citing intellectual property theft and copyright infringement. And it has not stopped at “well I’m angry”, but something more: OnlyOffice has accused Nextcloud of trying to sign its staff to take them to the EuroOffice project. Next steps. The commotion goes further because it has been pointed out that, if it is a fork of an app of Russian origin, they do not know to what extent Euro-Office can introduce yourself as something “purely European.” But, in any case, it is evident that there is a growing interest in becoming independent from non-European technology and this suite has a version 1.0 planned for this summer. The preliminary version It’s already on Github. The most complicated thing remains: moving the very heavy transatlantic that is the public organizations of the different European countries that want to join this. Also see how they convince those who already use European suites such as those from The Document Foundation -LibreOffice- or the British Collabora to switch back to Euro-Office. In Xataka | Schrödinger’s Office: at this point it is impossible to know if Microsoft keeps it alive or if everything is AI and Copilot

Houston, we have a problem with Outlook. Microsoft spends millions on AI, but Artemis II does not escape the failures of its email

On April 2, we experienced a historic event for humanity: the mission Artemis II It successfully took off towards the moon after more than 50 years without orbiting near the Earth’s satellite. Although the takeoff was a success, the path to get here was not without problems: it already had to delay the first date launch and also the second. Even on the official day there were problems. In the previous hours it was necessary check an anomaly in a temperature sensor of a battery abort system and also appeared another incident in the flight termination system (the safety mechanism that makes it possible to destroy the rocket if it deviates from its trajectory and becomes a threat). When the Orion spacecraft was flying almost 150,000 kilometers from Earth according to FortuneCommander Reid Wiseman encountered a mundane problem faced by any mortal with a computer and Microsoft email: an Outlook crash. The incident. The launch of Artemis II could be followed live and in that live, approximately 13 hours and 15 minutes after the broadcast began there is a fragment where the problem appears: “I see that I have two Microsoft Outlook accounts, and neither one works. If you could connect remotely and check Optimus and those two Outlook accounts, that would be great.” At first, Wiseman had issues related to the Optimus software, but then he pointed out a more trivial concern: There were two instances of Outlook running on his personal computing device. As a curiosity, the live stream to follow the takeoff still available on YouTube. Why it is important. The Artemis II mission is historic and the stream has left for posterity its first hours of flight and this anecdote that constitutes what is probably the first Microsoft technical support ticket generated from space. Beyond the joke, the episode shows that today’s space exploration and its cutting-edge technology coexist with commercial productivity software and its common failures. When an agency standardizes its entire infrastructure on a single technological ecosystem, the problems of that ecosystem also become problems of the mission. Tap to go to the post There is a support ticket from the Moon. As with any standard corporate ticket, the user first reported the incident, the technical team took over remotely, and finally closed the case. Houston accepted the request for remote access to the commander’s device, identified in records as PCD 1, and about an hour later, Outlook was back up and running. After 14 hours and 20 minutes of broadcast, someone from mission control communication said: “We managed to open Outlook. It will appear as “offline”, as expected”, as pick up Tom’s Hardware. Why they use Outlook in space. That there is Microsoft software on board is not something casual or improvised: Microsoft is a strategic partner of NASA that provides everything from productivity software to cloud data infrastructure and artificial intelligence (NASA Earth Copilot), hardware and mixed reality and Minburn Technology Group is your partner for software support and maintenance. In fact and according to NASAthe personal devices of the astronauts on the Orion spacecraft are Microsoft Surface Pro and the software they run is Commercial Off-The-Shelf, That is, standard commercial software for everyday tasks like talking to your family or managing your photos and videos. Another thing is the spacecraft and main flight systems: these are powered by specialized radiation-resistant hardware and specialized software with rigorous maintenance. The bathroom was also broken. The Outlook failure was not the only technical problem in the first hours of the flight, as can be seen in the broadcast. About two hours after launch, a malfunction light came on in the ship’s waste management system: the urine extractor fan had jammed. This component is responsible for sucking urine into a collector, avoiding the uncomfortable and unhygienic effects of microgravity. NASA confirmed shortly after the toilet problem had been solved. In Xataka | NASA had been refusing to allow its astronauts to carry iPhones for decades. For Artemis II you have made a historic decision In Xataka | The Artemis II astronauts will carry out experiments in what will be their own study models Cover | POT and Ed Hardie

In 1993 Microsoft created Encarta to revolutionize knowledge. Twenty years later it would be devastated by a tsunami

It became so popular that its logo and the sound of their intros They became two brands just as identifiable as those of Nokia or Windows. If – like the person writing this – you had to go to school or high school between the second half of the 90s and the first half of the 2000s, talk about the Encarta It does not require large presentations. If not, don’t worry; It won’t take us much time. Before Wikipedia offered free online knowledge and even the use of the Internet became popular, Microsoft launched a digital encyclopedia that revolutionized the sector and became a phenomenon between more or less 1993 and 2009. Its name: Encarta. Today, ironies of history, “Encarta” is one more entry in the index of other encyclopedias; but there was a time when it transformed our way of accessing knowledge. From having to spend their eyelashes and fingertips scrolling through pages in search of information, students began to search for information with the click of a button. The Encarta offered an agile, comfortable and above all didactic way to satisfy curiosity. With articles, yes; but also with videos, audios and even virtual visits and games. You could read about Nepalese temples in the Salvat. Or open the Encarta and “tour” one. Its “pull” was so great that it put the old paper encyclopedias in trouble. When the Spanish edition was presented in early 1997, those responsible presumed that the Encarta CD-ROM, a format that you could store in a drawer or even a folder, contained information that It was equivalent to 29 volumes and 1.2 meters of shelving. Not only that. The Encarta cost 24,900 pesetas, four times less than an equivalent printed encyclopedia. To make matters worse, his landing in Spain was protected by Santillanaa publishing house with considerable weight in school classrooms. How to compete with that? The product was liked and published in Spanish and other languages. He did well until, with the same ones with which he had become a phenomenon, ended up succumbing to the competition. In a way, his success is due to his good sense of smell in the 90s; its decline, to the inability to adapt in the 2000s. This is your story. Objective: reinvent the old encyclopedias In the mid-1980s Microsoft He began to think about the idea of ​​creating a digital encyclopedia. The idea was ambitious. Those from Redmond wanted, neither more nor less, to rethink the concept and operation of a product apparently as mature and closed as the volumes that publishers’ commercials were dedicated to selling door to door. To make its debut in a big way, the multinational tried to negotiate a license with the creators of what was probably the most respected publication internationally: the Encyclopædia Britannica. It didn’t go well for them. In the 1980s, paper volumes of Britannica were sold and They left huge profits. As Enrique Dans remembershis books cost about $250 to produce and the selling price ranged between $1,500 and $2,200, depending on the quality. Why would the firm want to digitize content on a CD and risk killing the goose that lays the golden eggs? Microsoft did not give up and looked for ways to move the idea forward. He even had a name for the initiative: Project Gandalf. Some time later he closed a contract with Funk & Wagnalls to use your New Encyclopediaof 29 volumes, in a database that was created at the end of that same decade. To complete its contents, years later two other McMillan encyclopedias would be added, the Collier’s and New Merit Scholar. They were not the Britannica; but it would have to do. However, doubts arose in Redmond about whether or not the project was viable and they decided to park it. It was resumed at the turn of the decade, in 1991, when Microsoft decided to go all out. In 1993, the first edition of the Encarta Encyclopedia was launched, which included the 25,000 Funk & Wagnalls articles and extra material, such as images and some animations. The tool was comfortable, much more agile than the kilometric tomes and even fun, but it started with a huge mistake: the shot was centered wrong. At the beginning of the 90s there were still many houses without a PC and the marketing price was exclusive. When it came out, the Encarta cost about $400, which greatly limited its range. The cost deterred customers and was not too far from that of another competitor that was testing the same niche with a recognizable brand, Compton, which also launched your own multimedia version in 1990, with text and supports such as images and sounds. In Redmond they knew how to react and soon they were deploying a more aggressive strategy. They launched promotions that allowed you to get the Encarta for 99 dollarsthey included their CD with the Windows software package and negotiated with manufacturers to incorporate it into their computers, a tactic not unlike that used with Windows and Office. The promotion of Microsoft itself gave the final push. The new encyclopedia gained fame and began to chain editions, translate into different languages and enrich content with multimedia supports. In 1995, abridged versions of some articles were offered for Microsoft Network ISP subscribers, and starting in ’96, standard and deluxe editions began to be released, an enriched version that could be updated month by month. In 1998, its creators went one step further and acquired the rights to several electronic encyclopedias. The product was growing and, above all, it demonstrated that the sector was experiencing a clear paradigm shift. The best example: in 1996 the once powerful company Britannica ended up underselling for their difficulties. “It allows young and old to explore the world by themes and characters,” their promoters boasted in the Spanish market. And so it was, indeed. Through articles, photos, illustrations, graphs, maps, timelines, recordings, videos and even virtual tours, Encarta won over an entire generation of students. … Read more

Microsoft killed the traditional Xbox by saying that everything was an Xbox. Now he wants to resurrect it with Project Helix

Microsoft has quietly withdrawn its “This is an Xbox” campaign, the initiative with which it had spent 16 months trying to convince the world that any device (television, mobile phone, tablet) It was technically an Xbox.. The deletion coincides with the replacement at the top managementthe debut of Project Helix at GDC and a market paradox: Sony and Microsoft have become, at the same time, the main defenders of the concept “a console is a console.” The campaign. The series of ‘This is an Xbox’ ads were launched under the presidency of Sarah Bond and functioned as the great manifesto of the post-hardware era of Microsoft Gaming. Now it has disappeared without an official statement: the blog entry that opened it on Xbox Wire gives error 404and searching for the term in the official Xbox news repository only returns one article about ROG Xbox Ally. The files indicate that the page was still accessible on March 1, 2026. What was it about? The idea behind “This is an Xbox” was, in theory, reasonable: expand the ecosystem beyond its own hardware, bet on the streaming in the cloud as a gateway and normalize that playing Xbox did not require purchasing an Xbox console. The problem is that the argument, taken to its extreme, destroyed the reason for the hardware. The campaign generated more confusion than interest, with fans wondering why they would buy an Xbox if the titles were available on any platform. The rejection. Apparentlythe initiative was not well received internally, and the company made some strategic lurches. For example, the announcement of an Xbox mobile store in summer 2024 never materialized. A few months later, with the arrival of Asha Sharma as the new CEO of Microsoft Gaming and the departure of Phil Spencer and Sarah Bond, the campaign has ended up being withdrawn. The phrase with which Sharma summed up this new change of direction speaks for itself: “The plan is the plan until it isn’t“. More from Helix. The same day that the 404 of “This is an Xbox” was discovered, Microsoft had a presence at GDC 2026 with the Developer Summit dedicated to Project Helix. Jason Ronald, vice president of Next Generation at Xbox, presented the technical details of the upcoming hardware: a console powered by a custom AMD SoC, co-designed for next-generation DirectX and FSR technology, and which the company describes as “an order of magnitude leap” in gaming performance. ray tracing: pscaling Next Generation ML, ML Multi-Frame Generation and Ray Regeneration for games with path tracing The technical details that AMD provided completed the picture: the custom chip is built on RDNA 5 architecture and TSMC’s 3nm process, and incorporates a dedicated NPU that will power all advanced rendering capabilities, including FSR Diamond. Developer alpha kits will begin shipping in 2027, and the company is committed to maintaining compatibility with games from four generations of Xbox. Not everything is perfect. The complicating point in the “return to consoles” story is that Microsoft told the developers at GDC that “build for PC” is the correct approach going forward, suggesting that Project Helix is, at heart, a PC disguised as a console. That is, it is closer to the ambitious project of Valve with its Steam Machine that of the Sony gives up making more PC games. In addition, Xbox Mode will arrive on Windows 11 in April, bringing the console experience directly to the desktop PC, and the Play Anywhere catalog already exceeds 1,500 titles. The Sony thing. It is commented that Sony is returning to the old strategy of exclusives as a hardware sales lever after the PC ports did not work as expected. Part of the problem was one of timing: games arrived on PC months or years after the console launch, making it difficult to build a stable audience on the platform. There is Steam data very significant: ‘Marvel’s Spider-Man Remastered’, for example, reached a peak of just 66,000 simultaneous players, a figure that did not justify the continued investment in big-budget game ports. Sony and Microsoft, two companies that took opposite paths in the last generation (one opening up to the PC, the other trying to dissolve the very idea of ​​the console), have simultaneously reached the same conclusion. A console is a console, and hardware has to have value. In Xataka | “We will not flood our ecosystem with soulless AI garbage.” We already know what Asha Sharma wants to do as CEO of Microsoft Gaming

The MacBook Neo is everything Microsoft dreamed of with the disastrous Windows 8

It was 2012 and Windows 8 He defied all canons. The mouse and keyboard were no longer enough: Microsoft wanted let’s touch the computerthat we handle it like an iPhone. That ambition led to the birth of one of the operating systems most original and brave in history. And also one of the most hated. Its greatest architect, Steven Sinofskyhas compared that launch almost 15 years ago with that of MacBook Neo which has just occurred and has left a clear message: with Windows 8 Microsoft was right. The only problem is that they arrived too soon. The Mac Neo is a “paradigm shift”. In its ‘Hardcore Software’ newsletter, Sinosky counted how he had bought one of the new MacBook Neo in “citrus” color with 512 GB of storage and “it completely blew my mind. It is a computer that changes the paradigm.” Their impressions coincide with other independent reviews: the performance of this device is indistinguishable from that of a conventional MacBook Air in everyday tasks. And that despite use a phone chip and not a “pure” laptop one. Windows 8 nostalgia. The use of the Neo has generated a feeling of melancholy and sadness in Sinosky when remembering his time at Microsoft. This Apple product is in fact the culmination of a concept that he tried to push more than a decade ago. At Microsoft they believed that a Windows laptop with an ARM processor made sense, and Sinofsky led that vision that led to the launch of Windows 8 and later Windows RT and the Surface RT. we were right. The MacBook Neo is for this former Microsoft executive the demonstration that he and his company were right when they tried to launch that product. According to him Windows on ARM and the Original Surface from 2012 They were not a technical error: that computer had an NVIDIA Tegra chip, 2 GB of RAM and 64 GB of storage, and “it had no problems running Office or browsing.” In his opinion, the hardware and software were not green – a very debatable point – and the failure was something else. People don’t like changes. Sinofsky explains that the mistake was trying to move the ecosystem to a new application model too quickly. “People wanted the old Windows application model,” but there was no way at the time to make it more efficient or secure, “it was designed for another era.” Microsoft certainly had the problem that its installed base was mostly conservative users: proposing a change as big as that, jump to an ARM architecture for goodit was unviable. Apple knew how to transition. Apple’s triumph with its ARM chips was due to the fact that its transition process has lasted almost two decades. During that time the company has been eliminating old code and obsolete APIs, allowing a smooth transition to its own Apple Silicon chips. Being early is not being wrong. Sinofksy also highlights how often being first on an idea—as was the case with Windows 8 or the Surface ARM—is often mistaken for being wrong, when in fact the problem was the execution of the ecosystem transition and not the concept itself. Reasonable sacrifices. Although there are clear hardware limitations (fewer ports, slightly different screen, smaller trackpad), they are irrelevant compared to the efficiency and portability of the device. The MacBook Neo is the definitive Chromebook. Apple’s affordable equipment is for this manager a “better Chromebook” focused on productivity, which is just the rescue plan he proposed for Windows RT after his departure from Microsoft in 2012. His vision, he argues, was the right one: the transition to ultra-efficient ARM devices was the inevitable future of personal computing. Yes, but. Sinofsky’s arguments are powerful, but also debatable. To begin with, Windows 8 and RT were designed to be much more “touchable”, but the touch interface has never gone beyond being an accessory in convertibles with Windows. Apple has in fact not touched the MacBook Neo operating system and has moved away from the idea of ​​the iPad converted to laptop. This is a MacBook with a cell phone chip, yes, but with a desktop operating system designed to be used with a keyboard and mouse. Without further ado. The condemnation legacy. There is another element that made it almost impossible for Windows RT to succeed: Microsoft had been feeding a monster called Windows on x86 architectures for a quarter of a century. End users could certainly have assumed an architectural change, but things were much more complicated in companies, where Windows adoption was massive. And of course, there are the apps. Applications that ran well on x86 ran poorly or not at all on Windows RT with ARM chips. Although Microsoft tried to address that problem —keep doing it with the “standard” of PC Copilot+—, he never completely succeeded and the public perception was clear: I don’t trust that the app I use on my x86 PC works well on an ARM PC. Apple overcame that obstacle with its Rosetta emulation layer (an invisible bridge) and the support of users and developers, but for them it was clearly simpler: they did not have the burden of millions of computers running legacy applications in offices and servers. Microsoft attempted a radical “clean slate” that left users without their long-standing programs. The Copilot+ PCs promised something like this. Microsoft actually wanted to resurrect the concept recently. The launch of the Copilot+ PCs relied heavily on ARM chips such as those manufactured by Qualcomm. The promise was that we would have cheap laptops, with enormous autonomy and that also no longer had compatibility problems with the software. The reality? The prices are basically the same as those of Intel/AMD equivalents, and although there are improvements in autonomy, the perception is that there is nothing particularly differential in this bet by Microsoft and some manufacturers. This is an opportunity. But all is not lost. Microsoft and manufacturers have in the MacBook Neo a demonstration that the concept … Read more

Microsoft wants Copilot to do more complex tasks. To achieve this, it has turned to Anthropic AI

For a long time, when we talked about artificial intelligence at Microsoft, there was one name that came up again and again: OpenAI. The relationship between both companies was decisive for the takeoff of ChatGPT and also for the launch of Copilot. But the AI ​​board is moving quickly. New models, new players and increasingly intense competition are pushing large technology companies to diversify their bets. In that context, Microsoft’s latest move is understood. The advertisement. Microsoft has decided to integrate Anthropic technology within Copilot, the assistant that is already part of tools such as Outlook, Teams or Excel within Microsoft 365. Among the new features is coworka tool based on Anthropic technology aimed at facilitating tasks within the work environment. But that’s not all: Claude’s models will also be available within the Copilot chatbot alongside the more advanced OpenAI models, thus expanding the capabilities of the assistant without depending on a single artificial intelligence provider. From asking for something to delegating work. Microsoft explains that Cowork is designed to go a step beyond the classic model of an assistant who answers questions or writes texts. The idea is that Copilot can take care of entire tasks within Microsoft 365. When the user makes a request, the system converts it into a work plan that runs in the background. To do this, it uses data from Outlook, Teams or Excel. From there, in theory, you propose actions, ask for clarification if needed, and allow the user to review or approve each step before the changes are applied. Some examples. Let’s imagine, for example, that we ask Copilot to review our agenda in Outlook. The system could analyze the calendar, detect conflicts between meetings and identify lower priority meetings. From there I would propose different adjustments, such as rescheduling some appointments or reserving blocks of time to focus on more important tasks. Once those suggestions are reviewed and approved, the system itself could apply the changes automatically, accepting, rejecting or rescheduling meetings and reserving blocks of time to focus on other tasks. The strategy. As we noted above, the move also reflects how Microsoft’s AI strategy is changing. The company has maintained a very close relationship with OpenAI for years and continues to be one of its largest shareholders, with a stake close to 27% after investments of around $13 billion since 2019. However, the rise of new models and the rapid evolution of the sector are pushing large technology companies to not depend on a single technology. Incorporating Anthropic tools within Copilot points precisely in that direction: building an ecosystem capable of relying on different models depending on the task. Platforms before models. What we are seeing with decisions like this is that the race for AI is not limited to developing increasingly advanced models. It’s also about deciding where those capabilities are going to live. In the case of Microsoft, the answer seems quite broad: The company has been integrating Copilot into more and more products and services in its ecosystem (and also external ones). For some users, this constant presence can be very useful; For others it can be somewhat invasive. But beyond these perceptions, the movement clearly shows Microsoft’s strategy. On the whole. So this is not just about adding another technology within Copilot, but rather reinforcing the idea that Microsoft wants to turn this assistant into a meeting point for different AI capabilities within its software. Incorporating Anthropic models alongside those of OpenAI points precisely to that scenario. Rather than relying on a single technology, the company appears to be laying the groundwork for a Copilot capable of combining different solutions as the AI ​​market continues to evolve. Images | Microsoft In Xataka | The best and worst of the Internet we know has been built on anonymity. AI brings bad news

“We will not flood our ecosystem with soulless AI garbage.” We already know what Asha Sharma wants to do as CEO of Microsoft Gaming

Friday night has been busy in the gaming world with a movement that, more than a change of cards, represents a paradigm shift in Microsoft’s video game division: the end of the Spencer era and the resignation of Sarah Bond as president of Xbox. Phil Spencer has left the company after almost 40 years, 12 of which he has been leading the gaming area. The new CEO of Microsoft Gaming is Asha Sharma. Who is Asha Sharma. The 36-year-old Indian-American’s CV includes Instacart, where she was director of operations for three years, until she left the firm for Microsoft in 2024. She previously served as vice president of product and engineering at Meta, leading, among other things, the company’s messaging apps. And more than a decade ago he worked in the marketing area of ​​Microsoft. Another leadership profile. Spencer’s leadership was almost evangelical: his era was characterized by rebuilding the brand after the discreet launch of the Xbox One in 2013expansion through acquisitions such as that of Activision Blizzard for 69,000 million dollars and its total commitment to Game Pass. However, Xbox has still not won the console war and its studios have been chaining cancellations and closures in recent times. Sharma’s career is meteoric, but she lacks a track record within the video game industry: she is neither a designer nor a dev, she is an operations and technology executive who comes from leading enterprise AI teams at Microsoft. The new Sharma aims more at operational efficiency, AI and platform ubiquity. Asha Sharma’s roadmap with Xbox. Sharma has already published its first statement where it establishes three axes: Great video games. His message is reassuring for fans: there will be iconic franchises, a commitment to creativity and innovation, and complete trust in Matt Booty. The return of Xbox. You want to put the console back in. center, something that with Spencer had been blurred. Of course, without giving up PC, mobile phones and cloud gaming. The future of gaming and AI. Sharma promises not to flood his ecosystem with artless garbage: “Games are and always will be art, created by humans and with the most innovative technology we offer.” Surprising from someone who comes precisely from there. In summary it would be: AI yes, but with a head. Unknowns and challenges. Its first message is promising but vague and leaves many key questions in an area where finding balance is complicated. If Microsoft, which is the largest player in the sector by capitalization, puts someone without gaming DNA on the front line, it sends a signal of where the business is going that points to platforms, subscriptions, generative AI, platforms… the question is whether that is compatible with making great games. On the other hand, Sharma mentions that games are “art made by humans” but also that AI will “evolve and influence.” We will have to see what the conciliation is like. In addition, neither she nor Booty have clarified What will happen to the studies that Microsoft has closed. Finally, the Xbox Everywhere model invites you to play on any device and makes more sense than ever, so there is no doubt to wonder about the future of consoles as devices. In Xataka | Video games have grown a lot this year. But the money goes to China, Roblox and the owners of mobile platforms In Xataka | Windows was the kingdom of gaming for decades: Microsoft knows that something has gone wrong, and promises these changes Cover | Microsoft

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.