is renovating its bullet train with 5G-enabled windows and noise-cancelling cabins

High-speed rail is going through a phase of maturity where the differential is no longer so much speed, but rather productivity and comfort. In short: it wants to become a fully fledged alternative to flying in the business segment. While Spain is consolidated as the second country in the world in high-speed network and leader in technological interoperability, Japan (which was a pioneer with the Shinkansen in 1964) wants to regain its hegemony with deep digitalization. JR Central, the rail operator of the Tokaido Shinkansen, has announced which will equip the next premium suites of its famous bullet trains with windows with integrated 5G antennas and active ambient noise cancellation without the need for headphones. The news. The improvements are not cosmetic, but a serious commitment to transform the premium car into a work or rest environment comparable to a private office. 5G antennas on the glass. The technology is provided by AGCa Japanese company that weaves conductive microfibers into the glass to form an antenna that is connected to the on-board Wi-Fi router. While conventional systems bounce the 5G signal inside the train before reaching the router, antennas integrated into the glass keep direct line of sight with outside base stations. That is, a more stable Wi-Fi with a stronger signal. Integrated ANC. The system is called Personalized Sound Zone (PSZ) and has been developed by NTT. Its operation is like that of headphones with active cancellation: it detects the waveform of the ambient sound and from there, projects its inversion to cancel it. The main difference is that you don’t need to cover your ear: it uses a combination of microphones, speakers and spatially optimized low-latency processing. Why is it important. JR Central’s bullet train reaches speeds of up to 285 km/h, meaning it passes mobile network base stations so quickly that it often needs to reconnect to another radio. This phenomenon is known as handover and degrades the quality of the connection: it is the Achilles heel of connectivity in high-speed trains around the world. Yes, those internet outages also happen on the AVE. Finding a solution in windows is technically elegant and scalable. From the point of view of the premium segment, the robust connectivity and acoustic isolation without headphones places it in direct competition with the business class flight on routes within the country. Of course, in the absence of the price. For now, JR Central has only confirmed which will be more expensive than tickets for first-class Shinkansen Green Car seats, which already cost 40% more than a standard unreserved ticket. An example: standard ticket from Tokyo station to Kyoto (2 hours and 15 minutes) costs 13,320 yen and 18,840 yen for the Green Car, that is, 71 and 100 euros respectively. Context. The Tokaido Shinkansen is the busiest high-speed line in the world. Private compartments disappeared from the line in 2003 with the withdrawal of the Series 100 double-decker trains, which included private cabins. This new initiative means the return to the format in style two decades later. Regarding glass antenna technology, the collaboration between AGC, NTT Docomo and Ericsson has been going on since 2018. In fact, in 2019 this conglomerate of companies reached the world’s first 5G communication using an integrated fused silica glass antenna to transmit and receive 28 GHz signals, with average download speeds of 1.3 Gbps and maximum download speeds of 3.8 Gbps in a range of 100 meters. What JR Central is now announcing is the first commercial application in high-speed trains. And in Spain? In connectivity, there is a technological and approach gap. The AVE has Wi-Fi since December 2016 thanks to a system of outdoor 4G-LTE antennas combined with satellite, routers, servers and access points. That is, like what JR Central has just overcome: capturing the signal from outside and redistributing it inside. AGC’s solution for the Shinkansen eliminates this weak point by maintaining direct line of sight with the base stations, something especially critical at high speed. Of course, while PlayRenfe is universal for all travelers, the Shinkansen Wi-Fi will be in the luxury suites. Yes, but. The deployment is very limited: in the initial phase only a couple of suites will be installed in six trains, so coverage is residual against the park of operational units total and JR Central has not made public a roadmap for the expansion of these technologies. On the other hand, NTT’s noise cancellation technology applied to a train poses its own structural challenges, ranging from noise variations at 285 km/h to pressure changes in tunnels. It will be necessary to check the real effectiveness of the system under these conditions. In Xataka | There was a day when Japan was the leading high-speed country. It has been surpassed by China, a victim of its own country In Xataka | In 2015, Japan showed the world a train capable of reaching 600 km/h. Ten years later we still don’t know anything about him Cover | Fikri Rasyid

France wants to replace Windows with Linux. Extremadura and Munich tried it before, and both failed

On April 8, 20226, the French Digital Interministerial Directorate (DINUM) advertisement that will migrate your jobs from Windows to Linux. He ordered all ministries to present a plan by the fall with the aim of eliminating dependencies on non-European software. The announcement in fact goes beyond changing Windows for Linux: it also affects collaborative tools, antivirus, AI, databases, virtualization or telecommunications. It is, on paper, the largest operation to replace proprietary software with free software that a Western State has ever attempted. And if the history of this type of projects teaches us anything, it is that many have ended in failure. French sovereignty. It is not that France is a lover of free software; What has happened is that the relationship with the United States has changed. Trump’s tariffs accelerated a debate that had been postponed for years: To what extent is it sustainable to depend on the US digital infrastructure? French companies like OVHCloud and Scaleway did not stop growing in 2025, but France has already taken some previous steps recently. In January 2026, announced the plan to replace Microsoft Teams and Zoom with its own video conferencing platform, called Visiowith the aim that its 2.5 million employees would use it. At the moment 40,000 of them are using it, and it remains to be seen if the deployment ends up being total. This was LinEx, the Linux distribution derived from Debian that was used in public organizations in Extremadura. Spain tried it in 2002. The Junta de Extremadura is one of the most famous cases of attempted replace Windows with Linux in public administration. In 2001 it launched LinEx, a Linux distribution based on Debian, and tried to implement it massively in the educational environment and in the health system of the autonomous community. That was imitated in other Spanish regions: Andalusia had Guadalinex, Valencia had LliureX, Madrid had MAX, Galicia had Galinux, Catalonia had Linkat and Castilla La Mancha had Molinux. All of these projects proposed an alternative to the absolute dominance of Windows on the desktops of public officials, and they all failed, but the biggest failure was the one that promised the most: that of LinEx. LinEx myths and realities. Although this distribution worked reasonably in the aforementioned education and health environments, it never fully penetrated the general public administration of the Autonomous Community. In 2011 the project was transferred to a state foundation due to budget cuts and by then only 1% of positions of the Extremaduran autonomous administration used free software. The final blow came when SAP, which managed the community’s medical records system, decided to stop supporting Linux. That made this body return to Windows, and in fact in 2024 the Board formally eliminated the obligation to use gnuLinEx. Rise and fall of Linux in Munich. another case even more famous At the European level it was Munich. In 2003, the city council of this German city announced that it would migrate 14,000 Windows computers to LiMux, its Debian-based Linux distribution. In 2013 the project seemed a success: there were 12,000 migrated computers and theoretically more than 11 million euros had been saved in licenses and other costs. However, in 2014, complaints about loss of productivity and debate began. ended sharply: At the end of 2017, the leaders of Munich decided to migrate 29,000 PCs of their employees to Windows 10 from LiMux. The initial migration was never complete, and in many cases there was a mix of Windows and Linux systems to complete the processes, something that seemed inefficient and never managed to eliminate the dependency on Windows and especially on legacy applications. But there are silent successes. LinEx and LiMux failed in Spain and Germany, but there is precedent to show that abandoning Windows in favor of Linux can work. It proves it GendBuntua version of Ubuntu that was implemented in the French National Gendarmerie. This organization was already a pioneer in the adoption of the OpenOffice.org office suite in 2005, and since 2008 the plan was to abandon Windows in favor of its own Linux distribution. In June 2024, GendBuntu runs on 103,164 jobswhich represents 97% of the IT park of this organization. This has also saved around two million euros per year on licenses, and has reduced the total cost of ownership (TCO) by 40%. Another promising example: Schleswig-Holstein. This German state began its migration from Windows and Office to Linux and LibreOffice in 2021. In early 2026 had already completed almost 80% of the transition in its 30,000 jobs and according to its data that allowed savings about 15 million euros in licenses only in 2026. A one-time investment of 9 million euros is planned to complete 20% of the process, which is still tied to certain specialized applications and will therefore take a little more time. This is the model that is closest to the French initiative: gradual migration and above all a political will that is maintained among the legislatures that are in power. What distinguishes success from failure. Cases that work share three characteristics. The first, gradual and phased migration, not sudden and massive. The second, real internal technical support that goes beyond political declarations. And the third (and probably the most important), a sustained will beyond an electroral cycle. Those who fail share three others: trying to migrate everything at once, underestimating the cost of legacy applications and depending on the project not changing government, which certainly contributed, for example, to the failure of LinEx. A colossal challenge. Installing Linux on a computer is trivial today and it is true that today the learning curve has been significantly reduced and its use is very similar to that of Windows or macOS. The real problem is in the applications that run on top of it. In public administration there is often critical software tailored for Windows, forms that only work in certain browsers (including the old Internet Explorer), management systems that do not have equivalents in Linux or vendors that simply do … Read more

France has begun to retire Windows from its administration. It is the beginning of his divorce from Microsoft, Google and Amazon

Digital sovereignty in Europe has gone from being a theoretical concept to something increasingly tangible and desirable with respect to the technology we consume. It is no longer just a trend that is increasingly more individual people are tryingbut has also become an object of desire for administrations and companies. The path to becoming independent from big tech in the United States is not easy and while there are startups like Mistral who gets rich in the processthere is a state that has decided to take a brave step forward: France. In a global environment where data and infrastructure are geopolitical weapons, the French Government, through the Interministerial Directorate for Digital (DINUM), has launched an aggressive roadmap to regain control over their information systems, thus reducing the hegemony of non-EU technological solutions. And it has started with Windows. The decision. In a high-level inter-ministerial seminar, DINUM together with ANSSI, the State Purchasing Directorate and the DGE formalized the most ambitious commitment to digital sovereignty adopted to date by a Western European power. Or what is the same: France wants to exit the American technological ecosystem in a systematic, planned way and with specific deadlines. It is not an experiment, it is state policy. The guideline is clear: map and reduce dependence on technology suppliers from outside the EU. The measure is not a veto but rather a mandatory transition towards a model where public administration must prioritize local or open source solutions, especially in critical services and sensitive data processing. As has declared the Minister of Action and Public Accounts David Amiel: “ We can no longer accept that our data, our infrastructure and our strategic decisions depend on solutions whose rules, prices, evolution and risks we do not control.” Why is it important. From a systems engineering and cybersecurity point of view, the measure is vital for issues such as protecting against Cloud Act of the United States, the law that allows its authorities to access data stored in American companies regardless of where the servers are located. On the other hand, it guarantees that the state maintains its necessary technical capabilities to operate its own infrastructure without depending on proprietary “black boxes” and to heal itself in the event of a change in conditions or other external problems. But this phased migration is much more than an OS change: it involves dismantling the entire associated ecosystem, certificates and applications designed for Windows. It means rebuilding the digital foundations of the state from the roots so that they function with total autonomy and without foreign parts, without citizens noticing the change on the surface. Context. Our daily personal, professional and bureaucratic lives live in an ecosystem governed by hyperscalersthose technology companies like Microsoft, Google or Amazon that dominate storage and cloud computing. This mention is not random: they alone eat more than 60% of the cloud cake, as Statista collects. The increase in cyber threats and the US technological monopoly in the West and its increasingly invasive turn to the privacy of others have done the rest. France has been maturing the doctrine for years “Cloud au Center“. While the ANSSI audited the dependencies on critical infrastructures, its sovereign cloud was being forged as a real alternative. In addition, the European regulatory framework, with the NIS2 directive wave cyber resilience lawhas created the ideal breeding ground. With tools like TchapVisio, FranceTransfert and Socle Numérique (alternatives to WhatsApp, Teams, WeTransfer or Microsoft 365, respectively) France no longer only has a plan, but a real operational base on which to scale. The plan towards sovereignty. It is neither a toast to the sun nor does it have vague and diffuse measurements or distant dates, but concrete, tangible movements and which is either already being implemented or is scheduled to be completed before the end of the year: DINUM abandons Windows and migrates its jobs to Linux. It is the first central State agency to do so. Already underway. Migration of 80,000 agents from the Caisse Nationale d’Assurance Maladie (equivalent to Social Security) to sovereign tools: Tchap, Visio and FranceTransfert. Already underway. Migration of the health data platform to a reliable European solution. Scheduled for the end of 2026. Duties for each ministry: present a dependency reduction plan, which includes databases, antivirus, AI or collaborative tools. For this fall. Yes, but. France has a basic skeleton and a legal framework, as well as public-private coalitions to accelerate the transition through concrete and measurable public commitments. But it won’t be easy. Exiting Windows involves disassembling Active Directory and what is behind it, something that costs a lot of time and money. And migrating 80,000 agents to new tools is not so much a technology problem but rather a problem of implementing new management. Also, go out where. Many European solutions still do not reach the integration, ease of use and capacity (especially in AI) of American big tech, which implies a step backwards in terms of quality. But even if it were possible, moving from a proprietary infrastructure to a sovereign one implies an enormous investment in time, personnel training and data migration. Finally, maintaining and evolving our own infrastructure requires specialized and experienced personnel in a market where talent is scarce and expensive. In Xataka | The CEO of Mistral sends a message to Europe: the end of being the technological vassal of the United States In Xataka | Europe seeks to become independent from Microsoft Office. Your alternative is already here, but not without controversy Cover | Clint Patterson and Arno Senoner

In 1994, a programmer created a “temporary” interface for Windows. Three decades later he is still with us

Windows is one step away from turning 40 years old. The first version of the operating system appeared in November 1985and since then it has not stopped evolving. However, Microsoft tends to take a long time to update some components of its products. With Windows 10, for example, it released a renewed user interface, but it was not until years after its launch that it began to get rid of some icons from the Windows 95 era. Now, in Windows 11is renewing programs like paint and Notepad. Regardless of how modern Windows 11 may feel, and all the new features that come with its updates, the system still retains some elements that we could classify as historical. Among them we find the utility to format disks. WINDOWS 10: 9 VERY USEFUL and LITTLE KNOWN TRICKS Currently, if you wanted to format a storage drive from Windows 11 you would find a pop-up window practically identical to the one you could find decades ago. In fact, we know exactly who created it. The format drives dialog in Windows 10 A former Microsoft programmer named Dave Plummer recently shared an some interesting facts about this part of the operating system. The now entrepreneur says he created the Format dialog box one rainy morning from the end of 1994. He says that they were migrating millions of lines of user interface code from Windows 95 to Windows NT, and that the formatting section was very different between systems, so it was necessary to create a new user interface. And Plummer took on this task. The programmer did not think of doing a definitive job, but of providing a temporary solution with the help of a sheet, a pen, Visual C++ 2.0 and the Resource Editor. “It wasn’t elegant, but it would do until the elegant user interface arrived,” he says in the message. Plummer also set the 32GB limit for the format of FAT volumes that morning. It is curious, because FAT is capable of working with larger volumes, although to create volumes with this capacity it is necessary to use the command line. The disk formatting utility interface appeared in Windows NT-based operating systems, such as Windows 2000 and Windows XPand it has been with us ever since. Throughout this time it has basically been a temporary solution created in 1994. Images | Windows | Genbeta In Xataka | Intel is hunting and capturing new customers. His next goal: convince Elon Musk and make chips for Tesla

The MacBook Neo is everything Microsoft dreamed of with the disastrous Windows 8

It was 2012 and Windows 8 He defied all canons. The mouse and keyboard were no longer enough: Microsoft wanted let’s touch the computerthat we handle it like an iPhone. That ambition led to the birth of one of the operating systems most original and brave in history. And also one of the most hated. Its greatest architect, Steven Sinofskyhas compared that launch almost 15 years ago with that of MacBook Neo which has just occurred and has left a clear message: with Windows 8 Microsoft was right. The only problem is that they arrived too soon. The Mac Neo is a “paradigm shift”. In its ‘Hardcore Software’ newsletter, Sinosky counted how he had bought one of the new MacBook Neo in “citrus” color with 512 GB of storage and “it completely blew my mind. It is a computer that changes the paradigm.” Their impressions coincide with other independent reviews: the performance of this device is indistinguishable from that of a conventional MacBook Air in everyday tasks. And that despite use a phone chip and not a “pure” laptop one. Windows 8 nostalgia. The use of the Neo has generated a feeling of melancholy and sadness in Sinosky when remembering his time at Microsoft. This Apple product is in fact the culmination of a concept that he tried to push more than a decade ago. At Microsoft they believed that a Windows laptop with an ARM processor made sense, and Sinofsky led that vision that led to the launch of Windows 8 and later Windows RT and the Surface RT. we were right. The MacBook Neo is for this former Microsoft executive the demonstration that he and his company were right when they tried to launch that product. According to him Windows on ARM and the Original Surface from 2012 They were not a technical error: that computer had an NVIDIA Tegra chip, 2 GB of RAM and 64 GB of storage, and “it had no problems running Office or browsing.” In his opinion, the hardware and software were not green – a very debatable point – and the failure was something else. People don’t like changes. Sinofsky explains that the mistake was trying to move the ecosystem to a new application model too quickly. “People wanted the old Windows application model,” but there was no way at the time to make it more efficient or secure, “it was designed for another era.” Microsoft certainly had the problem that its installed base was mostly conservative users: proposing a change as big as that, jump to an ARM architecture for goodit was unviable. Apple knew how to transition. Apple’s triumph with its ARM chips was due to the fact that its transition process has lasted almost two decades. During that time the company has been eliminating old code and obsolete APIs, allowing a smooth transition to its own Apple Silicon chips. Being early is not being wrong. Sinofksy also highlights how often being first on an idea—as was the case with Windows 8 or the Surface ARM—is often mistaken for being wrong, when in fact the problem was the execution of the ecosystem transition and not the concept itself. Reasonable sacrifices. Although there are clear hardware limitations (fewer ports, slightly different screen, smaller trackpad), they are irrelevant compared to the efficiency and portability of the device. The MacBook Neo is the definitive Chromebook. Apple’s affordable equipment is for this manager a “better Chromebook” focused on productivity, which is just the rescue plan he proposed for Windows RT after his departure from Microsoft in 2012. His vision, he argues, was the right one: the transition to ultra-efficient ARM devices was the inevitable future of personal computing. Yes, but. Sinofsky’s arguments are powerful, but also debatable. To begin with, Windows 8 and RT were designed to be much more “touchable”, but the touch interface has never gone beyond being an accessory in convertibles with Windows. Apple has in fact not touched the MacBook Neo operating system and has moved away from the idea of ​​the iPad converted to laptop. This is a MacBook with a cell phone chip, yes, but with a desktop operating system designed to be used with a keyboard and mouse. Without further ado. The condemnation legacy. There is another element that made it almost impossible for Windows RT to succeed: Microsoft had been feeding a monster called Windows on x86 architectures for a quarter of a century. End users could certainly have assumed an architectural change, but things were much more complicated in companies, where Windows adoption was massive. And of course, there are the apps. Applications that ran well on x86 ran poorly or not at all on Windows RT with ARM chips. Although Microsoft tried to address that problem —keep doing it with the “standard” of PC Copilot+—, he never completely succeeded and the public perception was clear: I don’t trust that the app I use on my x86 PC works well on an ARM PC. Apple overcame that obstacle with its Rosetta emulation layer (an invisible bridge) and the support of users and developers, but for them it was clearly simpler: they did not have the burden of millions of computers running legacy applications in offices and servers. Microsoft attempted a radical “clean slate” that left users without their long-standing programs. The Copilot+ PCs promised something like this. Microsoft actually wanted to resurrect the concept recently. The launch of the Copilot+ PCs relied heavily on ARM chips such as those manufactured by Qualcomm. The promise was that we would have cheap laptops, with enormous autonomy and that also no longer had compatibility problems with the software. The reality? The prices are basically the same as those of Intel/AMD equivalents, and although there are improvements in autonomy, the perception is that there is nothing particularly differential in this bet by Microsoft and some manufacturers. This is an opportunity. But all is not lost. Microsoft and manufacturers have in the MacBook Neo a demonstration that the concept … Read more

The MacBook Neo is the biggest existential threat to the Windows laptop market. And the manufacturers have no answer

Catacrac. This is how the announcement that Apple made with the MacBook Neo. They are modest in specifications, yes, but they have a surprising price/performance ratio if we take into account that it comes from Apple. The company, which seemed like it would never “humiliate itself” with a “cheap” product, has ended up doing just that. And in the process, it has posed an extraordinary threat to Windows laptops with a product that is a missile to the waterline of many manufacturers. A perfect team for many people. We’re all looking for the best product at the best price, and the MacBook Neo is a fantastic balancing act. It is not by far the best laptop one can find, but it is a device with a very reasonable configuration for many people. And it is because many people use the laptop for tasks that do not need more power or features. Apple has also hit the nail on the head with the price: being an Apple product, those 700 euros almost seem like a bargain. A textbook masterstroke. While Windows laptop manufacturers get tangled up in justifying why a laptop It should cost 1,500 euros to do everything you want (not to mention the AI ​​options), Apple has on its hands a product that overturns the perception of value. The MacBook Neo does not seek to win performance races, but rather to be the equipment that any student, administrator or home user buys without looking at another alternative. In 2026, true innovation is not to include an incredible NPU, but to offer a product that solves a need and do so at a price that previously seemed an insult by Apple’s standards. Remembering netbooks. Almost 20 years ago the industry tried to move in this direction with netbooks. These Windows laptops were (very) modest, crude and cheap and generated a lot of expectation, but realities soon arrived. Its limitations were so obvious that they were not worth it, and the concept of the “modest, cheap and functional laptop” was perhaps ahead of its time. Cupertino has arrived on time. Apple seems to have arrived at the right time, because we have been saying for years that mobile chips were already extraordinarily powerful and were wasted both in our smartphones and (especially) in iPads. The MacBook Neo is what netbooks should always be—well, maybe a little expensive for a netbook—with the difference that here the features promise to be much more adequate. Slap for Windows on ARM. The appearance of this team is also a very hard blow for all those teams that have tried to Windows on ARM it made sense. We have seen several throughout these years and everything seemed to indicate that Microsoft and the manufacturers they had a chancebut they have ended up making computers that were basically clones of their variants with Intel/AMD in almost everything. With more autonomy and many AI functionsYes, but with often high prices and with some software limitations because the Windows ecosystem on ARM architecture is not nearly as prepared as Apple’s with macOS, which completed that transition after the launch of the M1 in 2020. There is hope for Microsoft and its users. Manufacturers of Windows equipment will now have to react and come up with competitive options. And they certainly have the potential to do so. Qualcomm has its Snapdragon Meanwhile, NVIDIA already has its SoCs for laptops almost ready —we saw them at CES— so we may be looking at a “second era of netbooks” in which the MacBook Neo competes with Windows/ARM machines on price and features. Of course, it remains to be seen what the real performance, autonomy and reliability of these future devices, including Apple’s, are. Suddenly Apple has a catalog of “affordable” products that puts its competitors in trouble. Beyond the Chromebook. The MacBook Neo could be seen as a “Chromebook killer”, but Google has stopped promoting them and manufacturers no longer lend them either so much attention. In fact, the future of Google laptops It seems to go through Android, not ChromeOS. While the MacBook Neo can certainly be a very reasonable device for students, it is actually an attack on the conventional “home laptop” with which HP, Dell or ASUS have always triumphed. Apple’s prestige plays a lot in its favor here, and it may win over not only young people, but also many other users who saw Apple as an aspirational brand that was too exclusive for their budgets. Memory makes everything more expensive… except the MacBook Neo. Furthermore, this launch moment could not be more cruel for Windows laptop manufacturers. All of them have already been warning that they will have to raise prices due to the RAM memory crisis, but Apple has done just the opposite: instead of presenting more expensive products—well, has also done it—, the firm has uncovered a functional and affordable bet that does not punish consumers. Sacrifices must be made, yes, but they are reasonable, especially in view of events. Apple has shown that you can be “humble” in price without losing your identity, and now it remains to be seen what the response of Windows equipment manufacturers is. Because what is clear is that that answer will come. And it is likely that after all this launch it will end up being very good news for us, the users. In Xataka | Apple made a splash with its cheapest iPhone. And the iPhone 17e is coming to repeat the play

Windows 11 is already on 1 billion devices. It has arrived before Windows 10, and that says more than it seems

If we had to bet on which of the two operating systems users want more, Windows 10 I would still have many numbers. Not only because it was a solid launch, but also because it came at the right time: in July 2015, with the mission of erasing the bad memory they had left Windows 8 and Windows 8.1. For years, Windows 10 was the comfortable place, but Microsoft has been playing another game for some time. Windows 11 is going well, very good. Not only is it growing, but it is doing so at a pace that no longer allows for too many doubts.According to data shared by Satya Nadella During the presentation of Microsoft’s financial results (fiscal second quarter), Windows 11 has reached the symbolic milestone of 1 billion users, with a year-on-year growth of 45%. It is a huge fact due to the number, but even more so because of what it suggests: that migration is finally accelerating. A strategy that has worked. The reading fits with something we have been seeing for a long time: Microsoft has stepped on the accelerator to push the jump to Windows 11. And it has not always been easy. In fact, until not so long ago the consensus was different. Unofficial figures for November 2024, crossed with historical data, described disappointing and slower than expected adoption. Windows 11 seemed to move forward with difficulty, as if the public could not find enough reasons to abandon Windows 10. But the pace has changed, and not exactly a little. Arriving before Windows 10. The comparison leaves a particularly striking detail: Windows 11 has reached 1 billion users before Windows 10. In numbers, Windows 11 needed 1,576 days (almost four years and five months) to reach that barrier, while Windows 10 took 1,706 days (four years, eight months and two days). Even so, it is worth putting it in perspective: Microsoft set an even more aggressive goal with Windows 10, aiming for it to be installed on 1 billion devices in just three years. A goal that changed. That plan was ambitious, yes, but it also had small print. In its roadmap, Microsoft planned to add part of the mobile ecosystem as “installations”: Windows Phone and Windows 10 Mobile. The problem is that that future never came. The collapse of Windows Phone and the subsequent cancellation of the project They left that approach meaningless, and Microsoft ended up adjusting expectations. In fact, in April 2015 Terry Myersonthen head of Windows, was already talking about “1 billion devices” in “two or three years” after the launch. A more elastic formulation, less rotund, and much easier to land when the board changes. A milestone amidst challenges. Because the jump from Windows 10 to Windows 11 is not—nor has it been—a smooth transition for everyone. The first wall is technical: hardware requirements. Many computers are left out of the official update because they do not have TPM 2.0 or a compatible processor. In other words, there are users who are pushed to renew equipment even when theirs continues to function reliably. The second obstacle is more intangible, but just as important: experience. Windows 11 arrived with visible changes compared to Windows 10 (design, interface, organization) and also a different philosophy, with more presence of functions powered by artificial intelligence, new features that may arrive at any time and a model of constant evolution that does not always work in its favor. Added to this is the usual noise: a chain of incidents after some recent updates that have made people talk. Windows 11 is a solid system, but also one in constant transformation, and that has a cost. Despite everything, Windows 11 is advancing. Perhaps it is due to pure inertia, perhaps because of end of Windows 10 supportor maybe because the PC market is moving again. What is relevant is that Windows 11 is gaining ground at a pace that Microsoft can read as a victory. Although, deep down, the industry has already changed enough for Windows to stop being king within Microsoft itself. Today it represents less than 10% of income from the Redmond giant. The real jewel in the crown, and the big strategic bet, is elsewhere: Azure. Images | Microsoft | Andrey Matveev In Xataka | We have been waiting for the new Siri for a year and a half. Now it’s just around the corner with an unexpected twist: Google

Windows 95 had a little secret that made rebooting faster. The reason was in its more chaotic architecture

If before Windows 95 If you used other operating systems, it’s hard not to remember the feeling of being faced with something completely new. That proposal introduced elements that we take for granted today, such as the Start menu, the taskbar or Plug and Play, and it did so at a time when starting a PC was almost a small ritual. But beneath that familiar interface a complex architecture was hidden, the result of the forced coexistence between DOS inheritances, 16-bit Windows and the first 32-bit layers. That design, as inelegant as it was effective, gave rise to unexpected behaviors that still surprise today. Few users knew that Windows 95 hid an alternative route to the classic reboot. It was enough to hold down the Shift key during the process started from the graphical interface for the system to display the warning “Windows is restarting”, instead of following the path of a cold restart, as described by Raymond Chen. The difference was not spectacular, but it was noticeable at a time when every minute of starting counted. That small gesture activated an internal mechanism designed to avoid, whenever possible, starting from scratch. The shortcut that did not restart completely Behind this behavior there was a precise technical decision. Chen details that Windows 95 used a flag called EW_RESTARTWINDOWS when invoking the old ExitWindows function, still 16-bit. With that instruction, the system did not order a cold restart of the computer, but rather something more limited: close Windows and restart it. The objective was to save steps, as long as the internal situation allowed it, although this optimization depended on everything fitting correctly. Once that alternative route was taken, the process followed a very specific sequence. The 16-bit Windows kernel was shut down first. The 32-bit virtual memory manager was then turned off and the processor returned to real mode, the most basic state of the system. At that point, control returned to win.com with a special signal asking for something very specific: restart Windows in protected mode without going through a full computer boot. With control back on win.com, the most fragile part of the process began. The program had to simulate a clean boot of Windows, as if it had just been run from scratch, which involved, in Chen’s words, resetting the command line options and returning some global variables to their original values. Although the work was largely clerical, it was especially complex because win.com It was written in assembly. There were no abstractions or modern conveniences. The decisive point was in memory. When win.com was executed, like any .com file, it received all available conventional memory. However, it freed up almost all memory beyond its own code so that Windows could load a large contiguous block when entering protected mode. If during the session a program reserved memory within the space that win.com had left free, the memory was fragmented. In that scenario, win.com could no longer recreate the original map it expected, and, Chen explains, it was forced to abandon the fast reset and fall into a hard reset. When everything fell into place, the process continued without turning back. win.com jumped directly to the code responsible for booting Windows in protected mode, recreating the virtual machine manager and llifting the 32-bit layers again. From there, the graphical interface loaded as usual and the user returned to the desktop. The difference was subtle but real: Windows hadn’t had to reboot the entire system to get to that point. This type of shortcut was only viable in a system built on cross-compatibilities. Windows 95 had to coexist with DOS software, 16-bit Windows programs and Win32 applications, and that mix forced us to accept inelegant but very practical solutions. The developers took advantage of this complexity to introduce hidden optimizations that could speed up restarts, although they could sometimes end in crashes. The obsession with saving memory led to very imaginative solutions. Chen explains that in assembler it was common to recycle code that was no longer going to be used as if it were free memory. On win.com, the first bytes of the entry point were reused as a global variableunder the premise that this code was only executed once. Since the quick restart did not return to that initial point, the system could allow that shortcut without affecting the process. That shortcut also showed its seams. Chen recalls that some users detected errors after performing several consecutive quick reboots, something that he was unable to consistently reproduce. Their hypothesis is that some driver wasn’t rebooting properly, leaving the system in a weird state, and that weirdness ended up taking its toll later. It’s no surprise that this type of behavior wasn’t presented as a documented feature, but it sums up the spirit of Windows 95 well: inventive, ambitious, and full of compromises. Images | Microsoft In Xataka | Schrödinger’s Office: at this point it is impossible to know if Microsoft keeps it alive or if everything is AI and Copilot

In 1976 Boston built its most amazing skyscraper. Until its windows became lethal guillotines

The John Hancock Tower It was conceived in the late 1960s as the great coup of authority of modern Boston: a minimalist, elegant and almost “invisible” skyscraper, designed to reflect the sky with enormous panels of lightly tinted blue glass, with reduced mullions to a minimum and without elements that would break its purity, topped by a plant that visually sharpened the corners and a vertical slit that further stylized the mass. But there was a mistake fat. The modernist dream of a glass needle. The skyscraper was the type of building I wanted seem inevitableas if it had always been there, and at the same time had to demonstrate that “corporate architecture” could be a piece of urban art. In other words, a clear aesthetic ambition was sought, but it implied an enormous risk: betting everything on glass and geometric precision, where any failure ceases to be a defect and becomes a dangerous spectacle. The first shock of reality. From the beginning, the project lived under the spotlight because it in the Back Bay neighborhood and very close of Trinity Churcha historical milestone that already had a symbolic and emotional weight in the city, and that threatened to be dominated by the shadow and presence of the new colossus. Was protests and design adjustmentsbut the real conflict soon arrived below ground: the excavation and temporary retaining walls were deformed and gave way before the mud and clay fills characteristic of the area, damaging sidewalks, services and even nearby buildings. Trinity Church ended up claiming and won a million-dollar compensationand the skyscraper, before it even existed, was already seen as a work that was too ambitious for the terrain that supported it. The glass scandal. The episode that turned the tower into a black legend of architecture occurred when it was still unfinished: with the Boston winds, the panels began to crack and fall awayand the glass fragments began to fall to the street like some kind of lethal rain. The authorities even cordoned off areas and closed streets when the wind rose, and the image of the “brilliant” building was replaced by another. much more humiliating: windows covered with plywood sheetsa partially bandaged tower in the center, which earned nicknames like “Plywood Palace” and jokes like “the tallest wooden building in the world.” In a skyscraper that was intended to represent absolute control, the failure was not only technical: it was a reputational blow direct, one where the symbol of its modernity (glass) had become a meme and a threat… Why it failed. At first you knowsuspected the wind as the main actor, of the suction and channeling effect around the building, and tests were reviewed in wind tunnels with models of the environment, but the core of the problem was in the window itself. Apparently the system it was too rigid: the reflective layer and its connection to the metal frame did not allow bending, and in a structure subjected to vibrations, oscillations and continuous thermal cycles, this lack of “play” became the breaking mechanism. The stresses were transmitted to the glass instead of being absorbed, the cracks propagated, and the result was inevitable: enormous and very heavy panels, weighing hundreds of kilos, failing repeatedly until the unthinkable was assumed in a newborn corporate icon: it was necessary to replace them all. The tower at the time the windows that had fallen out were replaced with plywood The expensive remedy. The solution It was shocking.: remove and replace the entire glazing with a more robust, tempered and heat-treated glass, in an operation that cost several million and that prolonged the ordeal for years. The project, announced with grandeur and reasonable budgets, ended up becoming a spiral of delays: the inauguration was postponed, the numbers skyrocketed and the tower went from promise to public embarrassment. Even so, mass glass replacement was the only way outbecause it was not about fixing a few defective pieces, but about correcting a façade idea that had been born with a structural fragility incompatible with the climate and real loads of Boston. The building today The final twist. And when it seemed like the worst had already happened, came the most disturbing blow: Later calculations suggested that, under certain wind patterns, the building could have a stability problem more serious than assumed, with unforeseen twists and dangerous behavior on its narrower sides. The tower also moved enough to cause dizziness to occupants in tall plants. The city discovered that the beauty of minimalism had a physical price. The answer it was double: on the one hand, install a huge damping system with tuned masses, two gigantic weights mounted with springs and shock absorbers to oppose the swaying and “return” the building to its center. On the other hand, reinforce with tons of bracing steel diagonal. It was, in essence, reengineer an icon already built so that it would continue standing with the dignity that had been promised from the first render. The paradox: from shame to object of desire. The most fascinating thing is that, after such a disastrous start, the tower ended up establishing itself as an admired piece and recognized, until receiving prestigious awards and becoming an inseparable element of the Boston skyline. As they counted then architectural experts, it was the kind of redemption that only happens when a building survives to his own crisis: the public ends up remembering its silhouette and its reflection, not the panic of the closed streets or the wooden planks covering the absent glass. The Hancock went from being a historical lesson for modern architecture (a reminder that aesthetics does not negotiate with physics) to be, precisely because it has overcome this technical hell, a work with a certain aura of resistance, almost a monument to the obsession with fixing the irreparable. One more thing. Over time, the tower maintained its place as the tallest skyscraper of New England, but its story continued to move in the practical terrain of money, tenants and identity: … Read more

The community has made it clear that they do not want AI in Windows and Microsoft has ignored them. So they have taken the law into their own hands

Microsoft’s obsession with putting AI in every corner of Windows is logical at the current time (after all, it’s what everyone is doing). The problem is that the community has been very clear about this: they don’t want to. Microsoft has continued with its plan flood Windows 11 with AIbut we already have a way to avoid it. Winslop. The name comes from the play on words between Windows and Slop, which is the term used to refer to ‘AI garbage’, that is, very poor quality content. This is a free tool whose purpose is to eliminate all traces of AI from the system. Its creator makes it clear that he is not anti-Windows, in fact he states that he likes the platform, what he doesn’t like is the direction it is taking. CleanIA Windows. Winslop is totally free and you can download it from Github. The interface looks like old versions of Windows and consists of a list with all the changes that we can apply. There is an option that inspects the system and proposes the changes to be made, or we can check the boxes we want, depending on the level of cleaning we want. The list is quite long and is divided into categories, these are some of the functions we found: System: shows details if there is a blue screen instead of a sad face, optimizes system sensitivity, speeds up shutdown time… Microsoft Edge– makes it not the default browser, disables the Copilot icon, removes the shopping assistant, does not show sponsored links when opening a tab… Interface: Turn off transparency effects, hide taskbar search, turn off Bing search… gaming: Disables DVR recording, power throttling and visual effects. Privacy: disables activity history and location tracking Advertisements– Remove ads system-wide. AI– Hides Copilot from the taskbar and disables Windows Recall. Bloatware. There is more. Winslop is divided into three tabs: Windows 11, applications and extensions. From the apps section we can eliminate pre-installed applications such as Bing News, Bing Weather, WindowsCamera and many more. As in the other section, pressing the ‘Inspect System’ button gives us a list of suggestions to eliminate and we mark the ones we want. It’s not the first. Recently we told you about a tool that was born with the same objective (although with a name with less punch), RemoveWindowsAI. Like Winslop, it also disables all AI functions, but beyond its functions, the important thing is that its simple existence was already a symptom of community fatigue. The fact that another app has come out only confirms it. The PC IA. The obsession with turning Windows into an agentic system has collided head-on with what the community is asking for, to the point that Microsoft is losing favor with users. A year ago PCs with AI promised to be a revolutionbut they have come face to face with reality and even historical brands like Dell are changing their discourse. Microsoft is left alone. Image | Winslop In Xataka | There’s a reason AI PCs aren’t hurting Apple: Nobody asked for AI PCs

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.