Microsoft needs 500 million PCs to jump to Windows 11. Its new list of compatible CPUs does just the opposite

Microsoft has a calendar problem and a communication problem. It’s been almost two months since Windows 10 lost official support, leaving millions of users in security limbo. Although Windows 11 has managed recently surpassed its predecessorthe reality is that adoption is still a pending subject for those from Redmond. In this scenario, where clarity is vital for laggards, the company has updated its hardware documentation in the least intuitive way possible. It has wreaked havoc on those trying to figure out if their old PC is valid for upgrading. A labyrinth of compatibility. Until recently, Microsoft’s documentation was explicit: you looked for your exact model and left no doubt. Now, as reported specialized mediathat specificity has disappeared for the list of compatible chips. The new list groups the processors by generic families and redirects to the manufacturer’s website. This forces the user to investigate on their own and also generates certain absurd situations: complete series such as the “Celeron 3000” appear listed as compatible without being so. This family, which was launched a decade ago, only considers one chip as compatible (the Celeron 3867U). Erasing the chosen ones. The confusion now also punishes Microsoft’s own customers. Processors that are compatible have disappeared from the official list, as is the case of the Core i7-7820HQ that the Surface Studio 2 has. This chip was an exception that the firm made for its own hardware (being a Kaby Lake chip it should not fit), but by eliminating the reference, the implicit message for anyone who owns this premium device is that it is no longer suitable. Curiously, the lists dedicated to AMD and Qualcomm (ARM) processors maintain model-by-model detail. The user resists. This change, which given the context should be more intuitive, comes when the market is stubborn. There are an estimated 500 million PCs technically capable of running Windows 11 whose users simply have chosen not to update. The barriers were already high at its launch: from the technical demands of the TPM 2.0 to Microsoft’s obsession with force the online account and its services during installation. Obscuring the basic hardware requirements now only adds more friction to a user base that was already reluctant to abandon the stability of Windows 10. A lifesaver with small print. For those still trapped in the old system, security comes at a price. Microsoft has activated the extended security update program For first-time home users: grants an extra year of patches. Although in Europe regulatory pressure has made this additional year free, It’s just a temporary patch. Those who do not update are already using a vulnerable operating system, exposing themselves to security risks. PCs with Windows 11 are changing from the inside. In the photo, the Surface Pro 12 with Qualcomm ARM chip. Image: Javier Penalva for Xataka ARM is another option. It is certainly paradoxical that, while Microsoft neglects clarity in its traditional platform (x86 chips), it continues to pour resources into its ARM revolution with the Snapdragon X to compete with Apple. The company seeks to energize the sales of computers with Windows 11 relying on AI and Copilot+. But if compatibility management on today’s millions of computers becomes a labyrinth, user confidence in jumping to Windows 11 is eroded. For the more technical, third party tools like Flyoobe They continue to be the escape route to update without restrictions. The exit from the maze. Beyond the information chaos, the roadmap for the user who remains on Windows 10 is clear: the ideal solution is to make the leap to Windows 11, a process that it’s still free. If the hardware resists the official requirements, it is always the “tricks” option to install the system on non-compatible computers. It also opens a new window for Linux: distributions have greatly simplified their use and installation, and thanks to compatibility layers such as Steam Protoneven the old excuse of the lack of video games is no longer a real impediment. In Xataka | The amazing history of ARM, the architecture that triumphs in mobile phones and that was born more than 30 years ago at Acorn Computer

In 1995 a program came out that promised to double your PC’s RAM. In the best of cases what I did was not spend more

The 90s were wonderful in the world of software and hardware. Epic trolling like that of the 299 dollars of the first PlayStationthe legendary key of Windows 95 or the PlayStation emulator presented by Steve Jobs himself. In the middle of the decade a program came out that promised the impossible: double the amount of RAM on your PC. Its name was SoftRAM 95 and, although it makes us raise an eyebrow today, in its day it sold hundreds of thousands of copies for $80 each. And spoiler: it was of absolutely no use. SoftRAM 95, the miracle solution for your PC’s RAM The launch of a program like this is a product of its time, one in which users they could have been less ‘smart’ Now for more than logical reasons and in an industry in which everything was learned and developed as we went. There were times when the smartest were the ones who got results, but a company called Suncronys Softcorp learned its lesson the hard way. The year was 1995 and Windows 95 was beginning to revolutionize homes. Although the Microsoft system made control a PC was more accessible than ever (unfortunately for Steve Jobs), the hardware still had a brutal barrier to entry: the price. They were still expensive devices, very expensive, so saving on components saved a few dollars. RAM It was one of those components for which you paid gold per KB, but… what if there was a program that, for a few dollars, doubled the amount of memory on our PC? What if he did all this without having to touch any piece of our equipment? That is where the Californian Syncronys Softcorp saw a vein and – now we can say that in bad faith – launched its program: SoftRAM 95. It went on sale in August 1995 and it is estimated that they sold a whopping 600,000 copies until December of that same year. In those days, it was truly outrageous. And the logical question is how he achieved what he promised. The long answer is that it compressed the memory, so when the operating system needed to save data from RAM to the hard drive, SoftRAM 95 compressed it before writing it, reducing the amount of space needed on the disk and allowing the RAM to have more space available. The concept, roughly speaking, is correct, and the program interface told us that yes, congratulations, you had double the amount of RAM. The long answer is that it didn’t do what it promised. Although technically they were on the right track, this process at the time was tremendously ambitious for one reason: the speed of both the RAM and the primitive hard drives It was so absurdly slow that, effectively, the objective could not be met. They knew this from the top of Syncronys, but they didn’t care: the money was pouring in because each license cost about 30 dollars. Under the magnifying glass of the press… and Microsoft However, things quickly went wrong. A magazine of the time called PC Magazine submitted the software to a analysis How these analyzes should be done: testing whether the program really did what it promised. Using blocks of data to evaluate whether compression was effective, they found that processing times were exactly the same with compressible data and with random data that could not be compressed. They came to the conclusion that the only thing SoftRAM did was show an animated screen which gave the user the perception that they were working when, in reality, they were doing absolutely nothing. But beyond the press, those who got their hands on the software were Bryce Cogswell and Mark Russinovich, two Microsoft engineers who dissected the program at the code level. Basically, confirmed the well-founded suspicion of PC Magazine and pointed out that the program never actually worked. That is, the paging controller device – that compression of the RAM to transfer it to the hard drive – it closed just when loadingso it never did anything at all other than display false numbers while the operating system worked exactly as it should, whether the program was installed or not. When I said before that the management of Syncronys knew it, it was not because we saw history with the eyes of the present. When everything was revealed, they reported that RAM compression was not being carried out and, in addition, it was learned that they sold the software even though its developers had warned that the product was not ready. And it wasn’t aI’ll launch it and I’ll fix itlike many current games”, because in 1995 Internet updates were not the norm. Just when the company thought it was over, the US Federal Trade Commission arrived. Following its investigation, Syncronys finally acknowledged that it had misrepresented the performance of its product and banned it from selling any more copies of both SoftRAM and Windows 3.1 as SoftRAM 95. In total, both versions placed 700,000 copies on the market and Syncronys declared bankruptcy in July 98, owing 4.5 million dollars. The idea did not die with SoftRAM In the end, what SoftRAM did The best case scenario was not to eat up your PC’s resources.and it was one of those attempts to sell whatever in a still somewhat naive market. For PC Worldnext to AOL and RealPlayerSoftRAM is the worst technology product of all time. But of course, with the eyes of 2025, you may be wondering… what happens with solutions like Windows Vista ReadyBoost and the mobile memory expansion? It’s a different matter and, although both promise to improve performance by using “extra memory”, it is something very different from what SoftRAM did. ReadyBoost, for example, allowed you to use the memory of a pen drive as a cache to speed up access to frequent data. It acted as an extension of the system’s virtual memory and the theory is correct, but again we ran into the speed limitation of USBs … Read more

Apple is resisting the push for AI PCs because AI PCs have caused complete indifference

On paper everything was the sea of ​​pretty. Copilot+ PCs wanted resurrect and reinvent the PC turning it into a device with which you can do much more with much less effort. There was a lot of talk about TOPS power, how AI would do a lot of things for us, and an argument that would boost sales. Do you know what? Its impact has been practically zero. For better or worse, the PC segment has not undergone major changes. Sales have not suddenly started to grow, nor have they plummeted. If the Copilot+ PCs wanted to boost sales, they certainly haven’t seemed to succeed. But at least they don’t seem to have had a negative impact either. Own elaboration. Data: IDC The arrival of AI features on PCs should theoretically have had an impact on PC sales by boosting them, but also theoretically on Mac sales, from which it should have stolen some share if AI had been an important argument. As we know, Apple has barely emphasized the AI ​​functions of its equipment. Although introduced Apple Intelligence in June 2024, it did so in a very limited way and almost a year and a half later its functions remain modest. Own elaboration. Data: Apple quarterly reports. The people keep buying Macbut not because of Apple Intelligence, but because they are just that, Mac. This has been noted throughout this period in which sales have remained relatively stable. The Mac is a lot of Mac The recent presentation of the MacBook Pro M5 could encourage sales towards the end of the year, but where Apple seems to have a winning horse is in the MacBook Air M4which has only been on the market for eight months and offers an enviable price-performance ratio. In the US, for example, you can get it right now for 800 dollars (without taxes). Here, for 949 euros. Few Windows laptops can compete with Apple’s offering, which is surprisingly balanced and has extraordinary room for maneuver thanks to its Apple M4 chip. When we tested the Acer Swift Go 14 AIFor example, we find a device that at 719 euros is undoubtedly cheaper and boasts 16 GB of RAM and 512 GB of SSD, but is inferior in its chip, the Qualcomm Snapdragon X Plus. In Geekbench single-core it is around 2,400 points, and in multi-core it is 10,500. The Apple M4 is around 3,600 and 15,000 points respectively. Acer’s, like other manufacturers selling PC Copilot+, is on paper a comparatively decent proposal, but still fails to impose that TOPS argument and AI functions. They are there and can help, but they are not a decisive argument at the moment, at least if we look at the sales of these devices. PC sales may pick up and boost in the short term, but if they do it will probably not be because of AI features, but for the simple reason that Official Windows 10 support has ended —although that has small print— and many users and companies may have decided to renew their IT infrastructure. However, the promise that AI was going to revolutionize our PCs remains just that: a promise. Apple seems like it can rest easy. And it must be, because this last quarter the Mac division has grown 13% in revenue compared to the same period of the previous year. Not bad. In Xataka | Microsoft is already thinking about what the computers of 2030 will be like and has come to a conclusion: touching is overrated Image | Wesson Wang

is leading the creation of a memory standard for the PCs with AI

Jensen Huang, the co -founder and general director of NVIDIA, is convinced that in the future most users will have a “supercommer of artificial intelligence (AI) Personal “. At the beginning of last January he led the presentation in the CES of Las Vegas (USA) of Project Digitsa very compact personal computer capable of executing models of up to 200,000 million parametersand therefore bigger than GPT-3. This computer is mainly intended for researchers, developers and students, although a good part of the latter can hardly invest the $ 3,000 (about 2,870 euros) that costs the most economical review of this machine. His heart is a soc GB10 that integrates a GPU with Blackwell architecture and a 20 -core CPU grace with ARM architecture. He works side by side with 128 GB of unified DDR5X type and low consumption type, although NVIDIA available memory standards do not seem to convince him for this scenario of use. Nvidia leads the development of the Socamm memory standard Projects Digits is just the spearhead. Presumably in the future NVIDIA will launch other personal computers even with greater capacities in The field of AIbut it seems that they will not use any of the memory technologies currently available. According to Sedailythe company headed by Jensen Huang has allied with South Korean companies Samsung and Sk Hynix, and also with the American Micron Technology, to develop a new memory standard known as Socamm (System on Chip Advanced Memory Module). Nvidia, Samsung, SK Hynix and Micron for the moment are not having Jedec These three companies are The biggest memory chips manufacturers of the planet, so there is no doubt that they are the best allies to which Nvidia can resort. Especially if the Socamm standard is being devised without Jedec’s participation (Joint Electron Device Engineering Council), which is the global organization that It is responsible for the development of standards used by the semiconductor industry and microelectronics. Apparently it is just what is happening: Nvidia, Samsung, SK Hynix and Micron for the moment are not having Jedec. In any case, what we know right now about Socamm memories, beyond the fact that they will be used in the next batch of personal computers for AI, paints very well. Although this information has not been officially confirmed by any of the companies involved in their development, it seems that the Socamm modules are being designed on the basis of LPCAMM memories (Low-Power Compression Attached Memory Modules). According to Sedaily, the Socamm standard will be very efficient from an energy point of view; It will have more I/O ports than conventional LPCAMM and DRAM memories (up to 694 ports); will allow to easily expand memory initially installed in the equipment for AI; And finally, these modules will have a physical size, which not storage capacity, much more restrained than conventional dram modules. If this standard contributes to the hardware for the most accessible to all, welcome. However, it is still early to trust that it will be so. Image | Nvidia More information | Sedaily In Xataka | The 20 most important personal computers in the history of technology

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.