The best AI agents that are faster and easier to use to do tasks for you without complications or long installations

Let’s tell you the best fast and easy AI agents to use, without complicated installations and configurations. This type of AI agent They are less complete and powerful than the more complete and advanced ones, but they allow you to explore how the artificial intelligence can do tasks for you. We are going to make a small list to stick to the best alternatives. Many are quite popular, others are more unknown, and we even ended up with an open source alternative for privacy lovers. Claude Cowork Claude Cowork It is possibly the best and simplest tool to test the benefits of a medicine, but in a controlled way. It is a paid feature that you can use within the desktop application of Claude. The price to use it starts at 15 euros per month. Claude Cowork allows Claude’s AI to manage files and use applications on your computer. You tell him what you want, and Claude will find the best way to do it. Also, if you install the extension Claude in Chrome in your browser, Cowork will also be able to do things for you in the browser. Perplexity Comet Comet is the browser with artificial intelligence Perplexitya platform that started as a search engine based on artificial intelligence, and now it is much more. It is now a chatbot that allows you to use various artificial intelligence models, such as Gemini, GPT or Claude. The Comet browser has the peculiarity that can use AI to do tasks for yousuch as browsing you, interacting with websites, automating tasks, searching and filtering information, managing workflows and other tasks such as comparing prices on multiple pages. Manus on Telegram Manus is an autonomous AI agent, to which you can give a high-level objective and it works on its own to achieve it. Tasks are asynchronous, so you can ask it to do something, turn off the computer, and receive a notification when the work is completed. Manus also has the ability to used in Telegram chats like a bot With this, you will be able to use Manus directly from the messaging app and without entering its official website or application, and then you will be able to access them to see the result of AI research, web development, design, whatever you have asked. ChatGPT Agent ChatGPT also has an agent mode in your application. With it, you will be able to interact directly on web pages, ChatGPT will act on your behalf to book appointments, create presentations and perform other complex tasks. Of course, to use it you will need have a paid subscription in AI. Genspark This platform is a kind of all-in-one AI worksspace. It is not exactly a chatbot but acts in a similar way to the concept of an agent, planning taskschoosing the correct tools to do it, and chaining the steps autonomously. With this tool you will be able to create applications, documents, designs, images, music, spreadsheets and more. It has a free plan with limited access, although you will have to pay to access everything. Also has more than 80 toolsand eight language models of different sizes, each for a task. AgentGPT This was one of the first services to make AI agents accessible from the browser without having to install anything. It works similar to the previous ones, you have to write what you want with natural language, the agent divides this into subtasks, and then executes them autonomously. Kuse Cowork Kuse is an open source alternative to use an agent capable of helping you perform tasks on your computer. It can generate documents and presentations, transform d oc files, PDFs, you can also create mind maps, interact with YouTube videos and more. It is therefore an open alternative to Claude Cowork, where you can decide which AI models to use attaching them with their API, or even installing them directly on your computer. In Xataka Basics | How to create a Telegram bot that sends you a summary made by Gemini of each email you receive in Gmail and other emails

China is building submarines faster than anyone else. And that’s a problem for the United States.

In a tense geopolitical moment on a global scale with several open fronts such as Greenland, whose melting ice is allowing us to see nuclear submarinesChina just achieved a historic milestone: it is manufacturing nuclear submarines faster than any other country in the world, according to a report by the International Institute for Strategic Studies. This is a complete surprise to the United States, the power that until now held this title, and threatens the advantage that Washington has maintained for decades. Brief notes on nuclear submarines. Without wanting to delve into their characteristics, it is worth distinguishing what types there are: He SSBN is a nuclear-powered submarine designed to launch ballistic missiles with nuclear warheads (some with intercontinental range). They are strategic second response platforms, practically undetectable and guarantee that if someone attacks first, they will receive a response. The SSN/SSGN are nuclear attack submarines (the second, guided missiles), true maritime control weapons: they can attack land or sea targets, block routes and operate for months without resupply. Context. American hegemony underwater lasts for decades, but Beijing has on its roadmap modernize its military capabilities by 2035: it already has the largest surface fleet in the world in the words of the Pentagon and now he has turned on the turbo to reach the last bastion of the United States: the depths. The data. China has surpassed the United States in the pace of launching nuclear-powered submarines (SSN/SSBN). Thus, between 2021 and 2025, the Asian giant launched 10 units compared to Washington’s seven, according to has discovered the IISS through satellite analysis of the Bohai shipyard in Huludao (northern China), as the epicenter of the industrial leap. In a decade, China has gone from being far behind to leading the race: Why is it important. This shift in underwater hegemony has three implications, one of which points directly to the US: Nuclear deterrence. The new submarines Type 094 and future Type 096 They expand China’s nuclear response capacity in the face of possible nuclear attacks. A preemptive attack is strategically unfeasible. Maritime control of commercial routes. SSGNs with high-speed missile systems add a layer of threat to foreign combat groups in the Indo-Pacific, complicating access for the US and its allies to potentially conflictive areas, such as the South China Sea or Taiwan. At a time when The United States is betting on boarding As a sign of maritime control, China has in this fleet a safeguard for its commercial routes. The United States cannot cope with that pace. John Phelan, US Secretary of the Navy, recognized in Congress that “All of our programs are a disaster, honestly. Our best-performing program is six months behind schedule and 57% over budget.” Phelan mentions the erosion of this industry, which according to the Government Accountability Office Today it faces problems such as aging infrastructure and a shortage of qualified labor. The surprise figures. The IISS Military Balance 2025 leaves other interesting figures to better diagnose the reality of both powers in nuclear submarines: Launch rate from 2021 to 2025: seven from the US to 10 from China. The difference in tonnage is notable: while those from China weigh 79,000 tons, those from the US are 55,500. Active nuclear fleet: The United States wins by a landslide, with 65 units compared to China’s 12 units (plus another 46 conventional ones). Quantity vs quality. We have already seen in the previous point that the United States continues to gain in numbers (still) and it is not the only reason for optimism for the country led by Trump. CNN echoes the IISS report where he explains that “Chinese designs are almost certainly behind American and European submarines in terms of quality.” Among other qualities, in noise: Chinese submarines are noisier, which makes them more vulnerable, they explain. But as a captain warns Retired US Navy Half USNI Officer, Biggest Fleets Win. In Xataka | In the midst of rearmament, Spain has just surprised Europe: 5,000 million for 34 warships and four submarines In Xataka | The new fear of Western fleets is not nuclear. They are conventional submarines armed with surprise and a flag: China Cover | CSR Report RL33153 China Naval Modernization: Implications for US Navy Capabilities—Background and Issues for Congress by Ronald O’Rourke dated February 28, 2014 – United States Naval Institute News Blog, Public Domain

Is it worth paying extra to spin faster?

Life takes many turns, but not as many as a washing machine. If you start looking at a washing machine, it’s normal that your eyes will focus on the capacity or the price. But, what are those RPMs that they put on their technical sheet? It is not just another number, but it has its importance. Because a washing machine with a higher RPM has, in most cases, an even higher price. Is it worth spending more to have more RPM in your new washing machine? Since there is no simple answer, I am going to explain to you what it means for a washing machine to have more or less RPM. And if it’s worth spending that extra. Faster is not always better. The same thing happens inside your washing machine. When a washing machine finishes the wash cycle, everything in the drum is dripping with water. At that moment it is the turn of the centrifuge, that thing that makes a washing machine look like it’s going to fly away. Basically, the drum begins to rotate at a very high speed, causing the clothes, sheets or whatever we have put in the washing machine to stick to the walls of the drum. So that? So that the water comes out through the holes in the drum due to centrifugal force. Now that we know what the spin of a washing machine does, it’s time to talk about the RPM or revolutions per minute. There is not much mystery here: it is a unit of measurement that is used to know, in this case, how many revolutions does the drum make per minute. These are impressive figures, since if the cycle we use in the washing machine spins at 1,200 RPM, it means that the drum makes 20 revolutions per second. Numbers aside, what does more or less RPM mean when spinning? The higher the number of revolutions per minute of the drum, the greater the centrifugal force that is generated and, therefore, the greater the amount of water that is extracted from the clothes. So, in short, choosing a very high RPM number in our wash cycle will make the clothes come out with much less water. This is ideal if you put in a load of laundry and want it to be dry as soon as possible. Or even if you are going to use the dryer later, because if the clothes come out with less water, you will be able to use the dryer for less time. If we stop there, logic will tell us that it is better to use 1,400 or 1,600 RPM to remove dry or almost dry clothes from the washing machine. Spoiler: it is not like that. These types of spins make a lot more noise, so you don’t want to run them at night. Besides, they make clothes come out very wrinkledso it’s time to have a good time with iron in hand. It’s not that you have to worry about using it a few times, but in the long run, 1,400 or 1,600 RPM causes the fibers of the clothing to wear out much faster, so your clothes will be damaged. And what remains is what affects our pocket: Using these RPMs causes the washing machine’s consumption to increase a little. It’s not a drama, especially since then you will have to use the dryer less time (which uses a lot more) or use the clothesline less time. However, it is something to keep in mind because it adds to the higher price of the washing machine. The good and the bad of both options, face to face 1,200 RPM 1,400 or 1,600 RPM THE GOOD 🟢 It’s cheaper and will make your clothes suffer less. Your clothes will come out practically dry, so your laundry will be ready sooner. THE BAD đź”´ It will take you longer to dry your clothes (even with a dryer) They make more noise and clothes suffer more. It will also appear more wrinkled. Ideal for: Homes that air dry clothes and take great care of each garment Humid climates where air drying clothes is torture or for families who do a lot of laundry We do the math to see which one can compensate you more. As you can see, Both have advantages and disadvantages. Choosing the spin speed will almost always depend on the type of clothing or fabric that we put in the washing machinebut here the question is whether it is worth having that extra that 1,400 or 1,600 RPM gives you. Above all, because a washing machine is an appliance with a very long useful life. You are going to live with her for many years, so it is better to choose well. If at home you tend to blow dry and don’t use the dryer too much, With a 1,200 RPM washing machine you will have plenty. Because? Actual use: You don’t have the highest spin power, but you do have a balanced and sufficient one for most people. So?: You spend less on your new washing machine and you still have enough spin to remove clothes with little water. And, in the long run, you will take better care of your clothes. However, in a humid area such as northern Spain, you will know what an ordeal it is to dry your clothes outdoors. If we add to this intensive use of the dryer and you put so many loads in a week that you don’t have room to hang everything, then You may be interested in opting for a washing machine with 1,400 or 1,600 RPM. Because? Actual use: You live in an area where it is (almost) impossible to dry clothes because of the humidity and you want to use the dryer as little as possible. So?: You spend a little more, but you will take the clothes out of the washing machine practically dry. A blow of the dryer and you will … Read more

Windows 95 had a little secret that made rebooting faster. The reason was in its more chaotic architecture

If before Windows 95 If you used other operating systems, it’s hard not to remember the feeling of being faced with something completely new. That proposal introduced elements that we take for granted today, such as the Start menu, the taskbar or Plug and Play, and it did so at a time when starting a PC was almost a small ritual. But beneath that familiar interface a complex architecture was hidden, the result of the forced coexistence between DOS inheritances, 16-bit Windows and the first 32-bit layers. That design, as inelegant as it was effective, gave rise to unexpected behaviors that still surprise today. Few users knew that Windows 95 hid an alternative route to the classic reboot. It was enough to hold down the Shift key during the process started from the graphical interface for the system to display the warning “Windows is restarting”, instead of following the path of a cold restart, as described by Raymond Chen. The difference was not spectacular, but it was noticeable at a time when every minute of starting counted. That small gesture activated an internal mechanism designed to avoid, whenever possible, starting from scratch. The shortcut that did not restart completely Behind this behavior there was a precise technical decision. Chen details that Windows 95 used a flag called EW_RESTARTWINDOWS when invoking the old ExitWindows function, still 16-bit. With that instruction, the system did not order a cold restart of the computer, but rather something more limited: close Windows and restart it. The objective was to save steps, as long as the internal situation allowed it, although this optimization depended on everything fitting correctly. Once that alternative route was taken, the process followed a very specific sequence. The 16-bit Windows kernel was shut down first. The 32-bit virtual memory manager was then turned off and the processor returned to real mode, the most basic state of the system. At that point, control returned to win.com with a special signal asking for something very specific: restart Windows in protected mode without going through a full computer boot. With control back on win.com, the most fragile part of the process began. The program had to simulate a clean boot of Windows, as if it had just been run from scratch, which involved, in Chen’s words, resetting the command line options and returning some global variables to their original values. Although the work was largely clerical, it was especially complex because win.com It was written in assembly. There were no abstractions or modern conveniences. The decisive point was in memory. When win.com was executed, like any .com file, it received all available conventional memory. However, it freed up almost all memory beyond its own code so that Windows could load a large contiguous block when entering protected mode. If during the session a program reserved memory within the space that win.com had left free, the memory was fragmented. In that scenario, win.com could no longer recreate the original map it expected, and, Chen explains, it was forced to abandon the fast reset and fall into a hard reset. When everything fell into place, the process continued without turning back. win.com jumped directly to the code responsible for booting Windows in protected mode, recreating the virtual machine manager and llifting the 32-bit layers again. From there, the graphical interface loaded as usual and the user returned to the desktop. The difference was subtle but real: Windows hadn’t had to reboot the entire system to get to that point. This type of shortcut was only viable in a system built on cross-compatibilities. Windows 95 had to coexist with DOS software, 16-bit Windows programs and Win32 applications, and that mix forced us to accept inelegant but very practical solutions. The developers took advantage of this complexity to introduce hidden optimizations that could speed up restarts, although they could sometimes end in crashes. The obsession with saving memory led to very imaginative solutions. Chen explains that in assembler it was common to recycle code that was no longer going to be used as if it were free memory. On win.com, the first bytes of the entry point were reused as a global variableunder the premise that this code was only executed once. Since the quick restart did not return to that initial point, the system could allow that shortcut without affecting the process. That shortcut also showed its seams. Chen recalls that some users detected errors after performing several consecutive quick reboots, something that he was unable to consistently reproduce. Their hypothesis is that some driver wasn’t rebooting properly, leaving the system in a weird state, and that weirdness ended up taking its toll later. It’s no surprise that this type of behavior wasn’t presented as a documented feature, but it sums up the spirit of Windows 95 well: inventive, ambitious, and full of compromises. Images | Microsoft In Xataka | Schrödinger’s Office: at this point it is impossible to know if Microsoft keeps it alive or if everything is AI and Copilot

the greenhouse gas that warms the planet faster than COâ‚‚

In November 1776, while traveling on horseback between Italy and Switzerland, Carlo Giuseppe Campi saw bubbles in the marshes surrounding Lake Maggiore. He approached them and decided to investigate them. Almost by accident he discovered that they were flammable and He told it to his friend Alessandro Volta. Years later, Volta discovered that this gas was methane. Since then we have not stopped having problems with him. Colorless, odorless and highly flammable, methane (CHâ‚„) It is a gas composed of one carbon atom and four hydrogen atoms. It is the simplest hydrocarbon and, in fact, is the fundamental component of natural gas (and therefore a key fuel for boilers, power plants and part of industry). In addition to the energy context, methane also appears in biological and geological processes: it is a chemical compound that arises, naturally, in the processes of anaerobic decomposition of organic matter. That is, in wetlands, in landfills, in the digestive system of ruminants or in large bags under the ground. Otherwise, methane is used for many other things. Not in vain, it is a raw material for the chemical industry and is an essential part of the production of hydrogen, ammonia or methanol. But the global conversation is not has been talking about methane for decades for none of that. Because, curiously, the big problem with methane is that it is a much more powerful greenhouse gas than carbon dioxide. After all, from what we know, its molecules capture between about 82 times hotter than CO2 (taking a period of 20 years as a reference). If we broaden the focus and use the 100-year term, its global warming potential is 29.88 times greater than that of COâ‚‚. The only good thing, so as not to paint a picture that is too gloomy or malicious, is that it has an atmospheric half-life (11.8 years on average) compared to a much longer average. This explains why, despite collecting much more heat than the other, the long-term impact of methane is not so great. So? Well, it is an “accelerator” of short-term warming and, in that sense, it is a first-order problem for us. Not only because we are not moving forward; but because if we manage to reduce it, it can provide relatively rapid climate benefits. The problem is that it is not an easy thing to solve. On a planetary scale, annual methane emissions are around hundreds of millions of tons and 40% of them are due to natural sources that we cannot directly control. The other 60% is due, generally speaking, to human sources. According to the Global Methane Budget, there are three main causes: agriculture and rice, fossil fuels and waste. Agrolivestock Monika Kubala For years, experts have discussed the impact of livestock farming (especially ruminants such as cows and sheep). The calculation, in any case, is complex: not only is it difficult to estimate methane production from enteric fermentation (due to digestion), but things as ‘simple’ as manure management suffered from an “information blackout” that makes them very difficult to evaluate. In addition to this (and it is important), you must add the rice. Every year they consume more than 500 million metric tons of rice. That’s a lot of rice (it’s the main source of calories for 3 billion people), but it’s also a lot of methane: because, favored by floods that leave wide plains without oxygen, our gas rises to the surface. Fossil fuels Methane leaking throughout the oil, gas and coal chain is also difficult to measure, but less so. After all, leaks in wells and equipment, ventsinefficient flaring, outdated compressors, plumbing or storage are money wasted. And if we know how to measure something, it is money. The International Energy Agency esteem that the production and use of fossil fuels generated about 120 million tons of methane emissions in 2023. Waste, landfills and wastewater This case is the simplest and the one that most clearly shows that the methane problem really does not matter much to us: landfills, wastewater and other types of waste accumulation areas are areas especially conducive to the generation of methane (due to pure anaerobic activity) and since we do not capture it, it is released into the atmosphere. Thus, the atmospheric concentration of methane remains high and increasing. To give an example, NOAA estimated which, between 2023 and 2024, went from 1915.73 ppb to 1921.79 ppb on average. And, as I say, it is a shame because methane is surely one of the fastest routes: according to UNEP/CCAC, a strong reduction in human emissions (up to 45% this decade, with available measures) “could avoid almost 0.3 ÂşC of warming by 2045.” Biomethane (also called “renewable natural gas“) is the term that we have coined to refer to a methane of biological origin that is obtained, above all, by improving biogas: the COâ‚‚ and other contaminants in it are eliminated until a gas rich in CHâ‚„ is ​​achieved and comparable, in almost all aspects, to natural gas. As a result of this process, a fuel is obtained that can be injected into the gas network. That is, it is an efficient way to take advantage of (and make the capture and processing economically interesting) a whole series of waste: from manure and sewage sludge to municipal waste or agro-industrial remains. Obviously, “green methane” does not automatically mean that it has “zero environmental impact.” Only that it has a biological origin and can be used like natural gas. For its environmental impact to be low, other things are required such as control of leaks, the origin of the waste or its impact on the network as a whole. Image | Katie Rodriguez In Xataka | The importance of the colors of hydrogen and what it means if it is green, brown, blue or turquoise

OpenAI’s obsession was to train its models like crazy. Now it’s run them faster than anyone else

OpenAI has signed an agreement estimated to be worth more than $10 billion with Cerebras Systems, a startup that designs advanced AI chips dedicated to one thing: running AI models as fast as possible. It is a unique alliance not only because of that change of focus, but because there is a conflict of interests. what has happened. The firm led by Sam Altman has committed to purchasing 750 MW of computing capacity over the next three years from Cerebras. Sources cited in The Wall Street Journal indicate that this alliance has an estimated value of more than $10 billion. We are therefore facing an operation extraordinary in size, but peculiar in form and substance. What Cerebras does. The firm based in Sunnyvale, California, was founded in 2015 by former engineers from SeaMicro, purchased in 2012 by AMD. The startup designs artificial intelligence chips specifically aimed at the inference stage of AI models, that is, executing them. More tokens per second please. When we use ChatGPT or any AI model, what we are looking at is an AI model using inference. Some “write” faster than others, and that speed of displaying text in responses is measured in tokens per second. Typically NVIDIA chips are great for the training phase, but not so much for the inference phase. Chips from companies like Cerebras —or those of the well-known Groqwhich has just been “bought” by NVIDIA—are precisely designed to run those models at full speed and obtain very high token per second speeds. The AI ​​is already good. Now she wants to be fast. NVIDIA’s recent “purchase” of Groq makes it clear that Jensen Huang’s company wanted the ability to offer those ultra-fast inference chips, and now OpenAI seems to want something very similar with its deal with Cerebras. AI models have already achieved remarkable performance in many scenarios, and although they are not perfect, now companies want them to not only work well, but also work very very fast and their responses, even if they are long, appear almost instantly. OpenAI wants more computing power. This operation also helps Sam Altman’s company with another objective: to obtain (and reserve) as much computing capacity as possible in anticipation of the fact that demand for these AI models will grow non-stop in the coming months and years. According to WSJ OpenAI already has more than 900 million weekly users, and its managers have frequently commented that they continue to have computing capacity problems. Brains grow. This agreement reinforces Cerebras’ position in a market that clearly demands this type of solutions. The firm is negotiating a $1 billion investment round that would bring its market valuation to $22 billion, tripling the current valuation, which is around $8.1 billion. In the past it has raised $1.8 billion according to PitchBook. Conflict of interest. This agreement also draws attention for an important aspect: Sam Altman, CEO of OpenAI, is also an investor in Cerebras (he is at the bottom of this Cerebras website) and indeed your company At one point he considered acquiring Cerebras although in the end that operation did not bear fruit. We are therefore faced with an operation that theoretically benefits Altman on both sides, which is worrying. How will OpenAI pay for this party? This new agreement once again triggers the debate about OpenAI’s ability to meet its credit and debt obligations. In 2025 it generated about 13,000 million dollars in income, but that enormous amount remains minuscule if we take into account that the contracts it signed with OracleMicrosoft or Amazon They amount to about 600,000 million dollars that will have to end up getting from somewhere. Where from? It’s a good question. We’ll see if they can end up answering it. In Xataka | The alliance between Oracle and OpenAI is not just about data centers: it is about overtaking Google, Apple and Microsoft on the right

Science reveals that the weight returns four times faster than with a diet

The era of “miracle” drugs to treat obesity is entering a phase of crude scientific maturity, thanks to the time that has passed since its launch on the market. In this way, despite the years with big headlines pointing to great weight losses with Ozempic, science is now able to provide more answers to the key question What we should ask ourselves: what happens when we stop pricking ourselves? The problem. When a drug is newly released to the market, long term effects It is something that is not known exactly, since patients need to be taking them long enough to see the effects they cause. And above all the effect that exists when the drug is removed from the body. To answer this with Ozempic we have a study led by the University of Oxford which is not minor at all, since it has analyzed more than 9,300 adults in 37 different clinical trials. And the conclusion they have been able to draw is quite clear: patients regain weight when they stop treatment at a rate of 0.4 kg per month. The comparison. At first glance, this figure seems really low, but if we compare it with other methods to lose weight, we see that the magnitude of the problem is not minuscule. The study itself indicates that in behavioral programs, such as a diet and an increase in physical activity, the effect after its withdrawal is an increase of 0.1 kg per month. This way, the rebound effect of slimming drugs It causes you to return to your initial weight in approximately a year and a half, while a change in eating and sports behavior takes around four years. New generation drugs. But this is a simple average between the different medications on the market. This implies that within GLP-1 agonists we can see that the most powerful drugs also have a much greater rebound effect. For example, in the case of Wegovy or Mounjaro, where the initial loss was 14.7kg, the rebound was seen to shoot up to 0.8kg per month. An effect that tells us that the body tries to recover lost ground at twice the speed of previous generation drugs. Cardiovascular health. Beyond the aesthetic, science pointed out that these drugs had the ability to reduce the risk of heart attacks and improve the metabolic health. But it seems these effects are completely temporary. Specifically, the study has seen that approximately a year and a half after stopping the medication, the majority of cardiometabolic markers return to their levels before starting treatment. For example, blood pressure increases, diabetes markers reverse their improvement, and total cholesterol also returns to its risk levels. Why the rebound is so fast. The answer to this could lie in our own biology. Science believes that this effect is due to the fact that by injecting massive doses of GLP-1 agonists (a hormone that is produced in small quantities when we eat) we could be destabilizing our own cell receptors. Or we would even be blocking our body’s natural production of this hormone that gives us satiety. That is why when withdrawing the drugthe system does not have the ability to produce this hormone again in the same way as before (as if it had to turn the system back on) and that is why the body’s satiety system goes deaf. The result? Well, logically, the appetite returns with great intensity, causing the patient to eat much more food, since they are not satiated and in this way the weight increases again. The myth of the magic bullet. There are hardly any miracles in medicine, even though we say that these drugs are. And the reality is that these drugs are not the definitive solution for obesity, since real data indicates that the majority of patients stop treatment after 12 months due to its high cost, the fatigue of injecting or side effects. But in addition, there is no awareness that this treatment is a simple aid to self-regulation, but that logically it must be accompanied by a dietary change and physical activity that must be continued once the treatment is finished. If not, we can be sure that the injections will be of absolutely no use. A paradigm shift. This meta-analysis marks a turning point. Science tells us that GLP-1 is extraordinarily effective, but it is not a cure. If we treat them as a six-month “crash plan”, we are condemning the patient to a more aggressive yo-yo effect than any miracle diet of the past. The solution, according to Qi Sun and the Oxford researchers, is not only in the syringe, but in public policy: taxes on ultra-processed foods, aid in the purchase of fruits and vegetables, and urban planning that encourages exercise. Without a change in environment, the drug is just a temporary truce in a war the body is programmed to win. Images | David Trinks Towfiqu barbhuiya In Xataka | If you want a “miracle” weight loss drug, you no longer turn to Ozempic: the competition is beginning to surpass it

AI has allowed developers to program faster than ever. That’s turning out to be a problem.

Whoever has tried it knows it. Programming with AI can be wonderful. Especially if you have (almost) no idea about programming. This is where generative AI models have seen their first and probably only revolution. The developers were the first to be able to embrace this new technology. The appearance of GitHub Copilot in 2021 It showed us that it was no longer necessary to chop so much code, because the machine was already doing it for you, and since then the advance of generative AI in the field of programming has been overwhelming. The question is: has it been positive? The answer is not at all clear. It is evident that AI has allowed: That millions of people who were not programmers could turn their ideas for applications and games into a reality. That millions of professionals can save time by not having to write repetitive code (boilerplate) to focus on other more important and productive parts of your work The industry, of course, has been especially insistent with this vision of the transformation of this segment. Satya Nadella (CEO of Microsoft) and Sundar Pichai (CEO of Alphabet/Google) already boasted months ago that about 25% of the code generated by their companies is generated by AI. Meanwhile, Jensen Huang went further and made it clear that At this point no one should learn to program anymore because the AI ​​would do it for us. These are very forceful statements, but behind them lies another reality: that All that glitters is not gold in the world of AI for programmers. At MIT Technology Review they have spoken with more than 30 developers and experts in this field and have reached interesting conclusions. AI is a better programmer than ever. At least, according to the benchmarks In August 2024 OpenAI made a unique launch: presented SWE-bench Verifieda benchmark intended to measure the ability of generative AI models to program. At that time, the best of the models was only capable of solving 33% of the tests proposed by that benchmark. A year later the best models already exceed 70%. Current ranking of the best models according to the SWE-bench Verified benchmark. Several already pass 70% of the tests. Source: SWE-bench. The evolution in this area has been dizzying and we have witnessed the birth of that new modality programming called “vibe coding” and all the big ones have developed powerful programming tools to take advantage of the pull. We have OpenAI Codex, Gemini CLI, or Claude Code, for example, but they have been added startups like Cursor either Windsurfing who have also known how to take advantage of this fever for programming with AI. All of these tools promise basically the same thing: that you will program more and better. Productivity theoretically skyrockets, and while more code is certainly being written than ever thanks to AI, programmers They have gone from writing their own code to reviewing what machines generate. Recent studies reveal that veteran developers who believed they had been more productive actually they weren’t. Their estimate was that they had been 20% faster by being able to move forward without blockages, but in reality they had taken 19% longer than they would have taken without AI, according to the tests carried out. There is another problem too: code quality is not necessarily goodand as we say, developers must review that code before being able to use it in production. In the latest survey from Stack Overflow, one of the largest developer communities in the world, there was a notable fact: The positive perception of AI tools had decreased: it was 70% in 2024, and 60% in 2025. There are limitations, but even so everything has already changed Those interviewed by MIT Technology Review generally agreed with its conclusions. Generative AI programming tools are great for producing repetitive code, writing tests, fixing bugs, or explaining code to new developers. However, they still have important limitations, and the most notable is his short memory. These models are only capable of handling a fraction of the workload in professional environments: if your code is large, the AI ​​model may not be able to “consume” it and understand it all at once. For small projects, great. For large developments, probably not so much. The problem of hallucinations also affects the code, and in repositories with a multitude of components, AI models can end up getting lost and not understanding the structure and its interconnections. The problems are there, and they can end up accumulating and causing exactly the opposite of what they wanted to avoid. Several experts, however, explained in that text how it is actually difficult to go back. Kyle Daigle, COO of GitHub, explained that “the days of coding every line of code by hand are likely behind us.” Erin Yepis, an analyst at Stack Overflow, indicated that although this unbridled optimism towards AI has fallen somewhat, that is actually a sign of something else: that programmers embrace this technology, but they do so assuming its risks. And then there is another reality. One that is repeated day after day and that seems undeniable. The AI ​​we have today is the worst of all those we will have in the future. It may not be tomorrow or next week, but it is clear that the AI ​​you program will end up getting better and better. And there may come a point when those limitations disappear. Whether they do it or not, what is clear is that AI has changed programming forever. Image | Mohammad Rahmani In Xataka | OpenAI has turned ChatGPT into mainstream AI. In the business world the game is being won by its great rival

Nike wants to make slow runners faster. Your solution: powered sneakers

Nike has sneakers that you can put on without touching with your hands and even some that they tie themselves. The brand has just crossed a new frontier: that of motorized footwear that helps you walk and run faster. With a design reminiscent of an exoskeleton, Nike compares it to the operation of the electric bicycle. Project Amplify. This is how they have named this footwear system, which is currently in the testing phase. For its creation, Nike has partnered with the robotics company Dephy. It consists of a shoe and an ankle brace with a motor and rechargeable battery that transmits energy through a transmission belt. The shoe can be worn alone or with the ankle brace. Booster. What Project Amplify does is “increase the natural movement of the lower leg and ankle”, that is, it gives us a boost to be able to walk or run faster and for longer with the same effort. According to Nike, it is a system similar to that of electric bicycles whose motor assists us in pedaling and helps reduce the energy demand of the muscles. Nike says it’s like having “a second set of calf muscles.” Hacking speed. Nike has invested heavily in research and development to design footwear that drives athletic performance. Its previous innovations have focused on combining cutting-edge designs and advanced materials that offer a “rebound” or propulsion effect. An example is Vaporfly technology, which proved to be a determining factor in achieving records in major marathons. This is the case of the controversial record of Eliud Kipchoge, who ran a marathon under two hours. There is also the case of the Nike Super Spikes that were worn by several athletes during the Tokyo Olympics. The impact was undeniable: Up to three athletes broke the record in the 400 meter hurdles. Examples like this highlight the importance of footwear and opened the debate about the limits of technology in sports. Even It has come to be described as “mechanical doping”. For the slow ones. At the moment we will not see “motorized” athletes in the Olympics. Nike makes it clear that it is not a system designed for high-level athletes, but for amateurs. running with a rather low pace who want to go faster with less effort. We are talking about runners with a pace greater than 6 or 7 minutes per kilometer. Another use scenario for these motorized shoes is to be able to make urban trips more quickly, for example to go to work on foot. Images | Nike In Xataka | The Alicante sneakers that are succeeding in Silicon Valley and that have Zuckerberg as their best ambassador

It is not whoever shoots faster who wins, but whoever types better

The video game ‘Final Sentence’ is the perfect example that there are no mechanics that are too dry or complex: if the design is good and the packaging attractive you can have some of the content creators most relevant on the internet typing lapidary phrases without rest as if they were secretaries of some oil magnate. Typing for the masses in a title whose final version is not yet available but whose demo is already sweeping Steam. What is it. ‘Final Sentence’ is an independent video game with an overwhelmingly simple concept: a battle royale typing game in which up to 100 players compete to survive based on speed and precision when writingand where every spelling mistake is potentially fatal. Developed by independent Lithuanian studio Button Mash (actually just one person), the game creates an experience that some media have compared it to ‘The Squid Game’ or other games like ‘Buckshot Roulette’. The inspiration, according to your managercomes from your own clumsiness at the keys and the search for more entertaining ways to improve. Why has it been so successful? A series of factors have come together that have turned it into a bombshell. On the one hand, and above all, the launch of a playable demo as part of the latest Steam Next Fest in October. On the other hand, it uses a mechanic that we know well, the free-for-all, which is part of the essence of hits like ‘Fortnite‘, and washes his face. Being easy to understand and difficult to master (that is, anyone can start playing immediately), it also has an implacable brutality with errors: any mistake is severely punished. Which gives it both an addictive component and a viral character that has helped many content creators try it. The pressure of 100. He battle royale It is carried out against groups of rivals of between 40 and 100 players (although it is also possible to organize smaller private games). But the interesting thing is in the most populated ones: seeing how one after another the rivals finish their sentences, the players can feel the imminent elimination, since not only those who make mistakes are punished, but also those who are slower in finishing writing what they have been ordered to. Who has played. People as followed on the Spanish-speaking internet as IlloJuan, ElRubius, Genuine993, PNKeasy They have tried the game, but you just have to go around platforms like Youtube to check that there are thousands of videos with gameplays of the game, all squeezing out the component of tension and terror that this very peculiar concept has. The reason? The idea of ​​”write or die” is extremely juicy and practically any viewer, regardless of cultures and languages, can identify with it. Play to write. Since that legendary ‘The Typing of the Dead‘ that allowed you to connect a keyboard to the Dreamcast to undertake a literary version of ‘House of the Dead’, the very specific subgenre of “typing video games” has experienced multiple mutations, to the point of generating its own variants. There are, for example, online competitive titles (‘TypeRacer‘, ‘NitroType‘, ‘Ratatype Race‘), narrative and adventure games (‘Epistory‘, ‘The Textorcist‘, ‘Type to Continue‘) and games more oriented towards educational or casualas ‘TypingClub‘, the platform Typing.com or the hilarious Ztype. The important thing: that you have the keys well oiled, because you can see every work tool there… In Xataka | Transcribing at full speed with a keyboard with only 21 keys: the job of a stenotypist, according to someone who has been in it for 35 years

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.