We believed that no open model could outperform GPT-5. A Chinese startup proves us wrong

A Chinese startup called Moonshot just launched Kimi K2 Thinkinga gigantic open model with a trillion parameters that has done something that seemed almost impossible: surpass the best proprietary models from companies like OpenAI, Google or Anthropic. If we thought that “Open Source” models could never compete with GPT-5, Gemini 2.5 Pro or Claude, we were wrong. what has happened. This “AI laboratory” had already announced Kimi K2 in July with that gigantic size of one trillion parameters, but now they have released the “Thinking” version with that same size (32 billion active parameters, Mixture of Experts architecture). According to those responsible, the model is capable of maintaining stable use of agentic tools over between 200 and 300 sequential calls. Or what is the same: it can chain long sequences of actions autonomously and apparently without error. The best of all is not that: it is that it surpasses GPT-5 or Claude Sonnet 4.5 in various tests and costs much less than those models. The benchmarks. Those responsible for Moonshot explained how Kimi K2 Thinking achieves the highest scores in Humanity’s Last Exam (general knowledge, 44.9%) and BrowserComp (agent browsers, 60.2%). He is almost at Claude’s level in the SWE software development test, and is also almost the best in another of those benchmarks, LiveCodeBench v6. It is true that in some tests still slightly behind of its “western” rivals, but the achievement is spectacular. More benchmarks. Those responsible for Artificial Analysis have shown their first conclusions after evaluating it with various tests. Thus, they highlight its behavior in agentic tasks that simulate that the model is acting as a customer service agent. In this test it obtained 93% of the maximum, surpassing all its competitors by far (GPT-5 Codex High obtained 87%, for example). They will do more tests, but for now the prospects are fantastic. And on top of that, cheap. On CNBC indicate that training the model cost $4.6 million, a ridiculous figure considering that training proprietary models like GPT-5 It cost about 500 million dollars according to estimates. Using the Kimi K2 Thinking API is also very affordable: $0.6 per million tokens in and $2.5 per million tokens out. GPT-5 Chat costs $1.25/10 respectively, while Claude Sonnet 4.5 costs $3/15 respectively. The details. The model makes use of an INT4 quantization to improve its efficiency without compromising the precision and quality of its responses. Its context window—the “size” of the data we can enter when making prompts—is 256k, a relatively modest figure for large models but still notable. And as a good open model, we can download it to use locally… if we have a real monster at our disposal. The model weighs 594 GB, and for example joining two Mac Studio M3 Ultra It is possible to make it work locally relatively smoothly at about 15 t/s. Alibaba is behindyes. Although the model is developed by an independent startup called Moonshot, this firm has been financially supported by Alibaba, which is becoming an absolute powerhouse in this field. Already not only conforms with developing its own models, which are outstanding (Qwen is the clear example), but is also financing the development of other models such as Kimi K2/Thinking. China and its love for open AI models. During the last few months we have seen how China dominated in the field of open AI models —not “Open Source”—. The Asian giant has adopted an overwhelming philosophy with increasingly better models but which until now seemed to be several steps behind the large proprietary models of OpenAI, Anthropic or Google. This is no longer the case. The race is lively. This achievement represents a new vote of confidence for the open models coming from Chinese companies. It is true that they are huge and that makes it very difficult to use them in practice by end users, but they present an interesting alternative for companies. Image | idnaklss with Midjourney In Xataka | There are many “internal” races within the greater AI race. And Alibaba is winning Open Source

When Facebook launched its own Tinder we didn’t think it could succeed. we were wrong

It was 2018 when Facebook announced Facebook Datingalthough it was not until 2020 when arrived in Spain. At that time, dating apps like Tinder had experienced a boom caused by the pandemic, but Facebook had been losing users for some time and the idea had already been established that it was a place for older people. Meta recently shared usage data for its dating service and they shut us up. 21.5 million. It is the number of daily active Facebook Dating users in the 52 countries in which the app is available. They count in NYTimeswhich surpasses Hinge in users, a very popular dating app especially in the United States that has 15 million users. It is the first time that Facebook has shared usage data for its dating service since its launch. Popular among young people. A Pew Center study published by TechCrunchconfirmed the exodus of young Facebook users, which went from 71% in 2014 to only 32%. The most surprising thing about Facebook Dating is that it is having success among the youngest people. According to data from Sensor Tower As of last year, Facebook Dating has at least 1.77 million users between the ages of 18 and 29 in the United States, which represented a growth of 24%. Free. Other apps such as Tinder, Bumble or Hinge have adopted subscription models through which users can enjoy advantages such as knowing who liked you before anyone else. This is free on Facebook Dating and is your main asset against your competitors. Tinder is the app that had the most paying users, but for years has been losing subscribers. They don’t need it. That Facebook does not charge us for using its dating service can be interpreted as a generous gesture, but the reality is that Meta’s income is astronomical. In the last quarter they entered 51,240 million, many of them thanks to the advertising they serve in their apps. Image | Gemini/Goal In Xataka | Meta does not have the most advanced AI of all, but it does have something much more important: a business plan

Everyone is developing chips that compete with NVIDIA’s. They are in the wrong race

Qualcomm advertisement on Monday that it is working on AI accelerator chips, which means there will be new competition for NVIDIA. The company that dominates the AI ​​hardware landscape is seeing a large group of competitors try to erode that position, but the problem for all of these companies is not the chips, but something else. A CUDA call. what has happened. Qualcomm has announced the AI200 chip, which will begin selling in 2026, and the AI250, which will do so in 2027. Both will be able to work in rack-type systems that have liquid cooling. Qualcomm servers may have up to 72 chips based on the Hexagon NPUs of the company’s Snapdragon SoCs. Inference yes, training no. The company has revealed that its chips focus on inference (the execution of AI models) and not training. Their rack-based systems will have lower operating costs than cloud system providers, Qualcomm says. Each rack consumes 160 kW, a figure comparable to the consumption of some racks based on NVIDIA GPUs. There are no details about the price of these chips, the cards or the racks that will integrate them, nor about how many NPUs can be offered in each rack. What we do know is that Qualcomm’s accelerator cards will support up to 768 GB of memory, more than what NVIDIA or AMD offer in their current models. according to CNBC. Chips for third parties. The other important point is that Qualcomm will sell its AI chips and other components separately, allowing large AI companies to “customize” their own racks based on Qualcomm chips. It is an identical philosophy to the one they have adopted in the world of their mobile SoCs. Investors viewed the news with exceptional optimism, and Qualcomm shares rose 11% in Monday’s session. NVIDIA dominates with an iron fist. In the AI ​​chip segment, the king is NVIDIA. The company is the absolute protagonist of this market and according to CNBC it maintains a 90% market share, which has allowed it to skyrocket its valuation to 4.5 trillion dollars. That dominance could now be threatened by the avalanche of chips that are arriving from various manufacturers. All against NVIDIA. AMD has its excellent Instinct, Google has your TPUsAmazon their TrainiumMicrosoft their Maia and Huawei has your Ascend. All of them make really striking proposals for NVIDIA chips, and little by little these solutions are being integrated into more and more data centers. But the real problem is not in the hardware, but in the software. The great challenge is to defeat CUDA. The de facto standard in the AI ​​industry that developers use It’s CUDAa platform that allows you to take full advantage of the capabilities of NVIDIA chips in the field of artificial intelligence. This hardware+software combination is much more mature than that of its competitors, who have the hardware part resolved (or are on the right track) but do not have a platform comparable to CUDA. AMD has ROCmwhich is especially interesting because it is Open Source, but at the moment its features still do not reach those of CUDA. Reinvent the wheel? CUDA has been on the market for almost two decades, which means that the majority of academic research and pioneering models—such as ImageNet—were written for CUDA. It is not a language, it is a vast collection of libraries, optimized frameworks (like cuDNN), debugging tools and a huge community. Developing a competitor is basically like reinventing the wheel, and migrations are expensive and companies and startups will not have an easy time assuming it. China is also in the fight. And of course, if there is another great protagonist in this race, it is China. The Asian giant, previously dependent on NVIDIA, is seeking to get rid of this manufacturer, and along with the development of advanced AI chips they are also trying to have its own AI software to surpass CUDA. In Xataka | AI is the best thing happening to nuclear fusion. The construction of ITER is already accelerating

We thought dinosaurs were on the verge of extinction before the meteorite. we were wrong

The most emblematic mass extinction in Earth’s history without a doubt occurred up to 66 million years ago. It marked the end of an era like the Cretaceousand with it, the disappearance of dinosaurs that were not birds. But what was that extinction really like? This is the big question that experts have asked themselves and that it is already beginning to have light. For decades the scientific community has debated whether dinosaurs were already in decline before they abruptly went extinct or whether they were wiped out while they were still thriving. This is where the new has had an impact published study in the magazine Science in which the Spanish researcher Jorge García-Girón from the University of León participates, who sheds light on this debate. Simply put, the research refutes the idea of ​​a prolonged decline and suggests that dinosaurs were diverse and divided into distinct ecological regions just before the asteroid impact. The fossils of the south. Much of the uncertainty about this issue comes from a bias in the fossil record. The only well-dated faunas that span the extinction boundary come from northern North America (in the famous Hell Creek Formation). This made it impossible to know whether the extinction pattern observed there was a global or local phenomenon. In this case, the research team focused on a fossil-rich unit much further south, in the San Juan basin of New Mexico, known as the Member Naashoibito. The age of this formation has been a matter of controversy for years and was often considered much older. But now by applying geochronology techniques with Argon dating and magnetostratiography, the study has finally achieved precise dating. The results are conclusive: the Naashoibito Member dates back to the latest Cretaceous, which corresponds to up to 66 million years. This means that the fossils found there, which include a variety of species, preserve some of the last known non-avian dinosaurs. They lived a maximum of 340,000 years before the asteroid impact and were contemporaries of the Hell Creek fauna. Separated by weather. This finding is crucial because, for the first time, it allows us to compare two different faunas from the same end of the Cretaceous. And the result refutes the idea that we had all about decline in our minds. And the study not only dates the fossils, but also uses powerful ecological models to analyze the diversity of terrestrial vertebrates throughout North America. The results show that, far from forming a homogeneous and cosmopolitan fauna, the dinosaurs maintained high diversity and clear endemism until the end. In other words, it can be said that the dinosaurs were “strong” and divided into distinct regional assemblages. In this case, the study identifies two clear bioprovinces in the north and south that remained stable during the late Cretaceous. What separated these faunas? The analysis suggests that the main factor was temperature. More than a simple geographic division, different dinosaur communities were adapted to different climates. For example, the data propose that warmer southern regions may have been more tolerable for sauropods, while colder, more temperate northern regions were more suitable for hadrosaurines. The conclusion. The sum of the evidence points directly to the fact that non-avian dinosaurs were abruptly annihilated at the end of the Cretaceous. They were not in a decline as was thought, so they did not have this factor on top of them that would already condemn them to extinction if the disastrous event on Earth had passed. Instead, it has been seen that its ecosystem was diverse and biogeographically compartmentalized. Extinction in this way was sudden and, as the later fossil record demonstrates, was followed almost immediately by the rapid diversification and rise of mammals. Images | Vaibhav Pixels In Xataka | A museum kept bones for 20 years that they thought were rubble. Now we know that Mexico had its own T-Rex

Something has gone wrong in the European automotive industry. The conflict over Nexperia already threatens to paralyze factories

The European automotive industry is beginning to tighten. Manufacturers have received a clear signal that something is not right: Nexperia, one of the main chip suppliers, can no longer guarantee deliveries. Sector associations warn that the room for maneuver is very limited. This is not a technical problem or a strike, but rather the chain effect of an international dispute that threatens to affect the very foundations of a key industry for the Old Continent. It was on October 16 when the European Automobile Manufacturers Association (ACEA) officially warned of possible production stoppages if the Nexperia supply interruption was not resolved immediately. According to ACEA, the affected chips are used in electronic control units and current inventories will only last a few weeks. The turning point: a blacklist. At the end of September there was a movement that many in the sector identify as the trigger for the current crisis. The United States Bureau of Industry and Security updated his List of Entities to extend restrictions to subsidiaries controlled by already sanctioned companies. Nexperia, owned by Wingtech, thus fell under the scope of the measures. Since then, tensions have accelerated: The Dutch Government intervened in the company and China responded by blocking the export of certain components. Now, Nexperia’s role in the automotive industry is less showy than that of the large chip manufacturers, but essential. Its chips are integrated into electronic modules and control units (ECUs) of many of the vehicles produced in Europe. The company, based in the Netherlands and with a strong presence in Asia, is characterized by its volume and reliability. Precisely for this reason, the inability to maintain deliveries has ignited both sides of the supply chain. The impact in Europe. Initial warnings have been transformed into contingency plans. ACEA calls for a coordinated response between European authorities and the affected countries, aware that the supply chain is going through a delicate point. In Germany, CNBC points outVolkswagen has formed a special team to evaluate possible risks and keep communications open with its suppliers. One of Nexperia’s facilities in Guangdong The company tries to gain margin with a new supplier. “We have an alternative supplier that could compensate for Nexperia’s lack of semiconductors,” explained to Handelsblatt Christian Vollmer, responsible for Production of the VW brand. According to the media, conversations with that company have been underway for weeks. Although the discovery gives some oxygen, the transition will not be immediate and the risk of interruptions remains on the table. The group assures that, for now, there is no operational impact, but they admit that the scenario could change in the short term. The echo crosses the Atlantic. Concern has also reached the United States. The Alliance for Automotive Innovation, which brings together manufacturers such as General Motors, Ford, Toyota and Volkswagen, called for a quick resolution of the conflict. Its CEO, John Bozzella, warned Reuters that if chip shipping “does not resume soon,” auto production “will be affected in the United States and other countries.” Some companies in the group recognize that their plants could notice the impact starting next month. Japan takes positions before the coup. Japan is also bracing for impact. The Automobile Manufacturers Association (JAMA) explained that its members have received notifications from Nexperia warning of supply interruptions. According to the organization, the affected chips are part of the control systems of numerous models and their shortage could have consequences for global production. Mitsubishi Electric, which has had agreements with Nexperia since 2023, assured that it is already studying substitutes. A geopolitical board that is already sneaking onto the assembly line. The Nexperia case is no longer understood only as an industrial problem. The intervention of the Dutch Government and the confrontation with its Chinese subsidiary have turned the company into the new point of friction between Europe, Beijing and Washington. The Netherlands justified its decision by the need to protect the strategic supply of semiconductors, while China defended that its subsidiary acts in accordance with local legislation. At the center of the dispute, Nexperia is trying to maintain its activity under two increasingly opposing regulatory frameworks. The factories are on guard. The next few weeks will be decisive in measuring the real scope of the conflict. Manufacturers adjust their inventories and review alternative suppliers, while sector associations maintain diplomatic pressure to unblock the situation. From Sweden, Volvo Cars CEO Håkan Samuelsson explained to the Financial Times thatalthough his company, owned by the Chinese group Geely, does not face immediate problems, “there will be some factories that will have to stop.” He believes that the key is to react quickly and apply the lessons learned from the semiconductor crisis during the pandemic. Images | Nexperia | Caesar Salazar In Xataka | I also carried the bike in the car anyway. Until the DGT reminded me that it could fine me 200 euros

In its race to make advanced chips, China has tried to copy ASML. It’s going wrong

China continues to make extraordinary progress when it comes to manufacturing its own advanced chips, but it still has a big problem: it does not currently have manufacturing equipment. extreme ultraviolet photolithography (UVE) own. Of course is working in the development of this technology, and one of the strategies it is following to overcome this challenge is unique… and almost obvious. Reverse engineering. In his 2010 book ‘Copycats’ Professor Oded Shenkar argued that it is often the case that imitators end up triumphing over innovators. Although in the West the view is the opposite, in China there is a positive view of copying and reverse engineering processes are an important tool to copy technologies. That is what the country has supposedly tried, as indicated in The National Interest (TNI). From producing for the world to producing for themselves. Already we review the conclusions from the book ‘Apple in China’, which is a perfect example of how by delegating production to China, Western companies have ended up contributing to the country’s development and its specialization. The trade war has logically made China now seek its independence in the face of the vetoes it is suffering from developing its own technological solutions. From UVP to UVE. There has already been significant progress in this area, and recently we counted as a Chinese manufacturer already has a prototype of a UVP machine (deep ultraviolet) for the creation of relatively advanced chips. If there is a crucial challenge to be able to create these even more advanced chips, it is power. have UVE photolithography machinesbut having that first problem solved is important to make the leap to EUV technology. And this is where something unique has been discovered. Let’s see how it works inside. As revealed in TNI, it has been revealed that China has been “caught” trying to reverse engineer a machine ASML UVP Photolithography. Not so much to mass produce these machines, sources indicate, but because Chinese technicians are trying to learn how they work in order to replicate them and, from them, develop more advanced machines and chips. It’s not broken just because. However, it seems that when disassembling one of these ASML systems, Chinese technicians damaged it. That made them notify the official ASML technicians to solve the problem. When they arrived, they discovered that the machine had not simply broken, but that the Chinese had tried to dismantle it and then reassemble it. ASML’s de facto monopoly. ASML’s UVE photolithography machines are considered the most complex and advanced in the world, and the truth is that today the Dutch company has a de facto monopoly with such systems. It is these machines that allow access to the production of the most advanced chips – such as those used in NVIDIA’s modern AI accelerators – and have become the true bottleneck of the semiconductor industry. Beyond the damaged machine. The incident reveals two crucial points. The first, Beijing’s extreme urgency to be able to control chip production from start to finish. The second is that the challenge of creating these machines goes beyond mere hardware copying: lithography systems require extraordinary technical mastery of components such as precision optics or materials science. Too many obstacles? China may have brilliant engineers, but ASML machines also have a highly specialized supply chain which undoubtedly makes it difficult for such a machine to be built entirely in China. A good example is Zeiss SMTthe German company that supplies the ultra-precision optical systems and mirrors needed for UVE and advanced UVP photolithography systems. A long way to go. This supposed problem reveals the difficulties that China is going through in order to have machines with advanced photolithographic technologies. At Nikkei Asia They were already talking in July about how complex it is to achieve a “Chinese ASML.” In this analysis they cited Didier Scemama, director of hardware research at BofA Global Research, who estimated that China still has years to achieve something like this. “It may take 5, 10, 15 years, we don’t know. Will it be competitive with what ASML does? It’s highly unlikely, but it will be good enough for China.” Image | Zeiss In Xataka | Holland has just declared war on China in the most important battle of the century: control of semiconductors

We had been believing that dark matter existed. A new study believes that we were wrong

For decades, cosmology has been sustained on a pillar as fundamental as mysterious: The dark matter. The invisible glue that, according to the standard model, keeps galaxies together and prevents the stars from being fired by centrifugal force.Represents 27% of the universebut it has a problem: nobody has seen or detected it. It only trusts that it will be there. But now A published study in Galaxies This conception of the concept has changed. The study. Research led by the physicist Rajendra P. Gupta From the University of Ottawa proposes an idea as elegant as radical: what if dark matter does not really exist? According to his work, this ghost component could actually be an ‘illusion’, a side effect caused by something we were assumed: that the fundamental constants of nature are ‘constant’. The importance. To understand the magnitude of this proposal you must first remember the origin of the problem. Specifically, we have to go to the 70s, where astronomer Vera Rubin noticed that the stars at the edges of the galaxies revolved at the same speed those of the center. This completely challenged Newton’s laws, something that is as if a person sitting on the outer edge spinning at the same speed as a sitting near the axis. Physically, it should be triggered. The solution that the scientific community adopted was the existence of a “dark matter”, an invisible mass that generates the extra gravity necessary to maintain cohesive galaxy. This concept became the cornerstone of the cosmological model known as ΛCDM (Lamda-Cold Dark Matter). This model works incredibly well to explain the large -scale universe, but after searches with ultrasensitive detectors and experiments in the LHC We have not found a single particle of dark matter. It has always ‘detected’ indirectly through its gravitational effect on visible objects. The proposal. This is where Gupta’s idea enters. Its model, called CCC + TL (Covarying Couplening Constants + Tirad Light), is based on two different ideas. The first one is the so -called ‘Covriant coupling constants’ (CCC). In this case, the model suggests that the fundamental constants of physics, such as the speed of light (c) or the universal gravitation constant (G), are not fixed. Instead, they evolve and change as the universe expands. This is not a completely new idea (the physicist Paul Dirac already flirted with her), but Gupta integrates it into a complete cosmological model. The second idea raised in the investigation is that of ‘tired light’. A concept that arrives directly from the old hypothesis of ‘tired light’, which postulates that light loses energy throughout its trip through the cosmos. In this case, the Gupta model suggests that the redness of the light of the distant galaxies is not only due to the expansion of the universe but to a combination of both effects. Although the “tired light” as the only explanation, has been widely refuted, its inclusion in this hybrid model is key to its calculations. New terms. Once these two new ideas are taken into account, it is time to modify Einstein’s field equations with these variable constants, the GUPTA model makes new mathematical terms appear. This is something that the author has baptized as “α-material” and “α-energy.” And this is where it is true magic: these terms, which are not a physical substance but an effect of the evolution of the laws of physics, generate the extra gravitational attraction that until now we attributed to dark matter. Dark matter would not be something to find, but a mathematical mirage. It is tested. Something to keep in mind is that theories can be very well written and look very good on paper, but logically they have to demonstrate. For this, Gupta used the SPARC database, a high quality catalog with the rotation curves of 175 galaxies. The method used was the reverse to the traditional. Instead of adding dark matter to justify rotation curves, Gupta took the curves observed and used its model to “subtract” the effect of “α-material”. The result should be the rotation curve generated only by visible (barionic) matter. Something that has wanted to materialize in a graphic taking as an example the NGC3198 galaxy. In this image, the blue line (VO) is the rotation speed observed in the galaxy. The points line (VB) is the speed that should have if only the visible matter existed, according to the estimates of SPARC and the discontinuous line (VBX) is the prediction of visible matter calculated by the GUPTA model. The similarity between the prediction of its model and the estimation of barionic matter is remarkable. Something that the author repeated for several galaxies with promising results to give a very forceful conclusion. A new paradigm. If the CCC+TL model is correct, its implications are huge. Not only would it eliminate the need for dark matter, but, according to the author, it could also explain dark energy and other cosmological enigmas, such as why the first galaxies Observed by James Webb They seem more mature than they should. You have to be cautious. This is, for now, a “proof of concept” as the author himself points out. This means that it is using simplifications, such as treating galaxies as perfect spheres, something that is far from reality in the universe. In addition, its dependence on “tired light” is a friction point with conventional cosmology. Models such as this should demonstrate that they can explain with the same precision as λCDM Key observations such as microwave background radiation or the accelerated expansion of the universe. A new advance. But what is clear with this research is that the scientific community is exploring alternatives, especially when the predominant model presents fissures, such as the absence of a direct proof of dark matter. The Gupta model is, for now, a fascinating possibility. A reminder that in science, the most entrenched truths can be questioned and that the solution to the greatest mysteries of the universe might not be to find something new, but in … Read more

‘Borderlands 4’ has been the great launch of September. Also an example of what is wrong in the current video game

The last quarter of the year is not just a IMPORTANT MOMENT IN TECHNOLOGY: In the video game market, it also represents a crucial moment. Titles like ‘Ghost of Yotei‘ either ‘Metroid 4‘They will come out over the next few months, but the one that has just arrived is one of the bombings of the last layers of 2025:’ Borderlands 4 ‘. The analysis reflects that it is a hilarious gamebut also something that players are not missing: it is the example of what is wrong in the current megap production industry. And in this story the director of the game is being considered as the villain when we should look beyond: to technology and optimization. We are going to review. The situation. ‘Borderlands 4’ was launched last week and the reception by the players was good. It is a continuous, really fun title that, in addition, looks spectacular thanks to the artistic style so characteristic and that the engine Unreal Engine 5 (Stay with this name) Allows you to offer. The Steam review system is quite broken, but it is a thermometer of both the quality of the players and, at first, the reactions were mostly positive. All that has changed to ‘varied’, which implies that there is more bad than good. The problem is not the game itself: even those who put a bad review admit that it is a ‘borderlands’ and is hilarious. Here the problem is optimization of the game, especially on PC. Players complain that specs Minimum and recommended offered by developers -Gearbox- do not add up to the experience they are having with teams that reach those specifications or that exceed them. A negative A positive, but also talking about performance Randy Pitchford. It is the name of the director of Gearbox and one of the heads after the success of ‘Borderlands’. Also One of the most controversial characters of the industry due both to your actions as to his most recent statements about the price of video games with that “True fans will find a way to pay them”. It is a guy who does not usually go unnoticed and that uses his social networks to ‘chop’ with any user, and with the wave of criticism of his game, he has not lost the opportunity to get into a couple of gardens. There are countless encounters these days between Randy and users and the problem is that he is doing it from Twitter X. This direct communication between managers and public is a fixed double weapon because it gives the sensation of closeness, but for somewhat large companies have communication teams: so that what is happening does not happen. If we take a look at some Steam reviews, many are reflecting the discontent with the game for those words of Randy. Among the messages of the manager, have Those who encourage a refund to ask if you do not agree with the performance (something that, after two hours plays cannot be done), to “accept the reality of the relationship between the hardware of your PC and the software they want to reproduce”, “”program your own engine and we will be your customers ”and another Comments series that a good part of the community has been taken as a “Randy is calling us poor to those who do not have a RTX 5090” The reaction. And what, unfortunately, happens so much in the world of video games: people who pass from the line. A sample is the following message that Pitchford himself shared In X after receiving it by mail: “You are a damn idiot. If you think that the” Premium “games exist or that there are” premium “players, you are a damn delayed. You are the closest that a white man can become a ‘n -r’ -a racial insult, basically. This type of messages, more own ‘Gamergaters’ than from someone civilized, is a sad reality in certain forums and Randy himself commented that he had “caused harm.” And there are already developers who have shown support to the CEO of Gearbox for the situation. Another one with Unreal Engine 5. Seen all this context, we go with what directly affects the game: the engine on which it is developed. When Unreal Engine 5 was presented, the latest version of the Epic Games engine, We all stay with our mouths open. The demo in PS5 It was impressive and promised a lot. The problem is when the games began to arrive. That a game uses Unreal Engine 5 usually involves bulky requirements to make it work, mediocre yields –even in consoles– In many cases and a series of failures that prevent players from enjoying it as a launch as they should. And it is a problem because the engine greatly facilitates the development of video games, so he is tempting for studies, but there are already those who, when he sees the EU5 logo, starts to tremble for how the title will work on his machine. Balls out. It has passed with more recent examples, such as’Metal Gear Solid Delta‘, or the remake of’The Elder Scrolls IV: Oblivion‘, and Tim Sweeney, CEO of Epic Games, He swept home A few weeks ago, stating that the engine is good, but that studies should start with optimization tasks at an earlier phase of development, and not crossing the task just at the end. The big background problem. As much as it is, and Returning to ‘Borderlands 4’, it is evident that there is a problem. Digital Foundry is an expert medium in the technical analysis of video games, the performance being one of the parameters that measure. From the middle they affirm that there are decisions that do not understand, such as limiting cinematics in real time to 30 fps, the overload that seems to occur when there are many enemies on the screen, the ‘level of detail’, or Lod ‘so aggressive or the pop-in that occurs walking around the world. And that on the … Read more

Japan sent the wrong creature to eradicate the snakes of an island. The disaster was so great that it has taken half a century to solve it

Once again, desperate situations lead to extreme measures. Save a species Sometimes it implies “exterminating” another. We have seen it in South Africa and Your plan to annihilate miceeither Injecting radio -material material into rhinos hornscases of Wild cat huntor the plan for exterminate half a million owls. However, sometimes things do not come out as governments imagine. In Japan they know perfectly. The incident of 79. The story begins in 1979 on the Japanese island of Amami ōshima, located in Kagoshima Prefecture. That year, rediscover Amami’s rabbit (Pentalagus Furnessi), an endemic species and considered a “living fossil” due to its evolutionary seniority. Before the finding, it was thought that the rabbit was on the verge of extinction due to the loss of habitat and hunting. The discovery marked a before and after for the conservation of the species and highlighted the importance of protecting the natural environment of the island, home from many other unique species. An event that also underlined the need for higher conservation efforts in Amami ōshima, for example, trying to eradicate or control the population of snakes. A wrong “bomb”. Thus, within a few months, Japan launches a plan. Introduce about 30 mushrooms on the island With the intention of ending the population of snakes, specifically Habu (Trimeresurus flavoviridis), which represented a threat to local inhabitants. The idea, on paper, was a fissure plan: that mushrooms, which are natural snake predators, reduce the number of Habus and improve safety on the island at all levels. However, that project was far from infallible. The mushroom was not the ideal creature to eradicate snakes. In the first place, because they are active animals during the day, therefore, they could not catch the nightly hubs, who continued to inhabit the following decades without problem. What happened as a consequence had a huge ecological impact. A specimen of trimeresurus flavoviridis Depredation of endemic species. Thus, during the day, instead of focusing on the snakes, the mushrooms began to prey a wide range of native species, including several that had no natural enemies on the island until then. That seriously affected local fauna, especially endemic and endangered species, such as Amami’s same rabbit that had just announced happily months ago. Hundreds of thousands of mushrooms. The situation reached such a point, that the mushrooms, carried to eradicate a plague, had become even larger and more dangerous, one than reached around 10,000 copies At its maximum point over the year 2000. The truth is that Japan had already started a mushroom control project in 1993 that was expanding over time. As? About 30,000 traps were placed on the island to capture the animals and cameras with sensors to monitor them were installed. In addition, local residents formed the so -called Amami Mongoose Bustersa team specialized in the capture of mushrooms (they came to capture thousands). The end? In 2018 there was the last official capture of a megosta on the island. It happened in April, and since no creature has been captured for a long period of time, the panel of experts, which has the task of determining if the animal is eradicated from the island, estimated that the eradication rate It was between 98.8 and 99.8% In February of last year, reaching a preliminary conclusion that it is reasonable to say/think that mushrooms are eradicated from the island in current circumstances. Finally, on September 3, 2024, the Ministry of Environment of Japan declared The eradication of non -native mushrooms on the island of Amami-Oshima, declared a natural heritage of humanity by UNESCO. The statement was based on the opinion of the group of experts on scientific bases, taking into account that the capture of mushrooms has not been confirmed for more than six years since the last one in April 2018. A unique case. The Ministry itself did not hide the disaster that supposed the attempt to control snakes in 1979. In fact, and as the administration has announced, it is one of the largest cases in the world in which non -native mushrooms have been eradicated that had been established for so long. After the statement, the government explained that it will withdraw the traps that were placed on the island, although it will continue to watch with cameras to prevent a new group of these small creatures from between again. After all, if it took half a century to get them out of there, any contingency method is more than understandable. A version of this article is PUblicó in 2024 Image | Animalia, Tanaka Juuyoh, Patrick Randall In Xataka | We have just found a surprising remedy against Argentine ants pests: caffeine dose In Xataka | The mission impossible to control the invasive plague that is eating the European pine: biomolecules, piñones and citizen science

There was a day that Volkswagen wanted to have “the Bentley of the town.” It went wrong

If something has shown us the history of the car is that it is completely irrational, extremely competitive and very conservative. The electric car is demonstrating it clearly. The number of brands has triggered and China wants to make a foothold on European soil. The reality is that Only Tesla seems to have found the way Correct and China is moving in Europe … but with Combustion engines. That conservatism is not new. Raising a brand from scratch is only possible with a huge economic effortsustained and almost blind for years and years (like Tesla)with the help of state media (as Xiaomi and its association with one of the national Chinese car companies). But it is also almost impossible to change the perception that the client has of you. Winning to the client and ascending on the ranks of the market can take decades. A good example is Hyundai and Kia that have some of the best -selling cars In our country but they had to start earning market share selling cars much cheaper than those of the competition. But rapid movements, those who want to position a brand in a higher segment almost from nowhere or those that seek to compete with premium brands with a model that equals features but also in price is generally a call to failure. There is a good handful of examples and the Volkswagen Phaeton is undoubtedly one of the most representative. History of a failure Luckily for those who like cars and unfortunately from manufacturers, the purchase of a car is irrational. It has an inevitable part of aesthetic taste but also for quality perception, affinity with the brand and construction of an image and a history based on the past. That makes, for example, Renault fails with Vel satisf either Avantime Although they were very good vehicles that tried position yourself above the generalists. Nor has Stellantis (and before PSA) achieved return to DS to your luxury past Despite the multiple attempts. And something similar happened to Volkswagen Phaeton. By order of Ferdinand Piëch, then president of the Volkswagen Group, the Germans wanted to assault the premium market with a Berlina that was called “The Bentley of the People”. The intention was to stick with the Mercedes S, BMW 7 series and, curiously for being part of the group, with the Audi A8. The bet was so strong that the possibility to match (or improve) in equipment and materials to its rivals with a more adjusted price was not even tan. It was directly to resemble all fronts (also in price) and in the executives of Volkswagen they took a tortazo. In fact, the Volkswagen Phaeton Not even was a version of Audi A8. Yes, he shared some aluminum panels with the Berlina of the four hoops, as explained in Km77 In the early 2000s, but for development it departed from a blank sheet and even The Volkswagen Dresde factory was builtknown for their Glass structure and for being the one that, discontinued the Phaeton, covers the complete electrical models of the company. In that assault on the heavens, the Phaeton was sold above 66,000 euros for what the German triad looked from you. A Audi A8 It was sold at that time slightly below 69,000 euros. A BMW 7 series It started from 67,500 euros. He Mercedes S Yes it was significantly more expensive, starting from more than 71,000 euros. In those early years of the new century, all German luxury Berlins shared two things: they all had versions above 120,000 euros. And they all had gigantic engines. And the Volkswagen Phaeton was not going to be less. Its most “small” engine was already a V6 in diesel and gasoline versions. From there, it could only be dreaming. Volkswagen’s bet was also sold with a V8 4.2 gasoline, the famous V10 5.0 TDI and an endless W12 6.0 of gasoline that was sold with 420 and 450 hp versions. The average consumption of the latter was around (with the homologations of that time) 14.5 liters/100 km on average. And of equipment, the Phaeton was not badly served: heating seats, electric with memory and massage, bi-xenon headlights, four-zone heshlyzizer, indoor in the skin topped with wood and the possibility of replacing the rear seats with two sidewalks to improve comfort. Developing the car, therefore, was not going to be easy. At least this is attesting to 1,100 million euros that, according to Autoweekthe Germans invested in their development. From Automotive NewsHowever, they raise the figure to 2,000 million euros. But despite the expensive development and the good of the product, selling the Phaeton was not simple either. To the point that, according to this last medium, the Germans lost 28,101 euros for each unit sold. Keep in mind that the company had made a effort huge in machinery, employees and a new factory (The Dresden crystal plant) To launch a car that would meet the quality of a vehicle of its price range. It is said that Ferdinand Piëch delivered a series of unnegotiable requirements to put the car on the street among which was the ability to maintain the interior temperature at 22ºC circulating in a sustained way at 300 km/h with an exterior temperature of 50ºC. And all despite the car was limited to 250 km/h. Only for overestimating the capabilities of the car and that there was no open door to the client’s disappointment. But the market did not respond despite the fact that Volkswagen reached up to 100 patents during its development. Estimates that aimed at 20,000 units sold a year were impossible to meet. Even as the years passed. Because during the decade and a half that the car was on sale only 84,253 units were sold. Volkswagen’s most optimistic forecasts, collect in DiariomotorThey could exceed 35,000 units sold. And, as exceed, the 50,000 cars sold. Seeing one of those people’s bentley was not as complicated as seeing a true Bentley but of course it … Read more

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.