Someone has created the first complete advanced malware by vibecoding with AI. It’s called Voidlink and it leaves an important question

For a long time, develop malware advanced seemed reserved for actors with experience, time and considerable technical capacity, especially in an environment in which operating systems and many platforms have been tightening their defenses. But the table is changing. What we have seen in recent years is that artificial intelligence not only serves to summarize texts or answer questions, it can also very visibly accelerate the software creation when given precise instructions. And that leaves us facing a reality that is difficult to ignore: the same tool that simplifies legitimate tasks can also reduce part of the effort necessary to create malicious code. That change begins to take concrete form with VoidLink. In his analysisCheck Point presents it as one of the strongest evidence so far of advanced malware developed largely with the help of AI. There is, however, an important nuance in the investigation itself: the company assures that it detected it at an early stage, that it was not deployed against victims and that it was not used in active attacks. But that is precisely why the discovery is so revealing, because it allowed access to development materials that rarely come to light. How VoidLink was built and why it changes the dashboard VoidLink was not, at least on paper, a minor piece or a rudimentary experiment. The cybersecurity firm describes it as a malware framework for Linux with a modular architecture, designed to maintain stealthy and prolonged access in cloud environments. In his analysis he mentions components such as eBPF and LKM rootkits, as well as specific modules for cloud enumeration and subsequent activities in container environments. That level of maturity is just what separates it from other previous cases associated with simpler code. One of the most striking twists in the case is who seems to have been behind it. Check Point explains that, due to its internal structure and the pace of evolution observed, VoidLink gave the impression of having come from a large team, with different profiles and a fairly defined work plan. But the evidence collected by the firm points to something very different: a single actor who, according to the investigation, would have had AI support during different phases of development. There is also another relevant element: that actor would not be a rookie, but rather someone with a solid technical base and previous experience in cybersecurity. The most revealing part of the case is how the project would have been built. The firm describes a working method based on what it calls Spec Driven Development that works as follows: You define what you want to build This idea is translated into architecture, tasks, sprints and delivery criteria The implementation is delegated to the model. In the exposed materials, development plans, technical documentation, coding standards, deployment and testing guides appeared, as well as an organization by teams and phases that supports this model. One of the recovered artifacts, dated December 4, 2025, further suggests that VoidLink had already reached a functional phase in less than a week and exceeded the 88,000 lines of code. That is precisely what separates VoidLink from other precedents. Check Point maintains that this is the strongest evidence of malware created almost entirely with the help of AI. “This is the first confirmed case of advanced AI-generated malware, created with the speed, structure and sophistication of an entire engineering organization,” claims the company. The question now is how far malicious actors can go with these types of techniques. Images | Xataka with Nano Banana | Check Point In Xataka | The Booking hack is a little more disturbing: “Tracking phishing” attacks are here to stay

In 2013, Amazon created a Kindle so good it has proven to last forever. And now he has decided that it must end

Amazon has announced that, starting May 20, 2026, Kindle devices released in 2012 will no longer have access to the Kindle Store. You will still be able to access the books downloaded on the devices, taking into account that we should not factory reset the Kindle. If we do, we will not be able to register it in our Amazon account. Goodbye to old Kindles. If you have an early Kindle, starting in May you won’t be able to download books from the official Amazon store or register them as new devices when you restore them. Specifically, these are the affected models. Kindle 1st Gen (2007) Kindle DX and DX Graphite (2009 and 2010) Kindle Keyboard (2010) Kindle 4 (2011) Kindle Touch (2011) Kindle 5 (2012) Kindle Paperwhite 1st Gen (2012) Kindle Fire 1st Gen (2011) Kindle Fire 2nd Gen (2012) Kindle Fire HD 7 (2012) Kindle Fire HD 8.9 (2012) Amazon is sending an email to affected users, offering a 20% discount on new Kindle devices and credit compensation for purchasing new books. Likewise, all the purchases we have made on the old device will be available if we log in to the new one with the same account. It’s not the first time. Amazon has long wanted to have tight control over the installation of books on its Kindles. One of its most recent updates ended with a star function: being able to send books to the device via USB. In the same way, Users were required to keep their Kindle updated to access the store. In practice, this meant limiting features—such as downloading books outside of the Kindle Store—to push users to install those more restrictive versions if they wanted to retain access. Almost a paperweight. A book reader to which we cannot download more books is not very useful. A questionable decision considering that this type of device is born to have a useful life only limited by its hardware – that the screen ends up saying enough, which is difficult with electronic ink or that we are left without a battery replacement. Amazon has decided to end the life cycle of a product that still had a war left to fight. Not because the hardware has stopped working, but because maintaining its compatibility no longer fits with your business model or your current ecosystem. In Xataka | We enter book month with sales on Kindle: you can now buy the eReader for less than 100 euros

In 1994, a programmer created a “temporary” interface for Windows. Three decades later he is still with us

Windows is one step away from turning 40 years old. The first version of the operating system appeared in November 1985and since then it has not stopped evolving. However, Microsoft tends to take a long time to update some components of its products. With Windows 10, for example, it released a renewed user interface, but it was not until years after its launch that it began to get rid of some icons from the Windows 95 era. Now, in Windows 11is renewing programs like paint and Notepad. Regardless of how modern Windows 11 may feel, and all the new features that come with its updates, the system still retains some elements that we could classify as historical. Among them we find the utility to format disks. WINDOWS 10: 9 VERY USEFUL and LITTLE KNOWN TRICKS Currently, if you wanted to format a storage drive from Windows 11 you would find a pop-up window practically identical to the one you could find decades ago. In fact, we know exactly who created it. The format drives dialog in Windows 10 A former Microsoft programmer named Dave Plummer recently shared an some interesting facts about this part of the operating system. The now entrepreneur says he created the Format dialog box one rainy morning from the end of 1994. He says that they were migrating millions of lines of user interface code from Windows 95 to Windows NT, and that the formatting section was very different between systems, so it was necessary to create a new user interface. And Plummer took on this task. The programmer did not think of doing a definitive job, but of providing a temporary solution with the help of a sheet, a pen, Visual C++ 2.0 and the Resource Editor. “It wasn’t elegant, but it would do until the elegant user interface arrived,” he says in the message. Plummer also set the 32GB limit for the format of FAT volumes that morning. It is curious, because FAT is capable of working with larger volumes, although to create volumes with this capacity it is necessary to use the command line. The disk formatting utility interface appeared in Windows NT-based operating systems, such as Windows 2000 and Windows XPand it has been with us ever since. Throughout this time it has basically been a temporary solution created in 1994. Images | Windows | Genbeta In Xataka | Intel is hunting and capturing new customers. His next goal: convince Elon Musk and make chips for Tesla

In 1967, a war veteran believed that moving around a computer could be easier. So he created the first mouse

Things were clear from minute one. When Douglas Engelbarthead of the Augmentation Research Center (ARC), at Stanford, wanted to interview a new recruit, gave him a pencil attached to a brick and then asked him to write his name on a piece of paper. Difficult, right?, joked Engelbart, a doctor in electrical engineering and a pioneer in computer development. Well, people would encounter the same problems, he explained to the candidates, if they were not able to offer them more agile and simple tools to use computers. He wasn’t talking just to talk. Engelbart, together with one of his colleagues, also an engineer William Englishwas the father of the first mouse computer in the 1960s. Only that one was not called a mouse, but XY Position Indicator for a Display System; and its design was quite different from the modern peripherals that we use today. To begin with, it was made of wood and had a pair of metal wheels. This is your story. Make it easy for people: “Click” In the early 1960s, Engelbart, a World War II veteran, recent PhD and with just a couple of years of experience at the Stanford Research Institute —today known as SRI— had a clear idea: he wanted accessible technology. And simple. In 1945, while serving in the US Navy, he had read an article by the inventor Vannevar Bush who encouraged scientists to bring knowledge to the streets and he was determined to transfer that slogan to his own field. The golden opportunity came when the Department of Defense, through DARPAgave him the necessary support to set up his own center in the SRI, the ARC. There he had nearly fifty people working for him and efforts were focused on answering a question: What would the future of computer communication be like? At that time, computing had been in development for decades; IBM had manufactured the IBM 650 and the team was convinced of the enormous potential of the sector. The question was how to use it and prevent the systems from being as unwieldy as a pencil stuck to a brick. At that time the most popular devices for pointing on a screen were optical pencilsa system similar to that used in military radars. Since 1961 Engelbart, however, ruminated on an alternative. To make interaction with computers more efficient: install a pair of small wheels across a table so that the user could operate the screen cursor with them. One would rotate horizontally and the other vertically and its operation would be very similar to that of the planimeter commonly used by surveyors, geographers and architects. The idea had been recorded in his notebook, but already in the 1960s, with the financial backing of DARPA, his own team and extra help from NASAEngelbart was able to delve into it. The veteran and his colleagues gathered the best signaling equipment that existed and made a kind of brainstorming which left half a dozen proposals for working with monitors, some of the most curious, such as a joystick or a light pen. Perhaps the most striking of all was a mechanism that was fixed under the table and operated with the knee. A prototype nicknamed “mouse” Also included among that amalgam was a small device manufactured by Bill English after reviewing his notes from the beginning of the decade with Engelbart. The prototype basically consisted of a carved redwood block which included two wheels crimped at the bottom and a button at the top. Your name: XY Position Indicator for a Display System. Its appearance, compact and with a cable protruding, However, it ended up earning him the nickname “mouse.”. It was so comfortable that it prevailed over the rest of the laboratory’s alternatives and the team included it as a standard piece in their research. The SRI applied for the mouse patent in 1967 and received it in 1970. Engelbart and his companions did not stop there. They continued looking for a “companion” for the mouse, another device that the user could operate with their free hand and could use to enter commands and text. After several tests they opted for a device similar to a telephone with five keys. They also carried out tests to perfect the mouse design as much as possible. “We did a lot of experiments to see how many buttons it should have. We tried up to five. We decided on three. That’s all we could fit in. Now, the three-button mouse has become standard, except for the Mac,” Engelbart himself recalled in 2004, in an interview with Wired. With all this material and the rest of the inventions developed by his team, the war veteran decided to put on a gala performance. One like a beast. In 1968 they organized known as “mother of all demos”a historic conference held in San Francisco in which Engelbart showed all the functions they had developed over the last few years. “For 90 minutes, the stunned audience of more than a thousand professionals witnessed many of the features of modern computing for the first time: live video conferencing, document sharing, word processing, windows, and a strange pointing device jokingly referred to as “the mouse“The elements of the screen were linked to others through associative links or hypertexts,” explains the Computer History Museum. “People were amazed. In one hour, it defined the era of modern computing,” English commented to New York Times in 1996. Shortly after that historic achievement, however, the team began to lose its drive. Some staff questioned the lab’s drift, DARPA cut its funding, and other research centers began to emerge, such as the Xerox in Palo Alto (PARC). Result? Many of Engelbart’s employees sought new destinations. With them went the very concept of the mouse. The device, with a trackball, ended up being incorporated into the Xerox Alto computer and in 1983 Apple marketed it with its computer Lisa. After a while –as you remember Washington Post— Steve Jobs’ company was behind almost half of … Read more

A robot rental industry has been created in China that has plunged prices in a year, but it has an asterisk

From spring 2025 to winter 2026, renting a humanoid robot for a business event in China has gone from costing between 10,000 and 20,000 yuan a day to being listed at 1,796. Robot dogs already cost 78 yuan a day in JD.comless than 10 euros. A drop of 80% in twelve months. Why is it important. Beyond the price war, this is the first real scale laboratory in the humanoid robot business, and what happens says a lot about the real state of an industry that moves a lot of money in financing but still needs a human behind each machine. In figures: Between the lines. The most interesting number in this matter is not any of the above, but this: every robot deployed today arrives with a human engineer behind it. This technician assumes transportation, calibration, live operation and unforeseen events. The actual model is not ‘Robot as a Servicebut rather ‘Robot + Person as a Service’. The logic of SaaS (marginal costs that approach zero when scaling) does not apply here. Each new unit in the catalog implies a new payroll. The bottleneck is therefore not in the supply of machines, but in the supply of people capable of operating them. The context. Qingtianzu, the platform controlled by Zhiyuan Robotics and backed by Hillhouse Capital, connects more than 200 suppliers with companies that need robots for presentations, inaugurations or weddings. like a marketplace. During the Chinese New Year, their orders grew by 70% and exceeded 5,000 orders in one week. JD.com saw searches for “robot” increase 25-fold. The demand exists, the problem is the cost structure. Yes, but. Rent has fallen by 80%, but operating costs have barely budged: transportation, engineers, insurance, logistics… Everything remains basically the same.. The payback period cited by operators (about six or eight months) assumes about ten monthly orders at 2,500 yuan on average. But that works during peak demand. Outside of the holiday weeks, that rhythm is broken. The big question. 65% of orders are for entertainment and marketing: robots that dance or parade at fairs and those types of cute but short-lived acts. Intermittent uses by definition. To have a stable base, the sector needs to enter factories, hospitals and logistics. But experts have already warned: the majority of current humanoids are in the “cerebellum” phase, executing instructions without autonomous decision. That jump, according to the most optimistic estimatesit will take about five years. The panoramic. In a matter of months, China has built an industry with funded platforms, distributed logistics and real demand. It is the first country that has brought humanoid robots to the mass market, even if it is to perform in shopping centers and shake hands in dealerships. TrendForce foresees more than 50,000 units shipped in 2026, 700% more. The sector has its own precedent: drones for shows, which did not take off for their industrial uses but for the shows nightlife in cities across China. Robot rental can follow the same script. The difference is that an autonomous drone no longer needs a pilot. The humanoid robot still does. In Xataka | There is a Chinese startup creating the most amazing robots of the moment. It’s called X Square Featured image | Andy Kelly

In 1987 he had a problem displaying images on his Mac, so he created an app. Today it is the most used image editor in history

Maybe with Nano Banana There are people who have banished Photoshop, but the image editor is the tool that has accompanied photography professionals for decades, almost on par with their camera. In fact, it achieved something only within the reach of very few technological products: becoming a verb and even enter the dictionary. We Photoshop an image and Google it on the internet. Like many other milestones, Photoshop was born by chance: It was the result of a screen that did not know how to show grays. In figures. In these almost 40 years of Photoshop’s life, the editor has been accumulating astronomical data of its progress. Its launch price in 1990 was $895. No joke, it would be equivalent to $2,100 today. It has never been a home software but a professional one. Adobe closed last year with record turnover of 23.77 billion dollars. In 2024 billing was of 21,510 million dollars, of which subscriptions represented 20,521 million dollars. In 2013 Adobe played all its cards on the subscription. Time has proven him right: in twelve years it went from 4,000 million annual billing to almost 24 billion in 2025. How it all started. It’s 1987 and Thomas Knoll was pursuing a doctorate at the University of Michigan in computer vision. Then he had a problem: his Mac Plus had a monochrome screen unable to display grayscale images, only pure black and white. So he wrote a few lines of code to fix it. He called it Display. His little program did the trick, but that was it: he had no intention of commercializing it. The one who did have a nose for the business was his brother John, who at that time worked at Industrial Light & Magic (George Lucas’ company in charge of making Star Wars special effects): convinced him to develop the entire program. Brothers and partners, they sold the license to Adobe Systems Incorporated in 1988. From layers to AI. Photoshop 1.0 would see the light of day in February 1990 as an editor that required only 2MB of RAM and an 8 MHz processor to run, the minimum specifications for a Mac. To put it in context: today Photoshop recommends 16GB of RAM, 8,000 times more. It included tools as iconic to its users as the lasso or the magic wand. But if there was a technical leap that made the difference, those were the very useful capes: they arrived in 1994 with Photoshop 3.0. Before layers, the editor was destructive: each change overwrote the original image. Almost 20 years later, another functional milestone would arrive: the arrival of AI with Generative Fillthat is, being able to add or delete objects with a prompt. Despite the controversy over authorship and the future of retouchingits numbers were incontestable: in April of last year it had already generated more than 22,000 million images since its launch, according to Adobe. The risky move to the subscription model. Before the tricky decision to include AI in its suite, Adobe made another risky move: in 2013 and when we had still succumbed in subscriptionocracyannounced that it would stop selling its Photoshop on a license forever and start renting it. At that time almost 50,000 customers signed a petition against of this decision and its shares fell 12%. Once again, time and pocketbooks seem to have proven them right: they have multiplied their income by six. In Xataka | 16 years ago a student from Barcelona was looking for an easy way to edit PDFs. The website he created is one of the most viewed on the internet In Xataka | 30 years ago he created a player for the university: today his app has more than 6 billion downloads and is still free and without ads Cover | University of Michigan

In 1993 Microsoft created Encarta to revolutionize knowledge. Twenty years later it would be devastated by a tsunami

It became so popular that its logo and the sound of their intros They became two brands just as identifiable as those of Nokia or Windows. If – like the person writing this – you had to go to school or high school between the second half of the 90s and the first half of the 2000s, talk about the Encarta It does not require large presentations. If not, don’t worry; It won’t take us much time. Before Wikipedia offered free online knowledge and even the use of the Internet became popular, Microsoft launched a digital encyclopedia that revolutionized the sector and became a phenomenon between more or less 1993 and 2009. Its name: Encarta. Today, ironies of history, “Encarta” is one more entry in the index of other encyclopedias; but there was a time when it transformed our way of accessing knowledge. From having to spend their eyelashes and fingertips scrolling through pages in search of information, students began to search for information with the click of a button. The Encarta offered an agile, comfortable and above all didactic way to satisfy curiosity. With articles, yes; but also with videos, audios and even virtual visits and games. You could read about Nepalese temples in the Salvat. Or open the Encarta and “tour” one. Its “pull” was so great that it put the old paper encyclopedias in trouble. When the Spanish edition was presented in early 1997, those responsible presumed that the Encarta CD-ROM, a format that you could store in a drawer or even a folder, contained information that It was equivalent to 29 volumes and 1.2 meters of shelving. Not only that. The Encarta cost 24,900 pesetas, four times less than an equivalent printed encyclopedia. To make matters worse, his landing in Spain was protected by Santillanaa publishing house with considerable weight in school classrooms. How to compete with that? The product was liked and published in Spanish and other languages. He did well until, with the same ones with which he had become a phenomenon, ended up succumbing to the competition. In a way, his success is due to his good sense of smell in the 90s; its decline, to the inability to adapt in the 2000s. This is your story. Objective: reinvent the old encyclopedias In the mid-1980s Microsoft He began to think about the idea of ​​creating a digital encyclopedia. The idea was ambitious. Those from Redmond wanted, neither more nor less, to rethink the concept and operation of a product apparently as mature and closed as the volumes that publishers’ commercials were dedicated to selling door to door. To make its debut in a big way, the multinational tried to negotiate a license with the creators of what was probably the most respected publication internationally: the Encyclopædia Britannica. It didn’t go well for them. In the 1980s, paper volumes of Britannica were sold and They left huge profits. As Enrique Dans remembershis books cost about $250 to produce and the selling price ranged between $1,500 and $2,200, depending on the quality. Why would the firm want to digitize content on a CD and risk killing the goose that lays the golden eggs? Microsoft did not give up and looked for ways to move the idea forward. He even had a name for the initiative: Project Gandalf. Some time later he closed a contract with Funk & Wagnalls to use your New Encyclopediaof 29 volumes, in a database that was created at the end of that same decade. To complete its contents, years later two other McMillan encyclopedias would be added, the Collier’s and New Merit Scholar. They were not the Britannica; but it would have to do. However, doubts arose in Redmond about whether or not the project was viable and they decided to park it. It was resumed at the turn of the decade, in 1991, when Microsoft decided to go all out. In 1993, the first edition of the Encarta Encyclopedia was launched, which included the 25,000 Funk & Wagnalls articles and extra material, such as images and some animations. The tool was comfortable, much more agile than the kilometric tomes and even fun, but it started with a huge mistake: the shot was centered wrong. At the beginning of the 90s there were still many houses without a PC and the marketing price was exclusive. When it came out, the Encarta cost about $400, which greatly limited its range. The cost deterred customers and was not too far from that of another competitor that was testing the same niche with a recognizable brand, Compton, which also launched your own multimedia version in 1990, with text and supports such as images and sounds. In Redmond they knew how to react and soon they were deploying a more aggressive strategy. They launched promotions that allowed you to get the Encarta for 99 dollarsthey included their CD with the Windows software package and negotiated with manufacturers to incorporate it into their computers, a tactic not unlike that used with Windows and Office. The promotion of Microsoft itself gave the final push. The new encyclopedia gained fame and began to chain editions, translate into different languages and enrich content with multimedia supports. In 1995, abridged versions of some articles were offered for Microsoft Network ISP subscribers, and starting in ’96, standard and deluxe editions began to be released, an enriched version that could be updated month by month. In 1998, its creators went one step further and acquired the rights to several electronic encyclopedias. The product was growing and, above all, it demonstrated that the sector was experiencing a clear paradigm shift. The best example: in 1996 the once powerful company Britannica ended up underselling for their difficulties. “It allows young and old to explore the world by themes and characters,” their promoters boasted in the Spanish market. And so it was, indeed. Through articles, photos, illustrations, graphs, maps, timelines, recordings, videos and even virtual tours, Encarta won over an entire generation of students. … Read more

has now created the first chemical map of the hidden face

While NASA chokes on the MoonChina is going like a rocket. Not literally, but they lack little. The satelliteand has become a priority again in space exploration due to its potential in scientific research, but also like mine and even as a ‘battery’and everyone wants their share of the space cheese. China is completing steps at an astonishing speed in their goal of going to the satellite and has just reached another milestone: they have created the first chemical map of the hidden side of the moon. And it is something with the potential to accelerate the next steps on the satellite. In short. A investigation conducted by the Chinese Academy of Sciences, Tongji University and the Shanghai Institute of Technical Physics has led to chemical mapping of the entire satellite. That includes something that was “unexplored” in this sense until now: the hidden face. Until nowalmost half of the lunar surface that remains hidden from our eyes was “uncharted chemical territory” because… well, we hadn’t been there. In the Apollo missions, materials were collected that allowed, together with the observation missions, to carry out this chemical profile of the satellite, but only of the visible part. It is, in short, where we had been. The Chang’e-6 mission changed that when, in June 2024, returned from his mission on the hidden side with about two kilos of material from the South Pole-Aitken basin. AI. They were the first samples collected from the far side and the only thing researchers could cling to if they wanted to develop that chemical profile of the satellite. It is, so that we understand each other, like the DNI, and to create the chemical map, they have used artificial intelligence. They dumped the sample data along with other orbital spectral data collected by the multiband imager. Kaguya from Japan and, after a process of data cleaning and refinement, the researchers have mapped the distribution of six large groups of oxides. We are talking about iron, titanium, aluminum, silicon, calcium and magnesium, and this is something that allows us to develop a hypothetical historical profile of the Moon. For example, we now know that the highlands have a higher concentration of magnesian rocks compared to the visible side. And even if you think “so what,” this indicates that the Moon’s magma ocean crystallized asymmetrically: first in one of the hemispheres and then in the other. Importance. There is still data to be revealed, but this chemical map is more important than it may seem. It is a different way of mapping the satellite and… well, it conditions everything we want to do on the Moon soon. Rough wayis a key advance to understand both the elemental composition and the geological evolution of the planet. You can also create a chronology of impacts and something more “useful”: it is a guide for future missions. By having data on the composition of the soil and the probability that there are more or less resources In certain areas, this chemical map allows moon landing sites to be selected based on very specific data. For example, if future missions want to focus on collecting regolith rich in certain elements, the chemical map is a thread of clues to pull on. Future. Because we are no longer talking about “well, when we return to the Moon…” we are talking about powers that have very clear plans not only to send automated probes, but to set foot, again, on the satellite. He NASA’s Artemis program -which continues to accumulate problems- will be the first manned flight around the Moon in 50 years, and future trips They are aiming for lunar landings. China, for its part, wants to send the Chang’e 7 probe to the south pole in search of ice; Chang’e 8 to test the utilization of resources directly on the satellite and manned flight missions for 2028 and a moon landing in 2030. Russia was also in the loop with the Luna project, as well as the creation of the space base in collaboration with China, but its solo projects have been delayed. Therefore, the fact that we have the first chemical map of the satellite is not only an achievement to satisfy scientific curiosity, but also a guide for those future missions on the ground. In Xataka | Mars was the great space battleground between China and the US. Now it’s the Moon and there’s too much at stake

Before the Incas, a civilization created an impregnable empire in the heights of Peru. His secret: feces

The coastal desert of southern Peru is one of the most arid environments on the planet, but this was not an impediment for a civilization that was able to prosper here with more than 100,000 people and before the arrival of the Inca empire. Their secret here was seabird guano, and science has now just demonstrated to what extent bird dung was the real economic and demographic driver. of the Chincha Kingdom. The feeding problem. During the Late Intermediate Period, approximately 1000 to 1400 AD, the Chincha Valley became a pre-Inca superpower. But to sustain its growth and maintain some 30,000 workers, it was logically necessary to produce food on a large scale, and more specifically corn, which was the basis of their diet. The problem is that the Peruvian coast is not exactly the most fertile place in the world, so the population faced a serious food problem. But here the solution was to look at the sea and the islands full of guano birds, and more specifically towards their feces and their ability to fertilize. Something that made them begin to prosper and become very strong in the region. The confirmation. To confirm this theory, a scientific team analyzed stable isotopes of carbon, nitrogen and sulfur in 35 ancient corn cobs and 11 seabirds found in tombs in the Chincha Valley. Here it was possible to see how clearly plants that absorb nutrients from fertilizers derived from marine animals show a very specific chemical signature with high levels of nitrogen 15. The results. Here the conservative limit to determine the use of guano in the experiments was located at a value of +20%, but in Chincha corn the average values ​​were +19.4%, reaching peaks of up to +27.4%. Thanks to radiocarbon dating, scientists have been able to place the beginning of this large-scale agricultural practice around the year 1250 AD.a date that coincides millimeters with the rise and expansion of the Chincha Kingdom. What we knew. Modern chemistry only confirms what archeology and history already hinted to us, since the iconography of the time is full of references to this agronomic practice. In textiles, friezes and ceramics of the Chincha culture, corn appears constantly represented alongside guano-producing birds, such as the guanay cormorant, the Peruvian booby and the pelican. Even Spanish colonial chroniclers, such as Inca Garcilaso de la Vega, recorded this practice when describing how the indigenous people applied the guano to corn through irrigation systems and they documented the strict taboo laws later imposed by the Incas to protect these birds that for them were the focus of fertilization of their fields. This is why killing a guano bird or disturbing its nests was a crime punishable by death. A great revolution. The mastery of guano technology not only filled the stomachs of the Chincha, but made them a key player in Andean geopolitics. In this way, when the Inca empire began its expansion, they did not conquer the Chincha because of their great strength, and instead they formed a strategic alliance. The Chincha here had control of the precious fertilizer and dominated the maritime trade routes, exchanging the guano for luxury goods such as prized shells. Spondylus. This agricultural base allowed the Chincha Kingdom to negotiate its integration into the Inca empire from a position of power and privilege. Images | Ames Wainscoat In Xataka | Prehistory was also ‘woke’: a woman from 7,000 years ago suggests that gender was not an immovable barrier

A teenager in Mexico created a Hombres G fan website in 1998, with the band already separated. 9 years later they filled Las Ventas

In 1998, Mexican Francisco Romero was 15 years old, had a new computer and a school assignment to complete. Looking for the best grade, he created a website about his favorite group: Hombres G, a Spanish band that by then was already dissolved. What began as an academic exercise ended up becoming the band’s first digital fan community, with thousands of members spread around the world. And it was also the trigger that convinced David Summers and his team to return to the stage. How it all started. In 1998, having internet at home in Mexico was not common: just a marginal fraction (2-3%) of the Mexican population had access to the network under these conditions. Even so, Francisco Romero, a teenager who had just gotten his first computer, embarked on completing a school project in which students were asked to create a web page. Romero chose the Hombres G as the subject of his project. He had arrived at the Madrid group, which had already been dissolved for five years, through friends from high school. And since finding documentation about the band was difficult (there were only two pages about Hombres G on the internet), he decided to create a community. Meeting point. The web, as Romero himself explainswas titled Club ‘We’re still crazy… so what?’, in reference to ‘We’re crazy… or what?’ title of one of the group’s first albums. The success was immediate: in its first five months, it received hundreds of requests from Mexico, Spain, Colombia, Peru and Japan (in times before algorithms and search engines crashed). They wrote to him, above all, from fans who had not had a space to talk about the band for years, to which they had not stopped listening since the last album they had released in 1993, ‘Bikini history‘. The contact. At the end of 2000an anonymous user left him a complimentary message on the page, to which Romero responded politely. Three days later, another message arrived from the same sender, who turned out to be one of the band’s two guitarists: “Please don’t give out my email, I’m Dani Mezquita.” Later they established telephone contact, which ended up leading to more frequent conversations. The significant fact: Mezquita was then working as marketing director at DRO East West, the Warner Music label that released almost all of the band’s albums. From his position he had noticed something: at the end of 2000, a compilation of Hombres G was the third best-selling album in Mexico at that time. A group without activity, without tour, without active label, without a single public appearance in years. That is, they had an active and completely underserved fan base. With these data on the table, and as told in the documentary ‘The Best Years of Our Life’ (released in theaters scheduled for April 30), the members met and proposed a modest return, with three or four concerts in Mexico. It gets out of hand. From there the expectation skyrockets. The reunification tour ended adding 70 performances during 2002 and 2003including a concert in Las Ventas before 20,000 people and several cities in Latin America and the United States. The album that accompanied the comeback, ‘Dangerous Together’, was initially released only in America, which says a lot about where the weight of the comeback was leaning. When he arrived in Spain he ended up obtaining the Platinum Record. In gratitude for Romero’s importance in this return, he has continued working continuously with the band from Mexico. And so we come to the present: on April 25, 2025, Men G performed before more than 60,000 people at the GNP Seguros Stadium in Mexico City. All within the framework of a tour titled ‘Thank you, Mexico Tour’. A name that makes it clear to what extent the very survival of the group is owed to a modest student from the city. In Xataka | Three millennia of pop: the oldest song in the world is 3,400 years old and we can still hear it

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.