A woman from 7,000 years ago suggests that gender was not an immovable barrier

For decades, our vision of European prehistory has been dominated by a fairly rigid idea regarding the division of labor in communities: men were assigned certain tasks and women others. However, bones have a fascinating habit of disproving our prejudices, as has now happened after analyzing some human remains found in Hungary. What has been seen. This new analysis of human remains Dating back to more than 7,000 years ago, it has revealed an older woman buried not only with typically “masculine” grave goods, but also with marks on her bones that show that she did the same physical work as them. Something that has marked a before and after in gender roles in prehistory. The rule and the exception. To understand the magnitude of the find, an international research team thoroughly analyzed 125 adult skeletons which came from different cemeteries in Hungary. Here the researchers already knew that there were structured gender norms, since the funerary “law” was very clear, indicating that men were buried lying on their right side and accompanied by polished stone tools. In contrast, women stood on their left side and their trousseau was usually composed of belts made of shells. Up until this point, everything seemed to fit into a perfect binary system, until researchers came across the skeleton of an elderly woman. And, unlike the rest, she had been buried with polished stone tools, the classic “masculine” status symbol of her culture. They went further. If the grave goods on this corpse were already an anomaly by the standards of the time, the biomechanical analysis of the skeleton ended up surprising the scientists. In this case, the researchers did not limit themselves to looking at what objects accompanied the dead, but they crossed these data with the patterns of physical activity imprinted on the bones, such as the natural wear and tear of the different parts of the bones. Basically, the bones adapt and deform according to the postures and loads that we endure throughout our lives and that is why they can give us a lot of long-term information about our jobs. Here the researchers discovered that the men of this community tended to have marks associated with prolonged kneeling and intensive use of their arms, probably due to the use of specific tools or carrying work. Something that women did not have because they did not carry out those tasks. The surprise. Here the study skeleton that attracted so much attention revealed the same bone marks and joint wear resulting from kneeling as the men had. In this way, not only was this woman buried as a man, but she lived, worked, and moved like one of them. Neolithic genre. This study brings to the table a fascinating conclusion: Neolithic societies did have marked gender roles and a structured division of labor, but it was not something set in stone that ‘condemned’ a person to a job for being a man or a woman. As science now points out, the roles were “generalized but flexible.” This means that the fact that this community has decided to bury a woman with the honors of a man, recognizing the role she played in life, shows that in Europe seven thousand years ago there was room for exception. Images | engin akyurt In Xataka | 2,000 years ago Epicurus had already understood the secret of pleasure: “Nothing is enough for those who have enough is little.”

Spain has fought the fight against gender violence. And it is translating into mortal failures

A woman named Lina went to the police last January. His ex -part Viogén. This system, based on an algorithm, determined that Lina was a “medium” risk person. Three weeks later it was allegedly murdered For your partner. It is not the first time that something like this happens, and shows that we have a serious problem with our potential dependence on algorithms. The origin of Viogén. The Interior Ministry development In 2007 the Viogén system (integral monitoring in cases of gender violence). Among its objectives was to make a risk prediction and, depending on that prediction, monitoring and protection of the victims. How it works. The system is based on the collection and analysis from various sources such as police complaints, protection orders or criminal record. In the complaint, for example, a series of questions about the episode of the aggression, the situation of the victim, the children, the aggressor’s profile or the aggravating vulnerability, such as economic dependence are asked. Risk levels. From that evaluation one of the four levels of risk is assigned to each case (1 – low, 2 – medium, 3 – high, 4 – extreme). Each of them entails specific measures that may include from the allocation of telecare devices to remote orders. At the extreme risk, women have 24 -hour police surveillance. Viogén 2. The system has evolved since its creation and in recent months its second version has been implemented, Viogén 2. As explained in article 14the algorithm was updated with novelties such as eliminating the unreissented risk and hindering the inactivation of open cases. Thus, a new supervised inactivation modality appears that sets police control mechanisms for a period of six months extendable to one year. That makes it possible to monitor cases in which police experts have not appreciated the existence of risk for women or this is low. Zero protocol. There are also modifications that will allow the victim to request it in a “voluntary, manifest and repeated” way to inactivate cases of unattended risk, low or medium. Even so, the so -called “zero protocol” designed to minimize the risk of victims who express their desire not to denounce. According to the Macro -New Equalitythe vast majority of victims do not report, and therefore also protect them: institutions only have knowledge of 21.7% of cases according to said survey. Tragedies everywhere. The problem is that the system is not entirely effective. The alleged murder of Lina is the last example of the limited reliability of Viogén. In October 2024 a 56 -year -old woman She was killed Despite having asked for help even twice. Before, in 2024, another woman was killed by her partner and her It was also part of the Viogén system. The algorithm seems to minimize the risk. In the case of Lina, for example, the Viogén system allocated the “medium” risk for it, and that seems to happen on more occasions. In September 2024, 96,644 women were within the Viogén systembut only 12 of them were considered extreme risk, 0.01% of the total. Both the Minister of Equality, Ana Redondo, and the Minister of Interior, Fernando Grande-Marlaska, They minimize errors Recognizing that “the model is not infallible, but saves many lives.” New alarm against AI and algorithms. In recent times we are seeing how there are more and more cases in which excessive confidence is granted to algorithms on especially sensitive issues used in administrations and public institutions. The AI ​​does not stop making mistake. It happened with the Veripol system using AI to detect false complaints: His real reliability was very debatable. Something before, in March, we lived the Ábalos Case scandal in which an AI used to transcribe the statements of witnesses and defendants made mistakes and ended up turning some paragraphs into a gallimatisms. The AI ​​system for facial recognition itself that is being used for example In video surveillance cameras in Madrid He has done too Jump alarms in privacy. In the United Kingdom an AI was used to predict crimes to the minority report, and Its results were unfortunate. Attempts to apply AI in judicial processes and police They have also generated worrying conclusions. Lack of transparency. These systems are usually also criticized for their lack of transparency. Veripol is a good example, but we had others. In 2024 we talked about the Bosco system, used by electricity companies to decide who and who cannot accept the social bonus for aid to the light invoice. The Government He refused to share the source code claiming reasons for public security and national defense. It is not a problem only from Spain: there is an algorithm that suggests to the US judges what convictions imposebut its code is a secret, for example. In such delicate issues, the lack of transparency on the functioning of these algorithms is especially worrying. There were no agencies for this? In 2021 the creation of A Spanish Agency for Artificial Intelligence Supervision (Aesia). It was apparently centered to monitor compliance with the Digital Services Law (DSA) on platforms such as great social networks, and in fact in 2022 Sevilla was chosen To house the first European Center for Algorithmic Transparency (Ecat). What about Aesia. More recently we have seen how AESIA finally wants to take shape with Its coruña headquarters and start operating in 2025 to theoretically focus on the application of the EU AI Law. Its objective is theoretically to carry out “measures for the minimization of significant risks on the safety and health of people, as well as their fundamental rights, which can be derived from the use of AI systems.” Both the case of Viogén and Veripol’s or what happened in the ‘Abalos Case’ are precisely likely to enter that area, and it remains to be seen if the activity of this agency manages to help both the algorithms used as well as its application are optimal. Image | James Harrison | National Police In Xataka | We live a concentration crisis. Experts … Read more

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.