in

There is a non -proliferation pact of nuclear weapons. In 2025 what we need is one that avoids murderous robots

In 1139 Pope Innocent II prohibited The use of the crossbow. He then described it as “a detestable weapon for God and unworthy for Christians,” although he considered it valid to fight the infidels. The measure was not taken into account, and the crossbow continued to be used in later centuries. The history of weapons has always been linked to these prohibitions, and now there is a especially delicate: the one that affects the so -called “murderous robots.”

Prohibited weapons. That example of the crossbows ended up being just one of the many that have surrounded the evolution of the military weapons and its application to war conflicts. In 1970 the Nuclear Non -Proliferation Treaty To avoid the use of nuclear weapons, but international law also prohibits the use of chemical, biological or antipersonnel mines. These agreements are not usually ratified by all countries of the world, but by the vast majority.

Beware of autonomous weapons. As indicated In ReutersThe United Nations Organization has called a meeting to regulate the segment of autonomous weapons controlled by artificial intelligence. This type of armament is increasingly used in modern war conflicts, and experts warn: it is time to put limits to the use of this lethal technology.

The Ukraine War as an example. What is being lived in the Ukraine war is a sign of how this type of autonomous weapons are being used. The drones and robots They are being used Notable form against Russian troops, and there is already drones throwing drones to attack other drones. The relevance From this type of weapons it has been even affected by the commercial war between the US and China, which makes DANGER PRODUCTION AND EXPORT of these autonomous vehicles. In The New York Times They already warned of the rise of the fearsome “kamikaze drones” and their use in this conflict.

Ten years talking about prohibiting murderous robots. In Xataka we have been talking about the danger of weapons with AI and drifting towards the famous “murderous robots”. He debate On the potential prohibition of murderous robots It comes from afarand organizations like Human Rights Watch has been trying to ban them since 2015 before it’s late. The researchers themselves already They warned of that danger In 2017 and Brad Smith, president of Microsoft, claimed that these murderous robots They are “unstoppable”. However, there are many countries that They have continued developing them, and there is no consensus when putting limits in this dangerous area.

Deadline. The Secretary General of the United Nations, Antonio Guterres, has established that 2026 is the deadline for all countries to establish clear rules in the use of weapons with artificial intelligence. His words are clear: these autonomous armament systems are “politically unacceptable, morally disgusting” and should be prohibited. “

There is no consensus. What is missing is the aforementioned consensus: Alexander Kmentt, head of armament control in the Austrian Foreign Ministry, explained it: “Time is pressing to stop the nightmares that some of the most prestigious experts warn,” he said. Some great personalities of the technological world such as Elon Musk or Demis Hassabis They already warned of the problem in 2018 and asked the UN to ban autonomous weapons.

The military resists. Diplomatic efforts face military controls, which according to Reuters resist regulation because that could blur the advantages posed by these technologies on the battlefield. This last meeting of the Convention on Conventional Weapons (CCWfor its acronym in English) is the last edition of some meetings that have been held since 2014. Participants have been necessary “a legally binding treaty” for UN countries.

But some countries prefer to go to their rhythm. Many countries support that general agreement, but USA, Russia, China and India prefer to have national regulations or that existing international laws are applied, according to Amnesty International. A US Pentagon spokesman said in Reuters that “we are not convinced that existing laws are insufficient” and stressed that autonomous weapons could raise a lower risk for civilians than conventional weapons.

And since there is no regulation, there is proliferation. The lack of these limits is causing a clear development of this type of autonomous weapons. The experts of Future of Life Institute They have monitored the deployment of about 200 autonomous weapons systems in Ukraine, the Middle East and Africa. Russian forces have deployed some 3,000 kamikaze drones Veter in Ukraine, according to that data, and as we have indicated in several occasions In Xataka, that country has in these drones one of its Critical elements To attack Russian goals.

Duality. As my partner Javier Jiménez said In a fantastic theme That he prepared in 2018, another of the problems with this debate is that “it is very difficult to determine what to prohibit and what not in a world as strongly computerized as the war.” The key is not so much in technological and ethical, and here we are facing a dual technology capable of being used for civil and military purposes. Here the reflection was clear: “No one is going to give a strategic military asset for an ethical issue,” he said. He added as a conclusion that “beyond alarmism, we need tools” to identify, monitor and control the development of these weapons because “neither good intentions nor self -control have worked well in the past.”

A lot of money at stake. But as always, one of the factors of this industry is that there is a lot of money at stake, and more when there is a renewed fever for Increase defense budgets. Laura Nolan, of the Stop Killer Robots activist organization, made it clear that there is no guarantee that technology companies will be responsible when developing these systems: “In general, we do not trust that industries are self -regulated … there is no reason why defense or technology companies must be more worthy of trust.”

In Xataka | Ukraine has found a solution to China’s veto in drones: it’s called Hell, it’s a “home” missile and bends the scope of the attack

In Xataka | Thus the war of the future sees the teacher who defends the robots soldiers

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

What is it, how will it work and when will you get

The world is drinking less wine than ever. Spain is the exception