Autonomous weapons are here, but the world is not ready for them

[ad_1]

this might be The year remembered by the world Lethal autonomous weapon Has shifted from worrying about the future to Battlefield realityThis was also a year when policymakers failed to agree on how to deal with it.

On Friday, 120 countries participating in the United Nations Certain Conventional Weapons Convention Unable to reach agreement on whether to restrict the development or use of lethal autonomous weapons. Instead, they promised to continue and “strengthen” the discussion.

“This is very disappointing, and it’s a really missed opportunity,” said Neil Davidson, senior science and policy adviser. International Committee of the Red Cross, A humanitarian organization based in Geneva.

The failure to reach an agreement was reached about nine months later at the United Nations Report Deadly autonomous weapons were used for the first time in the armed conflict of the Libyan civil war.

In recent years, more and more weapon systems have incorporated elements of autonomy. For example, some missiles can fly in a given area without specific instructions; but they generally rely on a single person to launch an attack. Most governments say that, at least for now, they plan to “participate” humans when using such technologies.

But progress is Artificial Intelligence Algorithm, Sensors, and electronics make it easier to build more complex autonomous systems, which increases the prospect that machines can decide when to use lethal force on their own.

More and more countries, including Brazil, South Africa, New Zealand and Switzerland, believe that lethal autonomous weapons should be restricted by treaties because chemical and biology Weapons and Landmine already. Germany and France support restrictions on certain types of autonomous weapons, including weapons that may target humans. China supports a very narrow set of restrictions.

Other countries, including the United States, Russia, India, the United Kingdom, and Australia, oppose the prohibition of lethal autonomous weapons, believing that they need to develop technology to avoid being at a strategic disadvantage.

Killer robots have long attracted the imagination of the public and inspired both Beloved sci-fi character and A dystopian vision for the future. The recent resurgence of artificial intelligence, and New computer program Able to surpass humans in certain areas, prompting some tech giants Warn of existential threats Consists of smarter machines.

After the United Nations reported that Turkey-made Kargu-2 drones were used in the 2020 Libyan civil war, this issue has become more urgent this year. According to reports, forces allied with the Government of National Unity have launched drones at forces supporting the Government of National Unity in Libya. Army leader General Khalifa Haftar independently aimed and attacked the people.

The report stated: “The logistics convoy and the retreating Haftar affiliates… were pursued by drones and engaged in remote warfare.” These systems were “programmed to attack targets without the need for a data connection between the operator and the ammunition: actually , Has the real ability to’fire, forget and discover’.”

The news reflects the speed of improvement in autonomous driving technology. “This technology is developing much faster than military and political discussions,” said Max Tegmark, MIT professor and co-founder Future Life Institute, An organization dedicated to solving the survival risks faced by mankind. “By default, we are heading towards the worst outcome.”

[ad_2]

Source link