Help us continue to fight human rights abuses. Please give now to support our work
Should lethal autonomous weapons be banned?
marywareham
Share this via Facebook Share this via Twitter Share this via WhatsApp Share this via Email Other ways to share Share this via LinkedIn Share this via Reddit Share this via Telegram Share this via Printer
Ten years ago, Human Rights Watch united with other civil society groups to co-found the Stop Killer Robots campaign in response to emerging military technologies in which machines would replace human control in the use of armed force.
There is now widespread recognition that weapons systems that select and attack targets without meaningful human control represent a dangerous development in warfare, with equally disastrous implications for policing. At the United Nations in October, 70 countries, including the United States, acknowledged that autonomy in weapons systems raises “serious concerns from humanitarian, legal, security, technological and ethical perspectives.”
Delegating life-and-death decisions to machines crosses a moral line, as they would be incapable of appreciating the value of human life and respecting human dignity. Fully autonomous weapons would reduce humans to objects or data points to be processed, sorted and potentially targeted for lethal action.
A U.N. Human Rights Council resolution adopted Oct. 7 stresses the central importance of human decision-making in the use of force. It warns against relying on nonrepresentative data sets, algorithm-based programming and machine-learning processes. Such technologies can reproduce and exacerbate existing patterns of discrimination, marginalization, social inequalities, stereotypes and bias — with unpredictable outcomes.
The only way to safeguard humanity from these weapons is by negotiating new international law.
Such an agreement is feasible and achievable. More than 70 countries see an urgent need for “internationally agreed rules and limits” on autonomous weapons systems. This objective has strong support from scientists, faith leaders, military veterans, industry and Nobel Peace laureates.
On Oct. 6, Boston Dynamics and five other robotics companies pledged to not weaponize their advanced mobile robots or the software they develop — and called on the robotics community to follow suit.
There’s now much greater understanding among governments of the essential elements of the legal framework needed to address this issue. There is strong recognition that a new international treaty should prohibit autonomous weapons systems that inherently lack meaningful human control or that target people. The treaty should also ensure that other weapons systems can never be used without meaningful human control.
The inability of the current discussion forum to progress to negotiations — due to opposition from some major military powers, such as Russa and the United States — shows its limitations. A new path is urgently needed to negotiate new law. The United States should realize that it is in its interest to participate in drafting new law on killer robots.
Without a dedicated international legal standard on killer robots, the world faces an increasingly uncertain and dangerous future.
The Human Cost of Incendiary Weapons and the Limits of International Law
Elements of and Models for a Treaty on Killer Robots
Share this via Facebook Share this via Twitter Share this via WhatsApp Share this via Email Other ways to share Share this via LinkedIn Share this via Reddit Share this via Telegram Share this via Printer
Human Rights Watch defends the rights of people in 90 countries worldwide, spotlighting abuses and bringing perpetrators to justice
Get updates on human rights issues from around the globe. Join our movement today.
Human Rights Watch is a 501(C)(3) nonprofit registered in the US under EIN: 13-2875808
Leave a Reply
You must be logged in to post a comment.