Could Robots Be More Deadly Than Humans?
TRANSCEND MEMBERS, 20 Mar 2017
20 Mar 2017 – The ever-greater use by the United States – the military and the CIA – of drone air strikes in Yemen, Somalia, Syria-Iraq, Afghanistan, Pakistan and elsewhere has focused enough attention so that there will be a start of formal international attention given to the use of “killer robots” and the possibility of autonomous artificial intelligence weapons. US drone strikes have become so routine and bureaucratic that they provoke only small protests within the USA, usually on the part of human rights groups protesting the killing of individuals without a trial. (1)
The possible wider use of drones or other robotic military material such as autonomous tanks and trucks are being worked on by governments, often with the cooperation of private industry. The possibility of an “arms race” among technologically-advanced States such as the USA, Russia, China, South Korea and Israel is real, especially at a time when mutual trust is in short supply.
Contemporary armament dynamics tend to acquire a momentum of its own and to resist social control. Essentially, an arms race can become a race in technology. The rhythm of technological advances far outstrip the pace of arms control negotiations. New weapons reaching the production line make arms control agreements on older weapons systems obsolete.
For the moment, those concerned with arms control issues put the emphasis on human control of drones rather than a complete ban. There should always be a human who “pulls the trigger” even if he is far away. Robots should not be completely autonomous. There is a “science fiction” fear that robots might be even more deadly and with less conscience than humans. It is better not to test the hypothesis.
The most structured avenue of action for those of us concerned with the issue is “The Convention on Prohibitions on the Use of Certain Conventional Weapons which may be Deemed to be Excessively Injurious or to have Indiscriminate Effects “. There is a yearly, but only two-day review of the Convention, this year 22-24 November 2017. There will be a meeting of government experts in August to prepare the November meetings. Thus letters of concern on “killer robots” should be sent to Foreign Ministers prior to August. There are 107 States which have ratified the Convention, any one of which could take a leading role. Thus the importance of contacting as many States as possible. The negotiating Conference which led to the Convention was chaired by Ambassador Oluyemi Adeniji of Nigeria.
The Convention which came into force in 1981 owes much to US actions during the war In Vietnam. The prohibitions or the restrictions on the use of weapons most discussed during the negotiations on the Convention in 1978 and 1979 due to non-governmental representatives (NGOs) participating concerned napalm and flechettes – an ancestor of cluster munitions later widely used. Flechettes were small metal arrows that a bomb would send off in all directions. They were so small that they were difficultly picked up by an X-ray machine in the rare cases that there was an X-ray machine anyplace close to the fighting in Vietnam.
I was part of the NGO representatives pushing the flechette issue. Our statement to the conference stressed that “We submit further that the development of the flechette is a particularly flagrant example of the abuse, for destructive purposes, of technology which should rather be directed to meeting the legitimate peaceful needs of mankind.” Although I hate to repeat myself, I can say the same thing today concerning “killer robots.”
Through NGO efforts and strongly supported by the Government of Sweden, we were able to get a Protocol to the Convention concerning fragments non-detectable by X-ray. Our effort had begun in 1973 at the time of a working group on the International Committee of the Red Cross which was dealing with incendiaries and fragmentation weapons.
Although the United Nations has not yet been able to prevent armed conflicts either between States or in “civil wars”, there needs to be an effort to reduce the suffering that such conflicts cause. Today, as NGOs we work for a coming together of concerns for humanitarian international law, human rights respect of the dignity of each person, and arms control. As we see, there was a long road to travel between 1973 and 1981 when the Convention came into force and today when the spirit of the Convention needs to be upheld and applied. There was a need to remain in close contact with Government representatives, the International Committee of the Red Cross, and those NGOs on the “front lines.” We face the same challenges today to regulate artificial intelligence lethal weapons. While we know that humans can be destructive, we work in the knowledge that we can limit human destructiveness and turn real intelligence toward the common good.
(1) For a useful NGO analysis see “Killer robots and the concept of meaningful human control” Human Rights Watch, April 2016.
René Wadlow, a member of the Fellowship of Reconciliation and of its Task Force on the Middle East, is president and U.N. representative (Geneva) of the Association of World Citizens and editor of Transnational Perspectives. He is a member of the TRANSCEND Network for Peace, Development and Environment.
This article originally appeared on Transcend Media Service (TMS) on 20 Mar 2017.
Anticopyright: Editorials and articles originated on TMS may be freely reprinted, disseminated, translated and used as background material, provided an acknowledgement and link to the source, TMS: Could Robots Be More Deadly Than Humans?, is included. Thank you.
This work is licensed under a CC BY-NC 4.0 License.
Click here to go to the current weekly digest or pick another article: