{"id":42820,"date":"2014-05-19T12:00:17","date_gmt":"2014-05-19T11:00:17","guid":{"rendered":"https:\/\/www.transcend.org\/tms\/?p=42820"},"modified":"2015-05-05T21:34:59","modified_gmt":"2015-05-05T20:34:59","slug":"nobel-peace-laureates-call-for-preemptive-ban-on-killer-robots","status":"publish","type":"post","link":"https:\/\/www.transcend.org\/tms\/2014\/05\/nobel-peace-laureates-call-for-preemptive-ban-on-killer-robots\/","title":{"rendered":"Nobel Peace Laureates Call For Preemptive Ban on \u201cKiller Robots\u201d"},"content":{"rendered":"<p><em>(Ottawa)\u2014May 12, 2014.<\/em><\/p>\n<p>Referring to the development of weapons that could select targets and kill people without any human intervention as \u201cunconscionable\u201d, 20 individuals and organizations who have won the Nobel peace prize today issued a joint statement endorsing the call for a preemptive ban on these fully autonomous weapons.<\/p>\n<p>The signatories\u2014which include Jody Williams (1997), Lech Walesa (1983), Archbishop Desmond Tutu (1984), President F.W. de Klerk (1993), President Oscar Arias S\u00e1nchez (1987), Shirin Ebadi (2003) and Tawakkol Karman (2011)\u2014warn that robotic machines are \u201calready taking the place of soldiers on the battlefield.\u201d In their statement, the Nobel laureates note the concern that \u201cleaving the killing to machines might make going to war easier and shift the burden of armed conflict onto civilians.\u201d\u00a0 They also \u201capplaud and support\u201d the efforts of the Campaign to Stop Killer Robots to move us back from a \u201cpossible future of robotic warfare.\u201d<\/p>\n<p>The Nobel peace laureates have released their statement on the eve of the first-ever multilateral talks on killer robots, taking place this week at the United Nations in Geneva.\u00a0 The Convention on Conventional Weapons (CCW) is hosting a meeting of experts Tuesday, May 13 to Friday, May 16.<\/p>\n<p>Fully autonomous weapons do not yet exist, but several robotic systems with various degrees of autonomy and lethality are currently in use by the US, China, Russia, Israel, South Korea, and the UK, and these and other nations are moving toward ever-greater autonomy in weapons systems.<\/p>\n<p>The Nobel Women\u2019s Initiative is a co-founder of the Campaign to Stop Killer Robots, a global coalition of 51 nongovernmental organizations in two dozen countries.<\/p>\n<p><strong>For more information, please contact:<\/strong><\/p>\n<p>Rachel Vincent,<br \/>\nDirector of Media &amp; Communications<br \/>\nPhone +1 613-569-8400 ext.113<br \/>\n<a href=\"mailto:rvincent@nobelwomensinitiative.org\">rvincent@nobelwomensinitiative.org<\/a><\/p>\n<p>Zuzia Danielski<br \/>\nOnline Media &amp; Outreach Coordinator<br \/>\nPhone +1 613-569-8400 ext. 114<br \/>\n<a href=\"mailto:zdanielski@nobelwomensinitiative.org\">zdanielski@nobelwomensinitiative.org<\/a><\/p>\n<p>&nbsp;<\/p>\n<p><strong>Nobel Peace Laureates Call for Preemptive Ban on Autonomous Weapons<\/strong><\/p>\n<p>In April 2013 in London, a group of nongovernmental organizations \u2013 most associated with the successful efforts to ban landmines and cluster munitions \u2013 publicly launched the \u201cCampaign to Stop Killer Robots.\u201d\u00a0 Their efforts have helped bring the issue of fully autonomous weapons to a broader audience and spur governments to begin discussions on these weapons this May in Geneva.<\/p>\n<p>We, the undersigned Nobel Peace Prize Laureates, applaud this new global effort and whole-heartedly embrace its goal of a preemptive ban on fully autonomous weapons that would be able to select and attack targets on their own. It is unconscionable that human beings are expanding research and development of lethal machines that would be able to kill people without human intervention.<\/p>\n<p>Not all that long ago such weapons were considered the subject of science fiction, Hollywood and video games. But some machines are already taking the place of soldiers on the battlefield.\u00a0 Some experts in the field predict that fully autonomous weapons could be developed within 20 to 30 years; others contend it could even be sooner. With the rapid development of drones and the expansion of their use in the wars in Afghanistan and Iraq \u2013 and beyond, billions of dollars are already being spent to research new systems for the air, land, and sea that one day would make drones seem as quaint as the Model T Ford does today.<\/p>\n<p>Too many applaud the so-called success of drone warfare and extol the virtues of the weapons. While these unmanned aircraft can fly thousands of miles from home base on their own, they still require individuals watching computer screens to fire its weapons and attack a target. Already over 70 countries have drones and many are looking to develop methods to make them ever more autonomous and create new lethal robots that will, in fact, kill human beings on their own.<\/p>\n<p>Those who favor the development of autonomous lethal robots make many arguments on their behalf.\u00a0 They note that such machines do not put soldiers\u2019 lives at risk nor do they tire or become frightened.\u00a0 Emotion would not cloud their decision-making.\u00a0 They also say that ultimately lethal autonomous robots will be cheaper than manned systems and laud that feature in these times of cutting government budgets.<\/p>\n<p>But not everyone supports the arguments.\u00a0 In it very aptly entitled report, \u201cLosing Humanity:\u00a0 The Case Against Killer Robots,\u201d Human Rights Watch outlined legal and other arguments against the development of such weapons.\u00a0 The report says that such robots will have serious challenges meeting tests of military necessity, proportionality and distinction, which are fundamental to the laws of war. Lethal autonomous weapons would also threaten essential non-legal safeguards for civilians. They would not be constrained by the capacity for compassion, which can provide a key check on killing civilians. These arguments were also brought to the fore in the report of the UN special rapporteur on extrajudicial and arbitrary execution, Christoff Heyns, presented to the UN Human Rights Council in May 2013.<\/p>\n<p>Of course a key argument for robotic weapons is that using them could reduce military casualties.\u00a0 On the flip side, many fear that leaving the killing to machines might make going to war easier and shift the burden of armed conflict onto civilians. The use of fully autonomous weapons raises serious questions of accountability.\u00a0 Who should be held responsible for any unlawful actions they commit?\u00a0 The military commander?\u00a0 The company that makes the robot?\u00a0 The company that produces the software?\u00a0 The obstacles to holding anyone accountable are huge and would significantly weaken the power of the law to deter future violations.<\/p>\n<p>While there has been some heated debate about the dangers and possible virtues of such weapons, until now it had almost exclusively occurred among scientists, ethicists, lawyers and military. Even as killer robots loom over our future, there had been virtually no public discussion about the ethics and morality of fully autonomous weapons, let alone the implications and impact of their potential use.<\/p>\n<p>But the work of the campaign is changing that and even in the lead-up to the April 23<sup>rd<\/sup> launch of the Campaign to Stop Killer Robots, interest and public awareness had begun to grow.\u00a0 The press has increasingly begun to report on killer robots with both the New York Times and the Wall Street Journal running opinion pieces outlining the moral and legal perils of creating killer robots and calling for public discourse before it is too late.<\/p>\n<p>Lethal robots would completely and forever change the face of war and likely spawn a new arms race.\u00a0 Can humanity afford to follow such a path?\u00a0 We applaud and support the efforts of civil society\u2019s Campaign to Stop Killer Robots to help move us away from a possible future of robotic warfare.<\/p>\n<p><strong><br \/>\nIndividuals:<\/strong><\/p>\n<p>Mairead Maguire (1976)<\/p>\n<p>Adolfo P\u00e9rez Esquivel (1980)<\/p>\n<p>President Lech Walesa (1983)<\/p>\n<p>Archbishop Desmond Tutu (1984)<\/p>\n<p>President Oscar Arias S\u00e1nchez (1987)<\/p>\n<p>Rigoberta Mench\u00fa Tum (1992)<\/p>\n<p>President F.W. de Klerk (1993)<\/p>\n<p>President Jos\u00e9 Ramos-Horta (1996)<\/p>\n<p>Jody Williams (1997)<\/p>\n<p>John Hume (1998)<\/p>\n<p>Shirin Ebadi (2003)<\/p>\n<p>Muhammad Yunus (2006)<\/p>\n<p>Leymah Gbowee (2011)<\/p>\n<p>Tawakkol Karman (2011)<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Organizations<\/strong>:<\/p>\n<p>American Friends Service Committee (The Quakers) (1947) \u2013 Shan Cretin, General Secretary<\/p>\n<p>Amnesty International (1978) \u2013 Salil Shetty, Secretary-General<\/p>\n<p>International Campaign to Ban Landmines (1997) \u2013 Sylvie Brigot-Vilain, Executive Director<\/p>\n<p>International Peace Bureau (1910) \u2013 Colin Archer, Secretary-General<\/p>\n<p>International Physicians for the Prevention of Nuclear War (1985) \u2013 Michael Christ, Executive Director<\/p>\n<p>Pugwash Conferences on Science and World Affairs (1995) \u2013 Jayathana Dhanapala, President<\/p>\n<p><a target=\"_blank\" href=\"http:\/\/nobelwomensinitiative.org\/2014\/05\/nobel-peace-laureates-call-for-preemptive-ban-on-killer-robots\/\" >Go to Original \u2013 nobelwomensinitiative.org<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>We, the undersigned Nobel Peace Prize Laureates, applaud this new global effort and whole-heartedly embrace its goal of a preemptive ban on fully autonomous weapons that would be able to select and attack targets on their own. It is unconscionable that human beings are expanding research and development of lethal machines that would be able to kill people without human intervention.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[105],"tags":[],"class_list":["post-42820","post","type-post","status-publish","format-standard","hentry","category-nobel-laureates"],"_links":{"self":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts\/42820","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/comments?post=42820"}],"version-history":[{"count":0,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts\/42820\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/media?parent=42820"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/categories?post=42820"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/tags?post=42820"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}