<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>ICRAC in the media &#8211; ICRAC</title>
	<atom:link href="https://www.icrac.net/category/icrac-media/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.icrac.net</link>
	<description>International Committee for Robot Arms Control</description>
	<lastBuildDate>Fri, 22 Feb 2019 19:04:30 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.2</generator>
<site xmlns="com-wordpress:feed-additions:1">128339352</site>	<item>
		<title>LAWS: An Open Letter from AI &#038; Robotics Experts</title>
		<link>https://www.icrac.net/laws-an-open-letter-from-ai-robotics-experts/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 29 Jul 2015 08:15:26 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2858</guid>

					<description><![CDATA[Thousands of experts in artificial intelligence, robotics and related professions have signed an open letter, hosted by the Future of Life Institute, calling for a ban on autonomous weapons that select and engage targets without human intervention. You can read more on the importance of this letter to the current global effort of banning lethal [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>Thousands of experts in artificial intelligence, robotics and related professions have signed an <a href="http://futureoflife.org/AI/open_letter_autonomous_weapons#signatories">open letter</a>, hosted by the <a href="http://futureoflife.org/">Future of Life Institute</a>, calling for a ban on autonomous weapons that select and engage targets without human intervention.</p>
<p>You can read more on the importance of this letter to the current global effort of banning lethal autonomous weapon systems (LAWS) on the <a href="http://www.stopkillerrobots.org/2015/07/aicall/">Campaign to Stop Killer Robots website</a>.</p>
<p>ICRAC is committed to the peaceful uses of robotics and the regulation of robot weapons. We welcome this open letter, and numerous members of ICRAC have not only signed but also commented on this youngest development over the last couple of days in various media. Below is a selection of audio and video content in English, Italian and German language.</p>
<p><iframe loading="lazy" src="http://www.cnet.com/videos/share/id/kje1bJWKBsCFql0OOXKhKB1Rtqf46pvn/" width="600" height="350" frameborder="0" seamless="seamless" allowfullscreen="allowfullscreen"></iframe></p>
<p><strong>CNET – Ban autonomous weapons, urge AI experts including Hawking, Musk and Wozniak</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Noel Sharkey and Thomas Nash</a>)<br />
Release date: 27 Jul 2015</p>
<p><iframe loading="lazy" src="https://s.embed.live.huffingtonpost.com/HPLEmbedPlayer/?segmentId=55aef80bfe34445cec000164&amp;autoPlay=false" width="570" height="321" frameborder="0"></iframe></p>
<p><strong>Huffpost Live – A.I. Experts Push For Military Robot Ban</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Heather Roff and Ian Kerr</a>)<br />
Hundreds of artificial intelligence experts and revered thinkers, including Stephen Hawking and Elon Musk, are calling for a global ban on military robots. We explore the issue and whether these autonomous weapons could lower the threshold for war.<br />
Release date: 28 Jul 2015</p>
<p><iframe loading="lazy" src="https://www.youtube.com/embed/XIpmztX1X68" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p><strong>CTV – Will the growth of killer robots set off a global arms race?</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Ian Kerr</a>)<br />
In this 5 minute interview with Canada AM host, Beverly Thompson, Ian Kerr discusses the the difference between semi-autonomous and autonomous weapons, the call to ban killer robots, why he is a signatory to the open letter, and why efficacy is not the only consideration in deciding whether to ban a dangerous use of technology.<br />
Release date: 04 Aug 2015</p>
<p><iframe loading="lazy" src="http://www.bbc.co.uk/programmes/p02y817k/player" width="400" height="500" frameborder="0"></iframe></p>
<p><strong>BBC World Service – Fighting off ‘Killer Robots’</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Heather Roff</a>)<br />
More than 1,000 tech experts, scientists and researchers have written a letter warning about the dangers of autonomous weapons, warning that a ‘military AI race is a bad idea’. One signatory, Heather Roff Perkins from the University of Denver spoke to the BBC’s Dominic Laurie.<br />
Release date: 28 Jul 2015</p>
<p><iframe loading="lazy" src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/217772360&amp;color=ff5500" width="100%" height="166" frameborder="no" scrolling="no"></iframe></p>
<p><strong>Killer robots: the coming arms race?</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Peter Asaro</a>)</p>
<p>Are you worried about killer robots? Last week, some of the most prominent thinkers in science and technology signed an open letter that warned of the coming arms race should militaries pursue the development and deployment of artificially intelligent weaponry. The letter was written by The Campaign to Stop Killer Robots, an international coalition of NGOs, and was signed by almost 14,000 people, including Stephen Hawking, Elon Musk, and Steve Wozniak. Today, we discuss the threat of fully autonomous weapons and artificially-intelligent warfare with PETER ASARO, professor of media studies at The New School and spokesperson for The Campaign to Stop Killer Robots, roboticist and Georgia Tech professor RONALD ARKIN, as well as GEORGE ZARKADAKIS, writer and AI architect.</p>
<p>Release date: 6 Aug 2015</p>
<p><iframe loading="lazy" src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/218308595&amp;color=ff5500" width="100%" height="166" frameborder="no" scrolling="no"></iframe></p>
<p><strong>Ban Killer Robots Interview – Calgary Newstalk 770</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Ian Kerr</a>)<br />
In this 25 minute interview with Roger Kinkade and Rob Breakenridge of Newstalk770, CHQR Radio, Ian Kerr talks at length about autonomous weapons and the AI community’s call to ban “killer robots”. The conversation is about what a killer robot is, why they are likely to be developed, what the dangers are if we don’t ban them and a number of broader issues regarding the future of artificial intelligence.<br />
Release date: 8 Aug 2015</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2858</post-id>	</item>
		<item>
		<title>Model United Nations Urges Ban on Killer Robots</title>
		<link>https://www.icrac.net/model-united-nations-urges-ban-on-killer-robots/</link>
		
		<dc:creator><![CDATA[mbolton]]></dc:creator>
		<pubDate>Mon, 06 Apr 2015 21:02:40 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Ban Ki-moon]]></category>
		<category><![CDATA[Campaign to Stop Killer Robots]]></category>
		<category><![CDATA[CCW]]></category>
		<category><![CDATA[Convention on Certain Conventional Weapons]]></category>
		<category><![CDATA[Human Rights Watch]]></category>
		<category><![CDATA[International Committee for Robot Arms Control]]></category>
		<category><![CDATA[international humanitarian law]]></category>
		<category><![CDATA[Killer Robots]]></category>
		<category><![CDATA[LAWS]]></category>
		<category><![CDATA[Lethal Autonomous Weapons Systems]]></category>
		<category><![CDATA[Marten's Clause]]></category>
		<category><![CDATA[Model United Nations]]></category>
		<category><![CDATA[NMUN]]></category>
		<category><![CDATA[Pace University]]></category>
		<category><![CDATA[United Nations]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2518</guid>

					<description><![CDATA[UN Secretary Ban Ki-moon “energized” by students’ “serious discussions” on autonomous weapons systems In less than two weeks, diplomats from around the world will gather at the United Nations in Geneva to discuss potential global regulations on “lethal autonomous weapons systems” that would be able to select and attack targets without direct human control. But [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><em>UN Secretary Ban Ki-moon “energized” by students’ “serious discussions” on autonomous weapons systems</em></p>
<div id="attachment_2520" style="width: 325px" class="wp-caption alignleft"><a href="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2015/06/11083604_10152619791851923_7167836955760066498_n.jpg" target="_blank"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2520" class="wp-image-2520" style="border: 0px; margin: 0px;" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2015/06/11083604_10152619791851923_7167836955760066498_n-300x200.jpg?resize=315%2C210" alt="" width="315" height="210" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/11083604_10152619791851923_7167836955760066498_n.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/11083604_10152619791851923_7167836955760066498_n.jpg?w=960&amp;ssl=1 960w" sizes="auto, (max-width: 315px) 100vw, 315px" /></a><p id="caption-attachment-2520" class="wp-caption-text">UN Secretary General Ban Ki-moon addresses National Model UN conference in the General Assembly Room, 26 March 2015. Photo: NMUN.</p></div>
<p>In less than two weeks, diplomats from around the world will gather at the United Nations in Geneva to discuss potential global regulations on “<a href="http://www.unog.ch/80256EE600585943/%28httpPages%29/4F0DEF093B4860B4C1257180004B1B30?OpenDocument">lethal autonomous weapons systems</a>” that would be able to select and attack targets without direct human control.</p>
<p>But last week, at the <a href="http://nmun.org/nmun_ny.html">National Model UN conference in New York</a>, attended by some 2,500 undergraduate students from all over the world, a simulation of the UN General Assembly passed three resolutions calling for states to take action to prevent the threat of these “killer robots” to security, human rights and humanitarian law.</p>
<p>Addressing the closing ceremony of the conference, UN Secretary General Ban Ki-moon <a href="http://www.un.org/apps/news/infocus/sgspeeches/statments_full.asp?statID=2550#.VR1yOyLD8us">told students</a> he was “energized by this dynamic gathering” and its “serious discussions” on “cutting-edge issues on the international agenda”, such as “lethal autonomous weapons systems.”</p>
<p>“You are not just leaders of the future – you can start to lead right now,” he told them, “now is the time for your generation to build human solidarity around the world.”</p>
<p>The NMUN NY resolutions defined lethal autonomous robots as “weapons that can select and attack targets independently – without meaningful human input or control”, suggested all countries immediate adopt a national moratorium on such weapons, and urged the negotiation of an international ban through an additional Protocol VI at the Convention on Certain Conventional Weapons (NMUN NY 2015A/GA1-1-1).</p>
<p>Model UN is a simulation of diplomacy, negotiation and decisionmaking by international organizations. Students play the role of diplomats from Member States of the UN and discuss issues at the top of the global policymaking agenda. NMUN NY is one of the biggest undergraduate Model UN conferences in the world.</p>
<p>The students assigned to simulate the General Assembly First Committee – which deals with issues of disarmament and international security – spent several months learning about their countries’ policy positions, the General Assembly and the politics of killer robots. (See for example, their <a href="http://nmun.org/ny15_downloads/BGGs/NY15_BGG_GA1.pdf">background guide</a>). After debate and drafting in the First Committee, the resolutions were passed by students representing the full plenary body in the actual General Assembly Room at the UN in New York.</p>
<p>The resolutions also called attention to the “work and expertise” of civil society, particularly the <a href="http://www.stopkillerrobots.org/">Campaign to Stop Killer Robots</a>, <a href="http://www.hrw.org/">Human Rights Watch</a> and the <a href="http://icrac.net/">International Committee for Robot Arms Control (ICRAC)</a> (NMUN NY 2015A/GA1-1 -1, GA1-1-2 and GA1-1-3).</p>
<p>In a briefing, Dr. Matthew Bolton, <a href="http://pacenycmun.org/">Model UN</a> advisor for <a href="http://www.pace.edu/dyson/academic-departments-and-programs/political-science/faculty/matthew-bolton">Pace University New York City</a> and member of ICRAC, told students at the conference that when new weapons technologies are not adequately addressed by existing regulations, the <a href="https://www.icrc.org/eng/resources/documents/misc/57jnhy.htm">Marten’s Clause</a> in international law requires states to be guided by “the principle of humanity and the dictates of public conscience.”</p>
<p>“Avoid the temptation to think this simulation is a meaningless game,” said Bolton, “A statement of strong concern from you could be considered an expression of public conscience – a challenge to policymakers in the real world to take action against killer robots.”</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2518</post-id>	</item>
		<item>
		<title>Campaign to Stop Killer Robots takes significant step forward at UN</title>
		<link>https://www.icrac.net/campaign-to-stop-killer-robots-takes-significant-step-forward-at-un/</link>
		
		<dc:creator><![CDATA[mbolton]]></dc:creator>
		<pubDate>Fri, 15 Nov 2013 03:23:27 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Campaign to Stop Killer Robots]]></category>
		<category><![CDATA[CCW]]></category>
		<category><![CDATA[Convention on Conventional Weapons]]></category>
		<category><![CDATA[Dave Akerson]]></category>
		<category><![CDATA[Geneva]]></category>
		<category><![CDATA[Heather Roff]]></category>
		<category><![CDATA[ICRAC]]></category>
		<category><![CDATA[International Committee for Robot Arms Control]]></category>
		<category><![CDATA[Killer Robots]]></category>
		<category><![CDATA[Matthew Bolton]]></category>
		<category><![CDATA[Noel Sharkey]]></category>
		<category><![CDATA[Peter Asaro]]></category>
		<category><![CDATA[UN]]></category>
		<category><![CDATA[United Nations]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2433</guid>

					<description><![CDATA[ICRAC welcomes the historic decision taken by nations to begin international discussions on how to address the challenges posed by fully autonomous weapons. The agreement marks the beginning of a process that the campaign believes should lead to an international ban on these weapons to ensure there will always be meaningful human control over targeting decisions [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><iframe loading="lazy" src="https://player.vimeo.com/video/79472645" width="500" height="281" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p style="text-align: left;" align="center">ICRAC welcomes the historic decision taken by nations to begin international discussions on how to address the challenges posed by fully autonomous weapons. The agreement marks the beginning of a process that the campaign believes should lead to an international ban on these weapons to ensure there will always be meaningful human control over targeting decisions and the use of violent force.</p>
<p>At 16:48 on Friday, 15 November 2013, at the United Nations in Geneva, states parties to the Convention on Conventional Weapons adopted a report containing a decision to convene on May 13-16, 2014 for their first meeting to discuss questions related to “lethal autonomous weapons systems” also known as fully autonomous weapons or “killer robots.” These weapons are at the beginning of their development, but technology is moving rapidly toward increasing autonomy.</p>
<p>“This is a very significant step forward for the International Committee for Robot Arms Control (ICRAC ),” said Professor Noel Sharkey, Chairman of ICRAC. “We are now on the first rung of the international ladder to fulfill our goal of stopping these morally obnoxious weapons from ever being deployed.”</p>
<p>ICRAC was formed in 2009 to initiate international discussion on autonomous weapons systems. It is made up of experts in robotic technology, artificial intelligence, computer science, international security and arms control, ethics and international law. It is a co-founder of the Campaign to Stop Killer Robots.</p>
<p>The Campaign to Stop Killer Robots believes that robotic weapons systems should not be making life and death decisions on the battlefield. That would be inherently wrong, morally and ethically. Fully autonomous weapons are likely to run afoul of international humanitarian law, and that there are serious technical, proliferation, societal, and other concerns that make a preemptive ban necessary.</p>
<p>“Law follows technology.  With robotic weapons, we have an rare opportunity to regulate a category of dangerous weapons before they are fully realized and the CCW is our best opportunity for regulation,” said Dave Akerson an ICRAC legal expert.</p>
<p>A total of 117 states are party to the Convention on Conventional Weapons, including nations known to be advanced in developing autonomous weapons systems: United States, China, Israel, Russia, South Korea, and United Kingdom. Adopted in 1980, this framework convention contains five protocols, including Protocol I prohibiting non-detectable fragments, Protocol III prohibiting the use of air-dropped incendiary weapons in populated areas, and Protocol IV, which preemptively banned blinding lasers.</p>
<p>“This is a momentous opportunity to get states on the record and behind a ban on fully autonomous offensive weapons,” said Heather Roff, an ICRAC philosopher. “If we can gain enough support, we might succeed in banning a technology before it actually harms innocent civilians.”</p>
<p>The agreement to begin work in the Convention on Conventional Weapons could lead to a future CCW Protocol VI prohibiting fully autonomous weapons.</p>
<p>ICRAC with the Campaign to Stop Killer Robots supports any action to urgently address fully autonomous weapons in any forum. The decision to begin work in the Convention on Conventional Weapons does not prevent work elsewhere, such as the Human Rights Council.</p>
<p>Since the topic was first discussed at the Human Rights Council on 30 May 2013, a total of 44 nations have spoken publicly on fully autonomous weapons since May: Algeria, Argentina, Australia, Austria, Belarus, Belgium, Brazil, Canada, China, Costa Rica, Cuba, Ecuador, Egypt, France, Germany, Ghana, Greece, Holy See, India, Indonesia, Iran, Ireland, Israel, Italy, Japan, Lithuania, Madagascar, Mexico, Morocco, Netherlands, New Zealand, Pakistan, Russia, Sierra Leone, Spain, South Africa, South Korea, Sweden, Switzerland, Turkey, Ukraine, United Kingdom, and United States. All nations that have spoken out have expressed interest and concern at the challenges and dangers posed by fully autonomous weapons.</p>
<p>Together with the Campaign to Stop Killer Robots, ICRAC urges nations to prepare for extensive and intensive work next year, both within the CCW and outside the CCW context.  We urge states to develop national policies, and to respond to the UN Special Rapporteur on Extrajudicial Executions’ call for national moratoria on fully autonomous weapons. We urge states to come back one year from now and agree to a new mandate to begin negotiations. The new process must be underscored by  a sense of urgency.</p>
<p>Peter Asaro, vice-chairman of ICRAC said “The actions of the CCW this week are a hopeful first step towards an international ban on autonomous weapons systems.’</p>
<p>Mathew Bolton delivered a <a href="http://icrac.net/2013/11/icrac-delivers-statement-to-states-parties-to-the-convention-on-conventional-weapons-at-the-un-in-geneva-in/">statement</a> on behalf ICRAC at the UN CCW meeting yesterday. As a group of experts we are prepared to help any nations with expert discussions of autonomous weapons systems and to help develop clear definitions for the language to be used in a treaty to ban them. Video footage of the statement, ICRAC’s first ever statement in an official diplomatic forum, is <a href="http://vimeo.com/79472645" target="_blank">available here</a>.</p>
<p>ICRAC recently coordinated the circulation of a “<a href="http://icrac.net/2013/10/computing-experts-from-37-countries-call-for-ban-on-killer-robots/">Scientists Call</a>” to ban fully autonomous weapons systems, signed by more than 270 Computer Scientists, Engineers, Artificial Intelligence experts, Roboticists and professionals from related disciplines in 37 countries, saying: “given the limitations and unknown future risks of autonomous robot weapons technology, we call for a prohibition on their development and deployment. Decisions about the application of violent force must not be delegated to machines.”</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2433</post-id>	</item>
		<item>
		<title>News Roundup: Banning Lethal Autonomous Robots</title>
		<link>https://www.icrac.net/news-roundup-banning-lethal-autonomous-robots/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Thu, 31 Oct 2013 21:40:55 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[News]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2424</guid>

					<description><![CDATA[Over the last couple of months, lethal autonomous robots (LARs) went from a sidelined issue to not only a hotly and widely debated topic but most importantly an official item of United Nations (UN) arms control diplomacy. This post provides an overview over recent media coverage as well as events and statements given at the UN. The [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div class="entry">
<p>Over the last couple of months, lethal autonomous robots (LARs) went from a <a href="http://www.whiteoliphaunt.com/duckofminerva/2012/11/global-civil-society-targets-killer-robots.html">sidelined issue</a> to not only a hotly and widely debated topic but most importantly an official item of <a href="http://www.stopkillerrobots.org/2013/10/unga2013/">United Nations (UN) arms control diplomacy</a>. This post provides an overview over recent media coverage as well as events and statements given at the UN.</p>
<p>The First Secretary and representative from the France Permanent Mission to the UN, Anais Laigle, reported that LARs will be discussed at the <a href="http://en.wikipedia.org/wiki/Convention_on_Certain_Conventional_Weapons">Convention on Conventional Weapons</a> (CCW) conference in November. <a href="http://www.nbcnews.com/technology/terminator-hold-debate-stop-killer-robots-takes-global-stage-8C11433704">NBC</a> reported that Switzerland, France, and Egypt were interested in the regulation of LARs that the <a href="http://icrac.net/">International Committee for Robot Arms Control</a> (ICRAC) and <a href="http://www.hrw.org/">Human Rights Watch</a> (HRW) proposed. NBC reported that “representatives” of Austria, Germany, the United States, France, Brazil, Morocco, and Algeria also were interested in regulation. The NBC article also suggests that the discussion about LARs will be complicated by the secrecy of LAR programs, citing criticism from NYU Law human rights professor and ICRAC legal advisor <a href="http://icrac.net/who/">Sarah Knuckey</a>. She said:</p>
<blockquote><p>“We’re not going to know what laws are going to be programmed [into the robots], and where they’re going to be used.”</p></blockquote>
<p>The article also mentioned ICRAC’s opposition to such secrecy quoting Jody Williams who said:</p>
<blockquote><p>“I don’t like my tax dollars being used on weapons that are not even discussed in the public domain… We have every right and every responsibility to have a public discussion as to where war is going.”</p></blockquote>
<p>An article from <a href="http://www.fastcoexist.com/3020313/can-the-campaign-to-stop-killer-robots-save-us-from-weapons-that-kill-on-their-own">Co.EXIST</a> reported on the UN visit by activists, technologists, and others to raise awareness for a ban on LARs. The article cites Jody Williams’ warning that the governments are on a slippery slope towards developing LARs and that “in the loop” killer robots may “just mean the programmer” is “in the loop.” The article also mentions ICRAC chairman <a href="http://icrac.net/who/">Noel Sharkey</a> and his concerns about LARs, particularly their potential ability to distinguish targets, “proportionality-test,” and properly recognize the surrender process of combatants. The article also reports that <a href="http://icrac.net/2013/10/computing-experts-from-37-countries-call-for-ban-on-killer-robots/">“272 engineers, computer scientists and roboticists” recently co-signed the ICRAC’s letter</a> to the UN to implement a ban.The article clarifies that <em>ICRAC does not oppose non-autonomous weapons or peaceful autonomous robots</em>.</p>
<p>Another article in the <a href="http://www.ibtimes.co.uk/articles/515609/20131021/killer-robot-drone-war-asimov-united-nations.htm">International Business Times</a> reported on ICRAC’s call for “urgent international action” and HRW’s warning that LARs would be developed. HRW was quoted expressing concern that civilians were at risk by the development of LARs and stating that all killing decisions should be made by humans. The article further mentions the Pentagon’s downplaying of the development of LARs.</p>
<p><a href="http://www.computerworld.com/s/article/9243421/Activists_U.N._take_aim_at_killer_robots_">Computerworld</a> reported that human rights organizations and representatives have been at the UN calling for a ban on LARs.  Like the Co.EXIST article, they quote HRW’s Mary Wareham saying that HRW does not oppose armed robots, but it wants to ensure that humans remain in control of kill decisions and expresses concern that the military is working in the direction towards greater autonomy. The article also articulates Wareham’s concern that there are currently only policies, not laws governing the use of LARs.</p>
<p>A recent article in <a href="http://world.time.com/2013/10/25/the-campaign-to-kill-killer-robots-gains-steam/">TIME</a> underscored a surge of activity and momentum in the effort to stop LARs. The article explains that although LARs are not currently in use, the X-47B and the South Korean robot sentry are signs of the increasing autonomy of armed robots. It also cites arguments from those opposing LARS, including the possibility of LAR malfunction, hacking, as well as moral issues.</p>
<p><a href="http://rus.ruvr.ru/2013_10_22/Roboti-ubijci-LARS-kak-zerkalo-chelovecheskoj-morali-5565/">The Voice of Russia</a> outlet published a lengthy article outlining the concerns about LARs. It seems to claim that Pentagon analysts will have operational autonomous robots in 20 years and humans will become the most insignificant factor in their use by 30 years, the article says. It cites professor Chistoph Heyns, the UN’s Special Rapporteur on extrajudicial, summary or arbitrary executions, who voiced concern over the legal responsibility of LAR killing and his warning that their use will lead to “mechanical slaughter.”  The article explains that many there are many “experts” who desire regulation of LARs. The article draws a parallel to current drone use, arguing that targeting mistakes of drones in current use could apply to LARs. The author also argues that LARs will not be able to appropriately react to unexpected behavior. It also underscores three pro-LAR arguments. Namely, that they would make better decisions more quickly because they could process more information, would not be hampered by emotion, and could make decisions faster.</p>
<p>On 29 October, ten states spoke on fully autonomous weapons or lethal autonomous robots at UN General Assembly First Committee:<br />
Costa Rica, Ecuador, Greece, Ireland, Japan,Netherlands, Pakistan, Switzerland, United Kingdom, the United States. For all except Pakistan, Switzerland, UK, and the US, this was the first time they had made a public statement on the topic. At the outset of the 2013 First Committee, Pakistan also expressed its concerns over the weapons, as did Austria, Egypt, and France. Relevant extracts from the statements follow.</p>
<blockquote><p><a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/14Oct_Austria.pdf">Austria:</a> Prevention and accountability for deliberate targeting of civilians during war, as well as disproportionate collateral casualties as a result of military action, are at the centre of our concern. Today, arms technology is undergoing rapid changes. The use of armed drones in conflict situations is increasing. In a not too distant future, fully autonomous weapons systems might become available. As a result, the implications of these developments on IHL require urgent engagement by relevant UN forums and further discussion with a view to ensure that these weapons will not be used in a way that violates universally recognized principles of IHL such as the proportionality of the use of force or the obligation to distinguish between civilians and combatants.</p>
<p><a href="http://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_CostaRica.pdf">Costa Rica:</a> Furthermore, we worry that many problems identified with the use of armed drones would be exacerbated by the trend toward increasing autonomy in robotic weapons. My delegation feels that we should begin international dialogue soon on the issue of lethal autonomous robotics, and calls for States to consider placing national moratoria on their development, production and use and discuss eventual prohibition.</p></blockquote>
<blockquote><p><a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/8Oct_Egypt.pdf">Egypt</a>: Egypt reiterates that technology should not overtake humanity. The potential or actual development of Lethal Autonomous Robotics raises many questions on their compliance with international humanitarian law, as well as issues of warfare ethics. Such issues need to be fully addressed. Regulations should be put into place before such systems (LARs) are to be developed and/or deployed.</p></blockquote>
<blockquote><p><a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/8Oct_France.pdf">France</a>: We must look to the future and address its challenges. An important debate has emerged in recent months on the issue of Lethal Autonomous Robots (LARs). This is a key debate as it raises the fundamental question of the place of Man in the decision to use lethal force. It is also a difficult debate, as it highlights many ethical, legal and technical issues. It covers technologies which are not yet fully developed and which are dual-use. The terms of this debate need to be clarified. To be useful and allow progress, this discussion needs to be held in an appropriate disarmament forum, combining the necessary military, legal and technical expertise and all the States concerned.</p></blockquote>
<blockquote><p><a href="http://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_Ecuador.pdf">Ecuador</a>: My country believes that the international community should deepen the debate around Unmanned Aerial Vehicles and fully autonomous armed robots. The high number of victims indiscriminate use of drones in civilian areas has also caused serious ethical and legal questions that the development of new military technologies precluding participation and human responsibility in decision-making, is urgent a discussion would be on these new problems in the field of conventional weapons. – Google Translation</p></blockquote>
<blockquote><p><a href="http://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_Greece.pdf">Greece</a>: Greece remains firmly committed to the Convention on Certain Conventional Weapons (CCW) and its Protocols and continues to believe that the CCW remains the most appropriate forum for the discussion on a Protocol on Cluster munitions, as it includes both the most significant producers and users, and will thus be in a position to strike a delicate balance between military utility and humanitarian concerns. It is in this same forum that we believe that the topic of Lethal Autonomous Robotics (LARS) should be discussed considering that the CCW is in a unique position to gather the competent diplomatic, legal and military expertise to address this emerging issue.</p></blockquote>
<blockquote><p><a href="http://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_Ireland.pdf">Ireland</a>: The same principles which provide the foundation for the Arms Trade Treaty must also be applied to all topics of debate in relation to conventional weapons. Whether with regard to anti-personnel landmines, cluster munitions, transparency measures, the environmental impact of weapons, or the use of incendiary weapons, to name a few, our focus must always be to ensure respect for international humanitarian law and human rights, including the rights of women. These same principles must also apply to weapons which will be developed in the future, such as fully autonomous weapons systems. Constructive engagement and debate is essential to ensure that our actions comply with the principles which underlie the United Nations and international law.</p></blockquote>
<blockquote><p><a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_Japan.pdf">Japan</a>: Japan recognizes growing interests, in the international community, in the issues regarding fully autonomous weapons. We think it useful to start discussion about basic elements related to those weapons, including their definition. CCW, where military, legal and other arms control experts are involved, could provide an appropriate venue to address these issues. Japan looks forward to discussing these issues with other interested States and civil society.</p></blockquote>
<blockquote><p><a href="http://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_Netherlands.pdf">The Netherlands</a>: The possible development of Lethal Autonomous Robot Systems raises many legal, ethical and policy questions. In the Netherlands we have started a discussion on this issue with involvement of the ministries of Foreign Affairs and. Defence, relevant partners of civil society and academia in order to get a better understanding of the developments in this field and the related problems. In answering the question about the legality of weapon systems we are guided by international law and in particular by International Humanitarian Law. While developing new weapon systems, states should remain within the boundaries of international law. We will participate actively in discussions on LARS and in that regard support the proposal of the CCW chair for an informal discussion on LARS in the framework of CCW.</p></blockquote>
<blockquote><p><a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_Pakistan.pdf">Pakistan</a>: Another disturbing trend is the development of new types of conventional weapons like the Lethal Autonomous Robots (LARs), and the use of armed drones which cause indiscriminate killing of civilians. The use of drones, especially outside the zone of conflict or the battlefield, not only poses a legal challenge but also has serious human rights and humanitarian implications. It needs to be stopped immediately. The use of drones needs to be brought under international regulation before it spirals out of control. Similarly, LARs, which would choose and fire on pre-programmed targets on their own without any human intervention, pose a fundamental challenge to the protection of civilians and the notion of affixation of responsibility. They could alter traditional warfare in unimaginable ways. Their development needs to be addressed at the relevant international fora including at the UN and the CCW Conference of State Parties. The states that currently possess and use such weapons cannot afford to be complacent that such capabilities will not proliferate over time and hence they too shall become vulnerable unless such weapons7 production is curtailed forthwith under an international regime.</p></blockquote>
<blockquote><p><a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/16Oct_Pakistan.pdf">Pakistan</a>: Lethal Autonomous Robots (LARs) – that would chose and fire on pre-programmed targets on their own without any human intervention – pose a fundamental challenge to the protection of civilians and the notion of affixation of responsibility. … We recognize that consensus building will be a difficult task, but we take this opportunity to put forward some ideas that we feel are essential to promote greater global security: … Nine, The development and use of drones and Lethal Autonomous Robots (LARS) need to be checked and brought under international regulation. Besides the UNGA and its First Committee, the CCW Conference of State Parties also provides a forum to address these issues.</p></blockquote>
<blockquote><p><a href="http://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_Switzerland.pdf">Switzerland:</a> In conclusion, I would like to reiterate the importance of conventional arms in disarmament and international security. New technologies are changing warfare and challenges loom on the horizon. One emerging issue is that of “fully autonomous weapon systems” as highlighted in this year’s report of the Secretary-General’s Advisory Board on Disarmament Matters. We note with interest that the Secretary-General should consider commissioning a comprehensive study, involving UNlDlR and other research institutes and think tanks, in order to support the appropriate efforts. Switzerland is of the view that there is a need to understand, identify, and clarify the potential challenges associated with fully autonomous weapon systems and the relevant technology. Switzerland therefore recognizes the need for a structured intergovernmental dialogue in the existing forum of the Conventional Weapons Convention (CCW) on this issue. Switzerland stands ready to take an active part in the discussions.</p></blockquote>
<blockquote><p><a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_UK.pdf">United Kingdom:</a> I am looking forward to returning to Geneva for the meeting of States Parties to the Convention on Certain Conventional Weapons and our discussions on lethal autonomous robotics. This is an important issue, and one that sits well within the expert remit of the CCW. I hope that we can bring the UK’s expertise and experience to bear.</p></blockquote>
<blockquote><p><a href="http://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_US.pdf">United States:</a> Mr. Chairman, the United States is a High Contracting Party to the Convention on Certain Conventional Weapons and all of its five Protocols. The United States attaches importance to the CCW as an instrument that has been able to bring together states with diverse national security concerns. We look forward to the annual meetings of High Contracting Parties in November and to establishing a program of work for 2014 that will allow CCW States to continue supporting the universalization of the CCW and the implementation of all its Protocols. During this past year, questions have arisen regarding the development and use of lethal fully autonomous weapons in forums such as the Human Rights Council. As the United States delegation to the Human Rights Council stated, we welcome discussion among states of the legal, policy, and technological implications associated with lethal fully autonomous weapons in an appropriate forum that has a primary focus on international humanitarian law issues, if the mandate is right. The United States believes the CCW is that forum. CCW High Contracting Parties include a broad range of States, including those that have incorporated or are considering incorporating automated and autonomous capabilities in weapon systems. The CCW can bring together those with technical, military, and international humanitarian law expertise, ensuring that all aspects of the issue can be considered. Accordingly, we support an informal, exploratory discussion of lethal fully autonomous weapons and are engaged with our fellow CCW High Contracting Parties in formulating an appropriate mandate that will facilitate these discussions.</p></blockquote>
<p>As of today, a total of 28 states have spoken publicly on fully autonomous weapons: Algeria, Austria, Argentina, Belarus, Brazil, China, Costa Rica, Cuba, Ecuador, Egypt, France, Germany, Greece, Indonesia, Iran, Ireland, Japan, Mexico, Morocco, Netherlands, New Zealand, Pakistan, Russia, Sierra Leone, Sweden, Switzerland, United Kingdom, and the United States.”</p>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2424</post-id>	</item>
		<item>
		<title>ICRAC Member and Campaign to Stop Killer Robots Deliver Statements at the UN General Assembly First Committee</title>
		<link>https://www.icrac.net/icrac-member-and-campaign-to-stop-killer-robots-deliver-statements-at-the-un-general-assembly-first-committee/</link>
		
		<dc:creator><![CDATA[mbolton]]></dc:creator>
		<pubDate>Wed, 30 Oct 2013 21:06:20 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Statements]]></category>
		<category><![CDATA[YouTube video]]></category>
		<category><![CDATA[Arms Trade Treaty]]></category>
		<category><![CDATA[Article 36]]></category>
		<category><![CDATA[Campaign to Stop Killer Robots]]></category>
		<category><![CDATA[First Committee]]></category>
		<category><![CDATA[Human Rights Watch]]></category>
		<category><![CDATA[ICRAC]]></category>
		<category><![CDATA[Matthew Bolton]]></category>
		<category><![CDATA[NGOs]]></category>
		<category><![CDATA[nuclear weapons]]></category>
		<category><![CDATA[Pace University]]></category>
		<category><![CDATA[robotic weapons]]></category>
		<category><![CDATA[small arms and light weapons]]></category>
		<category><![CDATA[United Nations]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2413</guid>

					<description><![CDATA[On behalf of global civil society organizations, International Committee for Robot Arms Control member Matthew Bolton calls for disarmament and arms control “driven by the needs and rights of people most affected by armed violence.” The Campaign to Stop Killer Robots also spoke, calling for fully autonomous weapons to “be prohibited through an international treaty, [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div id="attachment_2416" style="width: 310px" class="wp-caption alignleft"><a href="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2015/06/NGO-Statement-to-UNGA-1.jpg"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2416" class="size-medium wp-image-2416" src="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2015/06/NGO-Statement-to-UNGA-1-300x200.jpg?resize=300%2C200" alt="ICRAC member Dr. Matthew Bolton, presenting a statement on disarmament at the UN General Assembly’s First Committee on Tuesday. Photo by Shant Alexander for Control Arms." width="300" height="200" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/NGO-Statement-to-UNGA-1.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/NGO-Statement-to-UNGA-1.jpg?w=1024&amp;ssl=1 1024w" sizes="auto, (max-width: 300px) 100vw, 300px" /></a><p id="caption-attachment-2416" class="wp-caption-text">ICRAC member Dr. Matthew Bolton, presenting a statement on disarmament at the UN General Assembly’s First Committee on Tuesday. Photo by Shant Alexander for Control Arms.</p></div>
<p><i>On behalf of global civil society organizations, <a href="http://icrac.net" target="_blank">International Committee for Robot Arms Control </a>member Matthew Bolton calls for disarmament and arms control “driven by the needs and rights of people most affected by armed violence.” The <a href="http://www.stopkillerrobots.org/" target="_blank">Campaign to Stop Killer Robots </a>also spoke, calling for fully autonomous weapons to </i><em>“be prohibited through an international treaty, as well as through national laws and other measures.” To watch <a href="http://new.livestream.com/accounts/5796840/NGOSpeeches?cat=event&amp;query=control" target="_blank">video footage of the NGO speeches, click here.</a></em></p>
<p>Dr. Matthew Bolton, a member of the International Committee for Robot Arms Control (ICRAC),  addressed the <a href="http://www.un.org/en/ga/first/">United Nations General Assembly First Committee</a> Tuesday afternoon, on behalf of <a href="http://www.article36.org/" target="_blank">Article 36</a> and other international non-governmental organizations (NGOs) working on disarmament, peacebuilding and humanitarian issues.</p>
<p>“We call for an approach to disarmament that is driven by the needs and rights of people most affected by armed violence, not by the discretion of states and organizations most responsible for it,” said Dr. Bolton to representatives of the 193 UN member states, as well as UN agencies and NGOs. The First Committee has responsibility for disarmament and international security.</p>
<p>The <a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_NGO-ways-of-work.pdf">NGO statement</a>, read by Dr. Bolton and endorsed by 11 organizations, congratulated states on “some noteworthy progress” in <a href="http://www.reachingcriticalwill.org/disarmament-fora/others/hlm-nuclear-disarmament">recent international discussions on the elimination of nuclear weapons</a>, the <a href="http://www.un.org/News/Press/docs/2013/sc11131.doc.htm">recent Security Council resolution on small arms and light weapons</a> as well as the <a href="http://www.un.org/disarmament/ATT/">Arms Trade Treaty</a>, signed by over 100 states since June.</p>
<p>Despite these developments in global policy making on controlling weapons, however, Dr. Bolton asserted that “now is not the time for resting on laurels.” The NGO statement identified numerous concerns, including the abuse of the consensus rule in disarmament forums, exclusion of meaningful civil society participation, lack of equal opportunities for women in decisionmaking and the marginalization of the voices of victims and survivors of armed violence.</p>
<p>“Creativity and new human-centered approaches must be a requirement for all states advocating nuclear disarmament, conventional arms control and reduced military expenditure,” said Dr. Bolton, reading the NGO statement. “We can and must replace stalemate and watered-down outcomes with alternatives that advance human security and social and economic justice.”</p>
<p>The Campaign to Stop Killer Robots also <a href="http://www.stopkillerrobots.org/wp-content/uploads/2013/10/KRC_StatementUNGA1_29Oct2013_delivered.pdf" target="_blank">delivered a statement </a>in the same session, calling for a prohibition on fully autonomous weapons.</p>
<p>“Our campaign believes that human control is essential to ensure the protection of civilians and to ensure compliance with international law,” said Mary Wareham of <a href="http://www.hrw.org/" target="_blank">Human Rights Watch</a>, delivering the statement on behalf of the campaign. “We seek a comprehensive and preemptive ban on weapons systems that would be able to select and attack targets without meaningful human intervention. These fully autonomous weapons or ‘lethal autonomous robots’ must be prohibited through an international treaty, as well as through national laws and other measures.”</p>
<p>Dr. Bolton is an expert on global disarmament policy and assistant professor of political science at <a href="http://pace.edu" target="_blank">Pace University</a>. He is author of <a href="http://us.macmillan.com/foreignaidandlandmineclearance/MatthewBolton"><i>Foreign Aid and Landmine Clearance: Governance, Politics and Security in Afghanistan, Bosnia and Sudan</i></a> (I.B. Tauris, 2010) and a forthcoming travelogue <a href="http://www.ibtauris.com/Books/Society%20%20social%20sciences/Politics%20%20government/International%20relations/Arms%20negotiation%20%20control/Political%20Minefields%20The%20Hidden%20Agendas%20Behind%20Clearing%20the%20Worlds%20Landmines.aspx?menuitem=%7BF66D6451-D7DF-403F-A234-82FAE9B3F795%7D"><i>Political Minefields</i></a> (I.B. Tauris, 2014). He has written widely on the politics of <a href="http://www.theguardian.com/commentisfree/cifamerica/2009/nov/26/obama-landmine-ban-treaty">landmines</a>, <a href="http://www.theguardian.com/commentisfree/2008/dec/06/armstrade-obama-white-house">cluster munitions</a>, <a href="http://thehill.com/blogs/congress-blog/foreign-policy/293325-arms-trade-treaty-keeping-weapons-from-terrorists-and-human-rights-abusers">the Arms Trade Treaty</a> and <a href="http://thehill.com/blogs/congress-blog/technology/295807-us-must-impost-moratorium-and-seek-global-ban-on-killer-robots">fully autonomous military robotics</a> (“killer robots”). He recently co-<a title="Futureproofing Is Never Complete: Ensuring the Arms Trade Treaty Keeps Pace with New Weapons Technology" href="http://icrac.net/2013/10/futureproofing-is-never-complete-ensuring-the-arms-trade-treaty-keeps-pace-with-new-weapons-technology/">authored an ICRAC Working Paper </a>on regulating robotic weapons with the Arms Trade Treaty.</p>
<p><a href="http://icrac.net/who/">ICRAC</a> is an international committee of experts in robotics technology, robot ethics, international relations, international security, arms control, international humanitarian law, human rights law, and public campaigns, concerned about the pressing dangers that military robots pose to peace and international security and to civilians in war.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2413</post-id>	</item>
		<item>
		<title>ICRAC’s Heather Roff debates Joshua Foust</title>
		<link>https://www.icrac.net/icracs-heather-roff-debates-joshua-foust/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Thu, 10 Oct 2013 21:28:10 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[News]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2274</guid>

					<description><![CDATA[Naval Postgraduate School CRUSER Robo-Ethics Continuing Education Series (RECES) Robo-Ethics Panelist Debate (Part 3)<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><a href="http://www.nps.edu/video/portal/Video.aspx?enc=TNDll5c9n3YZfpQds%2FU3FTD7pgzYSw0p">Naval Postgraduate School</a></p>
<p>CRUSER Robo-Ethics Continuing Education Series (RECES)<br />
Robo-Ethics Panelist Debate (Part 3)</p>
<p><a href="http://www.nps.edu/video/portal/Video.aspx?enc=TNDll5c9n3YZfpQds%2FU3FTD7pgzYSw0p"><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignleft wp-image-2351 size-thumbnail" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2013/10/Roff-Foust--150x150.png?resize=150%2C150" alt="Roff Foust" width="150" height="150" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2013/10/Roff-Foust-.png?resize=150%2C150&amp;ssl=1 150w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2013/10/Roff-Foust-.png?zoom=2&amp;resize=150%2C150&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2013/10/Roff-Foust-.png?zoom=3&amp;resize=150%2C150&amp;ssl=1 450w" sizes="auto, (max-width: 150px) 100vw, 150px" /></a></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2274</post-id>	</item>
		<item>
		<title>Everyone is a target</title>
		<link>https://www.icrac.net/everyone-is-a-target-2/</link>
		
		<dc:creator><![CDATA[Peter Asaro]]></dc:creator>
		<pubDate>Mon, 12 Aug 2013 23:11:47 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Media]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2006</guid>

					<description><![CDATA[A new short documentary (8 minutes) by Amy Kohn – Autonomous Weapons: everyone is a target – features members of ICRAC giving the reason why we need to move forward with an international legally binding treaty to prohibit research, use and development of autonomous weapons – weapons that once activated can select targets and kill [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Peter Asaro' src='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.peterasaro.org/">Peter Asaro</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Dr. Peter Asaro is a philosopher of science, technology and media. His work examines the interfaces between social relations, human minds and bodies, artificial intelligence and robotics, and digital media.

His current research focuses on the social, cultural, political, legal and ethical dimensions of military robotics and UAV drones, from a perspective that combines media theory with science and technology studies. He has written widely-cited papers on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro's research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. His research has been published in international peer reviewed journals and edited volumes, and he is currently writing a book that interrogates the intersections between military robotics, interface design practices, and social and ethical issues.

Dr. Asaro has held research positions at the Center for Cultural Analysis at Rutgers University, the HUMlab of Umeå University in Sweden, and the Austrian Academy of Sciences in Vienna. He has also developed technologies in the areas of virtual reality, data visualization and sonification, human-computer interaction, computer-supported cooperative work, artificial intelligence, machine learning, robot vision, and neuromorphic robotics at the National Center for Supercomputer Applications (NCSA), the Beckman Institute for Advanced Science and Technology, and Iguana Robotics, Inc., and was involved in the design of the natural language interface for the Wolfram|Alpha computational knowledge engine (winner of the 2010 SXSW Web Interactive Award for Technical Achievement), for Wolfram Research.

He is currently working on an Oral History of Robotics project that is funded by the IEEE Robotics and Automation Society and the National Endowment for the Humanities Office of Digital Humanities.

Dr. Asaro received his PhD in the History, Philosophy and Sociology of Science from the University of Illinois at Urbana-Champaign, where he also earned a Master of Arts from the Department of Philosophy, and a Master of Computer Science from the Department of Computer Science.</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><span style="font-size: 13px; line-height: 19px;">A new short documentary (8 minutes) by Amy Kohn – Autonomous Weapons: everyone is a target – features members of ICRAC giving the reason why we need to move forward with an international legally binding treaty to prohibit research, use and development of autonomous weapons – weapons that once activated can select targets and kill them without further human supervisory control.</span></p>
<p>ICRAC members discuss the difficulties of compliance with International Humanitarian law using autonomous weapons as well as international security issues and the moral red line.</p>
<p><iframe loading="lazy" src="//player.vimeo.com/video/55774055" width="500" height="281" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p><a href="http://vimeo.com/55774055">final video &#8211; robots</a> from <a href="http://vimeo.com/user9101249">Amy Kohn</a> on <a href="https://vimeo.com">Vimeo</a>.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Peter Asaro' src='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.peterasaro.org/">Peter Asaro</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Dr. Peter Asaro is a philosopher of science, technology and media. His work examines the interfaces between social relations, human minds and bodies, artificial intelligence and robotics, and digital media.

His current research focuses on the social, cultural, political, legal and ethical dimensions of military robotics and UAV drones, from a perspective that combines media theory with science and technology studies. He has written widely-cited papers on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro's research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. His research has been published in international peer reviewed journals and edited volumes, and he is currently writing a book that interrogates the intersections between military robotics, interface design practices, and social and ethical issues.

Dr. Asaro has held research positions at the Center for Cultural Analysis at Rutgers University, the HUMlab of Umeå University in Sweden, and the Austrian Academy of Sciences in Vienna. He has also developed technologies in the areas of virtual reality, data visualization and sonification, human-computer interaction, computer-supported cooperative work, artificial intelligence, machine learning, robot vision, and neuromorphic robotics at the National Center for Supercomputer Applications (NCSA), the Beckman Institute for Advanced Science and Technology, and Iguana Robotics, Inc., and was involved in the design of the natural language interface for the Wolfram|Alpha computational knowledge engine (winner of the 2010 SXSW Web Interactive Award for Technical Achievement), for Wolfram Research.

He is currently working on an Oral History of Robotics project that is funded by the IEEE Robotics and Automation Society and the National Endowment for the Humanities Office of Digital Humanities.

Dr. Asaro received his PhD in the History, Philosophy and Sociology of Science from the University of Illinois at Urbana-Champaign, where he also earned a Master of Arts from the Department of Philosophy, and a Master of Computer Science from the Department of Computer Science.</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2006</post-id>	</item>
		<item>
		<title>Smart Robots? Perhaps not smart enough to be called stupid.</title>
		<link>https://www.icrac.net/smart-robots-perhaps-not-smart-enough-to-be-called-stupid/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Mon, 18 Mar 2013 11:17:50 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Opinion]]></category>
		<guid isPermaLink="false">http://icrac.net/?p=899</guid>

					<description><![CDATA[The New York Times has entered the discussion about the Campaign to Stop Killer Robots. Columnist Bill Keller has produced a well balanced article that looks at the pros and cons of a ban. For the ban, he notes that The arguments against developing fully autonomous weapons, as they are called, range from moral (“they [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>The New York Times has entered the discussion about the Campaign to Stop Killer Robots. Columnist Bill Keller has produced a well balanced article that looks at the pros and cons of a ban.</p>
<p>For the ban, he notes that</p>
<blockquote><p>The arguments against developing fully autonomous weapons, as they are called, range from moral (“they are evil”) to technical (“they will never be that smart”) to visceral (“they are creepy”).</p>
<p>“This is something people seem to feel at a very gut level is wrong,” says Stephen Goose, director of the arms division of Human Rights Watch, which has assumed a leading role in challenging the dehumanizing of warfare. “The ugh factor comes through really strong.”</p></blockquote>
<p>He then discusses the three International Humanitarian issues with autonomous robot weapons (i) inability to conform to the principle of distinction; (ii) inability to conform to the principle of Proportionality and (iii) difficulties with accountability with mishaps or war crimes.</p>
<p>He brings out the usual suspect, Ron Arkin, to argue against a ban. Arkin still believes that robots could do better than human because they don&#8217;t have emotional responses. Others argue that that is one of the main problems. The funniest comment to Keller&#8217;s article was a response to Ron Arkin:</p>
<blockquote><p>Professor Arkin argues that automation can also make war more humane.&#8221; This guy has obviously been a civilian all his life. Only a civilian would believe there is a humane way to kill another human being. Does he get out of the house on a regular basis?</p></blockquote>
<p>But Arkin&#8217;s position in other respects does not now seem that removed from those calling for a ban. &#8220;He advocates a moratorium on deployment and a full-blown discussion of ways to keep humans in charge.&#8221; The human&#8217;s in charge is a subtle change in Arkin&#8217;s position that is greatly appreciated. It moves us some way toward the discussions that should be had.</p>
<p>However, without a ban on the development and research on these weapons systems, they are going to end up in the US arsenal. Other countries have not said that they will have a moratorium and so we can expect and arms race that the US will not be able to resist.</p>
<p>In fact in terms of a moratorium, Keller appears to have made an error of interpretation with regards to the recent <a title="Department of Defence directive" href="http://www.dtic.mil/whs/directives/corres/pdf/300009p.pdf">Department of Defence directive</a> (November 21 2012) &#8221; Last November the Defense Department issued what amounts to a 10-year moratorium on developing them while it discusses the ethical implications and possible safeguards.&#8221;</p>
<p>ICRAC member Mark Gubrud picks up on this error in a comment after Keller&#8217;s piece:</p>
<blockquote><p>The DoD Directive (3000.09) does not impose any moratorium. It says that the United States will develop and use autonomous weapons.</p>
<p>Although it draws a line at AW that kill humans autonomously, it does not forbid crossing the line; rather, it sets forth the procedure for doing so. Four sub-cabinet level signatures are required. Other than that, the rules for AW that kill humans are essentially the same as for AW that target materiel, which the Directive approves already.</p>
<p>The directive also approves for immediate development and use &#8220;semi-autonomous weapons&#8221; which may automatically acquire, track, identify and prioritize potential targets, cue a human operator to their presence, and upon approval, engage them, automatically determining the timing of when to fire.</p>
<p>So, a semi-autonomous weapon system might detect a group of persons, highlight their dim outlines on a screen, and say to the operator &#8220;target group identified.&#8221; The operator says &#8220;engage&#8221; and the machine kills them.</p>
<p>Such a system already has every capability needed for full lethal autonomy. It has only been programmed to request approval. One trivial software modification will fix that, if the system doesn&#8217;t already have a switch to throw it into full autonomous mode.</p>
<p>DoDD 3000.09 approves such systems for immediate development, acquistion and use.</p>
<p>There is no moratorium; it is a full-speed charge into the unknown.</p></blockquote>
<p>Nonetheless, Keller is clearly on the right side of the issues and shows a clear understanding: &#8221; It’s a squishy directive, likely to be cast aside in a minute if we learn that China has sold autonomous weapons to Iran&#8221;</p>
<p>Although Keller is not optimistic about the chance of us getting a ban on killer robots, he supports it and ICRAC appreciates him for that:</p>
<blockquote><p>I don’t hold out a lot of hope for an enforceable ban on death-dealing robots, but I’d love to be proved wrong. If war is made to seem impersonal and safe, about as morally consequential as a video game, I worry that autonomous weapons deplete our humanity. As unsettling as the idea of robots’ becoming more like humans is the prospect that, in the process, we become more like robots.</p></blockquote>
<p>It is well worth reading Bill Keller&#8217;s full story and the comments that come afterwards &#8211; <a title="Smart Robots" href="http://http://www.nytimes.com/2013/03/17/opinion/sunday/keller-smart-drones.html?pagewanted=all">Smart Robots.</a></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">899</post-id>	</item>
		<item>
		<title>Killer robots must be stopped, say campaigners</title>
		<link>https://www.icrac.net/killer-robots-must-be-stopped-say-campaigners/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Wed, 06 Mar 2013 11:41:59 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[News]]></category>
		<guid isPermaLink="false">http://icrac.net/?p=771</guid>

					<description><![CDATA[The Campaign to Stop Killer Robots was announced by Tracy McVeigh in the Sunday Newspaper the Observer on 24th February. This has created large positive media interest ahead of our the campaign to be launched in April this year. From the Observer: &#8220;Killer robots must be stopped, say campaigners A new global campaign to persuade [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>The Campaign to Stop Killer Robots was announced by Tracy McVeigh in the Sunday Newspaper the Observer on 24th February. This has created large positive media interest ahead of our the campaign to be launched in April this year.</p>
<p>From the Observer:<br />
&#8220;<strong>Killer robots must be stopped, say campaigners</strong></p>
<p>A new global campaign to persuade nations to ban &#8220;killer robots&#8221; before they reach the production stage is to be launched in the UK by a group of academics, pressure groups and Nobel peace prize laureates.</p>
<p>Robot warfare and autonomous weapons, the next step from unmanned drones, are already being worked on by scientists and will be available within the decade, said Dr Noel Sharkey, a leading robotics and artificial intelligence expert and professor at Sheffield University. He believes that development of the weapons is taking place in an effectively unregulated environment, with little attention being paid to moral implications and international law.</p>
<p>The Stop the Killer Robots campaign will be launched in April at the House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons.&#8221;</p>
<p>Read the full story in the <a title="Observer." href="http://www.guardian.co.uk/technology/2013/feb/23/stop-killer-robots">Observer</a></p>
<p>This was also picked up by the <a title="The Independent" href="http://www.independent.co.uk/life-style/gadgets-and-tech/news/killer-robots-need-to-be-stopped-now-before-it-is-too-late-says-human-rights-group-8509182.html">Independent</a> and <a title="Wired Magazine" href="http://www.wired.co.uk/news/archive/2013-02/24/killer-robot-ban">Wired Magazine</a></p>
<p>The UK Tabloids were also in on the act: <a title="The Sun" href="http://www.thesun.co.uk/sol/homepage/news/4810725/killer-robots-a-danger-to-mankind-warns-human-rights-group.html">The Sun</a>, <a title="The Daily Mail" href="http://www.dailymail.co.uk/news/article-2283758/Ban-terminators-Nobel-peace-prize-winners-urge-world-leaders-stop-production-killer-robots-developed-future-wars.html">The Daily Mail</a> and <a title="The Daily Mail again" href="http://www.dailymail.co.uk/sciencetech/article-2283941/Video-game-aces-wage-wars-future-using-killer-robots.html">the Daily Mail again</a> and <a title="The Daily Star" href="http://www.dailystar.co.uk/posts/view/300524Time-to-terminate-the-killer-robots">The Daily Star</a></p>
<p>There are too many news reports around the world to list but here are two quality ones from <a title="Norway" href="http://www.nrk.no/nyheter/verden/1.10926106">Norway</a><br />
and <a title="The Netherlands." href="http://www.nrc.nl/nieuws/2013/02/24/terminator-geen-sci-fi-meer-publieksactie-tegen-killer-robots/">The Netherlands</a></p>
<p>I will try to get time to give a full media report later.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">771</post-id>	</item>
		<item>
		<title>The rational approach to the inhumanity of automating death by machines</title>
		<link>https://www.icrac.net/the-rational-approach-to-the-inhumanity-of-automating-death-by-machines/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Wed, 06 Mar 2013 11:01:09 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<guid isPermaLink="false">http://icrac.net/?p=753</guid>

					<description><![CDATA[Three days after the publication of the Human Rights Watch Report: Losing our Humanity: The case against killer robots, the US Department of Defence issued a directive that gave clearance for the development of autonomous weapons: weapons that once launched can select and engage targets without further intervention. Here is my response on the relationship [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>Three days after the publication of the Human Rights Watch Report: Losing our Humanity: The case against killer robots, the US Department of Defence issued a directive that gave clearance for the development of autonomous weapons: weapons that once launched can select and engage targets without further intervention.</p>
<p>Here is my response on the relationship between these two documents in the <a title="Guardian Newspaper, December 2012" href="http://www.guardian.co.uk/commentisfree/2012/dec/03/mindless-killer-robots">Guardian Newspaper, December 2012.</a> The article has links to both the Human Rights Watch report and the Department of Defence directive.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">753</post-id>	</item>
	</channel>
</rss>
