<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Analysis &#8211; ICRAC</title>
	<atom:link href="https://www.icrac.net/category/analysis/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.icrac.net</link>
	<description>International Committee for Robot Arms Control</description>
	<lastBuildDate>Thu, 04 Oct 2018 10:45:53 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.2</generator>
<site xmlns="com-wordpress:feed-additions:1">128339352</site>	<item>
		<title>ICRAC Working Paper #3 (CCW GGE April 2018): Guidelines for the human control of weapons systems</title>
		<link>https://www.icrac.net/icrac-working-paper-3-ccw-gge-april-2018-guidelines-for-the-human-control-of-weapons-systems/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Tue, 10 Apr 2018 11:33:10 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Working Papers]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=3998</guid>

					<description><![CDATA[Guidelines for the human control of weapons systems [PDF] Authored by Noel Sharkey, chair of ICRAC[1] Since 2014, high contracting parties to the CCW have expressed interest and concern about the meaningful human control of weapons systems. There is an extensive scientific and engineering literature on the dynamics of human-machine interaction and human supervisory control [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><img data-recalc-dims="1" decoding="async" class="alignnone size-medium wp-image-4001" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/ICRAC-WP3_CCW-GGE-April-2018-1.jpg?resize=300%2C263&#038;ssl=1" alt="" width="300" height="263" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/ICRAC-WP3_CCW-GGE-April-2018-1.jpg?resize=300%2C263&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/ICRAC-WP3_CCW-GGE-April-2018-1.jpg?w=667&amp;ssl=1 667w" sizes="(max-width: 300px) 100vw, 300px" /></p>
<p><strong>Guidelines for the human control of weapons systems [<a href="https://www.icrac.net/wp-content/uploads/2018/04/Sharkey_Guideline-for-the-human-control-of-weapons-systems_ICRAC-WP3_GGE-April-2018.pdf">PDF</a>]<br />
</strong></p>
<p>Authored by Noel Sharkey, chair of ICRAC<a href="#_ftn1" name="_ftnref1">[1]</a></p>
<p>Since 2014, high contracting parties to the CCW have expressed interest and concern about the meaningful human control of weapons systems. There is an extensive scientific and engineering literature on the dynamics of human-machine interaction and human supervisory control of machinery. A short guide is presented here consisting of two parts. Part 1 is a simple primer on the psychology of human reasoning. Part 2 outlines different levels for the control of weapons systems, adapted from human-machine interaction research, and discusses them in terms of the properties of human reasoning. This outlines which of the levels can ensure the legality of human control of weapons systems and guarantee that precautionary measures are taken to assess the significance of potential targets, their necessity and appropriateness, as well as the likely incidental and possible accidental effects of the attack.</p>
<ol>
<li><strong> A short primer on human reasoning for the control of weapons</strong></li>
</ol>
<p>A well-established distinction in human psychology, backed by over 100 years of substantial research, divides human reasoning into two types: (i) fast <em>automatic </em>processes needed for routine and/or well tasks like riding a bicycle or playing tennis and (ii) slower <em>deliberativ</em>e processes needed for thoughtful reasoning such as making a diplomatic decision.</p>
<p>The drawback of deliberative reasoning is that it requires attention and memory resources and so it can easily be disrupted by anything like stress, or being pressured into making a quick decision.</p>
<p>Automatic processes kick in first, but we can override them if we are operating in novel circumstances or performing tasks that require active control or attention. Automatic processes are essential to our normal functioning, but they have a number of liabilities when it comes to making important decisions such as those required to determine the legitimacy of a target.</p>
<p>Four of the known properties of automatic reasoning<a href="#_ftn2" name="_ftnref2">[2]</a> illustrate why it is it problematic for the supervisory control of weapons.</p>
<ul>
<li><strong>neglects ambiguity and suppresses doubt</strong>. Automatic processes jump to conclusions. An unambiguous answer pops up instantly without question. There is no search for alternative interpretations or uncertainty. If something looks like it might be a legitimate target, in ambiguous circumstances, automatic reasoning will be certain that it is legitimate.</li>
<li><strong>infers and invents causes and intentions.</strong> Automatic reasoning rapidly invents coherent causal stories by linking fragments of available information. Events that include people are automatically attributed with intentions that fit a causal story. For example, people loading muckrakes onto a truck could initiate a causal story that they were loading rifles. This is called <em>assimilation bias</em> in the human supervisory control literature.<a href="#_ftn3" name="_ftnref3">[3]</a></li>
<li><strong>is biased to believe and confirm. </strong>Automatic reasoning favours uncritical acceptance of suggestions and maintains a strong bias. If a computer suggests a target to an operator, automatic reasoning alone would make it highly likely to be accepted. This is <em>automation bias</em>.<a href="#_ftn4" name="_ftnref4">[4]</a> <em>Confirmation bias</em><a href="#_ftn5" name="_ftnref5">[5]</a> selects information that confirms a prior belief.</li>
<li><strong>focuses on existing evidence and ignores absent evidence. </strong>Automatic reasoning builds coherent explanatory stories without consideration of evidence or contextual information that might be missing. What You See Is All There Is (WYSIATI)<a href="#_ftn6" name="_ftnref6">[6]</a>. It facilitates the feeling of coherence that makes us confident to accept information as true. For example, a man firing a rifle may be deemed to be a hostile target with WYSIATI when a quick look around might reveal that he is shooting a wolf hunting his goats.</li>
</ul>
<p>&nbsp;</p>
<ol start="2">
<li><strong> Levels of human control and how they impact on human decision-making</strong></li>
</ol>
<p>We can look at levels of human control for weapons systems by adapting research from the human supervisory control literature as shown in Table 1.<a href="#_ftn7" name="_ftnref7">[7]</a></p>
<table>
<tbody>
<tr>
<td width="594">A classification for levels of human supervisory control of weapons</td>
</tr>
<tr>
<td width="594">
<ol>
<li><strong>a human deliberates about a target before initiating any attack </strong></li>
<li><strong>program provides a list of targets and a human chooses which to attack</strong></li>
<li><strong>program selects target and a human must approve before attack</strong></li>
<li><strong>program selects target and a human has restricted time to veto </strong></li>
<li><strong>program selects target and initiates attack without human involvement</strong></li>
</ol>
</td>
</tr>
</tbody>
</table>
<p>&nbsp;</p>
<p><strong>Level 1 control is the ideal</strong>. A human commander (or operator) has full contextual and situational awareness of the target area at the time of a specific attack and is able to perceive and react to any change or unanticipated situations that may have arisen since planning the attack. There is active cognitive participation in the attack and sufficient time for deliberation on the nature of the target, its significance in terms of the necessity and appropriateness, and likely incidental and possible accidental effects. There must also be a means for the rapid suspension or abortion of the attack.</p>
<p><strong>Level 2 control could be acceptable</strong> if it is shown to meet the requirement of deliberating on potential targets. The human operator or commander should deliberatively assess necessity and appropriateness and whether any of the suggested alternatives are permissible objects of attack. Without sufficient time or in a distracting environment the illegitimacy of a target could be overlooked.</p>
<p>A rank ordered list of targets is particularly problematic as automation bias could create a tendency to accept the top ranked target unless sufficient time and attentional space is given for deliberative reasoning.</p>
<p><strong>Level 3 is unacceptable.</strong> This type of control has been experimentally shown to create what is known as <em>automation bias</em> in which human operators come to trust computer generated solutions as correct and disregard or don’t search for contradictory information. Cummings experimented with automation bias in a study on an interface designed for supervision and resource allocation of in-flight GPS guided Tomahawk missiles.<a href="#_ftn8" name="_ftnref8">[8]</a> She found that when the computer recommendations were wrong, operators using Level 3 control had a significantly decreased accuracy.</p>
<p><strong>Level 4 is unacceptable</strong> because it does not promote target validation and a short time to veto would reinforce automation bias and leave no room for doubt or deliberation. As the attack will take place <em>unless</em> a human intervenes, this undermines well-established presumptions under international humanitarian law that promote civilian protection.</p>
<p>The time pressure will result in operators neglecting ambiguity and suppressing doubt, inferring and inventing causes and intentions, being biased to believe and confirm, focusing on existing evidence and ignoring absent but needed evidence. An example of the errors caused by demands of fast veto was in the 2004 Iraq war when the U.S. Army&#8217;s Patriot missile system shot down a British Tornado and an American F/A-18, killing three pilots.</p>
<p><strong>Level 5 control</strong> <strong>is unacceptable</strong> as it describes weapons that are autonomous in the critical functions of target selection and the application of violent force.</p>
<p>It should be clear from the above that there are lessons to be drawn both from the psychology of human reasoning and from the literature on human-machine interaction. An understanding of this research is urgently needed to ensure that human-machine interaction is designed to get the best level of human control needed to comply with the international law in all circumstances.</p>
<p><strong>Conclusion: Necessary conditions for meaningful human control of weapons.</strong></p>
<p>A commander or operator should</p>
<ul>
<li>have full contextual and situational awareness of the target area at the time of initiating a specific attack;</li>
<li>be able to perceive and react to any change or unanticipated situations that may have arisen since planning the attack, such as changes in the legitimacy of the targets;</li>
<li>have active cognitive participation in the attack;</li>
<li>have sufficient time for deliberation on the nature of targets, their significance in terms of the necessity and appropriateness of an attack and the likely incidental and possible accidental effects of the attack and…</li>
<li>&#8230;have a means for the rapid suspension or abortion of the attack.</li>
</ul>
<p>&#8212;</p>
<p><a href="#_ftnref1" name="_ftn1">[1]</a> Special thanks to Lucy Suchman, Frank Sauer and Amanda Sharkey and members of ICRAC for helpful comments.</p>
<p><a href="#_ftnref2" name="_ftn2">[2]</a> D. Kahneman 2011:, Thinking, Fast and Slow, Penguin Books. He refers to the two processes as System 1 and System 2, These are exactly the same as the terms automatic and deliberative used here for clarity and consistency.</p>
<p><a href="#_ftnref3" name="_ftn3">[3]</a> J.M. Carroll and M.B. Rosson, ‘Paradox of the active user’, in J.M. Carroll (eds.), Interfacing Thought: Cognitive Aspects of Human-Computer Interaction (MIT Press, 1987), 80–111.</p>
<p><a href="#_ftnref4" name="_ftn4">[4]</a> K.L. Mosier and L.J. Skitka 1996: Human decision makers and automated decision aids: made for each other?, in: Mouloua, M. (Eds.): Automation and Human Performance: Theory and Applications, Lawrence Erlbaum Associates, 201–220.</p>
<p><a href="#_ftnref5" name="_ftn5">[5]</a> C.G. Lord, L. Ross and M. Lepper 1979: ‘Biased assimilation and attitude polarization: the effects of prior theories on subsequently considered evidence’, Journal of Personality and Social Psychology, 47, 1231–1243.</p>
<p><a href="#_ftnref6" name="_ftn6">[6]</a> Kaheneman ibid.</p>
<p><a href="#_ftnref7" name="_ftn7">[7]</a> For a more in-depth understanding of these analyses and references see N. Sharkey 2016: Staying in the Loop. Human Supervisory Control of Weapons, in: Bhuta, Nehal et al. (Eds.): Autonomous Weapons Systems. Law, Ethics, Policy. Cambridge University Press, 23-38.</p>
<p><a href="#_ftnref8" name="_ftn8">[8]</a> M.L. Cummings 2006: Automation and Accountability in Decision Support System Interface Design, in: Journal of Technology Studies 32: 1, 23–31.</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3998</post-id>	</item>
		<item>
		<title>Autonomous Weapon Systems and Strategic Stability</title>
		<link>https://www.icrac.net/autonomous-weapon-systems-and-strategic-stability/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 20 Sep 2017 16:28:00 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3227</guid>

					<description><![CDATA[Building on arguments previously developed for a blog post, ICRAC’s Juergen Altmann and Frank Sauer discuss the strategic implications of autonomy in weapon systems in more depth in a recently published article in Survival. Here’s an excerpt from the introduction: In July 2015, an open letter from artificial-intelligence experts and roboticists called for a ban on autonomous weapon [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>Building on arguments previously developed for a <a href="https://icrac.net/2016/04/speed-kills-why-we-need-to-hit-the-brakes-on-killer-robots/">blog</a> <a href="http://duckofminerva.com/2016/04/speed-kills-why-we-need-to-hit-the-brakes-on-killer-robots.html">post</a>, ICRAC’s Juergen Altmann and Frank Sauer discuss the strategic implications of autonomy in weapon systems in more depth in a recently published article in Survival. Here’s an excerpt from the introduction:</p>
<blockquote><p>In July 2015, an <a href="https://futureoflife.org/open-letter-autonomous-weapons/">open letter from artificial-intelligence experts and roboticists</a><br />
called for a ban on autonomous weapon systems (AWS), comparing<br />
their revolutionary potential to that of gun powder and nuclear weapons.</p>
<p>According to a <a href="https://icrac.net/2012/11/dod-directive-on-autonomy-in-weapon-systems/">2012 Pentagon directive</a>, AWS are weapon systems which,<br />
‘once activated … can select and engage targets without further intervention<br />
by a human operator’. Proponents of AWS have suggested that they<br />
could offer various benefits, from reducing military expenditure to ringing<br />
in a new era of more humane and less atrocious warfare. By contrast, critics<br />
– some characterising AWS as <a href="http://www.stopkillerrobots.org/">‘killer robots’</a> – expect the accompanying<br />
political, legal and ethical risks to outweigh these benefits, and thus argue<br />
for a preventive prohibition.</p>
<p>AWS are not yet operational, but decades of military research and development,<br />
as well as the growing technological overlap between the rapidly<br />
expanding commercial use of artificial intelligence (AI) and robotics, and<br />
the accelerating ‘spin-in’ of these technologies into the military realm, make<br />
autonomy in weapon systems a possibility for the very near future. Military<br />
programmes adapting key technologies and components for achieving<br />
autonomy in weapon systems, as well as the development of prototypes<br />
and doctrine, are well under way in a number of states.</p>
<p>Accompanying this work is a rapidly expanding body of literature on the<br />
various technical, legal and ethical implications of AWS. However, one particularly<br />
crucial aspect has – <a href="http://duckofminerva.com/2015/04/the-new-mineshaft-gap-killer-robots-and-the-un.html">with exceptions confirming the rule</a> – received<br />
comparably little systematic attention: the potential impact of autonomous<br />
weapon systems on global peace and strategic stability.<br />
By drawing on Cold War lessons and extrapolating insights from the<br />
current military use of remotely controlled unmanned systems, we argue<br />
that AWS are prone to proliferation and bound to foment an arms race<br />
resulting in increased crisis instability and escalation risks. We conclude<br />
that these strategic risks justify a critical stance towards AWS.</p></blockquote>
<p><a href="http://www.tandfonline.com/doi/full/10.1080/00396338.2017.1375263">Read the full article here: Altmann, Jürgen/Sauer, Frank 2017: Autonomous Weapon Systems and Strategic Stability, in: Survival 59 (5): 117-142.</a></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3227</post-id>	</item>
		<item>
		<title>Reflections on the 2016 CCW Review Conference</title>
		<link>https://www.icrac.net/reflections-on-the-2016-ccw-review-conference/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 08 Feb 2017 16:19:36 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3225</guid>

					<description><![CDATA[This is a guest post by Anna Khalfaoui. Anna is currently pursuing a LLM at Harvard Law School, having previously studied at Cambridge University and King’s College London. She specialises in public international law and international human rights law. Click here to read this post in braille Reflections on the Review Conference as a newcomer [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><em>This is a guest post by Anna Khalfaoui. Anna is currently pursuing a LLM at Harvard Law School, having previously studied at Cambridge University and King’s College London. She specialises in public international law and international human rights law.</em></p>
<p><a href="https://www.unibw.de/internationalepolitik/professur/team/Sauer/reflections-on-the-2016-ccw-review-conference.dxb/at_download/file">Click here to read this post in braille</a></p>
<p><strong>Reflections on the Review Conference as a newcomer to CCW</strong></p>
<p>The Fifth Review Conference of the Convention on Conventional Weapons (CCW) was a great success for advocates of a ban on fully autonomous weapons. Held at the United Nations in Geneva in December 2016, the Conference was also an opportunity for me to discover and reflect on the processes and challenges of the CCW, to which I was a newcomer.</p>
<div id="attachment_2865" style="width: 250px" class="wp-caption alignleft"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2865" class="wp-image-3231 size-medium alignleft" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2017/11/Anna_CCW-225x300.jpg?resize=225%2C300" alt="" width="225" height="300" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/Anna_CCW.jpg?resize=225%2C300&amp;ssl=1 225w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/Anna_CCW.jpg?resize=768%2C1024&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/Anna_CCW.jpg?w=960&amp;ssl=1 960w" sizes="auto, (max-width: 225px) 100vw, 225px" /><p id="caption-attachment-2865" class="wp-caption-text">.</p></div>
<p>I became involved when I attended the Conference as part of Harvard Law School’s International Human Rights Clinic (IHRC).  I also contributed to a report that IHRC co-published with Human Rights Watch the week before the Review Conference. <a href="https://www.hrw.org/report/2016/12/09/making-case/dangers-killer-robots-and-need-preemptive-ban"><em>Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban</em></a>rebuts the major arguments against a prohibition on the development and use of fully autonomous weapons. These weapons, also known as killer robots and lethal autonomous weapons systems, would be able to select and engage targets without human intervention.</p>
<p>The Review Conference was a key step toward a ban because states parties agreed to formalise talks on killer robots by establishing a Group of Government Experts (GGE), which will meet for 10 days in 2017. This GGE creates the expectation of an outcome as past GGEs have led to negotiation of new or stronger CCW protocols. It provides a forum for states and experts to discuss the parameters of a possible protocol which hopefully will take the form of a ban. The Review Conference also showed that support a ban is gaining traction around the world. Argentina, Panama, Peru and Venezuela joined the call for the first time at the Conference, bringing to 19 the number of states in favour of a ban.</p>
<p>The establishment of a GGE was the news I eagerly waited for the entire week. When the Review Conference opened on December 12, this result did not seem guaranteed. Decisions under the CCW are adopted on the basis of the consensus. This means that any state can block progress and the Russian delegation, from the beginning of the week, forcefully opposed the move to set up a GGE. All other countries that addressed killer robots during the Review Conference explicitly supported establishing such a group. There was something strange about the risk of a single state blocking efforts openly promoted by numerous countries, and I wondered whether, faced with the threat of isolation, it would actually do so. Ultimately, this opposition appears to have been overcome by overwhelming support for more formal discussions.</p>
<div id="attachment_2865" style="width: 250px" class="wp-caption alignright"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2865" class="size-medium wp-image-3230 alignright" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2017/11/IMG_20161216_160200-225x300.jpg?resize=225%2C300" alt="" width="225" height="300" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_160200.jpg?resize=225%2C300&amp;ssl=1 225w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_160200.jpg?resize=768%2C1024&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_160200.jpg?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_160200.jpg?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 225px) 100vw, 225px" /><p id="caption-attachment-2865" class="wp-caption-text">.</p></div>
<p>I first heard about fully autonomous weapons when I joined IHRC in September. At the Review Conference, I realised how invested I had become in this issue and how relieved I was when, on Friday, it became clear that Russia was not going to block a GGE. Fully autonomous weapons are still only under development. Yet, because they have the potential to dramatically change the way that wars are fought, it is incumbent upon us to address the dangers they pose before they find their way to military arsenals and the battlefield.</p>
<p>Several other points caught my attention throughout the week.</p>
<p>Firstly, I joined the Review Conference as part of the Campaign to Stop Killer Robots, an international coalition of non-governmental organisations (NGOs) working towards a preemptive ban on these weapons. In this capacity, I found it interesting and encouraging to observe the role played by civil society at the Review Conference, including doing advocacy, releasing research publications and making statements during the sessions. In their public remarks, state representatives often explicitly acknowledged the work of specific NGOs and experts and the importance of civil society engagement in the dialogue. Many diplomats also attended side events, organised by the Campaign, such as one on the need to adopt a ban rather than a regulatory approach to deal with the dangers associated with killer robots. In the never-ending discussions about the correct balance to strike between military interests and humanitarian concerns, civil society has a vital role to play in emphasising the importance of humanitarian protection and pushing states to adopt ambitious goals. Civil society’s efforts are all the more important when it comes to killer robots which have the potential to revolutionise warfare and raise deep ethical questions.</p>
<div id="attachment_2865" style="width: 250px" class="wp-caption alignleft"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2865" class="size-medium wp-image-3229 alignleft" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2017/11/IMG_20161216_122851-225x300.jpg?resize=225%2C300" alt="" width="225" height="300" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_122851.jpg?resize=225%2C300&amp;ssl=1 225w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_122851.jpg?resize=768%2C1024&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_122851.jpg?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_122851.jpg?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 225px) 100vw, 225px" /><p id="caption-attachment-2865" class="wp-caption-text">.</p></div>
<p>Secondly, I was surprised and concerned by the limited media coverage of the Review Conference, especially given the fact that a Review Conference happens only once every five years and addresses matters of global concern. Discussions about killer robots should take into account the views of the public at large because delegating decisions about the use of lethal force to machines raises fundamental moral and ethical questions and international law prohibits weapons that run counter to the dictates of the public conscience.  Media coverage is important to raise the public’s awareness and facilitate its involvement in the debate. Civil society can contribute by engaging with the media and disseminating information about emerging weapons technologies that have the potential to affect societies and the world we live in. In so doing, civil society can promote media scrutiny and public participation and thereby put greater pressure on states to be ambitious and adopt encompassing solutions.</p>
<p>Finally, much of the debate at the Conference concentrated on the issue of finances. Financial constraints forced some discussions to take place in an informal setting without the use of official translators. Dozens of countries throughout the week noted their concerns at the financial difficulties facing the CCW. Given the fact that the Conference lasted only five days, it was regrettable that financial discussions took time away from the substantive issues. If this pattern continues, there is a risk that it will undermine the effectiveness and impact of the GGE in 2017 and the CCW as a whole. States parties should therefore take steps to resolve the situation by making their financial contributions as soon as possible.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3225</post-id>	</item>
		<item>
		<title>Arms Control for AWS: 2016 and beyond</title>
		<link>https://www.icrac.net/3219-2/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 07 Dec 2016 16:07:58 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Opinion]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3219</guid>

					<description><![CDATA[After three informal meetings of experts, the Convention on Certain Conventional Weapons, during its Fifth Review Conference taking place from 12 to 16 December 2016 in Geneva, will decide on how to continue work on arms control for autonomous weapon systems. Below is a preview to an article published in the October 2016 issue of Arms Control [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>After three informal meetings of experts, the Convention on Certain Conventional Weapons, during its <a href="http://www.unog.ch/80256EE600585943/(httpPages)/9F975E1E06869679C1257F50004F7E8C?OpenDocument">Fifth Review Conference taking place from 12 to 16 December 2016 in Geneva</a>, will decide on how to continue work on arms control for autonomous weapon systems. Below is a preview to an article published in the October 2016 issue of <a href="https://www.armscontrol.org/ACT/2016_10/Features/Stopping-Killer-Robots-Why-Now-Is-the-Time-to-Ban-Autonomous-Weapons-Systems">Arms Control Today</a>, outlining the perspectives for future AWS arms control.</p>
<p>Sauer, Frank 2016: Stopping ‘Killer Robots’: Why Now Is the Time to Ban Autonomous Weapons Systems, in: Arms Control Today 46 (8): 8-13.</p>
<p><a href="https://www.armscontrol.org/ACT/2016_10/Features/Stopping-Killer-Robots-Why-Now-Is-the-Time-to-Ban-Autonomous-Weapons-Systems">Click here to read the full article</a>.</p>
<p><a href="https://www.unibw.de/internationalepolitik/professur/team/Sauer/Why%20Now%20Is%20the%20Time%20to%20Ban%20AWS%20-braille.brf/at_download/file">NEW: Click here for the BRF file of the full article</a></p>
<blockquote><p>[F]our possible outcomes can be predicted for the CCW process. The first would be a legally binding and preventive multilateral arms control agreement derived by consensus in the CCW and thus involving the major stakeholders, the outcome referenced as “a ban.” Considering the growing number of states-parties calling for a ban and the large number of governments calling for meaningful human control and expressing considerable unease with the idea of autonomous weapons systems, combined with the fact that no government is openly promoting their development, this seems possible. It would require mustering considerable political will. Verification and compliance for a ban, as well as for weaker restrictions, would then require creative arms control solutions. After all, with full autonomy in a weapons system eventually coming down to merely flipping a software switch, how can one tell if a specific system at a specific time is not operating autonomously? A few arms control experts are already wrapping their heads around these questions.</p>
<div class="SandboxRoot env-bp-350" data-twitter-event-id="0">
<div id="twitter-widget-1" class="EmbeddedTweet EmbeddedTweet--edge EmbeddedTweet--mediaForward media-forward js-clickToOpenTarget js-tweetIdInfo tweet-InformationCircle-widgetParent" lang="en" data-click-to-open-target="https://twitter.com/ArmsControlNow/status/786600390020194304" data-iframe-title="Twitter Tweet" data-dt-full="%{hours12}:%{minutes} %{amPm} - %{day} %{month} %{year}" data-dt-explicit-timestamp="12:12 PM - Oct 13, 2016" data-dt-months="Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec" data-dt-am="AM" data-dt-pm="PM" data-dt-now="now" data-dt-s="s" data-dt-m="m" data-dt-h="h" data-dt-second="second" data-dt-seconds="seconds" data-dt-minute="minute" data-dt-minutes="minutes" data-dt-hour="hour" data-dt-hours="hours" data-dt-abbr="%{number}%{symbol}" data-dt-short="%{day} %{month}" data-dt-long="%{day} %{month} %{year}" data-scribe="page:tweet" data-tweet-id="786600390020194304" data-twitter-event-id="3">
<article class="MediaCard MediaCard--mediaForward customisable-border" dir="ltr" data-scribe="component:card">
<div class="MediaCard-media"></div>
</article>
<div class="tweet-InformationCircle--top tweet-InformationCircle--topEdge tweet-InformationCircle" data-scribe="element:notice">
<p>&nbsp;</p>
<blockquote class="twitter-tweet" data-lang="en">
<p dir="ltr" lang="en">Can <a href="https://twitter.com/hashtag/KillerRobots?src=hash&amp;ref_src=twsrc%5Etfw">#KillerRobots</a> (autonomous weapons systems) work as preventive arms control? More in October&#8217;s <a href="https://twitter.com/hashtag/ArmsControlToday?src=hash&amp;ref_src=twsrc%5Etfw">#ArmsControlToday</a> <a href="https://t.co/E7sDVzdmbn">https://t.co/E7sDVzdmbn</a> <a href="https://t.co/LwPSojH9Gr">pic.twitter.com/LwPSojH9Gr</a></p>
<p>— Arms Control Assoc (@ArmsControlNow) <a href="https://twitter.com/ArmsControlNow/status/786600390020194304?ref_src=twsrc%5Etfw">October 13, 2016</a></p></blockquote>
<p><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script></p>
</div>
</div>
<div class="resize-sensor"></div>
</div>
<p>The second outcome would be restrictions short of a ban. The details of such an agreement are impossible to predict, but it is conceivable that governments could agree, for example, to limit the use of autonomous weapons systems, such as permitting their use against materiel only.</p>
<p>The third would be a declaratory, nonbinding agreement on best practices. Such a code of conduct would likely emphasize compliance with existing international humanitarian law and rigorous weapons review processes, in accordance with Article 36 of Additional Protocol I to the Geneva Conventions.</p>
<p>Finally, there may be no tangible result, perhaps with one of the technologically leading countries setting a precedent by fielding autonomous weapons systems. That would certainly prompt others to follow, fueling an arms race. In light of some of the most advanced standoff weapons, such as the U.S. Long Range Anti-Ship Missile or the UK Brimstone, each capable of autonomous targeting during terminal flight phase, one might argue that the world is already headed for such an autonomy arms race.</p>
<p>Implementing autonomy, which mainly comes down to software, in systems drawn from a vibrant global ecosystem of unmanned vehicles in various shapes and sizes is a technical challenge, but doable for state and nonstate actors, particularly because so much of the hardware and software is dual use. In short, autonomous weapons systems are extremely prone to proliferation. An unchecked autonomous weapons arms race and the diffusion of autonomous killing capabilities to extremist groups would clearly be detrimental to international peace, stability, and security.</p>
<div class="SandboxRoot env-bp-350" data-twitter-event-id="1">
<div id="twitter-widget-2" class="EmbeddedTweet EmbeddedTweet--edge js-clickToOpenTarget tweet-InformationCircle-widgetParent" lang="en" data-click-to-open-target="https://twitter.com/marywareham/status/788723233709101056" data-iframe-title="Twitter Tweet" data-dt-full="%{hours12}:%{minutes} %{amPm} - %{day} %{month} %{year}" data-dt-explicit-timestamp="8:47 AM - Oct 19, 2016" data-dt-months="Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec" data-dt-am="AM" data-dt-pm="PM" data-dt-now="now" data-dt-s="s" data-dt-m="m" data-dt-h="h" data-dt-second="second" data-dt-seconds="seconds" data-dt-minute="minute" data-dt-minutes="minutes" data-dt-hour="hour" data-dt-hours="hours" data-dt-abbr="%{number}%{symbol}" data-dt-short="%{day} %{month}" data-dt-long="%{day} %{month} %{year}" data-scribe="page:tweet" data-twitter-event-id="4">
<div class="EmbeddedTweet-tweet">
<blockquote class="Tweet h-entry js-tweetIdInfo subject expanded is-deciderHtmlWhitespace" cite="https://twitter.com/marywareham/status/788723233709101056" data-tweet-id="788723233709101056" data-scribe="section:subject">
<div class="Tweet-header u-cf">
<div class="Tweet-brand u-floatRight"></div>
<div class="TweetAuthor js-inViewportScribingTarget " data-scribe="component:author">
<blockquote class="twitter-tweet" data-lang="en">
<p dir="ltr" lang="en">The nascent social taboo against machines autonomously making kill decisions &#8211; Frank Sauer in <a href="https://twitter.com/ArmsControlNow?ref_src=twsrc%5Etfw">@ArmsControlNow</a> <a href="https://t.co/nBTGtXLT5R">https://t.co/nBTGtXLT5R</a> <a href="https://twitter.com/hashtag/CCWUN?src=hash&amp;ref_src=twsrc%5Etfw">#CCWUN</a></p>
<p>— Mary Wareham (@marywareham) <a href="https://twitter.com/marywareham/status/788723233709101056?ref_src=twsrc%5Etfw">October 19, 2016</a></p></blockquote>
<p><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script></p>
</div>
</div>
</blockquote>
</div>
</div>
<div class="resize-sensor"></div>
</div>
<p>This underlines the importance of the current opportunity for putting a comprehensive, verifiable ban in place. The hurdles are high, but at this point, a ban is clearly the most prudent and thus desirable outcome. After all, as long as no one possesses them, a verifiable ban is the optimal solution. It stops the currently commencing arms race in its tracks, and everyone reaps the benefits. A prime goal of arms control would be fulfilled by facilitating the diversion of resources from military applications toward research and development for peaceful purposes—in the fields of AI and robotics no less, two key future technologies.</p></blockquote>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3219</post-id>	</item>
		<item>
		<title>Speed kills! Why we need to hit the brakes on “killer robots”</title>
		<link>https://www.icrac.net/speed-kills-why-we-need-to-hit-the-brakes-on-killer-robots/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Fri, 08 Apr 2016 17:42:09 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Slider]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2875</guid>

					<description><![CDATA[by Juergen Altmann and Frank Sauer This analysis originally appeared as a guest post on duckofminerva.com Autonomous weapon systems: rarely has an issue gained the attention of the international arms control community as quickly as these so-called killer robots. “Once activated, they can select and engage targets without further intervention by a human operator“, according [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div class="entry">
<p><em>by Juergen Altmann and Frank Sauer</em></p>
<p><em>This analysis originally appeared as a guest post on <a href="http://duckofminerva.com/2016/04/speed-kills-why-we-need-to-hit-the-brakes-on-killer-robots.html">duckofminerva.com</a></em></p>
<p>Autonomous weapon systems: rarely has an issue gained <a href="http://www.cornellpress.cornell.edu/book/?GCOI=80140100234530&amp;fa=author&amp;person_id=4999">the attention of the international arms control community</a> as quickly as these so-called killer robots. “Once activated, they can select and engage targets without further intervention by a human operator“, according to the <a href="http://www.dtic.mil/whs/directives/corres/pdf/300009p.pdf">Pentagon</a>. They are, judging from the skepticism prevalent in <a href="http://futureoflife.org/open-letter-autonomous-weapons/">epistemic communities</a> and <a href="http://duckofminerva.com/2016/03/building-social-science-knowledge-on-public-attitudes-and-autonomous-weapons.html">public opinion</a> alike, a controversial development.</p>
<p>Come next Monday, the United Nations in Geneva will begin its third informal <a href="http://www.unog.ch/80256EE600585943/%28httpPages%29/37D51189AC4FB6E1C1257F4D004CAFB2?OpenDocument">experts meeting</a> on this emerging arms technology. For the third year in a row, various technical, legal and ethical questions surrounding autonomous weapons will be discussed at the UN’s Convention on Certain Conventional Weapons (CCW): Where does <a href="http://duckofminerva.com/2015/01/autonomous-or-semi-autonomous-weapons-a-distinction-without-difference.html">autonomy</a> begin, where does <a href="http://www.article36.org/wp-content/uploads/2014/05/A36-CCW-May-2014.pdf">meaningful human control</a> end? Can these systems function in compliance with <a href="https://www.icrc.org/en/document/lethal-autonomous-weapons-systems-LAWS">international humanitarian law</a>? Who is <a href="https://www.hrw.org/news/2015/04/08/killer-robots-accountability-gap">accountable</a> if things go awry? Can “outsourcing” kill-decisions to machines be <a href="https://www.icrc.org/eng/resources/documents/article/review-2012/irrc-886-asaro.htm">morally acceptable</a> in the first place?</p>
<p>Depending on how CCW States Parties answer these questions, the still nascent social taboo that forbids the use of machines autonomously making kill-decisions might spawn a <a href="http://onlinelibrary.wiley.com/doi/10.1111/1468-2346.12186/abstract">human security regime</a> and be codified in a CCW protocol. In short, a ban might be in the cards for killer robots.</p>
<p>And in fact, there is an additional set of compelling reasons for preventive arms control that received comparably less attention so far (with <a href="http://duckofminerva.com/2015/04/the-new-mineshaft-gap-killer-robots-and-the-un.html">notable exceptions</a>, of course): the impact of killer robots on peace and stability.</p>
<p><strong>Stability: not a Cold War relic</strong></p>
<p>Stability became a key notion in Cold War international thought for two reasons. First, the arms race. Arms competition instability exists if the classic dynamic of one side deploying systems which lead adversaries to respond in kind and vice versa goes unchecked, with horizontal and vertical proliferation in tow. Crises were the second reason. Crisis instability exists if there are significant incentives to <em>initiate</em> an attack quickly. These can also arise when (conventional) war is already underway; hastening the escalation to higher levels of conflict, potentially even across the nuclear threshold due to a “use them or lose them”-situation.</p>
<p>The vicious cycle of an uncurbed arms race as well as the dangers of <a href="http://www.palgrave.com/de/book/9781137533739">overboiling crises and deterrence failure</a> – backed up by the <a href="http://press.princeton.edu/titles/5301.html">accidental nuclear war</a> scares caused by early-warning slipups and <a href="http://www.penguinrandomhouse.com/books/303337/command-and-control-by-eric-schlosser/9780143125785/">human error</a> – provided cautionary tales and fueled the strive for stability via arms control during the Cold War, not only in the nuclear but also in the conventional realm with the Conventional Armed Forces in Europe (CFE) Treaty. IR and arms control literature documents these lessons. They carry over to the dawning age of autonomous weapons.</p>
<p><strong>Proliferation and arms race instability</strong></p>
<p>Strictly speaking, autonomous weapons do not exist yet. They are not to be confused with automatic defense systems capable of “firing without a human in the loop”. These are stationary or fixed on ships or trailers and mostly fire at inanimate targets such as incoming munitions. More importantly, they just repeatedly perform pre-programmed actions and operate in a comparably structured and controlled environment.</p>
<p>Autonomous weapon systems, in contrast, would have their own means of propulsion and be able to operate without human control or supervision in dynamic, unstructured, open environments over an extended period of time, potentially learning and adapting their behavior on the go. The military advantages – compared to today’s remotely piloted systems – are obvious. Think future autonomous combat drone sent off to seek, identify, track and attack targets on its own, and you’re spot on. They are called killer robots for a reason.</p>
<p>The <a href="http://sdi.sagepub.com/content/43/4/363.abstract">drone sector</a> gives an indication of what to expect. Between 2001 and 2015, the <a href="http://securitydata.newamerica.net/world-drones.html">number of countries with armed drones</a> has increased from two to ten (add Hamas and Hezbollah to that), and at least 11 countries are currently developing them.</p>
<p>Meanwhile, everything points toward weapon autonomy as the next logical step. The US, with its newly stated <a href="http://duckofminerva.com/2015/12/the-self-fulfilling-prophecy-of-high-tech-war.html">third offset strategy</a> explicitly embraces autonomy to achieve military-technological superiority and is consequently leading the way <a href="http://warisboring.com/articles/the-navys-first-carrier-drone-will-be-a-flying-gas-tank/">in the air</a>, <a href="http://www.defensenews.com/story/defense/land/army/2015/04/08/us-army-readying-unmanned-systems-doctrine/25473749/">on the ground</a>, <a href="https://www.youtube.com/watch?v=ITTvgkO2Xw4">on the sea</a> and <a href="http://www.naval-technology.com/features/feature-new-era-underwater-drones-unmanned-systems/">below it</a>. And while the US is the only country to have introduced a <a href="http://www.dtic.mil/whs/directives/corres/pdf/300009p.pdf">doctrine</a> for the deployment and use of autonomous weapon systems, claiming restraint, Deputy Secretary of Defense Bob Work just recently stated that <a href="http://www.defensenews.com/story/defense/land/army/2016/03/30/bob-work-autonomy-flight-ground-systems-robot-ai/82427024/">the delegation of lethal authority will inexorably happen</a>.</p>
<p>Absent an international ban, one would expect others to follow that lead. After all, who would allow a “<a href="http://duckofminerva.com/2015/04/the-new-mineshaft-gap-killer-robots-and-the-un.html">killer robot gap</a>”? Especially considering that implementing autonomy in already existing systems in a vibrant ecosystem of unmanned vehicles in various shapes and sizes is not the equivalent of starting a nuclear program from scratch – it’s a technical challenge, yes, but doable, particularly with significant portions of the hard- and software being dual-use. And we are not even considering technology export yet. In short, an unchecked robotics arms race is in the making – with weapons potentially proliferating to everyone, including <a href="http://duckofminerva.com/2016/03/autonomous-weapons-and-incentives-for-oppression.html">oppressive regimes</a> and <a href="http://duckofminerva.com/2015/04/the-new-mineshaft-gap-killer-robots-and-the-un.html">non-state actors</a>.</p>
<p><strong>Crisis escalation and instability</strong></p>
<p>Autonomous weapons are commonly projected as systems of systems operating in <a href="http://www.cnas.org/the-coming-swarm#.VTVl3Fwo1A8">swarms</a>. With that in mind, imagine a severe crisis, the swarms of adversaries operating in close proximity of each other. A coordinated attack of one could wipe out the other within missile flight time – that is seconds. The control software would have to react fast in order to use its weapons before they are lost. Sun glint in visual data misinterpreted as a rocket flame, sudden, unforeseen moves of the enemy swarm, a simple software bug could trigger an erroneous “counter”-attack. And while this could happen on a small scale at first, the sequence of events developing from two autonomous systems of systems interacting at rapid speed could never be trained nor tested nor, really, foreseen. The <a href="http://www.cnas.org/sites/default/files/publications-pdf/CNAS_Autonomous-weapons-operational-risk.pdf">stock market</a> provides cautionary tales of such unforeseeable algorithm interactions. Introducing algorithms in conflict bears an enormous risk of uncontrolled escalation from crisis to war.</p>
<p>In addition, swarms of autonomous weapons would generate new possibilities for disarming surprise attacks. Small, stealthy or extremely low-flying systems are difficult to detect, the absence of a remote-control radio link makes detection even harder. Russia already was not very amused when the idea of using <a href="http://thebulletin.org/2010/novemberdecember/how-us-strategic-antimissile-defense-could-be-made-work-1">stealthy drones for missile defense</a> was floated in the US. It’s easy to see why. When nuclear weapons or strategic command-and-control systems are, or are perceived to be, put at risk by undetectable swarms that are hard to defend against, autonomous conventional capabilities end up causing instability at the strategic level.</p>
<p><strong>Hitting the brakes</strong></p>
<p>The case of autonomous weapon systems is not one of “<em>we</em> need them because <em>they</em> have them”. After all, no one has them – yet. We would be well-advised to keep it this way. Preventive arms control is prudent. Not only would it curb the looming arms race, a ban would prevent the excessive acceleration of battle that threatens to escape human understanding and the possibility of staying in control during crises. Sometimes humans make mistakes, and humans are <a href="http://duckofminerva.com/2016/02/strategic-surprise-or-the-foreseeable-future.html">slower than machines</a>. But <a href="http://www.cnas.org/sites/default/files/publications-pdf/CNAS_Autonomous-weapons-operational-risk.pdf">when things threaten to get out of hand, slow is good</a>. That is why we need to hit the brakes now.</p>
<p><em>ICRAC’s Juergen Altmann, PhD, Researcher and Lecturer, Technische Universität Dortmund, </em><em>is a physicist and peace researcher specialized in the assessment of military technology and preventive arms control. He was among the first scholars to study the </em><a href="http://sdi.sagepub.com/content/35/1/61.abstract"><em>military uses of nanotechnology</em></a><em>.</em></p>
<p><em>ICRAC’s Frank Sauer, PhD, Senior Research Fellow and Lecturer, Bundeswehr University Munich, is an International Relations scholar focusing on issues of international security. He is the author of </em><a href="http://www.palgrave.com/de/book/9781137533739"><em>Atomic Anxiety: Deterrence, Taboo and the Non-Use of U.S. Nuclear Weapons</em></a><em>. </em></p>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2875</post-id>	</item>
		<item>
		<title>LAWS: An Open Letter from AI &#038; Robotics Experts</title>
		<link>https://www.icrac.net/laws-an-open-letter-from-ai-robotics-experts/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 29 Jul 2015 08:15:26 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2858</guid>

					<description><![CDATA[Thousands of experts in artificial intelligence, robotics and related professions have signed an open letter, hosted by the Future of Life Institute, calling for a ban on autonomous weapons that select and engage targets without human intervention. You can read more on the importance of this letter to the current global effort of banning lethal [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>Thousands of experts in artificial intelligence, robotics and related professions have signed an <a href="http://futureoflife.org/AI/open_letter_autonomous_weapons#signatories">open letter</a>, hosted by the <a href="http://futureoflife.org/">Future of Life Institute</a>, calling for a ban on autonomous weapons that select and engage targets without human intervention.</p>
<p>You can read more on the importance of this letter to the current global effort of banning lethal autonomous weapon systems (LAWS) on the <a href="http://www.stopkillerrobots.org/2015/07/aicall/">Campaign to Stop Killer Robots website</a>.</p>
<p>ICRAC is committed to the peaceful uses of robotics and the regulation of robot weapons. We welcome this open letter, and numerous members of ICRAC have not only signed but also commented on this youngest development over the last couple of days in various media. Below is a selection of audio and video content in English, Italian and German language.</p>
<p><iframe loading="lazy" src="http://www.cnet.com/videos/share/id/kje1bJWKBsCFql0OOXKhKB1Rtqf46pvn/" width="600" height="350" frameborder="0" seamless="seamless" allowfullscreen="allowfullscreen"></iframe></p>
<p><strong>CNET – Ban autonomous weapons, urge AI experts including Hawking, Musk and Wozniak</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Noel Sharkey and Thomas Nash</a>)<br />
Release date: 27 Jul 2015</p>
<p><iframe loading="lazy" src="https://s.embed.live.huffingtonpost.com/HPLEmbedPlayer/?segmentId=55aef80bfe34445cec000164&amp;autoPlay=false" width="570" height="321" frameborder="0"></iframe></p>
<p><strong>Huffpost Live – A.I. Experts Push For Military Robot Ban</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Heather Roff and Ian Kerr</a>)<br />
Hundreds of artificial intelligence experts and revered thinkers, including Stephen Hawking and Elon Musk, are calling for a global ban on military robots. We explore the issue and whether these autonomous weapons could lower the threshold for war.<br />
Release date: 28 Jul 2015</p>
<p><iframe loading="lazy" src="https://www.youtube.com/embed/XIpmztX1X68" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p><strong>CTV – Will the growth of killer robots set off a global arms race?</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Ian Kerr</a>)<br />
In this 5 minute interview with Canada AM host, Beverly Thompson, Ian Kerr discusses the the difference between semi-autonomous and autonomous weapons, the call to ban killer robots, why he is a signatory to the open letter, and why efficacy is not the only consideration in deciding whether to ban a dangerous use of technology.<br />
Release date: 04 Aug 2015</p>
<p><iframe loading="lazy" src="http://www.bbc.co.uk/programmes/p02y817k/player" width="400" height="500" frameborder="0"></iframe></p>
<p><strong>BBC World Service – Fighting off ‘Killer Robots’</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Heather Roff</a>)<br />
More than 1,000 tech experts, scientists and researchers have written a letter warning about the dangers of autonomous weapons, warning that a ‘military AI race is a bad idea’. One signatory, Heather Roff Perkins from the University of Denver spoke to the BBC’s Dominic Laurie.<br />
Release date: 28 Jul 2015</p>
<p><iframe loading="lazy" src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/217772360&amp;color=ff5500" width="100%" height="166" frameborder="no" scrolling="no"></iframe></p>
<p><strong>Killer robots: the coming arms race?</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Peter Asaro</a>)</p>
<p>Are you worried about killer robots? Last week, some of the most prominent thinkers in science and technology signed an open letter that warned of the coming arms race should militaries pursue the development and deployment of artificially intelligent weaponry. The letter was written by The Campaign to Stop Killer Robots, an international coalition of NGOs, and was signed by almost 14,000 people, including Stephen Hawking, Elon Musk, and Steve Wozniak. Today, we discuss the threat of fully autonomous weapons and artificially-intelligent warfare with PETER ASARO, professor of media studies at The New School and spokesperson for The Campaign to Stop Killer Robots, roboticist and Georgia Tech professor RONALD ARKIN, as well as GEORGE ZARKADAKIS, writer and AI architect.</p>
<p>Release date: 6 Aug 2015</p>
<p><iframe loading="lazy" src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/218308595&amp;color=ff5500" width="100%" height="166" frameborder="no" scrolling="no"></iframe></p>
<p><strong>Ban Killer Robots Interview – Calgary Newstalk 770</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Ian Kerr</a>)<br />
In this 25 minute interview with Roger Kinkade and Rob Breakenridge of Newstalk770, CHQR Radio, Ian Kerr talks at length about autonomous weapons and the AI community’s call to ban “killer robots”. The conversation is about what a killer robot is, why they are likely to be developed, what the dangers are if we don’t ban them and a number of broader issues regarding the future of artificial intelligence.<br />
Release date: 8 Aug 2015</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2858</post-id>	</item>
		<item>
		<title>Banning Lethal Autonomous Weapon Systems (LAWS): The way forward</title>
		<link>https://www.icrac.net/banning-lethal-autonomous-weapon-systems-laws-the-way-forward/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Fri, 13 Jun 2014 06:10:11 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2475</guid>

					<description><![CDATA[With ICRAC’s 2009 mission statement fulfilled and the issue of fully autonomous weapon systems picked up by the international community at the United Nations Convention on Conventional Weapons (CCW) in Geneva, ICRAC and the Campaign to Stop Killer Robots can celebrate a first success (Read more about this here and here and see the ICRAC [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>With <a href="http://icrac.net/2014/05/icrac-celebrates-successful-fulfillment-of-its-2009-mission/">ICRAC’s 2009 mission statement fulfilled</a> and the <a href="http://icrac.net/2013/11/campaign-to-stop-killer-robots-takes-significant-step-forward-at-un/">issue of fully autonomous weapon systems picked up by the international community</a> at the United Nations <a href="http://www.unog.ch/80256EE600585943/%28httpPages%29/4F0DEF093B4860B4C1257180004B1B30?OpenDocument">Convention on Conventional Weapons</a> (CCW) in Geneva, ICRAC and the<a href="http://www.stopkillerrobots.org/2014/05/ccwexperts/"> Campaign to Stop Killer Robots</a> can celebrate a first success (Read more about this <a href="http://icrac.net/2014/03/lethal-autonomous-robots-and-the-un-convention-on-conventional-weapons/">here</a> and <a href="http://icrac.net/2014/04/the-road-to-geneva-icrac-and-the-campaign-headed-to-ccw/">here</a> and see the <a href="http://icrac.net/category/icrac-statements/">ICRAC statements</a> delivered in Geneva).</p>
<p>But we are only at the beginning. There is lots of work left to do in regard to raising awareness, clarifying the issue, fleshing out the details of the concepts involved, and moving the debate forward at both the CCW and other international fora.</p>
<p><strong>Some background: LARS or LAWS or Killer Robots? </strong></p>
<div id="attachment_2477" style="width: 182px" class="wp-caption alignleft"><a href="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2014/06/Phalanx_CIWS-e1434435380807.jpg"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2477" class=" wp-image-2477" src="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2014/06/Phalanx_CIWS-225x300.jpg?resize=172%2C229" alt="PHALANX – Automatic defensive systems to protect human life are not the problem. Source: Wikimedia Commons " width="172" height="229" /></a><p id="caption-attachment-2477" class="wp-caption-text">PHALANX – Automatic defensive systems to protect human life are not the problem. Source: Wikimedia Commons</p></div>
<p>Obviously, militaries all over the world already deploy systems which operate <strong>“</strong>on their own<strong>“</strong>, but these systems are currently confined to defensive functions such as the interception of rockets, artillery fire and mortars, either ship-based or stationary on land. The most prominent ones are <a href="http://en.wikipedia.org/wiki/Phalanx_CIWS">PHALANX</a>, <a href="http://en.wikipedia.org/wiki/MIM-104_Patriot">PATRIOT</a>, <a href="http://en.wikipedia.org/wiki/Iron_Dome">IRON DOME</a> or <a href="http://en.wikipedia.org/wiki/N%C3%A4chstbereichschutzsystem_MANTIS">MANTIS</a> (see also <a href="http://www.hrw.org/sites/default/files/reports/arms1112_ForUpload.pdf">Human Rights Watch 2012</a>) designed for use against such inanimate targets, if necessary, without human intervention (the rationale being that there may not be enough time for human intervention). However, these defensive systems operate <em>automatically</em> rather than <em>autonomously</em>, simply performing repeated pre-programmed actions.</p>
<p>To distinguish them from these precursors, weapons systems are described as autonomous if they operate without human control or supervision, perhaps over a longer period of time, in dynamic, unstructured, open environments. In other words, these are mobile (assault) weapons platforms which are equipped with on-board sensors and decision-making algorithms, enabling them to guide themselves. As they could potentially have the autonomous capability to identify, track and attack humans or living targets, they are known as <strong>“</strong>lethal autonomous robots<strong>“</strong> (<strong>LARS</strong>) or, to use CCW’s current terminology, <strong>“</strong>lethal autonomous weapons systems<strong>“</strong> (<strong>LAWS</strong>). Of course, one might just call them <strong>Killer Robots</strong> for short instead – because that is essentially what they are.</p>
<p><strong>Why “autonomy” in mobile weapon systems </strong>–<strong> and what are the problems with LAWS?</strong></p>
<p>As of today, it is mainly for applications underwater or in the air – in other words, in less complex but more inaccessible environments – where the drive towards more autonomy is becoming most apparent.<br />
In a nutshell, three driving factors can be identified and are put forward by proponents of LAWS.</p>
<ol>
<li>Transferring all the decision-making to the weapons system offers various benefits from a military perspective. After all, there is no longer any need for a control and communication link, which is vulnerable to disruption or capture and may well reveal the system’s location, and in which there is invariably some delay between the issuing of the command by the responsible person and the execution of the command. The time benefits already afforded by defensive systems are also valuable from a tactical perspective during military assaults. In the drone sector, a number of research and technology demonstrator programs have therefore been launched to develop (more) autonomous systems; examples are the <a href="http://en.wikipedia.org/wiki/Northrop_Grumman_X-47B">X-47B</a> in the US, <a href="http://en.wikipedia.org/wiki/BAE_Systems_Taranis">Taranis</a> in the UK, and the French <a href="http://en.wikipedia.org/wiki/Dassault_nEUROn">nEUROn</a> project.</li>
<li>As autonomous systems are immune to fear, stress and overreactions, some observers believe that they offer the prospect of more humane warfare (see <a href="http://icrac.net/2013/11/georgie-tech-techdebate-ron-arkin-vs-icracs-rob-sparrow/">Ron Arkin’s arguments</a> in his debate with ICRAC’s co-founder Rob Sparrow). Proponents further argue that not only are machines devoid of negative human emotions; they also lack a self-preservation instinct, so could well delay returning fire in extreme cases. All in all, this, it is argued, could prevent some of the atrocities of war.</li>
<li>Some observers draw attention to the superior efficiency of LAWS and their cost-cutting potential, especially due to the reduced need for personnel.</li>
</ol>
<p>But the <a href="http://stopkillerrobots.org/the-problem/">problems</a> raised by LAWS are manifold.</p>
<div id="attachment_2479" style="width: 251px" class="wp-caption alignright"><a href="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2014/06/X-47B_110204-F-1162D-119.jpg"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2479" class=" wp-image-2479" src="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2014/06/X-47B_110204-F-1162D-119-300x199.jpg?resize=241%2C160" alt="The US X-47B technology demonstrator. Source: Wikimedia Commons" width="241" height="160" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/X-47B_110204-F-1162D-119.jpg?resize=300%2C199&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/X-47B_110204-F-1162D-119.jpg?resize=1024%2C680&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/X-47B_110204-F-1162D-119.jpg?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/X-47B_110204-F-1162D-119.jpg?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 241px) 100vw, 241px" /></a><p id="caption-attachment-2479" class="wp-caption-text">The US X-47B technology demonstrator. Source: Wikimedia Commons</p></div>
<p>From a <strong>military </strong>perspective, there is a certain amount of tension between autonomous systems and military leadership structures. In fact, it was noticeable how support from the military at the Expert Meeting in Geneva was somewhat muted. For them, it was mostly about toying with ideas and looking at ways of deploying these systems in strictly controlled scenarios – for example as anti-materiel weapons. But these scenarios remain highly artificial. It is entirely unclear how the use of force could be restricted against only other military hardware in the chaos of battle. Also, with much of the robotics revolution in the military driven by commercial off-the-shelf-technology (see below) the risks of proliferation and potentially destabilizing effects to peace and security quickly come to mind regarding the wider <strong>politico-military</strong> context (<a href="http://icrac.net/2013/04/arms-control-for-uninhabited-vehicles-detailed-study/">Altmann 2013</a>). And here another downside of unmanned systems in general, the lowering of the threshold to the use of military force (<a href="http://sdi.sagepub.com/content/43/4/363.abstract">Sauer/Schoernig 2012</a>), might become an even more vexing problem with LAWS in the future.</p>
<p>From an <strong>international law</strong> perspective, there is considerable doubt that LAWS could potentially be capable of distinguishing between civilians and combatants and ensure that the military use of force is proportionate. Numerous international law and robotics experts doubt that it is possible, in the foreseeable future, to pre-programme machines to abide by international law in the notoriously grey area of decision-making in times of war (<a href="http://www.icrc.org/eng/resources/documents/article/review-2012/irrc-886-sharkey.htm">Sharkey 2012</a>; <a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2211036">Wagner 2012</a>). A further objection against LAWS is that the body of international law is based on the premise of <em>human</em> agency; it is therefore unclear who would be legally responsible and accountable if people – particularly civilians – were unlawfully injured or killed by LAWS (<a href="http://profiles.arts.monash.edu.au/rob-sparrow/download/KillerrobotsForWeb.pdf">Sparrow 2007</a>). Lastly, t<em>he Martens Clause, which </em>forms part of customary international law, holds that in cases not (yet) covered in the regulations adopted in international law, the principles of the laws of humanity and the dictates of the public conscience apply. And in fact, the general public has serious concerns about LAWS. The findings of a representative survey, unfortunately available only for the US at the moment, show that a majority (55%) of Americans are opposed to autonomous weapons on humanitarian grounds, with 40% “strongly opposed” (<a href="http://www.foreignaffairs.com/articles/139554/charli-carpenter/beware-the-killer-robots">Carpenter 2014</a>).</p>
<p>What follows from this is that the <strong>ethical</strong> dimension may well pose the greatest problem regarding LAWS. In short, it stands to reason that giving machines the power to decide on the use of force against people violates basic principles of humanity and is, <em>per se</em>, unacceptable (<a href="http://icrac.net/2012/11/the-principle-of-humanity-in-conflict/">Gubrud 2012</a>; <a href="http://www.icrc.org/eng/assets/files/review/2012/irrc-886-asaro.pdf">Asaro 2012</a>). In fact, the report of the CCW Expert Meeting emphasised this very point, stressing that LAWS could end up undermining human dignity, as these systems cannot understand or respect the value of human life, yet would have the power to determine when to take it away.</p>
<p><strong>LAWS on the international agenda: The challenges involved</strong></p>
<div id="attachment_2482" style="width: 310px" class="wp-caption alignleft"><a href="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2014/06/Opinion_Ideology_AWS1.png"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2482" class="wp-image-2482 size-medium" src="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2014/06/Opinion_Ideology_AWS1-300x192.png?resize=300%2C192" alt="" width="300" height="192" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/Opinion_Ideology_AWS1.png?resize=300%2C192&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/Opinion_Ideology_AWS1.png?w=742&amp;ssl=1 742w" sizes="auto, (max-width: 300px) 100vw, 300px" /></a><p id="caption-attachment-2482" class="wp-caption-text">Noone likes a Killer Robot! Source: Charli Carpenter’s survey on “How Do Americans Feel About Fully Autonomous Weapons?”</p></div>
<p>With LAWS, the CCW has identified a new topic, which – according to seasoned CCW participants – has been placed on the agenda with unprecedented speed and is attracting lively interest from the international community. Exactly what is behind this is not one hundred percent clear though. On the one hand, it seems plausible that countries have discovered their genuine interest in a development which is deemed to require regulation, and, after the process on cluster munitions failed to conclude a new protocol in 2011, are keen to demonstrate the CCW’s capacity to act. However, the CCW has a fearsome reputation as a place where good ideas go to die a slow and silent death. So it is also possible that some countries which might have an interest in developing and deploying LAWS (from purely a military technology perspective, this applies primarily to the US, Israel, China, Russia and the United Kingdom) will use the CCW process to stall for time and smother the anti-LAWS campaign over the coming years.</p>
<p>But the Campaign to Stop Killer Robots not only brought the issue of LAWS to the CCW’s attention in record time. The Campaign – <a href="http://stopkillerrobots.org/coalition/">a coalition of 52 NGOs from 24 countries</a> which provides a platform for coordinated civil society and academic activities – is, of course, also aware of the time sensitivity of the issue and other hurdles and intricacies involved.</p>
<p>Hence the first goal must be to work towards a CCW protocol banning LAWS as swiftly as possible – a preemptive ban, that is, which would come into effect before countries and the arms industry invest so much in LAWS that the window of opportunity for a preemptive solution closes.</p>
<p>The dual use issue makes this especially pressing. Research on autonomous robots is already underway in countless university laboratories and companies, and there is massive commercial interest in robotics. And the problem lies in the fact that the integration of commercial off-the-shelf-technology has long been a driver of developments in the field of military technology.</p>
<p>So, to be clear: The Campaign is all <em>for</em> research and innovation in all fields of autonomous systems and robotics. Especially we at <strong>ICRAC</strong> like to say: <strong><em>We love robots!</em></strong> However, we want to see them used for peaceful purposes, i.e. we love autonomous robots that do not have a “kill function” or are deployed to coerce or terrorize human beings. The speed of technological progress makes drawing this line a challenging endeavor. Especially since the potential military relevance of LAWS is, obviously, much greater than that of, say, blinding lasers (CCW Protocol IV) to which a comparison is sometimes drawn.</p>
<p>Where automation can serve to protect human life – as in the aforementioned defensive systems – it is not necessarily a problem, but when machines make life and death decisions without human intervention and responsibility, a line is arguably being crossed. But where and how to draw that line to preserve basic human dignity in the practice of warfighting will be the subject of upcoming debates. At this early stage in the CCW process it is still unclear if the CCW is the right place to sort these things out and ensure a preemptive ban of lethal robots<em>.</em> It will thus have to be discussed in other fora as well, such as the <a href="http://www.stopkillerrobots.org/2014/05/hrc2014/">Human Rights Council</a> and the UN General Assembly First Committee.</p>
<p>Nevertheless, there are two graspable results of the CCW process already. First, five CCW parties (Cuba, Ecuador, Egypt, Pakistan and the Holy See) were already calling for a ban on LAWS at the informal CCW Expert Meeting. And no country vigorously defended or argued for the development and deployment of LAWS, although the Czech Republic and Israel underlined, in their statements in Geneva, that autonomous weapons systems may offer certain benefits. The US pursued a similar line of argument. Second, many countries (including Germany, Austria, France, Norway, Netherlands, Switzerland and the United Kingdom), made one thing very clear: they want to see guarantees of what is now called <strong>“</strong>meaningful human control<strong>“</strong> over the use of armed force.</p>
<p><strong>“(Meaningful?) human control” as the way forward?</strong></p>
<p>The concept of “meaningful human control”, which was introduced into the CCW debate by Campaign NGOs (<a href="http://www.article36.org/wp-content/uploads/2014/05/A36-CCW-May-2014.pdf">Article36</a>) and has now been taken up by governments, is the counter-concept to <strong>“</strong>appropriate human involvement<strong>“</strong> in the operation of (semi-)autonomous weapons systems, as specified by the US in its <a href="http://www.dtic.mil/whs/directives/corres/pdf/300009p.pdf">Directive on Autonomy in Weapon Systems</a>, issued in November 2012. It essentially argues that <strong>“</strong>appropriate human involvement<strong>“</strong> does not go far enough – for there may be certain circumstances in which <em>zero</em> human involvement may be deemed <strong>“</strong>appropriate<strong>“</strong>.</p>
<p>The idea is that human control over life and death decisions must always be significant – in other words, it must be considerably more than none at all and, putting it bluntly, it must also involve more than the mindless pressing of a button in response to machine-processed information. According to current practice, a human operator of weapons must have sufficient information about the target and sufficient control of the weapon, and must be able to assess its effects, in order to be able to make decisions in accordance with international law. But how much human judgment can be transferred into a technical system and exercised by algorithms before human control ceases to be <strong>“</strong>meaningful<strong>“</strong> – in other words, before warfare is quite literally <strong>“</strong>dehumanized<strong>“</strong>?</p>
<p>One thing seems clear: in the future, certain time limits would have to apply if LAWS are not to become a reality across a broad front. The fact is that the human brain needs time for complex evaluation and decision-making processes – time which must not be denied to it in the interaction between human and machine, if the human role is to remain relevant; in other words, if the decision-making process is merely to be supported, not dominated, by the machine.</p>
<div id="attachment_2484" style="width: 280px" class="wp-caption alignleft"><a href="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2014/06/ICRAC_CCWUN23.jpg"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2484" class=" wp-image-2484" src="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2014/06/ICRAC_CCWUN23-300x180.jpg?resize=270%2C162" alt="Some of ICRAC’s members in discussion at the UN in Geneva" width="270" height="162" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/ICRAC_CCWUN23.jpg?resize=300%2C180&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/ICRAC_CCWUN23.jpg?resize=1024%2C613&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2014/06/ICRAC_CCWUN23.jpg?w=2000&amp;ssl=1 2000w" sizes="auto, (max-width: 270px) 100vw, 270px" /></a><p id="caption-attachment-2484" class="wp-caption-text">Some of ICRAC’s members in discussion at the UN in Geneva</p></div>
<p>The concept of <strong>“</strong>meaningful human control<strong>“</strong> is, at present, not fully fleshed out yet, and in the further course of the CCW process there will undoubtedly be considerable wrangling over precisely how it should be filled with meaning. In that process, the Campaign will be pressing for the greatest possible role for the exercise of human judgment – not only in relation to killing but also in other decisions on the use of violence or non-lethal force.</p>
<p>Against this background, members of ICRAC are currently (and have been for some time) thinking more in-depth about what (meaningful) human control can and should be all about, e.g. both in terms of differentiating discrete levels of supervisory control from an analytical perspective (<a href="https://www.mini-symposium-tokyo.info/ICRA2014/sharkey2014.pdf">Sharkey 2014</a>) and in terms of a normative reminder to seek a definition that is as clear-cut, simple and with as little degrees of meaning as possible (<a href="http://gubrud.net/?p=272">Gubrud 2014</a>). In its working paper series, ICRAC has already been thinking even further ahead, pondering the design of legally binding instruments and suggesting verification and compliance measures for a possible future convention on autonomous weapons (<a href="http://icrac.net/wp-content/uploads/2013/05/Gubrud-Altmann_Compliance-Measures-AWC_ICRAC-WP2.pdf">Gubrud and Altmann 2013</a>).</p>
<p>There is lots of work to do, but considerable progress has been made. As a founding member of the Campaign to Stop Killer Robots, ICRAC will keep working towards a pre-emptive ban on lethal autonomous weapon systems to ensure that the future of robotics is a peaceful one.</p>
<p><em>This post is based on a policy paper titled <strong><a href="http://icrac.net/wp-content/uploads/2014/06/GGS_04-2014_Sauer_2014-06-13_en.pdf">Autonomous Weapons Systems – Humanising or Dehumanising Warfare?</a>, </strong>published by the German “Development and Peace Foundation” as “Global Governance Spotlight 4|2014″.</em></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2475</post-id>	</item>
		<item>
		<title>Is Russia Leading the World to Autonomous Weapons?</title>
		<link>https://www.icrac.net/is-russia-leading-the-world-to-autonomous-weapons/</link>
		
		<dc:creator><![CDATA[Mark Gubrud]]></dc:creator>
		<pubDate>Tue, 06 May 2014 05:32:05 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[News]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2453</guid>

					<description><![CDATA[The short answer is no. But Russia is testing and may deploy at its ICBM bases a lethal mobile system which has “automatic and semi-automatic control modes.” Additionally, Deputy Prime Minister Dmitry Rogozin has recently called for “robotic systems that are fully integrated in the command and control system, capable not only to gather intelligence [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Mark Gubrud' src='https://secure.gravatar.com/avatar/a0ed93015aa261386521e2fdb3b63ff65d79da29491562533b052108724bcdcc?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a0ed93015aa261386521e2fdb3b63ff65d79da29491562533b052108724bcdcc?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong>Mark Gubrud</strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div class="entry">
<p>The short <a href="http://gubrud.net/?p=203">answer is no</a>. But Russia is testing and may deploy at its ICBM bases a lethal mobile system which has “automatic and semi-automatic control modes.” Additionally, Deputy Prime Minister Dmitry Rogozin has recently called for “robotic systems that are fully integrated in the command and control system, capable not only to gather intelligence and to receive from the other components of the combat system, but also on their own strike.”</p>
<p>However, <a href="http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_Status_4Nov2013.doc">Russia’s statements at the UN</a> have expressed concern about autonomous weapons as a threat to human life and international law, and Russia will be a participant in the Geneva CCW meeting. Moreover, a critical examination of claims that Russia is notably more aggressive in its early deployments of autonomous weapon systems than other nations, let alone that Russia is “leading a new robotic arms race,” shows these claims to be inflated and unwarranted. I have <a href="http://gubrud.net/?p=203">posted a detailed report at 1.0 Human.</a></p>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Mark Gubrud' src='https://secure.gravatar.com/avatar/a0ed93015aa261386521e2fdb3b63ff65d79da29491562533b052108724bcdcc?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a0ed93015aa261386521e2fdb3b63ff65d79da29491562533b052108724bcdcc?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong>Mark Gubrud</strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2453</post-id>	</item>
		<item>
		<title>Futureproofing Is Never Complete: Ensuring the Arms Trade Treaty Keeps Pace with New Weapons Technology</title>
		<link>https://www.icrac.net/futureproofing-is-never-complete-ensuring-the-arms-trade-treaty-keeps-pace-with-new-weapons-technology/</link>
		
		<dc:creator><![CDATA[mbolton]]></dc:creator>
		<pubDate>Sat, 19 Oct 2013 21:42:42 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Working Papers]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2329</guid>

					<description><![CDATA[In a new working paper, International Committee for Robot Arms Control (ICRAC) members Matthew Bolton (Pace University) and Wim Zwijnenburg (IKV Pax Christi) stress the importance of making sure states control new weapons technologies, including robotic weapons, when the Arms Trade Treatyenters into force. It outlines strategies for civil society (such as the Control Arms campaign) and concerned states to counter [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>In <a href="http://icrac.net/wp-content/uploads/2013/10/Futureproofing-ICRAC-Working-Paper-3-2.pdf">a new working paper</a>, <a href="http://icrac.net/" target="_blank">International Committee for Robot Arms Control (ICRAC) </a>members Matthew Bolton (<a href="http://www.pace.edu/dyson/academic-departments-and-programs/political-science/faculty/matthew-bolton" target="_blank">Pace University</a>) and Wim Zwijnenburg (<a href="http://www.ikvpaxchristi.nl/" target="_blank">IKV Pax Christi</a>) stress the importance of making sure states control new weapons technologies, including robotic weapons, when the <a href="http://www.un.org/disarmament/ATT/" target="_blank">Arms Trade Treaty</a>enters into force. It outlines strategies for civil society (such as the <a href="http://controlarms.org/en/" target="_blank">Control Arms campaign</a>) and concerned states to counter potential arguments from states or manufacturers acting in bad faith, who may claim erroneously that the treaty will not apply to robotic weapons. We recommend that civil society and concerned states:</p>
<ol>
<li>Unequivocally assert that the Arms Trade Treaty Scope includes both manned and unmanned conventional arms,</li>
<li>Build on the recent clarifications by the <a href="http://www.un.org/ga/search/view_doc.asp?symbol=A/68/140" target="_blank">Group of Governmental Experts of the UN Register of Conventional Arms </a>of the categories of weapons borrowed by the Arms Trade Treaty in its Scope. The Group authoritatively defined the categories as including armed aerial drones,</li>
<li>Develop and promote comprehensive National Control Lists of the weapons to be controlled by states party to the Arms Trade Treaty,</li>
<li>Influence the interpretation of the Arms Trade Treaty through careful monitoring and calling out states acting in bad faith, and</li>
<li>Build connections between the community working on the Arms Trade Treaty (such as <a href="http://www.controlarms.org/">Control Arms</a>) and those working on related campaigns (such as the <a href="http://www.stopkillerrobots.org/" target="_blank">Campaign to Stop Killer Robots</a>) and control regimes (such as the <a href="http://www.un.org/disarmament/convarms/Register/" target="_blank">UN Register</a>, <a href="http://www.mtcr.info/english/" target="_blank">Missile Technology Control Regime</a>, the <a href="http://www.wassenaar.org/" target="_blank">Wassenaar Arrangement </a>and dual-use equipment control programs).</li>
</ol>
<p><a href="http://icrac.net/wp-content/uploads/2013/10/Futureproofing-ICRAC-Working-Paper-3-2.pdf">Click here to read the full paper</a>.</p>
<p><a href="http://icrac.net/who/">ICRAC</a> is an international committee of experts in robotics technology, robot ethics, international relations, international security, arms control, international humanitarian law, human rights law, and public campaigns, concerned about the pressing dangers that military robots pose to peace and international security and to civilians in war.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2329</post-id>	</item>
		<item>
		<title>A meme is born: autonomous = secure</title>
		<link>https://www.icrac.net/a-meme-is-born-autonomous-secure/</link>
		
		<dc:creator><![CDATA[Mark Gubrud]]></dc:creator>
		<pubDate>Fri, 11 Oct 2013 21:46:20 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[News]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2332</guid>

					<description><![CDATA[One of Joshua Foust’s assertions in his debate with Heather Roff was that making weapons autonomous was necessary in order to secure them against the threat of hacking. I posted a response after Foustreiterated this surprising argument, and provided a few scraps of pseudo-evidence to support it, in an article which seems to have gone semi-viral on the internet–launching what seems [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Mark Gubrud' src='https://secure.gravatar.com/avatar/a0ed93015aa261386521e2fdb3b63ff65d79da29491562533b052108724bcdcc?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a0ed93015aa261386521e2fdb3b63ff65d79da29491562533b052108724bcdcc?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong>Mark Gubrud</strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>One of Joshua Foust’s assertions in his debate with Heather Roff was that making weapons autonomous was necessary in order to secure them against the threat of hacking. I <a title="foustian meme" href="https://medium.com/i-m-h-o/a7c6981915e1">posted a response</a> after Foust<a href="http://www.defenseone.com/technology/2013/10/ready-lethal-autonomous-robot-drones/71492/">reiterated</a> this surprising argument, and provided a few scraps of pseudo-evidence to support it, in an article which seems to have <a href="http://www.checkarmaments.com/america-wants-drones-that-kill-without-humans-g757167530?language=en">gone semi-viral</a> on the internet–launching what seems likely to become a persistent meme (canard, for those less fond of neologisms) in this debate.</p>
<p>Briefly, Foust argues that teleoperated drones today are too vulnerable to hacking through their communications links, and that the solution is to make them fully autonomous. And just as briefly, I <a href="https://medium.com/i-m-h-o/a7c6981915e1">show</a> that this is wrong in a number of ways.</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Mark Gubrud' src='https://secure.gravatar.com/avatar/a0ed93015aa261386521e2fdb3b63ff65d79da29491562533b052108724bcdcc?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a0ed93015aa261386521e2fdb3b63ff65d79da29491562533b052108724bcdcc?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong>Mark Gubrud</strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2332</post-id>	</item>
	</channel>
</rss>
