<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Frank Sauer &#8211; ICRAC</title>
	<atom:link href="https://www.icrac.net/author/fsauer/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.icrac.net</link>
	<description>International Committee for Robot Arms Control</description>
	<lastBuildDate>Mon, 09 Jun 2025 20:09:59 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.2</generator>
<site xmlns="com-wordpress:feed-additions:1">128339352</site>	<item>
		<title>ICRAC statement at the 2018 CCW States Parties Meeting</title>
		<link>https://www.icrac.net/icrac-statement-at-the-2018-ccw-states-parties-meeting/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Fri, 23 Nov 2018 08:34:47 +0000</pubDate>
				<category><![CDATA[Front Page]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Statements]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=4341</guid>

					<description><![CDATA[As delivered by Prof. Roser Martínez Quirante (in Spanish) &#160; Mr. President, representatives of nations, members of civil society, During the past 5 years, at the Convention on Conventional Weapons, we have seen a greater understanding of the problems and challenges posed by autonomous weapon systems. The ICRAC is satisfied with the general consensus on [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>As delivered by Prof. Roser Martínez Quirante (in Spanish)</p>
<p>&nbsp;</p>
<p>Mr. President, representatives of nations, members of civil society,</p>
<p>During the past 5 years, at the Convention on Conventional Weapons, we have seen a greater understanding of the problems and challenges posed by autonomous weapon systems. The ICRAC is satisfied with the general consensus on the need to retain human control over these systems, in particular, on the critical functions of selection and elimination of objectives. Therefore, we believe that the time has come to establish binding legal mechanisms that restrict the use of autonomous weaponry, underlining the importance of human judgment in critical decisions.</p>
<p>During our participation in this convention, we have generated a large number of scientific articles, books and reports that emphasize three main classes of risk.</p>
<p>First, this type of weapons can not guarantee compliance with international humanitarian law. We should not give a blank check to future technology. With the large-scale commercialization of AI it is true that we are observing a great innovation in areas that are beneficial for humanity, but at the same time we are witnessing the appearance of many problems with biases in decision and facial recognition algorithms that can be dramatic if they are applied in a warlike context.</p>
<p>If nations invest based on techno-scientific speculations, we believe that it will be practically impossible to return to the starting position when the new typologies of conflict that announce these weapons materialize. We urge states to consider the veracity of current technology and its limitations in the critical selection of objectives.</p>
<p>Second, there are considerable moral values ​​at risk. No machine, computer or algorithm is capable of recognizing a human being as such, nor can he respect it as a human being with rights and dignity. He only observes it as a bit of information. A machine, without intuition, without ethics or morals, can not even understand what it means to be in a state of war, much less what it means to end a human life.</p>
<p>Decisions to end human life must be made by humans and in a non-arbitrary way to be justified. In addition, we must not confuse the fact that humans develop computer programs with the objective that the calculated results of these programs constitute human decisions. While responsibility for the deployment of lethal force is a necessary condition for compliance with minimum ethical standards in armed conflict, that responsibility alone is not enough, also requiring recognition of the human, of its dignity, and the reflection on the value of life and the justification of the use of violent force.</p>
<p>Third, autonomous weapons systems represent a great danger to global security. The threshold for the application of military force will be reduced and the likelihood of conflict will increase. We are concerned that the human control mechanisms established and controlled for double verification and reconsideration, function as security boxes or switches and can be easily disconnected. This, in combination with unpredictable algorithmic interactions and unpredictable results, will increase the instability of the conflict. In addition, the development and use of autonomous weapons by some States unilaterally will provide strong incentives for their proliferation, including their use by actors who are not responsible to the legal frameworks governing the use of force. Do we really need this new competitive arms race?</p>
<p>From the ICRAC as well as from other organizations involved in the Stop Killer Robots Campaign, representing a large part of international civil society, we urge the Convention to lay the foundations for the elaboration of an international treaty whose main objective is to prohibit preventive way autonomous weapons in clear application of the precautionary principle.</p>
<p>&nbsp;</p>
<p>***</p>
<p>&nbsp;</p>
<p>Señor presidente, representantes de las naciones, miembros de la sociedad civil,</p>
<p>&nbsp;</p>
<p>Durante los últimos 5 años en la Convención de armas Convencionales, hemos visto una mayor comprensión de los problemas y desafíos planteados por los sistemas de armamento autónomo. El ICRAC está satisfecho con el consenso general sobre la necesidad de retener el control humano sobre estos sistemas de armamento, en particular, sobre las funciones críticas de selección y eliminación de objetivos. Por ello consideramos que ha llegado el momento de establecer unos mecanismos legales vinculantes que restrinjan el uso de armamento autónomo subrayando la importancia del juicio humano en decisiones críticas.</p>
<p>&nbsp;</p>
<p>Durante nuestra participación en esta convención, hemos generado un gran número artículos científicos, libros e informes que enfatizan tres clases principales de riesgo.</p>
<p>&nbsp;</p>
<p>Primero, esta tipología de armas no puede garantizar el cumplimiento del derecho internacional humanitario. No debemos dar un cheque en blanco a la tecnología futura. Con la comercialización a gran escala de la IA es cierto que estamos observando una gran innovación en ámbitos benéficos para la humanidad, pero al mismo tiempo estamos comprobando la aparición de muchos problemas con sesgos en los algoritmos de decisión y de reconocimiento facial que pueden ser dramáticos si se aplican en un contexto bélico.</p>
<p>&nbsp;</p>
<p>Si las naciones invierten en base a especulaciones tecnocientíficas, creemos que será prácticamente imposible volver a la posición de partida cuando las nuevas tipologías de conflicto que anuncian estas armas, se materialicen. Instamos a los estados a considerar la veracidad de la tecnología actual y sus limitaciones en la selección crítica de objetivos legítimos.</p>
<p>&nbsp;</p>
<p>Segundo, hay considerables valores morales en riesgo. Ninguna máquina, computadora o algoritmo es capaz de reconocer a un ser humano como tal, ni puede respetarlo como un ser con derechos y dignidad humana. Solo lo observa como un bit de información. Una máquina, sin intuición, sin ética ni moral, ni siquiera puede entender lo que significa estar en estado de guerra, y mucho menos lo que significa terminar con una vida humana.</p>
<p>&nbsp;</p>
<p>Las decisiones para acabar con la vida humana deben ser tomadas por los humanos y de forma no arbitraria para ser justificadas. Además, no debemos confundir el hecho de que los humanos desarrollan programas informáticos con el objetivo que los resultados calculados de esos programas constituyan decisiones humanas. Si bien la responsabilidad por el despliegue de la fuerza letal es una condición necesaria para el cumplimiento de los estándares mínimos éticos en el conflicto armado, esa responsabilidad por sí sola no es suficiente, requiriendo también el reconocimiento de lo humano, de su dignidad, y la reflexión sobre el valor de la vida y la justificación del uso de la fuerza violenta.</p>
<p>&nbsp;</p>
<p>En tercer lugar, los sistemas de armas autónomos representan un gran peligro para la seguridad global. El umbral para la aplicación de la fuerza militar se reducirá y la probabilidad de conflicto aumentará. Nos preocupa que los mecanismos de control humano establecidos y controlados encaminados a una doble verificación y reconsideración, funcionen como cajas de seguridad o interruptores y puedan ser fácilmente desconectados. Esto, en combinación con interacciones de algoritmos imprevisibles y resultados impredecibles, aumentará la inestabilidad del conflicto. Además, el desarrollo y uso de armas autónomas por parte de algunos Estados proporcionará fuertes incentivos para su proliferación, incluido su uso por parte de actores que no son responsables ante los marcos legales que rigen el uso de la fuerza. ¿Realmente necesitamos esta nueva carrera competitiva de armamentos?</p>
<p>&nbsp;</p>
<p>Desde el ICRAC así como desde otras organizaciones involucradas en la Campaña Stop Killer Robots, en representación de una gran parte de la sociedad civil internacional, instamos a la Convención a sentar las bases para la elaboración de un tratado internacional que tenga como objetivo principal prohibir de manera preventiva las armas autónomas en aplicación clara del principio de precaución.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4341</post-id>	</item>
		<item>
		<title>ICRAC general statement at the August 2018 CCW GGE</title>
		<link>https://www.icrac.net/icrac-general-statement-at-the-august-2018-ccw-gge/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 29 Aug 2018 15:28:10 +0000</pubDate>
				<category><![CDATA[Front Page]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Statements]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=4263</guid>

					<description><![CDATA[As delivered by Prof. Noel Sharkey Mr Chairman, Over the last 5 years at the CCW we have seen an increased understanding of the issues and challenges posed by autonomous weapons systems. ICRAC is pleased with the general consensus that we must retain human control over weapons systems, in particular, over the critical functions of [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><em>As delivered by Prof. Noel Sharkey</em></p>
<p>Mr Chairman,</p>
<p>Over the last 5 years at the CCW we have seen an increased understanding of the issues and challenges posed by autonomous weapons systems. ICRAC is pleased with the general consensus that we must retain human control over weapons systems, in particular, over the critical functions of selecting and killing targets. During our time at the CCW, we have produced many scientific papers and reports emphasising three major classes of risk.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-medium wp-image-4308 alignright" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/IMG_20180829_172118314_BURST000_COVER_TOP.jpg?resize=300%2C225&#038;ssl=1" alt="" width="300" height="225" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/IMG_20180829_172118314_BURST000_COVER_TOP.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/IMG_20180829_172118314_BURST000_COVER_TOP.jpg?resize=160%2C120&amp;ssl=1 160w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/IMG_20180829_172118314_BURST000_COVER_TOP.jpg?resize=768%2C576&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/IMG_20180829_172118314_BURST000_COVER_TOP.jpg?resize=1024%2C768&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/IMG_20180829_172118314_BURST000_COVER_TOP.jpg?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/IMG_20180829_172118314_BURST000_COVER_TOP.jpg?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 300px) 100vw, 300px" /></p>
<p><strong>First</strong>, we do not believe that IHL compliance can be guaranteed with autonomous weapons systems. Some argue that the technology will be able to comply with IHL in the future. But there is absolutely no evidence for that. We must not rely on hopeware and speculations about future technology. With the mass scale commercialisation of AI we are seeing great innovation but we are also seeing the emergence of many problems with bias in decision algorithms and face recognition (see my new ICRC blog post for more on this). If nations invest heavily on the basis of technical speculations, we believe that it will be difficult to put the toothpaste back in the tube when the humanitarian crises begin to emerge. We urge states to look at the plausibility of the current technology and how it falls short in the critical function of selecting legitimate targets.</p>
<p><strong>Second,</strong> there are considerable moral values at risk. No machine, computer or algorithm is capable of recognizing a human as a human being, nor can it respect the human as a being with human rights and human dignity.  A machine cannot even understand what it means to be in a state of war, much less what it means to have, or to end a human life. Decisions to end human life must be made by humans in order to be justified.  Further, we should not mistake the fact that humans write computer programs to imply that the calculated results of those programs constitute human decisions.  While accountability for the deployment of lethal force is a necessary condition for moral responsibility in war, accountability alone is not sufficient for moral responsibility.  This also requires the recognition of the human, respect for the human right to life and dignity, and reflection upon the value of life and the justification for the use of violent force.</p>
<p><strong>Third</strong>, Autonomous Weapons Systems pose great dangers to global security. The threshold for applying military force will be lowered and the likelihood of conflict will go up. We are concerned that tried and tested human control mechanisms for double checking and reconsidering, with humans functioning as fail-safes or circuit-breakers, would be discontinued. This, in combination with unforeseeable algorithm interactions and their unpredictable outcomes, increases crisis instability. In addition, the development and use of Autonomous weapons by <em>some</em> States will provide strong incentives for their proliferation, including their use by actors not accountable to legal frameworks governing the use of force. Do we really need a new arms race?</p>
<p><strong>Finally,</strong> we urge that nations urgently move towards negotiations for a legally binding instrument in further deliberations next year. I am going off script here but look &#8211; come on – and no offence intended – but I am a scientist and not a diplomat so in plain speech there are those here who have an interest in slowing down the move towards a ban while they quickly continue to develop the weapons. Don’t be fooled or bullied by these tactics or the mudslide of refining definitions. We ask you &#8211; please &#8211; get on with ridding us of these morally reprehensible weapons before it is too late.</p>
<p>Thank you, Mr Chairman</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4263</post-id>	</item>
		<item>
		<title>ICRAC statement on the human control of weapons systems at the August 2018 CCW GGE</title>
		<link>https://www.icrac.net/icrac-statement-on-the-human-control-of-weapons-systems-at-the-august-2018-ccw-gge/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 29 Aug 2018 08:40:49 +0000</pubDate>
				<category><![CDATA[Front Page]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Statements]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=4246</guid>

					<description><![CDATA[As delivered by Dr. Elke Schwarz Thank you, Mr Chairperson, The International Committee for Robot Arms Control is pleased to see states move away from the use of broad, brush-stroke terms such as in-the-loop, on-the-loop, the wider loop, human oversight, and appropriate human judgement. We agree with the working paper from Estonia and Finland that [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignnone size-medium wp-image-4248" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/DlwQIN-XoAAcT7v.jpg-large.jpg?resize=300%2C225&#038;ssl=1" alt="" width="300" height="225" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/DlwQIN-XoAAcT7v.jpg-large.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/DlwQIN-XoAAcT7v.jpg-large.jpg?resize=160%2C120&amp;ssl=1 160w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/DlwQIN-XoAAcT7v.jpg-large.jpg?resize=768%2C576&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/DlwQIN-XoAAcT7v.jpg-large.jpg?resize=1024%2C768&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/08/DlwQIN-XoAAcT7v.jpg-large.jpg?w=2048&amp;ssl=1 2048w" sizes="auto, (max-width: 300px) 100vw, 300px" />As delivered by Dr. Elke Schwarz</p>
<p>Thank you, Mr Chairperson,</p>
<p>The International Committee for Robot Arms Control is pleased to see states move away from the use of broad, brush-stroke terms such as in-the-loop, on-the-loop, the wider loop, human oversight, and appropriate human judgement. We agree with the working paper from Estonia and Finland that complex definitions of autonomy and autonomous weapons systems is moving us in the wrong direction. As scientists we believe, following Einstein, that definitions should be as simple as possible but no simpler. In that, we applaud the approach of the ICRC, that focus should be on the <em>critical functions</em> of target selection and the application of violent force. This counters concerns that a prohibition of autonomous weapons systems (AWS) would impact on innovation in other civilian and non-lethal military applications.</p>
<p>&nbsp;</p>
<p>ICRAC holds that the way forward is to focus on the meaningful human control of weapons systems. For human control to be <em>meaningful</em> we need to examine how humans interact with machines and understand the types of human-machine biases that can occur in the selection of legitimate targets. Lessons should be learned from 30 years of research on human supervisory control of machinery and more than 100 years of research on the psychology of human reasoning. A combination of this work can help us to design human-machine interfaces that allow weapons to be controlled in a manner that is fully compliant with international law and the principles of humanity.</p>
<p>First, there should be a focus on what the human operator<strong> MUST</strong> <em>do</em> in the targeting cycle. This is <em>control in use</em> which is governed by targeting rules under International Humanitarian Law and International Human Rights Law. Further, international law rules that apply <em>after</em> the use of weapons – such as those that relate to human responsibility – must be satisfied.</p>
<p>Second, the design of weapon systems must render them <strong>INCAPABLE</strong> of operating <em>without</em> meaningful human control.  This is <em>control by design</em>, which is governed by international weapons law. In terms of international weapons law, if the weapon system, by its design, is incapable of being sufficiently controlled, then such a weapon is illegal <em>per se. </em>Systems <strong>MUST</strong> be designed to ensure human responsibility and accountability.</p>
<p>Ideally the following three conditions should be followed for the control of weapons systems:</p>
<ol>
<li>a human commander (or operator) will have full contextual and situational awareness of the target area for each and every attack and is able to perceive and react to any change or unanticipated situations that may have arisen since planning the attack.</li>
<li>there will be active cognitive participation in every attack with sufficient time for deliberation on the nature of any target, its significance in terms of the necessity and appropriateness of attack, and likely incidental and possible accidental effects of the attack and</li>
<li>there will be a means for the rapid suspension or abortion of every attack.</li>
</ol>
<p>For further details please see our guidelines for the human control of weapons systems from the April meeting this year.</p>
<p>While systems must be designed to ensure safety and responsibility, we should not mistake the review of weapons and good design as itself a form of human control. The responsibility to make decisions of life and death cannot be delegated to machines, nor to the review- or design process of those machines.</p>
<p>Thank you Mr Chairperson</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4246</post-id>	</item>
		<item>
		<title>ICRAC Statement at the 2017 CCW GGE Meeting</title>
		<link>https://www.icrac.net/icrac-statement-at-the-2017-ccw-gge-meeting/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 15 Nov 2017 20:11:20 +0000</pubDate>
				<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Statements]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3350</guid>

					<description><![CDATA[ICRAC Statement to the 2017 UN CCW GGE Meeting Delivered by Noel Sharkey, Chair, on 15 November 2017 I speak on behalf of the International Committee for Robot Arms Control, a founding member of the Campaign to Stop Killer Robots. We would like to thank Ambassador Gill for his preparations of this important meeting. And [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><strong>IC</strong><strong>RAC Statement to the 2017 UN CCW GGE Meeting</strong></p>
<p>Delivered by Noel Sharkey, Chair, on 15 November 2017</p>
<p>I speak on behalf of the International Committee for Robot Arms Control, a founding member of the Campaign to Stop Killer Robots. We would like to thank Ambassador Gill for his preparations of this important meeting. And we also thank all of the States Parties for their lively participation and their interesting points of view.</p>
<p>ICRAC has many concerns about the use and development of autonomous weapons systems but in this statement we are going to concentrate on three points which have come up in the discussions here: these are the dual use of autonomous systems, where we are now with autonomous weapons systems development, and finally the issue of definitions and human control.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-medium wp-image-3352 alignright" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2017/12/signal-2017-11-15-122318-300x225-300x225.jpg?resize=300%2C225" alt="" width="300" height="225" />First the question of dual use: Will a prohibition on LAWS inhibit the development of autonomous systems innovation that have a practical purpose for good in society? The answer is clearly NO! Remember: We are not calling for a prohibition on the development or use of autonomous robots or autonomous functioning in the military or in the civilian sphere – except in one instance. We wish to only prohibit the development and use of autonomy in the critical functions of target selection and the application of violent force. Let us be totally clear here that restricting these critical functions in weapon systems will have absolutely no impact on civilian or even other military applications.</p>
<p>Second, it also became evident in the discussions over the last couple of days that some of the delegates believe that no one is as yet developing autonomous weapons systems, that they are a long way off and there are not even any working prototypes. The announcements from a number of companies in the hi-tech nations tell a very different story. In recent years we have heard about the development of fully autonomous fighter jets, tanks, submarines, naval ships, border protection systems and swarms of small drones. These have not been deployed as yet but that will not take long. For example, Kalashnikov have announced this year that <a href="http://tass.com/defense/954894">it was developing a</a> “fully automated combat module” based on neural networks that could allow a weapon to “identify targets and make decisions.” We cannot verify the truth of such claims but nonetheless it is clear that the underlying technology that will enable self-targeting is here and could be deployed soon.</p>
<p>Finally, we at ICRAC are very concerned that we are already beginning to see the emergence of an arms race towards an ever increasing level of autonomy in weapon systems. There are often token efforts to say that there is a human somewhere in the control loop or on the control loop exercising some form of human judgement or planning. This human control of weapons systems is the key component of what we should be focussing on in these discussions – not artificial intelligence, not different levels of autonomy for vehicles or semi- autonomous function. It is sufficient to define autonomous weapons systems in a simple way such as “weapons systems that once launched can select targets and apply violent force without meaningful human control” – or something similar. It would be MOST valuable here to debate what kinds of human control do states find acceptable. There is 30 years of research on human supervisory control of computing machinery and we have never heard it mentioned here. So let us come down on a simple definition of autonomous weapons systems without delay and get down to the really important question of what is an acceptable level of human control. Why are we not discussing this here to find out what state experts think is acceptable and what is acceptable ethically and under IHL and IHRL?</p>
<p>ICRAC would like to recommend that States Parties schedule at least four weeks of time for talks in 2018 to discuss the human control of weapons systems and to start a process toward negotiating a legally binding instrument that ensures meaningful human control over weapon systems.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3350</post-id>	</item>
		<item>
		<title>Frequently Asked Questions on LAWS</title>
		<link>https://www.icrac.net/frequently-asked-questions-on-laws/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Sat, 11 Nov 2017 20:05:12 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Slider]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3344</guid>

					<description><![CDATA[Memorandum for delegates at the Convention on Certain Conventional Weapons (CCW) Group of Governmental Experts (GGE) Meeting on Lethal Autonomous Weapons Systems (LAWS) Geneva, 13-17 November 2017 ICRAC is an international not-for-profit association of scientists, technologists, lawyers and policy experts committed to the peaceful use of robotics and the regulation of robot weapons. Please visit our [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><strong>Memorandum for delegates at the Convention on Certain Conventional Weapons (CCW) Group of Governmental Experts (GGE) Meeting on Lethal Autonomous Weapons Systems (LAWS)</strong></p>
<p><strong>Geneva, 13-17 November 2017<img data-recalc-dims="1" loading="lazy" decoding="async" class="alignright wp-image-3347 size-medium" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/12/ICRAC_CCWUN24-300x225-300x225.jpg?resize=300%2C225&#038;ssl=1" alt="" width="300" height="225" /></strong></p>
<p><strong>ICRAC</strong> is an international not-for-profit association of scientists, technologists, lawyers and policy experts committed to the peaceful use of robotics and the regulation of robot weapons. Please visit our website <a href="http://www.icrac.net/">www.icrac.net</a> and follow us on Twitter <a href="https://twitter.com/icracnet">@icracnet</a></p>
<p><strong>ICRAC</strong> is a founding member of the Campaign to Stop Killer Robots <a href="http://www.stopkillerrobots.org/">www.stopkillerrobots.org</a></p>
<p>&nbsp;</p>
<p><strong>What is Artificial Intelligence (AI)?</strong></p>
<p>The term AI tends to evoke science-fiction tropes and even notions of “super intelligence”. But in reality, AI is just an umbrella term given to computational techniques that automate tasks that we would normally consider to require human intelligence. This does not mean that these software programs themselves are intelligent.</p>
<p>&nbsp;</p>
<p><strong>How fast is AI progressing?</strong></p>
<p>Enthusiasm about the progress of AI has increased considerably in the last couple of years even while techniques have not improved much since the 1980s. This is largely because of two factors</p>
<p>(i) the acquisition of big data sets with billions of examples;</p>
<p>(ii) plummeting costs for massive processing power.</p>
<p>Both factors provide an ideal environment for a cluster of computational techniques called Machine Learning (ML). The exploitation of ML has led to the mass commercialization of AI over a wide range of applications by various companies. So current AI progress is best described as spreading sideways rather than moving upwards.</p>
<p><strong> </strong></p>
<p><strong>Do civilian and military applications of AI differ?</strong></p>
<p><strong>Yes.</strong> What makes any autonomous system relying on AI computational techniques work is brittle software based on algorithms and statistics. Thanks to the availability of large amounts of training data, we will hopefully soon be able to make these techniques work in applications such as self-driving cars, to name a prominent example from the civilian sector. But this does not translate into military applications. Aside from the fact that cars and weapon systems are designed for completely different purposes, the comparably structured and regulated environment of road traffic does not compare at all to the adversarial, chaotic environment of the battlefield. The fog of war will only allow for faulty or, at best, noisy data. So beware of false equivalences!</p>
<p>&nbsp;</p>
<p><strong>Would LAWS be “precision weapons”?</strong></p>
<p><strong>Possibly (yet illegal to use as well). </strong>LAWS could take various forms. For instance, a swarm of hobby drones fitted with a heat sensor and a small explosive payload could be programmed to attack everything that emits body temperature. Such a three-dimensional moving minefield of LAWS would be the opposite of a precision weapon.</p>
<p>But let’s assume, for the sake of the argument, LAWS designed with military-grade accuracy in mind. Fitted with better sensing and data processing hard- and software as well as payloads tailored to the system’s mission, those could be more precise than current weapon systems. But the technical potential for accuracy and the application of violent force to a legitimate target are two separate issues. Even the most high-tech precision weapon system has to be used in a manner that is legal under International Humanitarian Law (IHL).</p>
<p>IHL dictates that, when using a weapon system, constant care should be taken to avoid or minimize civilian casualties (principles of distinction and precautions in attack). It also prohibits to launch or continue an attack, when the expected civilian losses exceed the military advantage sought for (principle of proportionality). These concepts enshrined in IHL are only meaningful in the context of human judgment. Machines are a far cry from the reasoning that a human military commander acting responsibly and in compliance with the law would engage in. Machines will for the foreseeable future not be able to discriminate combatants from civilians, let alone judge which use of force or type of munition is proportionate in light of the military objective. Hence we cannot and must not expect modern weapon systems to free us from these legal obligations. On the contrary, we have to heed these principles in equivalence with our growing technological capabilities.</p>
<p>For example, before launching an attack, and throughout its execution, IHL requires military commanders to take all feasible precautions to spare the civilian population, by making use of all the information from all sources available to them. An autonomous weapon system fitted with various sensors for targeting purposes would thus require a commander to make use of the data that is gathered and the additional information that is generated whilst using the system. A commander cannot choose to treat this new “smart” precision weapon akin to the “dumb” weapons of the past, that is, as if this information were not being made available by the system or as if it could be ignored. Instead, weapon technology and legal obligations go hand in hand. Consequently, the more sophisticated our weapon systems become, the more meaningful human control becomes <em>feasible</em> regarding the critical functions of identifying (“fixing”), selecting and engaging targets. And hence the more care for ensuring meaningful human control is <em>required</em>.</p>
<p>This is not a particularly new insight of course, it is why advanced laser guided munitions are used with tactics, techniques and procedures that differ from those of simple free-falling bombs. So, in sum, fully autonomous weapon systems (=LAWS), that is, systems designed in a way that would require commanders to abdicate meaningful human control, are simply incompatible with the way IHL demands weapons to be used by human military commanders on the battlefield.</p>
<p>&nbsp;</p>
<p><strong>Would LAWS make war more humane?</strong></p>
<p><strong>No.</strong> It is sometimes argued that autonomy in weapons systems could make wars more humane by ensuring greater precision in targeting military objectives and by clearing the battlefield from human passions, such as anger, fear and vengefulness. Even assuming – but not conceding (see above: <em>Would LAWS be “precision weapons”?</em>) – that one day LAWS might somehow reach human or even “higher-than-human” performances with respect to adherence to IHL, this would not “humanize” future armed conflicts for at least three reasons</p>
<p>(i) delegating the power to take life-or-death decisions to machines blatantly denies the human dignity of the recipients of lethal force and their intrinsic worth as human beings;</p>
<p>(ii) LAWS trivialize the decision to take someone else’s life by relieving war-fighters from the moral burden inevitably associated with it;</p>
<p>(iii) while it is true that machines’ decision-making will never be influenced by negative human emotions, it is equally true that LAWS are also immune to compassion and empathy, which in certain situations could compel a human to refrain from using lethal force even when she or he would legally be entitled to do so.</p>
<p>&nbsp;</p>
<p><strong>Would LAWS proliferate?</strong></p>
<p><strong>Yes. </strong>LAWS need not necessarily take the shape of one specific weapon system akin to, for instance, a drone. LAWS also do not require a very specific military technology development path, the way nuclear weapons do, for example. As AI software and robotic hardware mature and continue to pervade the civilian sphere, militaries will feel prompted to increasingly adopt them (however, see above: <em>Do civilian and military applications of AI differ?</em>) in continuation of a dual-use-trend that is already observable in, for instance, armed drones.</p>
<p>Research and development for LAWS-related technology is thus already well underway and distributed over countless university laboratories and commercial enterprises, making use of economies of scale and the forces of the free market to spur competition, lower prices and shorten innovation cycles. This renders the military research and development effort in the case of LAWS different from those of past hi-tech conventional weapon systems. So (without even taking exports into account) it is easy to see that LAWS would be comparably easy to obtain (as well as reverse-engineer) and thus prone to quickly proliferate to a wide range of state and non-state actors.</p>
<p>&nbsp;</p>
<p><strong>Would LAWS threaten global stability?</strong></p>
<p><strong>Yes. </strong>LAWS promise a military advantage inter alia because they are expected to perform certain tasks much faster than a human could do. We argued above that IHL does not allow for relinquishing meaningful human control. In addition, there are considerations from a strategic perspective that also suggest restraining ourselves and keeping meaningful human control intact. Without meaningful human control, the actions and reactions of individual LAWS as well as swarms of LAWS would have to be controlled by software alone.</p>
<p>Consider the example of adversarial swarms deployed in close proximity to each other. Their respective control software would have to react to signs of an attack within a very short, split-second timeframe – by evading or, possibly, counter-attacking in a use-them-or-lose-them situation. Indications of an attack – sun glint interpreted as a rocket flame, sudden and unexpected moves of the adversary, or just some malfunction – could trigger escalation. It is within the nature of military conflict that these kinds of interactions between two adversarial systems or swarms would obviously not be tested or trained beforehand. In addition, it is, technically speaking, impossible to fathom all possible outcomes in advance. In other words, the interaction of LAWS, if handed over full autonomy, would be unpredictable and take place at operational speeds far beyond human fail-safe capabilities.</p>
<p>Comparable runaway interactions between algorithms are already observable in financial markets. Hence it is a real possibility that LAWS interactions could result in an unwanted escalation from crisis to war, or, within armed conflict, to unintended higher levels of violence. This means an increase in global instability and is unpleasantly reminiscent of Cold War scenarios of “accidental war”.</p>
<p>&nbsp;</p>
<p><strong>Would banning LAWS stifle technology?</strong></p>
<p><strong>No. </strong>On the contrary. Global Governance for LAWS would not mean a prohibition or control of specific technologies as such. The wide spread and the dual-use potential of AI software and robotics suggest that this would not only be a completely futile, luddite endeavor. It would also be severely misguided in light of the various benefits potentially flowing from the maturation of these technologies with regard to civilian applications.</p>
<p>What is more, a number of recent developments in fact suggest that technology companies would welcome a ban on LAWS since they do not want their products to be associated with “Killer Robots”. Google, for instance, stated already years ago that it is not interested in military robotics. The Canadian robot manufacturer Clearpath Robotics even officially joined forces with the Campaign to Stop Killer Robots in 2014 and “ask[s] everyone to consider the many ways in which this technology would change the face of war for the worse” and create robotic products solely “for the betterment of humankind” instead. And in 2017, 160 high profile CEOs of companies developing artificial intelligence technologies signed an open letter calling for the CCW to act.</p>
<p>So preventive arms control for LAWS would not mean the regulation or prohibition of specific technologies. Instead, it would give tech entrepreneurs and manufacturers guidance and assurance that their inventions and products cannot be misused. Hence arms control for LAWS is not about listing or counting (stockpiles of) individual weapon systems. Rather, it is about drawing a line regarding the use of autonomy in weapon systems, a line to retain meaningful human control and prohibit the application of autonomy in specific (especially the “critical”) functions of weapon systems.</p>
<p>The CCW has drawn a comparable line and established a strong norm like that before, with the preventive prohibition of laser blinding weapons in 1995. This prohibition protects a soldier’s eyes on the battlefield; it is, obviously, not a blanket ban on laser technology in all its other uses, be they military or, especially, civilian in nature. In other words, just as we got to keep our CD players and laser pointers back then, we will get to keep our smartphones and self-driving cars this time.</p>
<p><strong> </strong></p>
<p><strong>Further reading:</strong></p>
<p>Altmann, Jürgen/Sauer, Frank (2017): <a href="http://www.tandfonline.com/eprint/qnJKjAUPXWPhmyMjZ6cD/full">Autonomous Weapon Systems and Strategic Stability</a>, in: Survival 59: 5, 117–142.</p>
<p>Amoroso, Daniele/Tamburrini, Guglielmo (2017): The Ethical and Legal Case Against Autonomy in Weapons Systems, in: Global Jurist. Online first.</p>
<p>Asaro, Peter (2012): On Banning Autonomous Weapon Systems. Human Rights, Automation, and the Dehumanization of Lethal Decision-Making, in: International Review of the Red Cross 94: 886, 687–709.</p>
<p>Garcia, Denise (2016): Future Arms, Technologies, and International Law: Preventive Security Governance, in: European Journal of International Security 1: 1, 94-111.</p>
<p>Sauer, Frank (2016): <a href="https://www.armscontrol.org/ACT/2016_10/Features/Stopping-Killer-Robots-Why-Now-Is-the-Time-to-Ban-Autonomous-Weapons-Systems">Stopping “Killer Robots”. Why Now Is the Time to Ban Autonomous Weapons Systems</a>, in: Arms Control Today 46: 8, 8–13.</p>
<p>Sharkey, Noel (2012): The Evitability of Autonomous Robot Warfare, in: International Review of the Red Cross 94: 886, 787–799.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3344</post-id>	</item>
		<item>
		<title>Autonomous Weapon Systems and Strategic Stability</title>
		<link>https://www.icrac.net/autonomous-weapon-systems-and-strategic-stability/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 20 Sep 2017 16:28:00 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3227</guid>

					<description><![CDATA[Building on arguments previously developed for a blog post, ICRAC’s Juergen Altmann and Frank Sauer discuss the strategic implications of autonomy in weapon systems in more depth in a recently published article in Survival. Here’s an excerpt from the introduction: In July 2015, an open letter from artificial-intelligence experts and roboticists called for a ban on autonomous weapon [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>Building on arguments previously developed for a <a href="https://icrac.net/2016/04/speed-kills-why-we-need-to-hit-the-brakes-on-killer-robots/">blog</a> <a href="http://duckofminerva.com/2016/04/speed-kills-why-we-need-to-hit-the-brakes-on-killer-robots.html">post</a>, ICRAC’s Juergen Altmann and Frank Sauer discuss the strategic implications of autonomy in weapon systems in more depth in a recently published article in Survival. Here’s an excerpt from the introduction:</p>
<blockquote><p>In July 2015, an <a href="https://futureoflife.org/open-letter-autonomous-weapons/">open letter from artificial-intelligence experts and roboticists</a><br />
called for a ban on autonomous weapon systems (AWS), comparing<br />
their revolutionary potential to that of gun powder and nuclear weapons.</p>
<p>According to a <a href="https://icrac.net/2012/11/dod-directive-on-autonomy-in-weapon-systems/">2012 Pentagon directive</a>, AWS are weapon systems which,<br />
‘once activated … can select and engage targets without further intervention<br />
by a human operator’. Proponents of AWS have suggested that they<br />
could offer various benefits, from reducing military expenditure to ringing<br />
in a new era of more humane and less atrocious warfare. By contrast, critics<br />
– some characterising AWS as <a href="http://www.stopkillerrobots.org/">‘killer robots’</a> – expect the accompanying<br />
political, legal and ethical risks to outweigh these benefits, and thus argue<br />
for a preventive prohibition.</p>
<p>AWS are not yet operational, but decades of military research and development,<br />
as well as the growing technological overlap between the rapidly<br />
expanding commercial use of artificial intelligence (AI) and robotics, and<br />
the accelerating ‘spin-in’ of these technologies into the military realm, make<br />
autonomy in weapon systems a possibility for the very near future. Military<br />
programmes adapting key technologies and components for achieving<br />
autonomy in weapon systems, as well as the development of prototypes<br />
and doctrine, are well under way in a number of states.</p>
<p>Accompanying this work is a rapidly expanding body of literature on the<br />
various technical, legal and ethical implications of AWS. However, one particularly<br />
crucial aspect has – <a href="http://duckofminerva.com/2015/04/the-new-mineshaft-gap-killer-robots-and-the-un.html">with exceptions confirming the rule</a> – received<br />
comparably little systematic attention: the potential impact of autonomous<br />
weapon systems on global peace and strategic stability.<br />
By drawing on Cold War lessons and extrapolating insights from the<br />
current military use of remotely controlled unmanned systems, we argue<br />
that AWS are prone to proliferation and bound to foment an arms race<br />
resulting in increased crisis instability and escalation risks. We conclude<br />
that these strategic risks justify a critical stance towards AWS.</p></blockquote>
<p><a href="http://www.tandfonline.com/doi/full/10.1080/00396338.2017.1375263">Read the full article here: Altmann, Jürgen/Sauer, Frank 2017: Autonomous Weapon Systems and Strategic Stability, in: Survival 59 (5): 117-142.</a></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3227</post-id>	</item>
		<item>
		<title>Reflections on the 2016 CCW Review Conference</title>
		<link>https://www.icrac.net/reflections-on-the-2016-ccw-review-conference/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 08 Feb 2017 16:19:36 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3225</guid>

					<description><![CDATA[This is a guest post by Anna Khalfaoui. Anna is currently pursuing a LLM at Harvard Law School, having previously studied at Cambridge University and King’s College London. She specialises in public international law and international human rights law. Click here to read this post in braille Reflections on the Review Conference as a newcomer [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><em>This is a guest post by Anna Khalfaoui. Anna is currently pursuing a LLM at Harvard Law School, having previously studied at Cambridge University and King’s College London. She specialises in public international law and international human rights law.</em></p>
<p><a href="https://www.unibw.de/internationalepolitik/professur/team/Sauer/reflections-on-the-2016-ccw-review-conference.dxb/at_download/file">Click here to read this post in braille</a></p>
<p><strong>Reflections on the Review Conference as a newcomer to CCW</strong></p>
<p>The Fifth Review Conference of the Convention on Conventional Weapons (CCW) was a great success for advocates of a ban on fully autonomous weapons. Held at the United Nations in Geneva in December 2016, the Conference was also an opportunity for me to discover and reflect on the processes and challenges of the CCW, to which I was a newcomer.</p>
<div id="attachment_2865" style="width: 250px" class="wp-caption alignleft"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2865" class="wp-image-3231 size-medium alignleft" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2017/11/Anna_CCW-225x300.jpg?resize=225%2C300" alt="" width="225" height="300" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/Anna_CCW.jpg?resize=225%2C300&amp;ssl=1 225w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/Anna_CCW.jpg?resize=768%2C1024&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/Anna_CCW.jpg?w=960&amp;ssl=1 960w" sizes="auto, (max-width: 225px) 100vw, 225px" /><p id="caption-attachment-2865" class="wp-caption-text">.</p></div>
<p>I became involved when I attended the Conference as part of Harvard Law School’s International Human Rights Clinic (IHRC).  I also contributed to a report that IHRC co-published with Human Rights Watch the week before the Review Conference. <a href="https://www.hrw.org/report/2016/12/09/making-case/dangers-killer-robots-and-need-preemptive-ban"><em>Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban</em></a>rebuts the major arguments against a prohibition on the development and use of fully autonomous weapons. These weapons, also known as killer robots and lethal autonomous weapons systems, would be able to select and engage targets without human intervention.</p>
<p>The Review Conference was a key step toward a ban because states parties agreed to formalise talks on killer robots by establishing a Group of Government Experts (GGE), which will meet for 10 days in 2017. This GGE creates the expectation of an outcome as past GGEs have led to negotiation of new or stronger CCW protocols. It provides a forum for states and experts to discuss the parameters of a possible protocol which hopefully will take the form of a ban. The Review Conference also showed that support a ban is gaining traction around the world. Argentina, Panama, Peru and Venezuela joined the call for the first time at the Conference, bringing to 19 the number of states in favour of a ban.</p>
<p>The establishment of a GGE was the news I eagerly waited for the entire week. When the Review Conference opened on December 12, this result did not seem guaranteed. Decisions under the CCW are adopted on the basis of the consensus. This means that any state can block progress and the Russian delegation, from the beginning of the week, forcefully opposed the move to set up a GGE. All other countries that addressed killer robots during the Review Conference explicitly supported establishing such a group. There was something strange about the risk of a single state blocking efforts openly promoted by numerous countries, and I wondered whether, faced with the threat of isolation, it would actually do so. Ultimately, this opposition appears to have been overcome by overwhelming support for more formal discussions.</p>
<div id="attachment_2865" style="width: 250px" class="wp-caption alignright"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2865" class="size-medium wp-image-3230 alignright" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2017/11/IMG_20161216_160200-225x300.jpg?resize=225%2C300" alt="" width="225" height="300" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_160200.jpg?resize=225%2C300&amp;ssl=1 225w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_160200.jpg?resize=768%2C1024&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_160200.jpg?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_160200.jpg?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 225px) 100vw, 225px" /><p id="caption-attachment-2865" class="wp-caption-text">.</p></div>
<p>I first heard about fully autonomous weapons when I joined IHRC in September. At the Review Conference, I realised how invested I had become in this issue and how relieved I was when, on Friday, it became clear that Russia was not going to block a GGE. Fully autonomous weapons are still only under development. Yet, because they have the potential to dramatically change the way that wars are fought, it is incumbent upon us to address the dangers they pose before they find their way to military arsenals and the battlefield.</p>
<p>Several other points caught my attention throughout the week.</p>
<p>Firstly, I joined the Review Conference as part of the Campaign to Stop Killer Robots, an international coalition of non-governmental organisations (NGOs) working towards a preemptive ban on these weapons. In this capacity, I found it interesting and encouraging to observe the role played by civil society at the Review Conference, including doing advocacy, releasing research publications and making statements during the sessions. In their public remarks, state representatives often explicitly acknowledged the work of specific NGOs and experts and the importance of civil society engagement in the dialogue. Many diplomats also attended side events, organised by the Campaign, such as one on the need to adopt a ban rather than a regulatory approach to deal with the dangers associated with killer robots. In the never-ending discussions about the correct balance to strike between military interests and humanitarian concerns, civil society has a vital role to play in emphasising the importance of humanitarian protection and pushing states to adopt ambitious goals. Civil society’s efforts are all the more important when it comes to killer robots which have the potential to revolutionise warfare and raise deep ethical questions.</p>
<div id="attachment_2865" style="width: 250px" class="wp-caption alignleft"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2865" class="size-medium wp-image-3229 alignleft" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2017/11/IMG_20161216_122851-225x300.jpg?resize=225%2C300" alt="" width="225" height="300" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_122851.jpg?resize=225%2C300&amp;ssl=1 225w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_122851.jpg?resize=768%2C1024&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_122851.jpg?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/11/IMG_20161216_122851.jpg?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 225px) 100vw, 225px" /><p id="caption-attachment-2865" class="wp-caption-text">.</p></div>
<p>Secondly, I was surprised and concerned by the limited media coverage of the Review Conference, especially given the fact that a Review Conference happens only once every five years and addresses matters of global concern. Discussions about killer robots should take into account the views of the public at large because delegating decisions about the use of lethal force to machines raises fundamental moral and ethical questions and international law prohibits weapons that run counter to the dictates of the public conscience.  Media coverage is important to raise the public’s awareness and facilitate its involvement in the debate. Civil society can contribute by engaging with the media and disseminating information about emerging weapons technologies that have the potential to affect societies and the world we live in. In so doing, civil society can promote media scrutiny and public participation and thereby put greater pressure on states to be ambitious and adopt encompassing solutions.</p>
<p>Finally, much of the debate at the Conference concentrated on the issue of finances. Financial constraints forced some discussions to take place in an informal setting without the use of official translators. Dozens of countries throughout the week noted their concerns at the financial difficulties facing the CCW. Given the fact that the Conference lasted only five days, it was regrettable that financial discussions took time away from the substantive issues. If this pattern continues, there is a risk that it will undermine the effectiveness and impact of the GGE in 2017 and the CCW as a whole. States parties should therefore take steps to resolve the situation by making their financial contributions as soon as possible.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3225</post-id>	</item>
		<item>
		<title>Arms Control for AWS: 2016 and beyond</title>
		<link>https://www.icrac.net/3219-2/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 07 Dec 2016 16:07:58 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Opinion]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3219</guid>

					<description><![CDATA[After three informal meetings of experts, the Convention on Certain Conventional Weapons, during its Fifth Review Conference taking place from 12 to 16 December 2016 in Geneva, will decide on how to continue work on arms control for autonomous weapon systems. Below is a preview to an article published in the October 2016 issue of Arms Control [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>After three informal meetings of experts, the Convention on Certain Conventional Weapons, during its <a href="http://www.unog.ch/80256EE600585943/(httpPages)/9F975E1E06869679C1257F50004F7E8C?OpenDocument">Fifth Review Conference taking place from 12 to 16 December 2016 in Geneva</a>, will decide on how to continue work on arms control for autonomous weapon systems. Below is a preview to an article published in the October 2016 issue of <a href="https://www.armscontrol.org/ACT/2016_10/Features/Stopping-Killer-Robots-Why-Now-Is-the-Time-to-Ban-Autonomous-Weapons-Systems">Arms Control Today</a>, outlining the perspectives for future AWS arms control.</p>
<p>Sauer, Frank 2016: Stopping ‘Killer Robots’: Why Now Is the Time to Ban Autonomous Weapons Systems, in: Arms Control Today 46 (8): 8-13.</p>
<p><a href="https://www.armscontrol.org/ACT/2016_10/Features/Stopping-Killer-Robots-Why-Now-Is-the-Time-to-Ban-Autonomous-Weapons-Systems">Click here to read the full article</a>.</p>
<p><a href="https://www.unibw.de/internationalepolitik/professur/team/Sauer/Why%20Now%20Is%20the%20Time%20to%20Ban%20AWS%20-braille.brf/at_download/file">NEW: Click here for the BRF file of the full article</a></p>
<blockquote><p>[F]our possible outcomes can be predicted for the CCW process. The first would be a legally binding and preventive multilateral arms control agreement derived by consensus in the CCW and thus involving the major stakeholders, the outcome referenced as “a ban.” Considering the growing number of states-parties calling for a ban and the large number of governments calling for meaningful human control and expressing considerable unease with the idea of autonomous weapons systems, combined with the fact that no government is openly promoting their development, this seems possible. It would require mustering considerable political will. Verification and compliance for a ban, as well as for weaker restrictions, would then require creative arms control solutions. After all, with full autonomy in a weapons system eventually coming down to merely flipping a software switch, how can one tell if a specific system at a specific time is not operating autonomously? A few arms control experts are already wrapping their heads around these questions.</p>
<div class="SandboxRoot env-bp-350" data-twitter-event-id="0">
<div id="twitter-widget-1" class="EmbeddedTweet EmbeddedTweet--edge EmbeddedTweet--mediaForward media-forward js-clickToOpenTarget js-tweetIdInfo tweet-InformationCircle-widgetParent" lang="en" data-click-to-open-target="https://twitter.com/ArmsControlNow/status/786600390020194304" data-iframe-title="Twitter Tweet" data-dt-full="%{hours12}:%{minutes} %{amPm} - %{day} %{month} %{year}" data-dt-explicit-timestamp="12:12 PM - Oct 13, 2016" data-dt-months="Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec" data-dt-am="AM" data-dt-pm="PM" data-dt-now="now" data-dt-s="s" data-dt-m="m" data-dt-h="h" data-dt-second="second" data-dt-seconds="seconds" data-dt-minute="minute" data-dt-minutes="minutes" data-dt-hour="hour" data-dt-hours="hours" data-dt-abbr="%{number}%{symbol}" data-dt-short="%{day} %{month}" data-dt-long="%{day} %{month} %{year}" data-scribe="page:tweet" data-tweet-id="786600390020194304" data-twitter-event-id="3">
<article class="MediaCard MediaCard--mediaForward customisable-border" dir="ltr" data-scribe="component:card">
<div class="MediaCard-media"></div>
</article>
<div class="tweet-InformationCircle--top tweet-InformationCircle--topEdge tweet-InformationCircle" data-scribe="element:notice">
<p>&nbsp;</p>
<blockquote class="twitter-tweet" data-lang="en">
<p dir="ltr" lang="en">Can <a href="https://twitter.com/hashtag/KillerRobots?src=hash&amp;ref_src=twsrc%5Etfw">#KillerRobots</a> (autonomous weapons systems) work as preventive arms control? More in October&#8217;s <a href="https://twitter.com/hashtag/ArmsControlToday?src=hash&amp;ref_src=twsrc%5Etfw">#ArmsControlToday</a> <a href="https://t.co/E7sDVzdmbn">https://t.co/E7sDVzdmbn</a> <a href="https://t.co/LwPSojH9Gr">pic.twitter.com/LwPSojH9Gr</a></p>
<p>— Arms Control Assoc (@ArmsControlNow) <a href="https://twitter.com/ArmsControlNow/status/786600390020194304?ref_src=twsrc%5Etfw">October 13, 2016</a></p></blockquote>
<p><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script></p>
</div>
</div>
<div class="resize-sensor"></div>
</div>
<p>The second outcome would be restrictions short of a ban. The details of such an agreement are impossible to predict, but it is conceivable that governments could agree, for example, to limit the use of autonomous weapons systems, such as permitting their use against materiel only.</p>
<p>The third would be a declaratory, nonbinding agreement on best practices. Such a code of conduct would likely emphasize compliance with existing international humanitarian law and rigorous weapons review processes, in accordance with Article 36 of Additional Protocol I to the Geneva Conventions.</p>
<p>Finally, there may be no tangible result, perhaps with one of the technologically leading countries setting a precedent by fielding autonomous weapons systems. That would certainly prompt others to follow, fueling an arms race. In light of some of the most advanced standoff weapons, such as the U.S. Long Range Anti-Ship Missile or the UK Brimstone, each capable of autonomous targeting during terminal flight phase, one might argue that the world is already headed for such an autonomy arms race.</p>
<p>Implementing autonomy, which mainly comes down to software, in systems drawn from a vibrant global ecosystem of unmanned vehicles in various shapes and sizes is a technical challenge, but doable for state and nonstate actors, particularly because so much of the hardware and software is dual use. In short, autonomous weapons systems are extremely prone to proliferation. An unchecked autonomous weapons arms race and the diffusion of autonomous killing capabilities to extremist groups would clearly be detrimental to international peace, stability, and security.</p>
<div class="SandboxRoot env-bp-350" data-twitter-event-id="1">
<div id="twitter-widget-2" class="EmbeddedTweet EmbeddedTweet--edge js-clickToOpenTarget tweet-InformationCircle-widgetParent" lang="en" data-click-to-open-target="https://twitter.com/marywareham/status/788723233709101056" data-iframe-title="Twitter Tweet" data-dt-full="%{hours12}:%{minutes} %{amPm} - %{day} %{month} %{year}" data-dt-explicit-timestamp="8:47 AM - Oct 19, 2016" data-dt-months="Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec" data-dt-am="AM" data-dt-pm="PM" data-dt-now="now" data-dt-s="s" data-dt-m="m" data-dt-h="h" data-dt-second="second" data-dt-seconds="seconds" data-dt-minute="minute" data-dt-minutes="minutes" data-dt-hour="hour" data-dt-hours="hours" data-dt-abbr="%{number}%{symbol}" data-dt-short="%{day} %{month}" data-dt-long="%{day} %{month} %{year}" data-scribe="page:tweet" data-twitter-event-id="4">
<div class="EmbeddedTweet-tweet">
<blockquote class="Tweet h-entry js-tweetIdInfo subject expanded is-deciderHtmlWhitespace" cite="https://twitter.com/marywareham/status/788723233709101056" data-tweet-id="788723233709101056" data-scribe="section:subject">
<div class="Tweet-header u-cf">
<div class="Tweet-brand u-floatRight"></div>
<div class="TweetAuthor js-inViewportScribingTarget " data-scribe="component:author">
<blockquote class="twitter-tweet" data-lang="en">
<p dir="ltr" lang="en">The nascent social taboo against machines autonomously making kill decisions &#8211; Frank Sauer in <a href="https://twitter.com/ArmsControlNow?ref_src=twsrc%5Etfw">@ArmsControlNow</a> <a href="https://t.co/nBTGtXLT5R">https://t.co/nBTGtXLT5R</a> <a href="https://twitter.com/hashtag/CCWUN?src=hash&amp;ref_src=twsrc%5Etfw">#CCWUN</a></p>
<p>— Mary Wareham (@marywareham) <a href="https://twitter.com/marywareham/status/788723233709101056?ref_src=twsrc%5Etfw">October 19, 2016</a></p></blockquote>
<p><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script></p>
</div>
</div>
</blockquote>
</div>
</div>
<div class="resize-sensor"></div>
</div>
<p>This underlines the importance of the current opportunity for putting a comprehensive, verifiable ban in place. The hurdles are high, but at this point, a ban is clearly the most prudent and thus desirable outcome. After all, as long as no one possesses them, a verifiable ban is the optimal solution. It stops the currently commencing arms race in its tracks, and everyone reaps the benefits. A prime goal of arms control would be fulfilled by facilitating the diversion of resources from military applications toward research and development for peaceful purposes—in the fields of AI and robotics no less, two key future technologies.</p></blockquote>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3219</post-id>	</item>
		<item>
		<title>ICRAC leaflet on LAWS and global security</title>
		<link>https://www.icrac.net/icrac-leaflet-on-laws-and-global-security/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Fri, 18 Nov 2016 15:56:32 +0000</pubDate>
				<category><![CDATA[ICRAC News]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3217</guid>

					<description><![CDATA[Back in 2015, one week before the second informal meeting of experts at the CCW in Geneva, ICRAC released a leaflet with a succinct list of the ten most pressing  issues for global security raised by autonomous weapon systems: LAWS – 10 Problems for Global Security. The original announcement is below, and with the CCW’s 5th Review Conference right [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>Back in 2015, one week before the second informal meeting of experts at the CCW in Geneva, ICRAC released a leaflet with a succinct list of the ten most pressing  issues for global security raised by autonomous weapon systems: <a href="https://icrac.net/wp-content/uploads/2015/04/LAWS-10-Problems-for-Global-Security.pdf">LAWS – 10 Problems for Global Security.</a> The original announcement is below, and with the <a href="http://www.unog.ch/80256EE600585943/(httpPages)/9F975E1E06869679C1257F50004F7E8C?OpenDocument">CCW’s 5th Review Conference</a> right around the corner, it is worth another read. Today, we are happy to announce that the leaflet is now also availabe for easy access by braille readers.</p>
<p><a href="https://www.unibw.de/internationalepolitik/professur/team/Sauer/LAWS-10-Problems-for-Global-Security%20-braille.brf/at_download/file">Click here to read this post in braille</a>.</p>
<hr />
<p>Original 2015 post:</p>
<p>Next week nation states from around the world will meet at the United Nations in Geneva to discuss the problems raised by Lethal Autonomous Weapons Systems (LAWS): weapons that once activated will select targets and attack them with violent force without the benefits of human control. The Convention of Certain Conventional Weapons will host this their second expert meeting on the topic from Monday 13 to Friday 17 April.</p>
<div id="attachment_2865" style="width: 330px" class="wp-caption alignleft"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2865" class="size-medium wp-image-2524 alignleft" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2015/06/dreamstime_xl_2942377-300x225.jpg?resize=300%2C225" alt="" width="300" height="225" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/dreamstime_xl_2942377.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/dreamstime_xl_2942377.jpg?resize=1024%2C768&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/dreamstime_xl_2942377.jpg?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/dreamstime_xl_2942377.jpg?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 300px) 100vw, 300px" /><p id="caption-attachment-2865" class="wp-caption-text">.</p></div>
<p>Delegates from the International Committee for Robot Arms Control (ICRAC), an international not-for-profit association of scientists, technologists, lawyers, and policy experts committed to the peaceful use of robotics in the service of humanity and the regulation of robot weapons, will take an active role in the discussions.</p>
<p>As a founding member of the Campaign to Stop Killer Robots, ICRAC is pushing for an international legally binding treaty to prohibit the development production and use of LAWS. Among the many concerns of the membership, ICRAC is worried about the destabilising impact that LAWS will have on the security of the planet.</p>
<p>As Professor Lucy Suchman from Lancaster University puts it, ‘LAWS take the automation of weapon systems a step too far, undermining the conditions necessary for meaningful human control.’</p>
<p>A new information leaflet from ICRAC will be issued to delegates at the CCW to raise awareness about ten of the most serious global issues for security that the use of LAWS will create: <a href="https://icrac.net/wp-content/uploads/2015/04/LAWS-10-Problems-for-Global-Security.pdf">LAWS – 10 Problems for Global Security</a></p>
<p>It concludes,</p>
<blockquote><p>We are at a critical juncture in the evolution of weapons. The end point of increasing weapons’ automation is full autonomy, where human beings have little control over the course of conflicts and events in battle. At this point in time, it is still within our power to stop the automation of the kill decision, by ensuring that every weapon remains meaningfully controlled by humans.</p>
<p>Both humans and computer systems have their strengths and weaknesses, and the aim of designing effective supervisory systems for weapons control must be to exploit the strengths of both. This way, it is possible not only to gain better legal compliance, but also to ensure that the partnership between human and machine best ensures the protection of civilians, their human dignity and our wider global security.</p></blockquote>
<p><strong>Dr Heather Roff</strong> from the University of Denver, an invited speaker at the CCW meeting said that, “Without careful consideration of the second and third order effects of developing and deploying LAWS, we risk destabilizing regional and global peace and security.”</p>
<p>These concerns were echoed by <strong>Dr. Denise Garcia</strong> from Northeastern University who said that, “the dangers to global security from the future of LAWs is too frightening to contemplate.”</p>
<p>“We are just witnessing the beginning of an automated pace race that will take battlefield decisions out of human hands” said <strong>Professor Noel Sharkey</strong>, chair of ICRAC. “There is a limited time window to stop these weapons before they proliferate widely.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3217</post-id>	</item>
		<item>
		<title>ICRAC second statement on security to the 2016 UN CCW Expert Meeting</title>
		<link>https://www.icrac.net/icrac-second-statement-on-security-to-the-2016-un-ccw-expert-meeting/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Fri, 15 Apr 2016 17:45:53 +0000</pubDate>
				<category><![CDATA[Statements]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2881</guid>

					<description><![CDATA[On April 15, ICRAC’s Juergen Altmann delivered the following statement to the informal “Meeting of Experts“, gathered to discuss questions related to “lethal autonomous weapons systems” from April 11 to April 15 at the United Nations in Geneva, Switzerland. Thank you for allowing the International Committee for Robot Arms Control to provide a second statement [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div class="entry">
<p>On April 15, ICRAC’s <a href="http://icrac.net/who/">Juergen Altmann</a> delivered the following statement to the informal “<a href="http://www.unog.ch/80256EE600585943/%28httpPages%29/37D51189AC4FB6E1C1257F4D004CAFB2?OpenDocument">Meeting of Experts</a>“, gathered to discuss questions related to “lethal autonomous weapons systems” from April 11 to April 15 at the <a href="http://www.unog.ch/80256EE600585943/%28httpPages%29/3CFCEEEF52D553D5C1257B0300473B77?OpenDocument">United Nations</a> in Geneva, Switzerland.</p>
<blockquote><p>Thank you for allowing the International Committee for Robot Arms Control to provide a second statement about the problems with AWS and regional security. We are speaking to further John Borrie’s notion of hidden interaction and crisis.</p>
<p>Yesterday ICRAC explained how the interaction between two systems of autonomous weapon systems with secret combat algoritms would produce unpredictable and unanticipated consequenes. This is cautious scientific wording, but there is more to say.</p>
<p>Imagine a severe crisis, with the swarms of adversaries operating in close proximity of each other. A coordinated attack of one could wipe out the other within missile flight time – that is seconds. The control software would have to react fast to repel the attack.</p>
<p>The problem is that an erroneous “counter” attack could be triggered by a sun glint in visual data misinterpreted as a rocket flame, sudden, unforeseen moves of the enemy swarm, or a simple software bug.  That could then lead to a counter-attack of the other side, with fast escalation from crisis to war.</p>
<p>If war is seen as unavoidable, then there would be strong pressures to attack the other swarm first. Such pre-emption and the fear of it would provide another mechanism of fast and uncontrolled escalation.</p>
<p>We cannot determine the mathematical probabilities of such scenarios or simulate them on a computer. But they are certainly possible and plausible. Considerations of this sort were a strong motive for the arms control treaties limiting nuclear weapons.</p>
<p>Crisis instability as described is not limited to the interactions between big military powers – they can as well occur between regional powers. The best way to avoid such destabilisation would be to preventively prohibit autonomous weapon systems.</p>
<p>We ask those states seeing military advantages in AWS to ponder the mid- and long-term threats from autonomous escalation.</p></blockquote>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2881</post-id>	</item>
	</channel>
</rss>
