<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Uncategorized &#8211; ICRAC</title>
	<atom:link href="https://www.icrac.net/category/uncategorized/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.icrac.net</link>
	<description>International Committee for Robot Arms Control</description>
	<lastBuildDate>Mon, 23 Jun 2025 12:49:19 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.2</generator>
<site xmlns="com-wordpress:feed-additions:1">128339352</site>	<item>
		<title>Statement on Ethical Considerations in Open Informal Meeting at UNGA 1st Committee</title>
		<link>https://www.icrac.net/statement-on-ethical-considerations-in-open-informal-meeting-at-unga-1st-committee/</link>
		
		<dc:creator><![CDATA[Peter Asaro]]></dc:creator>
		<pubDate>Tue, 13 May 2025 20:45:00 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Slider]]></category>
		<category><![CDATA[Statements]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Peter Asaro]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=19944</guid>

					<description><![CDATA[UNGA Informals on LAWS ICRAC Statement on Ethical Considerations Delivered by Prof. Peter Asaro on May 13, 2025 Thank you, Chair. I speak on behalf of the International Committee for Robot Arms Control, or ICRAC, a group of academics, experts, scholars and researchers in computer science, artificial intelligence, robotics, international law, political science, philosophy and [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Peter Asaro' src='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.peterasaro.org/">Peter Asaro</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Dr. Peter Asaro is a philosopher of science, technology and media. His work examines the interfaces between social relations, human minds and bodies, artificial intelligence and robotics, and digital media.

His current research focuses on the social, cultural, political, legal and ethical dimensions of military robotics and UAV drones, from a perspective that combines media theory with science and technology studies. He has written widely-cited papers on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro's research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. His research has been published in international peer reviewed journals and edited volumes, and he is currently writing a book that interrogates the intersections between military robotics, interface design practices, and social and ethical issues.

Dr. Asaro has held research positions at the Center for Cultural Analysis at Rutgers University, the HUMlab of Umeå University in Sweden, and the Austrian Academy of Sciences in Vienna. He has also developed technologies in the areas of virtual reality, data visualization and sonification, human-computer interaction, computer-supported cooperative work, artificial intelligence, machine learning, robot vision, and neuromorphic robotics at the National Center for Supercomputer Applications (NCSA), the Beckman Institute for Advanced Science and Technology, and Iguana Robotics, Inc., and was involved in the design of the natural language interface for the Wolfram|Alpha computational knowledge engine (winner of the 2010 SXSW Web Interactive Award for Technical Achievement), for Wolfram Research.

He is currently working on an Oral History of Robotics project that is funded by the IEEE Robotics and Automation Society and the National Endowment for the Humanities Office of Digital Humanities.

Dr. Asaro received his PhD in the History, Philosophy and Sociology of Science from the University of Illinois at Urbana-Champaign, where he also earned a Master of Arts from the Department of Philosophy, and a Master of Computer Science from the Department of Computer Science.</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><img data-recalc-dims="1" decoding="async" width="1024" height="800" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG-20250513-WA00251-e1750619884799-1024x800.jpg?resize=1024%2C800&#038;ssl=1" alt="Peter Asaro delivering ICRAC Statement on Ethics" class="wp-image-19940" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG-20250513-WA00251-e1750619884799.jpg?resize=1024%2C800&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG-20250513-WA00251-e1750619884799.jpg?resize=300%2C234&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG-20250513-WA00251-e1750619884799.jpg?resize=768%2C600&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG-20250513-WA00251-e1750619884799.jpg?w=1536&amp;ssl=1 1536w" sizes="(max-width: 1000px) 100vw, 1000px" /></figure>



<p><strong>UNGA Informals on LAWS <br>ICRAC Statement on Ethical Considerations <br>Delivered by Prof. Peter Asaro on May 13, 2025</strong> </p>



<p><br>Thank you, Chair. I speak on behalf of the International Committee for Robot Arms Control, or ICRAC, a group of academics, experts, scholars and researchers in computer science, artificial intelligence, robotics, international law, political science, philosophy and ethics. ICRAC is a co-founding member of the Stop Killer Robots Campaign.</p>



<p>We appreciate the organizers of this Informal Meeting including a Session on Ethical Considerations. It has been many years since Ethics has been the primary focus of substantive discussion within the CCW GGE meetings. Yet ethics and morality has provided a valuable basis for international law in the past, and is precisely where we must ground new laws to prohibit and regulate AWS in the near future. That is, in our common shared humanity, and principles which transcend human laws, particularly human dignity in a deep sense as discussed by Prof. Chengeta, and ethical decisions as discussed by the Representative of the Holy See.</p>



<p>Whenever violent force is used, there are risks involved. But merely managing those risks is not sufficient to meet the requirements for morally justifiable killing. Understanding the reasons and the potential consequences for the use of force is required for its justification. It has been argued that AWS may be highly accurate and precise in their use of force, but these are not sufficient to meet the requirements for the ethically discriminant use of force, and do not begin to address the requirements of the proportionate use of force.</p>



<p>Following the outlines of the two-tiered approach advanced by the ICRC, regulated AWS would be permitted to target autonomously. In these limited cases, more specifically cases where the target is a military object by nature, such as military vehicles and installations, automated targeting must still be carefully regulated to ensure that humans can safely supervise those systems.</p>



<p>But as soon as we start considering civilian objects, even those which might be used for military purposes and might be lawfully targeted under IHL, we must not permit their targeting by automated processes. The moral argument that leads to this conclusion is clear. It may be tempting to think that we can automate proportionality decisions–how much force is needed, or how much risk is acceptable, or how much collateral harm to civilians might be acceptable relative to a military objective. But the nature of proportionality judgments is fundamentally moral.</p>



<p>These decisions are inherently about values–the value of a target to a military objective, the value of a military objective to an operation and an overall strategy; the value of civilian infrastructure to a family, a community, a country; the value of a natural environment; and above all the value of human lives and the cost of taking those lives. They are also about duties, our duties to protect, our duties to each other.</p>



<p>These values are not intrinsically numerical or quantitative in nature, and assigning them such values in a computer program is arbitrary at best. Computers do not “understand” in any meaningful sense. They represent the world through mathematical abstractions that we design and understand, and from which we assign and seek meaning. Worse, training an algorithm to “learn” these values from a dataset is to abdicate any human responsibility in establishing the values represented in the systems, including the value of human life and the necessary conditions of human flourishing.</p>



<p>These are moral values, only understood through the lived experience of human life, moral reflection, and ethical development. In those limited cases where the decision to end a human life can be morally justified, it must be made by a moral agent who truly understands these values. Any life lost by the decision of an algorithm is, by definition, taken arbitrarily. ICRAC appreciates the work of the CCW GGE and this section of latest draft of the Chair’s Rolling Text:</p>



<p><em>States should ensure context-appropriate human judgement and control in the use of<br>LAWS, through the following measures &#8230; [which] &#8230; includes ensuring assessment of legal<br>obligations and ethical considerations by a human, in particular, with regard to the effects<br>of the selection and engagement functions.</em></p>



<p>The ethical considerations of the use of force must remain a matter of human judgement. We must not eliminate ethical considerations altogether by delegating them to machines wholly incapable of grasping such considerations. Human dignity requires that we consider a human as human–no machine can do this for us.</p>



<p>Similarly for anti-personnel AWS, in order to design systems to autonomously target people, it would be necessary to create digital representations of people, or target profiles. The same moral logic applies here.</p>



<p>While from a legal perspective, it could be argued that unmounted infantry are military objects by nature, and can pose a threat just as a tank does. But there is an important moral difference between targeting people directly, versus targeting a tank, and accepting that people inside it may be killed. People are not to be treated as objects, but always as moral subjects.</p>



<p>The aim of war, and the moral justification of killing in war, depends critically on using force to diminish the ability of your adversary to use force against you. The ultimate aim is not to harm or kill the enemy directly, this is only a means to an end, namely the end of hostilities. Targeting a human directly is to make the destruction of a human a goal in itself, rather than the true goal of eliminating the threat they pose. This might sound like a minor distinction, but by making the targeting and killing of humans the goal of a machine, rather than the elimination of military threats, we stand to vastly undermine human dignity.</p>



<p>By designing systems to target people directly, we essentially and effectively “pre-authorize” the moral judgement to take their lives. By pre-authorizing the killing of humans, and making personnel the targets of autonomous weapons, we would fundamentally violate and diminish human dignity. If we accept that a soldier on the battlefield can be directly targeted, without a human moral judgement or moral justification, then we make it more acceptable to do so in other contexts as well.</p>



<p>When we violate human dignity, it is not just the immediate victim who loses their dignity. All of humanity suffers from this loss. This is why we feel such moral disgust at the injustices of slavery, and torture, and the dropping of bombs on children–these atrocities undermine our collective dignity as human beings and offend our moral sensibility.</p>



<p>While the use of violent force against unjust aggression is sometimes necessary, it is our moral responsibility to ensure that force is used justly. The only way to ensure that force is used justly is through moral judgement, and this requires a moral agent. Machines and automated algorithms, however sophisticated they may appear, are not moral agents, and are not capable of moral judgements–only thin and arbitrary approximations. We must not delegate our morality to machines, as doing so threatens the very essence of our human dignity.</p>



<p>To quote the wise words of Christof Heyns, “War without reflection is mechanical slaughter.”</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Peter Asaro' src='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.peterasaro.org/">Peter Asaro</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Dr. Peter Asaro is a philosopher of science, technology and media. His work examines the interfaces between social relations, human minds and bodies, artificial intelligence and robotics, and digital media.

His current research focuses on the social, cultural, political, legal and ethical dimensions of military robotics and UAV drones, from a perspective that combines media theory with science and technology studies. He has written widely-cited papers on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro's research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. His research has been published in international peer reviewed journals and edited volumes, and he is currently writing a book that interrogates the intersections between military robotics, interface design practices, and social and ethical issues.

Dr. Asaro has held research positions at the Center for Cultural Analysis at Rutgers University, the HUMlab of Umeå University in Sweden, and the Austrian Academy of Sciences in Vienna. He has also developed technologies in the areas of virtual reality, data visualization and sonification, human-computer interaction, computer-supported cooperative work, artificial intelligence, machine learning, robot vision, and neuromorphic robotics at the National Center for Supercomputer Applications (NCSA), the Beckman Institute for Advanced Science and Technology, and Iguana Robotics, Inc., and was involved in the design of the natural language interface for the Wolfram|Alpha computational knowledge engine (winner of the 2010 SXSW Web Interactive Award for Technical Achievement), for Wolfram Research.

He is currently working on an Oral History of Robotics project that is funded by the IEEE Robotics and Automation Society and the National Endowment for the Humanities Office of Digital Humanities.

Dr. Asaro received his PhD in the History, Philosophy and Sociology of Science from the University of Illinois at Urbana-Champaign, where he also earned a Master of Arts from the Department of Philosophy, and a Master of Computer Science from the Department of Computer Science.</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">19944</post-id>	</item>
		<item>
		<title>Statement on Technical Considerations in Open Informal Meeting at UNGA 1st Committee</title>
		<link>https://www.icrac.net/statement-on-technical-considerations-in-open-informal-meeting-at-unga-1st-committee/</link>
		
		<dc:creator><![CDATA[Peter Asaro]]></dc:creator>
		<pubDate>Tue, 13 May 2025 20:40:00 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[statements]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=19938</guid>

					<description><![CDATA[UNGA LAWS Informals ICRAC Statement on Technical Considerations Delivered by Prof. Peter Asaro, 13 May 2025 Thank you Chair. I speak on behalf of the International Committee for Robot Arms Control, or ICRAC, a co-founding member of the Stop Killer Robots Campaign. ICRAC has many concerns about the development and use of autonomous weapons and [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Peter Asaro' src='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.peterasaro.org/">Peter Asaro</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Dr. Peter Asaro is a philosopher of science, technology and media. His work examines the interfaces between social relations, human minds and bodies, artificial intelligence and robotics, and digital media.

His current research focuses on the social, cultural, political, legal and ethical dimensions of military robotics and UAV drones, from a perspective that combines media theory with science and technology studies. He has written widely-cited papers on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro's research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. His research has been published in international peer reviewed journals and edited volumes, and he is currently writing a book that interrogates the intersections between military robotics, interface design practices, and social and ethical issues.

Dr. Asaro has held research positions at the Center for Cultural Analysis at Rutgers University, the HUMlab of Umeå University in Sweden, and the Austrian Academy of Sciences in Vienna. He has also developed technologies in the areas of virtual reality, data visualization and sonification, human-computer interaction, computer-supported cooperative work, artificial intelligence, machine learning, robot vision, and neuromorphic robotics at the National Center for Supercomputer Applications (NCSA), the Beckman Institute for Advanced Science and Technology, and Iguana Robotics, Inc., and was involved in the design of the natural language interface for the Wolfram|Alpha computational knowledge engine (winner of the 2010 SXSW Web Interactive Award for Technical Achievement), for Wolfram Research.

He is currently working on an Oral History of Robotics project that is funded by the IEEE Robotics and Automation Society and the National Endowment for the Humanities Office of Digital Humanities.

Dr. Asaro received his PhD in the History, Philosophy and Sociology of Science from the University of Illinois at Urbana-Champaign, where he also earned a Master of Arts from the Department of Philosophy, and a Master of Computer Science from the Department of Computer Science.</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[
<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c"><img data-recalc-dims="1" loading="lazy" decoding="async" width="600" height="450" class="wp-image-19939" style="width: 600px;" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG_20250512_101911990_HDR_AE-scaled.jpg?resize=600%2C450&#038;ssl=1" alt="" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG_20250512_101911990_HDR_AE-scaled.jpg?w=2560&amp;ssl=1 2560w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG_20250512_101911990_HDR_AE-scaled.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG_20250512_101911990_HDR_AE-scaled.jpg?resize=1024%2C768&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG_20250512_101911990_HDR_AE-scaled.jpg?resize=160%2C120&amp;ssl=1 160w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG_20250512_101911990_HDR_AE-scaled.jpg?resize=768%2C576&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG_20250512_101911990_HDR_AE-scaled.jpg?resize=1536%2C1152&amp;ssl=1 1536w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2025/06/IMG_20250512_101911990_HDR_AE-scaled.jpg?resize=2048%2C1536&amp;ssl=1 2048w" sizes="auto, (max-width: 600px) 100vw, 600px" /></p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c"><strong>UNGA LAWS Informals <br>ICRAC Statement on Technical Considerations <br>Delivered by Prof. Peter Asaro, 13 May 2025</strong> </p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c"><br>Thank you Chair. I speak on behalf of the International Committee for Robot Arms Control, or ICRAC, a co-founding member of the Stop Killer Robots Campaign.</p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c">ICRAC has many concerns about the development and use of autonomous weapons and the accelerated production and promotion of these systems by private technology companies. Far from being a technically inevitable and practically necessary, autonomous weapons pose a considerable risk to global stability and security, and are likely to cause more civilian harm, rather than less. As a group of scholars with expertise in relevant domains, including robotics, AI, and digital information systems, we strongly urge caution. </p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c">We are concerned that the technology that underpins the functionalities of AWS is dangerously unsuitable for the complex and dynamic contexts of conflict. Specifically, the AI element in AWS poses considerable risks. Testing such systems is difficult and time-consuming, and the tools and methods for the verification and validation of AI systems do not yet exist, if they are possible at all. The questionable reliability of prediction based on historical data when applied to dynamically unfolding situations in conflict raises further questions regarding the validity and legality of using AI supported AWS.</p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c">At best, AI supported systems are only as good as the data on which they are trained on and appropriate, comprehensive and up-to-date data is hard to come by in contested conflict spaces. AI systems need frequent updates to remain relevant and functional, but with each substantial update, vital systems-aspects may become compromised requiring further verification and validation.</p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c">As we heard from these presenters, it is a well-known fact in technology and industry circles that AI systems remain unproven in terms of reliability for safety-critical situations and complex situations such as armed conflict. They are known to give inaccurate outputs, and newer generative AI systems, which are likely to find their way into the wider AWS environment, are known to hallucinate – that is they give false or misleading output which is difficult to distinguish from accurate results. In the case of generative AI, this behavior is guaranteed by its technical architecture and these types of errors can only be managed not eliminated. When AI experts and those that make the technologies used in AWS raise alarms about the inadequacies of AWS for conflict, we should listen.</p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c">We are concerned that the technical characteristics of AWS pose a considerable risk in enabling uncontrolled escalation and conflict at speed. Escalation from crisis to war, or escalating a conflict to a higher level of violence, could come about due to erroneous indications of attack or a simple sensor or computer error. Unpredictable systems, and systems which operators cannot understand or explain, will give leaders false impressions of their capabilities, leading to overconfidence or encouraging pre-emptive<br>attacks. This will lead to greater global instability and insecurity. </p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c">Finally, there are operational risks posed by AWS in that they give the illusion that such weapons are more precise and accurate, and will therefore inflict less harm. The extensive use of AI in current conflicts has given us an indication that the contrary might be the case. This is particularly so for database-driven systems that generate targeting lists faster than humans can evaluate and verify the lawfulness of targets. The technical capacity for precision or accuracy is not a warrant for discrimination or proportionality in use. Unless we establish clear legally binding limitations on AWS, there is no safeguard that systems that prioritize speed and scale are not used in an indiscriminate and disproportional manner, either intentionally or because humans have abdicated their judgement to a machine.</p>



<p id="block-5c43bd10-ca39-48af-98ad-e1748d15cd4c">Thank you.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Peter Asaro' src='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.peterasaro.org/">Peter Asaro</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Dr. Peter Asaro is a philosopher of science, technology and media. His work examines the interfaces between social relations, human minds and bodies, artificial intelligence and robotics, and digital media.

His current research focuses on the social, cultural, political, legal and ethical dimensions of military robotics and UAV drones, from a perspective that combines media theory with science and technology studies. He has written widely-cited papers on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro's research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. His research has been published in international peer reviewed journals and edited volumes, and he is currently writing a book that interrogates the intersections between military robotics, interface design practices, and social and ethical issues.

Dr. Asaro has held research positions at the Center for Cultural Analysis at Rutgers University, the HUMlab of Umeå University in Sweden, and the Austrian Academy of Sciences in Vienna. He has also developed technologies in the areas of virtual reality, data visualization and sonification, human-computer interaction, computer-supported cooperative work, artificial intelligence, machine learning, robot vision, and neuromorphic robotics at the National Center for Supercomputer Applications (NCSA), the Beckman Institute for Advanced Science and Technology, and Iguana Robotics, Inc., and was involved in the design of the natural language interface for the Wolfram|Alpha computational knowledge engine (winner of the 2010 SXSW Web Interactive Award for Technical Achievement), for Wolfram Research.

He is currently working on an Oral History of Robotics project that is funded by the IEEE Robotics and Automation Society and the National Endowment for the Humanities Office of Digital Humanities.

Dr. Asaro received his PhD in the History, Philosophy and Sociology of Science from the University of Illinois at Urbana-Champaign, where he also earned a Master of Arts from the Department of Philosophy, and a Master of Computer Science from the Department of Computer Science.</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">19938</post-id>	</item>
		<item>
		<title>New Book Shows Catastrophic Folly of Automating Warfare</title>
		<link>https://www.icrac.net/new-book-shows-catastrophic-folly-of-automating-warfare/</link>
		
		<dc:creator><![CDATA[mbolton]]></dc:creator>
		<pubDate>Sun, 20 Sep 2020 10:40:20 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.icrac.net/?p=6475</guid>

					<description><![CDATA[Despite the challenges of COVID-19, 21-25 September the world’s governments will discuss at the UN in Geneva the humanitarian and security threat posed by “lethal autonomous weapons systems” – high-tech killer robots that could target people without meaningful human control over the use of violence. But despite the media clichés that often accompany these discussions [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-full"><img data-recalc-dims="1" loading="lazy" decoding="async" width="191" height="300" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2023/04/Political-Minefields-cover.jpg?resize=191%2C300&#038;ssl=1" alt="" class="wp-image-19759"/></figure>



<p>Despite the challenges of COVID-19, 21-25 September the world’s governments will <a href="https://www.stopkillerrobots.org/2020/09/maintaining-momentum-during-the-pandemic/">discuss</a> at the UN in Geneva the humanitarian and security threat posed by “lethal autonomous weapons systems” – high-tech killer robots that could target people without meaningful human control over the use of violence.</p>



<p>But despite the media clichés that often accompany these discussions – journalists seem unable to resist references to Terminator and Skynet – killer robots are not some sci-fi fantasy <a href="https://buff.game/splitgate-arena-warfare/">Splitgate Arena</a>. They are the result of a long trend in weapons development of using technology to disembody killing.</p>



<p>In my new <a href="https://www.bloomsburycollections.com/book/political-minefields-the-struggle-against-automated-killing/">book</a>, <em>Political Minefields </em>(I.B. Tauris), I trace the history of efforts to avoid responsibility for violence by using remote and automated weapons like landmines, cluster munitions, armed drones and, now killer robots.</p>



<p>“Weapons developers are seeking to pervert the power of information and communications technology for deadly ends, taking humans entirely out of the decision to kill,” writes Jody Williams, who was awarded the 1997 Nobel Peace Prize along with the International Campaign to Ban Landmines (<a href="http://www.icbl.org/">ICBL</a>), in her foreword to my book. “In the autonomous, weaponized robot, they are essentially designing mines that actively seek out their targets, that can follow you, that can fly.”</p>



<p>From the WWII minefields of North Africa and US automated bombing of Laos to armed drones targeting makers of improvised explosive devices, military planners have tried, as US Vietnam War General William Westmoreland put it, to “replace wherever possible the man with the machine.”</p>



<p>But in my research, I’ve learned that the fever dream of algorithmic warfare never delivers on its promise of victory by remote control. People are too messy, unpredictable, clever, and tricky to be meet the assumptions programmed into military technology.</p>



<p>Allied troops just marched through the Nazi minefields at El Alamein, taking the casualties and repurposing the mines they found for their own uses. Vietnamese communist soldiers spoofed the various electronic detectors dropped from US warplanes onto their pathways through the jungle. They sent animals down the trail, placed bags of urine next to so-called “people sniffers”, and played tapes of vehicle noises next to microphones – prompting computerized bombers to unload explosives onto phantom guerillas.</p>



<p>As I have travelled in and around the world’s minefields and cluster munition strike zones, I have heard the story over and over again, in Afghanistan, Bosnia, Cambodia, Iraq, Laos and South Sudan. In each place, it is civilians who have borne the consequences of turning warfare over to automated devices. Decades after the soldier who placed and armed them, landmines in Afghanistan continue to maim children. Lao farmers still risk setting off unexploded cluster bomblets when they plough their rice fields.</p>



<p>The final chapter of my book highlights the far-sighted work of the International Committee for Robot Arms Control (<a href="https://www.icrac.net/">ICRAC</a>) and <a href="https://www.stopkillerrobots.org/">Campaign to Stop Killer Robots</a>, which have sounded the alarm on the emerging militarization of artificial intelligence and robotics. They are not anti-technology Luddites. “It’s OK for a plane to fly itself,” Dr. Peter Asaro, co-founder of ICRAC and New School professor told me. “It’s not OK for a plane to decide who to shoot at.” We have, he says, “the right not to be killed by a machine.”</p>



<p>For the last six years, governments have gathered in Geneva to consider how to address lethal autonomous weapons systems, under the auspices of the Convention on Certain Conventional Weapons (CCW). Despite pressure from a wide range of countries and civil society, the big military powers have dragged their feet, abusing the rule of consensus decisionmaking, running down the clock to avoid constraining their high-tech weapons R&amp;D.</p>



<p>But diplomatic patience of the <a href="https://www.stopkillerrobots.org/2019/03/minority-of-states-delay-effort-to-ban-killer-robots/">majority</a> of nations in the CCW is running out. A global <a href="https://www.stopkillerrobots.org/2019/01/global-poll-61-oppose-killer-robots/">poll</a> found that 61% of people in 26 countries opposed killer robots. And UN Secretary-General António Guterres has directly <a href="https://www.stopkillerrobots.org/2018/11/unban/">called</a> on governments “to ban these weapons, which are politically unacceptable and morally repugnant.”</p>



<p>Just as human reality is more complex than software programs, society is not the passive recipient of technological change. In writing my book, I have been inspired to meet people who have mobilized to clear up the minefields, provide support to survivors of cluster munitions and pressured diplomats to negotiate treaties banning inhumane weapons. We must following their example to demand new international law ensuring that the use of force remains under meaningful human control.</p>



<p><em>Matthew Bolton, ICRAC member and associate professor of political science at Pace University, is author of </em>Political Minefields: The Struggle against Automated Killing<em>. If you can’t find a copy in your local bookstore, it is available with a 35% discount from </em><a href="https://www.bloomsburycollections.com/book/political-minefields-the-struggle-against-automated-killing/"><em>Bloomsbury</em></a><em> with the code GLR TW5</em></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">6475</post-id>	</item>
		<item>
		<title>Acto de campaña con Jody Williams en Barcelona</title>
		<link>https://www.icrac.net/acto-de-campana-con-jody-williams-en-barcelona/</link>
		
		<dc:creator><![CDATA[Joaquin Rodriguez]]></dc:creator>
		<pubDate>Mon, 08 Apr 2019 13:59:03 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=6181</guid>

					<description><![CDATA[El próximo Viernes día 12 de Abril a las 16:30h celebraremos en compañía de Jody Williams, premio Nobel de la Paz 1997, y presidenta de la Iniciativa de mujeres Nobel desde 2006, un acto enmarcado en la campaña Stop Killer Robots en el que además participaran los miembros del ICRAC, la Dra Roser Martinez Quirante [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Joaquin Rodriguez' src='https://secure.gravatar.com/avatar/06ecb23ad24f0d156d3caac752517f80dd8e7c6b4010b903914036f205332d98?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/06ecb23ad24f0d156d3caac752517f80dd8e7c6b4010b903914036f205332d98?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong>Joaquin Rodriguez</strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image"><img data-recalc-dims="1" loading="lazy" decoding="async" width="930" height="1024" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/04/Asset-18.png?resize=930%2C1024&#038;ssl=1" alt="" class="wp-image-6180" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/04/Asset-18.png?resize=930%2C1024&amp;ssl=1 930w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/04/Asset-18.png?resize=273%2C300&amp;ssl=1 273w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/04/Asset-18.png?resize=768%2C845&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/04/Asset-18.png?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/04/Asset-18.png?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 930px) 100vw, 930px" /></figure>



<p>El próximo Viernes día 12 de Abril a las 16:30h celebraremos en compañía de Jody Williams, premio Nobel de la Paz 1997, y presidenta de la Iniciativa de mujeres Nobel desde 2006, un acto enmarcado en la campaña Stop Killer Robots en el que además participaran los miembros del ICRAC, la Dra Roser Martinez Quirante y el Dr. Joaquín Rodríguez Álvarez, así como un representante del Centre Delàs d&#8217;estudis per la Pau, el Dr Pere Brunet.</p>



<p>El acto se celebrará en la Sala de Actos del Rectorado de la Universidad Autónoma de Barcelona y tiene como principal objetivo, desplegar el argumentario de la Campaña Stop Killer Robots a favor de la elaboración de un tratado internacional vinculante para la prohibición de los sistemas de armamento autónomo.</p>



<p>Para más información contactar con: Joaquín Rodríguez email: <a href="mailto:joaquin.rodriguez@uab.cat">joaquin.rodriguez@uab.cat</a></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Joaquin Rodriguez' src='https://secure.gravatar.com/avatar/06ecb23ad24f0d156d3caac752517f80dd8e7c6b4010b903914036f205332d98?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/06ecb23ad24f0d156d3caac752517f80dd8e7c6b4010b903914036f205332d98?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong>Joaquin Rodriguez</strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">6181</post-id>	</item>
		<item>
		<title>Presentación de la Campaña &#8220;Stop Killer Robots&#8221; en Madrid</title>
		<link>https://www.icrac.net/presentacion-de-la-campana-stop-killer-robots-en-madrid/</link>
		
		<dc:creator><![CDATA[Joaquin Rodriguez]]></dc:creator>
		<pubDate>Mon, 04 Mar 2019 15:20:58 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=6052</guid>

					<description><![CDATA[El próximo lunes día 11 de Marzo a las 18h presentaremos en Madrid la Campaña &#8220;Stop Killer Robots&#8221;. El acto se celebrará en el Circulo de Bellas Artes (Sala Nueva) y contará con la participación de ICRAC, FundiPau, Centre Delàs d&#8217;estudis per la Pau, CEIPAZ y Demospaz. El acto se enmarca dentro del ciclo electoral [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Joaquin Rodriguez' src='https://secure.gravatar.com/avatar/06ecb23ad24f0d156d3caac752517f80dd8e7c6b4010b903914036f205332d98?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/06ecb23ad24f0d156d3caac752517f80dd8e7c6b4010b903914036f205332d98?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong>Joaquin Rodriguez</strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[
<p>El próximo lunes día 11 de Marzo a las 18h presentaremos en Madrid la Campaña &#8220;Stop Killer Robots&#8221;. El acto se celebrará en el Circulo de Bellas Artes (Sala Nueva) y contará con la participación de ICRAC, <a href="http://fundipau.org/">FundiPau</a>, <a href="http://www.centredelas.org/es/">Centre Delàs d&#8217;estudis per la Pau</a>, <a href="http://www.ceipaz.org/">CEIPAZ</a> y <a href="http://www.demospaz.org/">Demospaz.</a></p>



<p>El acto se enmarca dentro del ciclo electoral español, y tiene como principal objetivo concienciar sobre la importancia de trabajar de forma urgente en un tratado internacional vinculante de prohibición del Armamento Autónomo. Así como convencer a los partidos políticos que incluyan en sus programas electorales la prohibición del Armamento completamente Autónomo.</p>



<h2 class="wp-block-heading"><a href="https://www.icrac.net/wp-content/uploads/2019/03/SKR_madrid.pdf">P</a><a href="https://www.icrac.net/wp-content/uploads/2019/03/SKR_Madrid.pdf">rograma del acto</a></h2>



<figure class="wp-block-image"><img data-recalc-dims="1" loading="lazy" decoding="async" width="662" height="853" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/03/PresentacionMadridlogos.png?resize=662%2C853&#038;ssl=1" alt="" class="wp-image-6153" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/03/PresentacionMadridlogos.png?w=662&amp;ssl=1 662w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/03/PresentacionMadridlogos.png?resize=233%2C300&amp;ssl=1 233w" sizes="auto, (max-width: 662px) 100vw, 662px" /></figure>



<figure class="wp-block-image"><img data-recalc-dims="1" loading="lazy" decoding="async" width="768" height="1024" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/03/skrmadrid_invite_2-1.png?resize=768%2C1024&#038;ssl=1" alt="" class="wp-image-6154" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/03/skrmadrid_invite_2-1.png?resize=768%2C1024&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/03/skrmadrid_invite_2-1.png?resize=225%2C300&amp;ssl=1 225w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2019/03/skrmadrid_invite_2-1.png?w=2048&amp;ssl=1 2048w" sizes="auto, (max-width: 768px) 100vw, 768px" /></figure>



<p>Para más información contactar con Joaquín Rodríguez e-mail <a href="mailto:joaquin.rodriguez@uab.cat">joaquin.rodriguez@uab.cat </a></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Joaquin Rodriguez' src='https://secure.gravatar.com/avatar/06ecb23ad24f0d156d3caac752517f80dd8e7c6b4010b903914036f205332d98?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/06ecb23ad24f0d156d3caac752517f80dd8e7c6b4010b903914036f205332d98?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong>Joaquin Rodriguez</strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">6052</post-id>	</item>
		<item>
		<title>ICRAC Statement at the 2017 CCW GGE Meeting</title>
		<link>https://www.icrac.net/icrac-statement-at-the-2017-ccw-gge-meeting/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 15 Nov 2017 20:11:20 +0000</pubDate>
				<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Statements]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3350</guid>

					<description><![CDATA[ICRAC Statement to the 2017 UN CCW GGE Meeting Delivered by Noel Sharkey, Chair, on 15 November 2017 I speak on behalf of the International Committee for Robot Arms Control, a founding member of the Campaign to Stop Killer Robots. We would like to thank Ambassador Gill for his preparations of this important meeting. And [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><strong>IC</strong><strong>RAC Statement to the 2017 UN CCW GGE Meeting</strong></p>
<p>Delivered by Noel Sharkey, Chair, on 15 November 2017</p>
<p>I speak on behalf of the International Committee for Robot Arms Control, a founding member of the Campaign to Stop Killer Robots. We would like to thank Ambassador Gill for his preparations of this important meeting. And we also thank all of the States Parties for their lively participation and their interesting points of view.</p>
<p>ICRAC has many concerns about the use and development of autonomous weapons systems but in this statement we are going to concentrate on three points which have come up in the discussions here: these are the dual use of autonomous systems, where we are now with autonomous weapons systems development, and finally the issue of definitions and human control.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-medium wp-image-3352 alignright" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2017/12/signal-2017-11-15-122318-300x225-300x225.jpg?resize=300%2C225" alt="" width="300" height="225" />First the question of dual use: Will a prohibition on LAWS inhibit the development of autonomous systems innovation that have a practical purpose for good in society? The answer is clearly NO! Remember: We are not calling for a prohibition on the development or use of autonomous robots or autonomous functioning in the military or in the civilian sphere – except in one instance. We wish to only prohibit the development and use of autonomy in the critical functions of target selection and the application of violent force. Let us be totally clear here that restricting these critical functions in weapon systems will have absolutely no impact on civilian or even other military applications.</p>
<p>Second, it also became evident in the discussions over the last couple of days that some of the delegates believe that no one is as yet developing autonomous weapons systems, that they are a long way off and there are not even any working prototypes. The announcements from a number of companies in the hi-tech nations tell a very different story. In recent years we have heard about the development of fully autonomous fighter jets, tanks, submarines, naval ships, border protection systems and swarms of small drones. These have not been deployed as yet but that will not take long. For example, Kalashnikov have announced this year that <a href="http://tass.com/defense/954894">it was developing a</a> “fully automated combat module” based on neural networks that could allow a weapon to “identify targets and make decisions.” We cannot verify the truth of such claims but nonetheless it is clear that the underlying technology that will enable self-targeting is here and could be deployed soon.</p>
<p>Finally, we at ICRAC are very concerned that we are already beginning to see the emergence of an arms race towards an ever increasing level of autonomy in weapon systems. There are often token efforts to say that there is a human somewhere in the control loop or on the control loop exercising some form of human judgement or planning. This human control of weapons systems is the key component of what we should be focussing on in these discussions – not artificial intelligence, not different levels of autonomy for vehicles or semi- autonomous function. It is sufficient to define autonomous weapons systems in a simple way such as “weapons systems that once launched can select targets and apply violent force without meaningful human control” – or something similar. It would be MOST valuable here to debate what kinds of human control do states find acceptable. There is 30 years of research on human supervisory control of computing machinery and we have never heard it mentioned here. So let us come down on a simple definition of autonomous weapons systems without delay and get down to the really important question of what is an acceptable level of human control. Why are we not discussing this here to find out what state experts think is acceptable and what is acceptable ethically and under IHL and IHRL?</p>
<p>ICRAC would like to recommend that States Parties schedule at least four weeks of time for talks in 2018 to discuss the human control of weapons systems and to start a process toward negotiating a legally binding instrument that ensures meaningful human control over weapon systems.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3350</post-id>	</item>
		<item>
		<title>Arms Control for AWS: 2016 and beyond</title>
		<link>https://www.icrac.net/3219-2/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 07 Dec 2016 16:07:58 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Opinion]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3219</guid>

					<description><![CDATA[After three informal meetings of experts, the Convention on Certain Conventional Weapons, during its Fifth Review Conference taking place from 12 to 16 December 2016 in Geneva, will decide on how to continue work on arms control for autonomous weapon systems. Below is a preview to an article published in the October 2016 issue of Arms Control [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>After three informal meetings of experts, the Convention on Certain Conventional Weapons, during its <a href="http://www.unog.ch/80256EE600585943/(httpPages)/9F975E1E06869679C1257F50004F7E8C?OpenDocument">Fifth Review Conference taking place from 12 to 16 December 2016 in Geneva</a>, will decide on how to continue work on arms control for autonomous weapon systems. Below is a preview to an article published in the October 2016 issue of <a href="https://www.armscontrol.org/ACT/2016_10/Features/Stopping-Killer-Robots-Why-Now-Is-the-Time-to-Ban-Autonomous-Weapons-Systems">Arms Control Today</a>, outlining the perspectives for future AWS arms control.</p>
<p>Sauer, Frank 2016: Stopping ‘Killer Robots’: Why Now Is the Time to Ban Autonomous Weapons Systems, in: Arms Control Today 46 (8): 8-13.</p>
<p><a href="https://www.armscontrol.org/ACT/2016_10/Features/Stopping-Killer-Robots-Why-Now-Is-the-Time-to-Ban-Autonomous-Weapons-Systems">Click here to read the full article</a>.</p>
<p><a href="https://www.unibw.de/internationalepolitik/professur/team/Sauer/Why%20Now%20Is%20the%20Time%20to%20Ban%20AWS%20-braille.brf/at_download/file">NEW: Click here for the BRF file of the full article</a></p>
<blockquote><p>[F]our possible outcomes can be predicted for the CCW process. The first would be a legally binding and preventive multilateral arms control agreement derived by consensus in the CCW and thus involving the major stakeholders, the outcome referenced as “a ban.” Considering the growing number of states-parties calling for a ban and the large number of governments calling for meaningful human control and expressing considerable unease with the idea of autonomous weapons systems, combined with the fact that no government is openly promoting their development, this seems possible. It would require mustering considerable political will. Verification and compliance for a ban, as well as for weaker restrictions, would then require creative arms control solutions. After all, with full autonomy in a weapons system eventually coming down to merely flipping a software switch, how can one tell if a specific system at a specific time is not operating autonomously? A few arms control experts are already wrapping their heads around these questions.</p>
<div class="SandboxRoot env-bp-350" data-twitter-event-id="0">
<div id="twitter-widget-1" class="EmbeddedTweet EmbeddedTweet--edge EmbeddedTweet--mediaForward media-forward js-clickToOpenTarget js-tweetIdInfo tweet-InformationCircle-widgetParent" lang="en" data-click-to-open-target="https://twitter.com/ArmsControlNow/status/786600390020194304" data-iframe-title="Twitter Tweet" data-dt-full="%{hours12}:%{minutes} %{amPm} - %{day} %{month} %{year}" data-dt-explicit-timestamp="12:12 PM - Oct 13, 2016" data-dt-months="Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec" data-dt-am="AM" data-dt-pm="PM" data-dt-now="now" data-dt-s="s" data-dt-m="m" data-dt-h="h" data-dt-second="second" data-dt-seconds="seconds" data-dt-minute="minute" data-dt-minutes="minutes" data-dt-hour="hour" data-dt-hours="hours" data-dt-abbr="%{number}%{symbol}" data-dt-short="%{day} %{month}" data-dt-long="%{day} %{month} %{year}" data-scribe="page:tweet" data-tweet-id="786600390020194304" data-twitter-event-id="3">
<article class="MediaCard MediaCard--mediaForward customisable-border" dir="ltr" data-scribe="component:card">
<div class="MediaCard-media"></div>
</article>
<div class="tweet-InformationCircle--top tweet-InformationCircle--topEdge tweet-InformationCircle" data-scribe="element:notice">
<p>&nbsp;</p>
<blockquote class="twitter-tweet" data-lang="en">
<p dir="ltr" lang="en">Can <a href="https://twitter.com/hashtag/KillerRobots?src=hash&amp;ref_src=twsrc%5Etfw">#KillerRobots</a> (autonomous weapons systems) work as preventive arms control? More in October&#8217;s <a href="https://twitter.com/hashtag/ArmsControlToday?src=hash&amp;ref_src=twsrc%5Etfw">#ArmsControlToday</a> <a href="https://t.co/E7sDVzdmbn">https://t.co/E7sDVzdmbn</a> <a href="https://t.co/LwPSojH9Gr">pic.twitter.com/LwPSojH9Gr</a></p>
<p>— Arms Control Assoc (@ArmsControlNow) <a href="https://twitter.com/ArmsControlNow/status/786600390020194304?ref_src=twsrc%5Etfw">October 13, 2016</a></p></blockquote>
<p><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script></p>
</div>
</div>
<div class="resize-sensor"></div>
</div>
<p>The second outcome would be restrictions short of a ban. The details of such an agreement are impossible to predict, but it is conceivable that governments could agree, for example, to limit the use of autonomous weapons systems, such as permitting their use against materiel only.</p>
<p>The third would be a declaratory, nonbinding agreement on best practices. Such a code of conduct would likely emphasize compliance with existing international humanitarian law and rigorous weapons review processes, in accordance with Article 36 of Additional Protocol I to the Geneva Conventions.</p>
<p>Finally, there may be no tangible result, perhaps with one of the technologically leading countries setting a precedent by fielding autonomous weapons systems. That would certainly prompt others to follow, fueling an arms race. In light of some of the most advanced standoff weapons, such as the U.S. Long Range Anti-Ship Missile or the UK Brimstone, each capable of autonomous targeting during terminal flight phase, one might argue that the world is already headed for such an autonomy arms race.</p>
<p>Implementing autonomy, which mainly comes down to software, in systems drawn from a vibrant global ecosystem of unmanned vehicles in various shapes and sizes is a technical challenge, but doable for state and nonstate actors, particularly because so much of the hardware and software is dual use. In short, autonomous weapons systems are extremely prone to proliferation. An unchecked autonomous weapons arms race and the diffusion of autonomous killing capabilities to extremist groups would clearly be detrimental to international peace, stability, and security.</p>
<div class="SandboxRoot env-bp-350" data-twitter-event-id="1">
<div id="twitter-widget-2" class="EmbeddedTweet EmbeddedTweet--edge js-clickToOpenTarget tweet-InformationCircle-widgetParent" lang="en" data-click-to-open-target="https://twitter.com/marywareham/status/788723233709101056" data-iframe-title="Twitter Tweet" data-dt-full="%{hours12}:%{minutes} %{amPm} - %{day} %{month} %{year}" data-dt-explicit-timestamp="8:47 AM - Oct 19, 2016" data-dt-months="Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec" data-dt-am="AM" data-dt-pm="PM" data-dt-now="now" data-dt-s="s" data-dt-m="m" data-dt-h="h" data-dt-second="second" data-dt-seconds="seconds" data-dt-minute="minute" data-dt-minutes="minutes" data-dt-hour="hour" data-dt-hours="hours" data-dt-abbr="%{number}%{symbol}" data-dt-short="%{day} %{month}" data-dt-long="%{day} %{month} %{year}" data-scribe="page:tweet" data-twitter-event-id="4">
<div class="EmbeddedTweet-tweet">
<blockquote class="Tweet h-entry js-tweetIdInfo subject expanded is-deciderHtmlWhitespace" cite="https://twitter.com/marywareham/status/788723233709101056" data-tweet-id="788723233709101056" data-scribe="section:subject">
<div class="Tweet-header u-cf">
<div class="Tweet-brand u-floatRight"></div>
<div class="TweetAuthor js-inViewportScribingTarget " data-scribe="component:author">
<blockquote class="twitter-tweet" data-lang="en">
<p dir="ltr" lang="en">The nascent social taboo against machines autonomously making kill decisions &#8211; Frank Sauer in <a href="https://twitter.com/ArmsControlNow?ref_src=twsrc%5Etfw">@ArmsControlNow</a> <a href="https://t.co/nBTGtXLT5R">https://t.co/nBTGtXLT5R</a> <a href="https://twitter.com/hashtag/CCWUN?src=hash&amp;ref_src=twsrc%5Etfw">#CCWUN</a></p>
<p>— Mary Wareham (@marywareham) <a href="https://twitter.com/marywareham/status/788723233709101056?ref_src=twsrc%5Etfw">October 19, 2016</a></p></blockquote>
<p><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script></p>
</div>
</div>
</blockquote>
</div>
</div>
<div class="resize-sensor"></div>
</div>
<p>This underlines the importance of the current opportunity for putting a comprehensive, verifiable ban in place. The hurdles are high, but at this point, a ban is clearly the most prudent and thus desirable outcome. After all, as long as no one possesses them, a verifiable ban is the optimal solution. It stops the currently commencing arms race in its tracks, and everyone reaps the benefits. A prime goal of arms control would be fulfilled by facilitating the diversion of resources from military applications toward research and development for peaceful purposes—in the fields of AI and robotics no less, two key future technologies.</p></blockquote>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3219</post-id>	</item>
	</channel>
</rss>
