<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>nsharkey &#8211; ICRAC</title>
	<atom:link href="https://www.icrac.net/author/nsharkey/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.icrac.net</link>
	<description>International Committee for Robot Arms Control</description>
	<lastBuildDate>Mon, 28 Dec 2020 16:15:01 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.2</generator>
<site xmlns="com-wordpress:feed-additions:1">128339352</site>	<item>
		<title>ICRAC statement on the human control of weapons systems at the April 2018 CCW GGE</title>
		<link>https://www.icrac.net/icrac-statement-on-the-human-control-of-weapons-systems-at-the-april-2018-ccw-gge/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Wed, 11 Apr 2018 13:06:43 +0000</pubDate>
				<category><![CDATA[Front Page]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Statements]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=4006</guid>

					<description><![CDATA[International Committee for Robot Arms Control Statement to the UN GGE Meeting 2018 Delivered by Prof. Noel Sharkey, on 11 April 2018 Mr Chairperson, We have been very pleased with this morning&#8217;s session as states begin to contemplate a move towards policies on the human control of weapons systems. On a pedantic note: we cannot [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><img data-recalc-dims="1" decoding="async" class="alignnone size-medium wp-image-4007" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/IMG_5870-e1523451941135-300x300.jpg?resize=300%2C300&#038;ssl=1" alt="" width="300" height="300" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/IMG_5870-e1523451941135.jpg?resize=300%2C300&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/IMG_5870-e1523451941135.jpg?w=640&amp;ssl=1 640w" sizes="(max-width: 300px) 100vw, 300px" /></p>
<p>International Committee for Robot Arms Control<br />
Statement to the UN GGE Meeting 2018<br />
Delivered by Prof. Noel Sharkey, on 11 April 2018</p>
<p>Mr Chairperson,</p>
<p>We have been very pleased with this morning&#8217;s session as states begin to contemplate a move towards policies on the human control of weapons systems. On a pedantic note: we cannot talk about the meaningful human control of LAWS as that would make them no longer an autonomous weapon.</p>
<p>In the view of ICRAC, the control of weapons systems is more nuanced than can be captured by terms such as in-the-loop, on-the-loop, the broader loop, looping-the loop, human oversight, and appropriate human judgement. In this way we agree strongly with the statement made by Brazil and several others in this session who believe that the devil is in the detail.</p>
<p>For human control to be meaningful we need to examine how humans interact with machines and understand the types of human-machine biases that can occur in the selection of legitimate targets. Lessons should be learned from 30 years of research on human supervisory control of machinery <a href="https://xn--yxadbbg.tv/tag/milf-sex/" style="border: none; color: #333; font-weight: normal !important; text-decoration: none;">xn--yxadbbg milf</a> and more than 100 years of research on the psychology of human reasoning. This combination of this work can help us to design human-machine interfaces that allow weapons to be controlled in a manner that is fully compliant with international law and the principle of humanity.</p>
<p>First, there should be a focus on what the human operator<strong> MUST</strong> do in the targeting cycle. This is control by use which is governed by targeting rules under International Humanitarian  Law and International Human Rights Law. Further, international law rules that apply after the use of weapons – such as those that relate to human responsibility – must be satisfied.</p>
<p>Second, the design of weapon systems must render them <strong>INCAPABLE</strong> of operating without meaningful human control. &nbsp;This is control by design, which is governed by international weapons law. In terms of international weapons law, if the weapon system, by its design, is incapable of being sufficiently controlled in terms of the law, then such a weapon is illegal <em>per se.</em></p>
<p>Ideally the following three conditions should be followed for the control of weapons systems:</p>
<ol>
<li>a human commander (or operator) will have full contextual and situational awareness of the target area for each and every attack and is able to perceive and react to any change or unanticipated situations that may have arisen since planning the attack.</li>
<li>there will be active cognitive participation in every attack with sufficient time for deliberation on the nature of any target, its significance in terms of the necessity and appropriateness of attack, and likely incidental and possible accidental effects of the attack and</li>
<li>there will be a means for the rapid suspension or abortion of every attack.</li>
</ol>
<p>Thank you Mr Chairperson</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4006</post-id>	</item>
		<item>
		<title>ICRAC Working Paper #3 (CCW GGE April 2018): Guidelines for the human control of weapons systems</title>
		<link>https://www.icrac.net/icrac-working-paper-3-ccw-gge-april-2018-guidelines-for-the-human-control-of-weapons-systems/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Tue, 10 Apr 2018 11:33:10 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Working Papers]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=3998</guid>

					<description><![CDATA[Guidelines for the human control of weapons systems [PDF] Authored by Noel Sharkey, chair of ICRAC[1] Since 2014, high contracting parties to the CCW have expressed interest and concern about the meaningful human control of weapons systems. There is an extensive scientific and engineering literature on the dynamics of human-machine interaction and human supervisory control [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignnone size-medium wp-image-4001" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/ICRAC-WP3_CCW-GGE-April-2018-1.jpg?resize=300%2C263&#038;ssl=1" alt="" width="300" height="263" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/ICRAC-WP3_CCW-GGE-April-2018-1.jpg?resize=300%2C263&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/ICRAC-WP3_CCW-GGE-April-2018-1.jpg?w=667&amp;ssl=1 667w" sizes="auto, (max-width: 300px) 100vw, 300px" /></p>
<p><strong>Guidelines for the human control of weapons systems [<a href="https://www.icrac.net/wp-content/uploads/2018/04/Sharkey_Guideline-for-the-human-control-of-weapons-systems_ICRAC-WP3_GGE-April-2018.pdf">PDF</a>]<br />
</strong></p>
<p>Authored by Noel Sharkey, chair of ICRAC<a href="#_ftn1" name="_ftnref1">[1]</a></p>
<p>Since 2014, high contracting parties to the CCW have expressed interest and concern about the meaningful human control of weapons systems. There is an extensive scientific and engineering literature on the dynamics of human-machine interaction and human supervisory control of machinery. A short guide is presented here consisting of two parts. Part 1 is a simple primer on the psychology of human reasoning. Part 2 outlines different levels for the control of weapons systems, adapted from human-machine interaction research, and discusses them in terms of the properties of human reasoning. This outlines which of the levels can ensure the legality of human control of weapons systems and guarantee that precautionary measures are taken to assess the significance of potential targets, their necessity and appropriateness, as well as the likely incidental and possible accidental effects of the attack.</p>
<ol>
<li><strong> A short primer on human reasoning for the control of weapons</strong></li>
</ol>
<p>A well-established distinction in human psychology, backed by over 100 years of substantial research, divides human reasoning into two types: (i) fast <em>automatic </em>processes needed for routine and/or well tasks like riding a bicycle or playing tennis and (ii) slower <em>deliberativ</em>e processes needed for thoughtful reasoning such as making a diplomatic decision.</p>
<p>The drawback of deliberative reasoning is that it requires attention and memory resources and so it can easily be disrupted by anything like stress, or being pressured into making a quick decision.</p>
<p>Automatic processes kick in first, but we can override them if we are operating in novel circumstances or performing tasks that require active control or attention. Automatic processes are essential to our normal functioning, but they have a number of liabilities when it comes to making important decisions such as those required to determine the legitimacy of a target.</p>
<p>Four of the known properties of automatic reasoning<a href="#_ftn2" name="_ftnref2">[2]</a> illustrate why it is it problematic for the supervisory control of weapons.</p>
<ul>
<li><strong>neglects ambiguity and suppresses doubt</strong>. Automatic processes jump to conclusions. An unambiguous answer pops up instantly without question. There is no search for alternative interpretations or uncertainty. If something looks like it might be a legitimate target, in ambiguous circumstances, automatic reasoning will be certain that it is legitimate.</li>
<li><strong>infers and invents causes and intentions.</strong> Automatic reasoning rapidly invents coherent causal stories by linking fragments of available information. Events that include people are automatically attributed with intentions that fit a causal story. For example, people loading muckrakes onto a truck could initiate a causal story that they were loading rifles. This is called <em>assimilation bias</em> in the human supervisory control literature.<a href="#_ftn3" name="_ftnref3">[3]</a></li>
<li><strong>is biased to believe and confirm. </strong>Automatic reasoning favours uncritical acceptance of suggestions and maintains a strong bias. If a computer suggests a target to an operator, automatic reasoning alone would make it highly likely to be accepted. This is <em>automation bias</em>.<a href="#_ftn4" name="_ftnref4">[4]</a> <em>Confirmation bias</em><a href="#_ftn5" name="_ftnref5">[5]</a> selects information that confirms a prior belief.</li>
<li><strong>focuses on existing evidence and ignores absent evidence. </strong>Automatic reasoning builds coherent explanatory stories without consideration of evidence or contextual information that might be missing. What You See Is All There Is (WYSIATI)<a href="#_ftn6" name="_ftnref6">[6]</a>. It facilitates the feeling of coherence that makes us confident to accept information as true. For example, a man firing a rifle may be deemed to be a hostile target with WYSIATI when a quick look around might reveal that he is shooting a wolf hunting his goats.</li>
</ul>
<p>&nbsp;</p>
<ol start="2">
<li><strong> Levels of human control and how they impact on human decision-making</strong></li>
</ol>
<p>We can look at levels of human control for weapons systems by adapting research from the human supervisory control literature as shown in Table 1.<a href="#_ftn7" name="_ftnref7">[7]</a></p>
<table>
<tbody>
<tr>
<td width="594">A classification for levels of human supervisory control of weapons</td>
</tr>
<tr>
<td width="594">
<ol>
<li><strong>a human deliberates about a target before initiating any attack </strong></li>
<li><strong>program provides a list of targets and a human chooses which to attack</strong></li>
<li><strong>program selects target and a human must approve before attack</strong></li>
<li><strong>program selects target and a human has restricted time to veto </strong></li>
<li><strong>program selects target and initiates attack without human involvement</strong></li>
</ol>
</td>
</tr>
</tbody>
</table>
<p>&nbsp;</p>
<p><strong>Level 1 control is the ideal</strong>. A human commander (or operator) has full contextual and situational awareness of the target area at the time of a specific attack and is able to perceive and react to any change or unanticipated situations that may have arisen since planning the attack. There is active cognitive participation in the attack and sufficient time for deliberation on the nature of the target, its significance in terms of the necessity and appropriateness, and likely incidental and possible accidental effects. There must also be a means for the rapid suspension or abortion of the attack.</p>
<p><strong>Level 2 control could be acceptable</strong> if it is shown to meet the requirement of deliberating on potential targets. The human operator or commander should deliberatively assess necessity and appropriateness and whether any of the suggested alternatives are permissible objects of attack. Without sufficient time or in a distracting environment the illegitimacy of a target could be overlooked.</p>
<p>A rank ordered list of targets is particularly problematic as automation bias could create a tendency to accept the top ranked target unless sufficient time and attentional space is given for deliberative reasoning.</p>
<p><strong>Level 3 is unacceptable.</strong> This type of control has been experimentally shown to create what is known as <em>automation bias</em> in which human operators come to trust computer generated solutions as correct and disregard or don’t search for contradictory information. Cummings experimented with automation bias in a study on an interface designed for supervision and resource allocation of in-flight GPS guided Tomahawk missiles.<a href="#_ftn8" name="_ftnref8">[8]</a> She found that when the computer recommendations were wrong, operators using Level 3 control had a significantly decreased accuracy.</p>
<p><strong>Level 4 is unacceptable</strong> because it does not promote target validation and a short time to veto would reinforce automation bias and leave no room for doubt or deliberation. As the attack will take place <em>unless</em> a human intervenes, this undermines well-established presumptions under international humanitarian law that promote civilian protection.</p>
<p>The time pressure will result in operators neglecting ambiguity and suppressing doubt, inferring and inventing causes and intentions, being biased to believe and confirm, focusing on existing evidence and ignoring absent but needed evidence. An example of the errors caused by demands of fast veto was in the 2004 Iraq war when the U.S. Army&#8217;s Patriot missile system shot down a British Tornado and an American F/A-18, killing three pilots.</p>
<p><strong>Level 5 control</strong> <strong>is unacceptable</strong> as it describes weapons that are autonomous in the critical functions of target selection and the application of violent force.</p>
<p>It should be clear from the above that there are lessons to be drawn both from the psychology of human reasoning and from the literature on human-machine interaction. An understanding of this research is urgently needed to ensure that human-machine interaction is designed to get the best level of human control needed to comply with the international law in all circumstances.</p>
<p><strong>Conclusion: Necessary conditions for meaningful human control of weapons.</strong></p>
<p>A commander or operator should</p>
<ul>
<li>have full contextual and situational awareness of the target area at the time of initiating a specific attack;</li>
<li>be able to perceive and react to any change or unanticipated situations that may have arisen since planning the attack, such as changes in the legitimacy of the targets;</li>
<li>have active cognitive participation in the attack;</li>
<li>have sufficient time for deliberation on the nature of targets, their significance in terms of the necessity and appropriateness of an attack and the likely incidental and possible accidental effects of the attack and…</li>
<li>&#8230;have a means for the rapid suspension or abortion of the attack.</li>
</ul>
<p>&#8212;</p>
<p><a href="#_ftnref1" name="_ftn1">[1]</a> Special thanks to Lucy Suchman, Frank Sauer and Amanda Sharkey and members of ICRAC for helpful comments.</p>
<p><a href="#_ftnref2" name="_ftn2">[2]</a> D. Kahneman 2011:, Thinking, Fast and Slow, Penguin Books. He refers to the two processes as System 1 and System 2, These are exactly the same as the terms automatic and deliberative used here for clarity and consistency.</p>
<p><a href="#_ftnref3" name="_ftn3">[3]</a> J.M. Carroll and M.B. Rosson, ‘Paradox of the active user’, in J.M. Carroll (eds.), Interfacing Thought: Cognitive Aspects of Human-Computer Interaction (MIT Press, 1987), 80–111.</p>
<p><a href="#_ftnref4" name="_ftn4">[4]</a> K.L. Mosier and L.J. Skitka 1996: Human decision makers and automated decision aids: made for each other?, in: Mouloua, M. (Eds.): Automation and Human Performance: Theory and Applications, Lawrence Erlbaum Associates, 201–220.</p>
<p><a href="#_ftnref5" name="_ftn5">[5]</a> C.G. Lord, L. Ross and M. Lepper 1979: ‘Biased assimilation and attitude polarization: the effects of prior theories on subsequently considered evidence’, Journal of Personality and Social Psychology, 47, 1231–1243.</p>
<p><a href="#_ftnref6" name="_ftn6">[6]</a> Kaheneman ibid.</p>
<p><a href="#_ftnref7" name="_ftn7">[7]</a> For a more in-depth understanding of these analyses and references see N. Sharkey 2016: Staying in the Loop. Human Supervisory Control of Weapons, in: Bhuta, Nehal et al. (Eds.): Autonomous Weapons Systems. Law, Ethics, Policy. Cambridge University Press, 23-38.</p>
<p><a href="#_ftnref8" name="_ftn8">[8]</a> M.L. Cummings 2006: Automation and Accountability in Decision Support System Interface Design, in: Journal of Technology Studies 32: 1, 23–31.</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3998</post-id>	</item>
		<item>
		<title>Short ICRAC Statement at the April 2018 CCW GGE</title>
		<link>https://www.icrac.net/short-icrac-statement-at-the-april-2018-ccw-gge/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Tue, 10 Apr 2018 11:06:41 +0000</pubDate>
				<category><![CDATA[Front Page]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Statements]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=3993</guid>

					<description><![CDATA[International Committee for Robot Arms Control Statement to the UN GGE Meeting 2018 Delivered by Prof. Noel Sharkey, on 10 April 2018 Mr. Chairperson, There have been very useful and interesting discussions this morning. I speak here as chair of an academic NGO: the International Committee for Robot Arms Control (ICRAC) and as a member [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignnone size-medium wp-image-3994" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/DaaaODtW4AEjomI.jpg-large.jpg?resize=300%2C225&#038;ssl=1" alt="" width="300" height="225" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/DaaaODtW4AEjomI.jpg-large.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/DaaaODtW4AEjomI.jpg-large.jpg?resize=160%2C120&amp;ssl=1 160w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/DaaaODtW4AEjomI.jpg-large.jpg?resize=768%2C576&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/DaaaODtW4AEjomI.jpg-large.jpg?resize=1024%2C768&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/04/DaaaODtW4AEjomI.jpg-large.jpg?w=2048&amp;ssl=1 2048w" sizes="auto, (max-width: 300px) 100vw, 300px" /></p>
<p>International Committee for Robot Arms Control<br />
Statement to the UN GGE Meeting 2018<br />
Delivered by Prof. Noel Sharkey, on 10 April 2018</p>
<p>Mr. Chairperson,</p>
<p>There have been very useful and interesting discussions this morning.</p>
<p>I speak here as chair of an academic NGO: the International Committee for Robot Arms Control (ICRAC) and as a member of the scientific community in the field of Artificial Intelligence and Robotics with specialty in Machine Learning.</p>
<p>We stress again that it would be confusing to broaden the discussion of LAWS into issues about Artificial Intelligence or weapons with emerging intelligence. By chasing definitions of LAWS down the rabbit hole of AI, we remove ourselves from the key issues that need to be urgently discussed here. The definition extracted from ICRC, and echoed by a number of states this morning, is concerned with weapons that have autonomy in the critical functions of target selection and the application of force. This is sufficient for our definitional purposes here: decisions about target selection and the application of force are delegated to a machine. <strong>Let me highlight that it does not matter what techniques or computing methods are used to create autonomy in these critical functions.</strong></p>
<p>What is important here are questions about the nature of human control required and acceptable to ensure compliance with international law. <strong>It is key that we get this right.</strong> You can read more about this in ICRAC’s new working paper <strong>Guidelines for the Human Control of Weapons Systems</strong> that will be delivered at Wednesday’s side event. We support those states who have stated that the focus of this meeting should be on human control of weapons systems and human-machine interaction. In this way we can make real progress this week and protect our future no matter what the technological developments.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3993</post-id>	</item>
		<item>
		<title>ICRAC Podcast Series</title>
		<link>https://www.icrac.net/podcast-2/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Fri, 25 Dec 2015 17:13:05 +0000</pubDate>
				<category><![CDATA[Media]]></category>
		<category><![CDATA[Podcast]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php56-3.dfw3-2.websitetestlink.com/?p=3373</guid>

					<description><![CDATA[<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><meta http-equiv="refresh" content="0; url=https://www.icrac.net/podcast/" /></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3373</post-id>	</item>
		<item>
		<title>Icelandic research institute shuns autonomous weapons</title>
		<link>https://www.icrac.net/icelandic-research-institute-shuns-autonomous-weapons/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Wed, 02 Sep 2015 17:31:22 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2864</guid>

					<description><![CDATA[The Icelandic Institute for Intelligent Machines (IIIM) has issued an ethical policy that makes them the first Artificial Intelligence research and development group to reject the development of technologies intended for military operations. IIIM is an independent research institute affiliated with Reykjavik University in Reykjavik, Iceland “It is only fitting that a research center in Iceland [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div id="attachment_2865" style="width: 310px" class="wp-caption alignleft"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2865" class="wp-image-2865" style="margin-right: 5px;" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2016/05/IIIM-outside-sm-300x167.png?resize=300%2C167" alt="IIIM-outside-sm" width="300" height="167" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2016/05/IIIM-outside-sm.png?resize=300%2C167&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2016/05/IIIM-outside-sm.png?w=413&amp;ssl=1 413w" sizes="auto, (max-width: 300px) 100vw, 300px" /><p id="caption-attachment-2865" class="wp-caption-text">icelandic institute for intelligent machines.</p></div>
<p>The Icelandic Institute for Intelligent Machines (IIIM) has issued an ethical policy that makes them the first Artificial Intelligence research and development group to reject the development of technologies intended for military operations. IIIM is an independent research institute affiliated with Reykjavik University in Reykjavik, Iceland</p>
<p>“It is only fitting that a research center in Iceland should field such a policy” says Kristinn R. Thórisson, Managing Director of IIIM. “A nation without a standing army and virtually no history of war in its 1100 years”.</p>
<p>Thórisson believes that Artificial Intelligence has great potential for the immediate benefit of society. He asks, “Why should the taxpayers’ money fund autonomous weapons meant for killing humans when they could be funding applications for civilian uses, with enormous immediate benefits to society? Why not spend the large sums of money poured into weapons development instead for a peaceful, civilian agenda?”</p>
<p>He adds a strong note of caution, “When people think of a war waged with machines — autonomous killing machines — they may imagine armies or robots fighting other armies of robots. And no one gets hurt, right? But evidence so far tells us that the reality may in fact be very different —much more likely is the scenario of armies of killer robots set against individuals, groups and even nation states who are not in a good position to defend themselves, or even worse, largely at at a loss to do so.”</p>
<p>It is certainly heartwarming to see an AI R&amp;D institute showing such a great sense of social responsibility and it would be great to see others following their lead.</p>
<p>This comes after Canada’s Clearpath robotics became the <a href="http://icrac.net/2014/08/canadas-biggest-robot-company-rejects-killer-robots/">first robotics company</a> to issue a policy against autonomous weapons systems. The cards are  stacking up against autonomous weapons with support for a ban this year from Stephen Hawking, Elon Musk, Steve Wozniak, Bill Gates and Noam Chomsky. The Future of Life Institute <a href="http://icrac.net/2015/07/laws-an-open-letter-from-ai-robotics-experts/">issued an open letter</a> against AI weapons last month with thousands of signatures from prominent figures in the field.</p>
<p>The full ethics policy of IIIM is given below and can also be found at <a href="http://www.iiim.is/about-iiim/ethics-policy/">http://www.iiim.is/about-iiim/ethics-policy/</a> For more information about the IIIM visit <a href="http://iiim.is/">http://iiim.is</a> or email (<a href="mailto:info@iiim.is">info@iiim.is</a>).</p>
<p><em>The Board of Directors of IIIM believes that the freedom of researchers to explore and uncover the principles of intelligence, automation, and autonomy, and to recast these as the mechanized runtime principles of man-made computing machinery, is a promising approach for producing advanced software with commercial and public applications, for solving numerous difficult challenges facing humanity, and for answering important questions about the nature of human thought.</em></p>
<p><em>A significant part of all past artificial intelligence (AI) research in the world is and has been funded by military authorities, or by funds assigned various militaristic purposes, indicating its importance and application to military operations. A large portion of the world’s most advanced AI research is still supported by such funding, as opposed to projects directly and exclusively targeting peaceful civilian purposes. As a result, a large and disconcerting imbalance exists between AI research with a focus on hostile applications and AI research with an explicitly peaceful agenda. Increased funding for military research has a built-in potential to fuel a continual arms race; reducing this imbalance may lessen chances of conflict due to international tension, distrust, unfriendly espionage, terrorism, undue use of military force, and unjust use of power.</em></p>
<p><em>Just as AI has the potential to enhance military operations, the utility of AI technology for enabling perpetration of unlawful or generally undemocratic acts is unquestioned. While less obvious at present than the military use of AI and other advanced technologies, the falling cost of computers is likely to make highly advanced automation technology increasingly accessible to anyone who wants it. The potential for all technology of this kind to do harm is therefore increasing.</em></p>
<p><em> </em><em>For these reasons, and as a result of IIIM’s sincere goal to focus its research towards topics and challenges of obvious benefit to the general public, and for the betterment of society, human livelihood and life on Earth, IIIM’s Board of Directors hereby states the Institute’s stance on such matters clearly and concisely, by establishing the following Ethical Policy for all current and future activities of IIIM:</em></p>
<p><strong><em>1</em></strong><em> – IIIM’s aim is to advance scientific understanding of the world, and to enable the application of this knowledge for the benefit and betterment of humankind.</em></p>
<p><strong><em>2</em></strong><em> – IIIM will not undertake any project or activity intended to (2a) cause bodily injury or severe emotional distress to any person, (2b) invade the personal privacy or violate the human rights of any person, as defined by the United Nations Declaration of Human Rights, (2c) be applied to unlawful activities, or (2d) commit or prepare for any act of violence or war.</em></p>
<p><strong><em>2.1</em></strong><em> – IIIM will not participate in projects for which there exists any reasonable evidence of activities 2a, 2b, 2c, or 2d listed above, whether alone or in collaboration with governments, institutions, companies, organizations, individuals, or groups.</em></p>
<p><strong><em>2.2</em></strong><em> – IIIM will not accept military funding for its activities. ‘Military funding’ is defined as any and all funds designated to support the activities of governments, institutions, companies, organizations, and groups, explicitly intended for furthering a military agenda, or to prepare for or commit to any act of war.</em></p>
<p><strong><em>2.3</em></strong><em> – IIIM will not collaborate with any institution, company, group, or </em><em>organization whose existence or operation is explicitly, whether in part or in whole, sponsored by military funding as described in 2.2 or controlled by military authorities. For civilian institutions with a history of undertaking military-funded projects a 5-15 rule will be applied: if for the past 5 years 15% or more of their projects were sponsored by such funds, they will not be considered as IIIM collaborators.</em></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2864</post-id>	</item>
		<item>
		<title>Podcast #6: Interviews on Social Responsibility in Robotics and AI</title>
		<link>https://www.icrac.net/podcast-6-interviews-on-social-responsibility-in-robotics-and-ai/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Mon, 04 May 2015 19:33:52 +0000</pubDate>
				<category><![CDATA[Podcast]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=3919</guid>

					<description><![CDATA[Download file &#124; Play in new window &#124; Duration: 9:53 Interviews with: Ryan Gariepy, Chief Technical Officer, Clearpath Robotics Stuart Russell, Professor of Artificial Intelligence, University of California at BerkeleyThis is episode 6 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series.In this episode we discuss the social responsibility [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div class="podcast_player"><!-- [if lt IE 9]><script>document.createElement('audio');</script><![endif]--><br />
<!--[if lt IE 9]><script>document.createElement('audio');</script><![endif]-->
<audio class="wp-audio-shortcode" id="audio-3919-1" preload="none" style="width: 100%;" controls="controls"><source type="audio/mpeg" src="https://www.icrac.net/wp-content/uploads/2018/01/06-socialrepponsibility.mp3?_=1" /><a href="https://www.icrac.net/wp-content/uploads/2018/01/06-socialrepponsibility.mp3">https://www.icrac.net/wp-content/uploads/2018/01/06-socialrepponsibility.mp3</a></audio></p>
</div>
<div class="podcast_meta">
<aside><a class="podcast-meta-download" title="Podcast #6: Interviews on Social Responsibility in Robotics and AI " href="https://www.icrac.net/podcast-download/3669/podcast-6.mp3?ref=download">Download file</a> | <a class="podcast-meta-new-window" title="Podcast #6: Interviews on Social Responsibility in Robotics and AI " href="https://www.icrac.net/podcast-download/3669/podcast-6.mp3?ref=new_window" target="_blank" rel="noopener noreferrer">Play in new window</a> | <span class="podcast-meta-duration">Duration: 9:53</span></aside>
<aside></aside>
<aside><img data-recalc-dims="1" loading="lazy" decoding="async" class="wp-image-3614" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/gariepy-ryan_Clearpath.jpg?resize=305%2C201&#038;ssl=1" alt="" width="305" height="201" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/gariepy-ryan_Clearpath.jpg?w=850&amp;ssl=1 850w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/gariepy-ryan_Clearpath.jpg?resize=300%2C198&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/gariepy-ryan_Clearpath.jpg?resize=768%2C507&amp;ssl=1 768w" sizes="auto, (max-width: 305px) 100vw, 305px" /> <img data-recalc-dims="1" loading="lazy" decoding="async" class="alignnone wp-image-3615 size-medium" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/russell-stairs-300x150.jpg?resize=300%2C150&#038;ssl=1" alt="" width="300" height="150" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/russell-stairs.jpg?resize=300%2C150&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/russell-stairs.jpg?resize=768%2C384&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/russell-stairs.jpg?w=934&amp;ssl=1 934w" sizes="auto, (max-width: 300px) 100vw, 300px" /></aside>
<aside>Interviews with:<br />
<b>Ryan Gariepy</b>, Chief Technical Officer, Clearpath Robotics<br />
<b>Stuart Russell</b>, Professor of Artificial Intelligence, University of California at Berkeley<i>This is episode 6 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series.</i>In this episode we discuss the social responsibility of robotics manufactures with Ryan Gariepy, CTO of the Canadian company <a href="https://www.clearpathrobotics.com/">Clearpath Robotics</a>&nbsp;and with Stuart Russell, Professor of Artificial Intelligence at UC Berkeley in the United States. Both spoke during the CCW.Clearpath Robotics took the bold step of becoming the first robotics company with defense contracts to come forward and&nbsp;support a ban on Lethal Autonomous Weapons Systems. Ryan tells us why.</p>
<p>Prof. Russell points out that there had been little interest&nbsp;within the AI community to accept social responsibility for its creations – but that is changing. A bit more than two years after this conversation, Stuart Russell presented the &#8220;<a href="https://www.youtube.com/watch?v=HipTO_7mUOw">Slaughterbots video</a>&#8220;.</p>
</aside>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		<enclosure url="https://www.icrac.net/wp-content/uploads/2018/01/06-socialrepponsibility.mp3" length="0" type="audio/mpeg" />
<enclosure url="https://www.icrac.net/podcast-download/3669/podcast-6.mp3?ref=download" length="14248201" type="audio/mpeg" />
<enclosure url="https://www.icrac.net/podcast-download/3669/podcast-6.mp3?ref=new_window" length="0" type="audio/mpeg" />

		<post-id xmlns="com-wordpress:feed-additions:1">3919</post-id>	</item>
		<item>
		<title>Podcast Episode #5: Interviews on Human Rights</title>
		<link>https://www.icrac.net/podcast-episode-5-interviews-on-human-rights/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Sun, 03 May 2015 19:08:12 +0000</pubDate>
				<category><![CDATA[Podcast]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=3927</guid>

					<description><![CDATA[&#160; Interviews with: Rasha Abdul Rahim, Amnesty International, Advocate/Advisor in Arms Control, Security Trade, and Human Rights Steve Wright, Reader in the School of Social Science at Leeds Beckett University, and ICRAC Member This is episode 4 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series. Rasha Abdul Rahim [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<audio class="wp-audio-shortcode" id="audio-3927-2" preload="none" style="width: 100%;" controls="controls"><source type="audio/mpeg" src="https://www.icrac.net/wp-content/uploads/2018/01/05-humanrights.mp3?_=2" /><a href="https://www.icrac.net/wp-content/uploads/2018/01/05-humanrights.mp3">https://www.icrac.net/wp-content/uploads/2018/01/05-humanrights.mp3</a></audio>
<p>&nbsp;</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignnone wp-image-3616" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/12/RashaAbdulRahim.jpg?resize=152%2C205&#038;ssl=1" alt="" width="152" height="205"> <img data-recalc-dims="1" loading="lazy" decoding="async" class="alignnone wp-image-3194" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2017/10/steve.jpg?resize=174%2C205&#038;ssl=1" alt="" width="174" height="205"></p>
<p>Interviews with:<br />
<b>Rasha Abdul Rahim</b>, Amnesty International, Advocate/Advisor in Arms Control, Security Trade, and Human Rights<br />
<b>Steve Wright</b>, Reader in the School of Social Science at Leeds Beckett University, and ICRAC Member</p>
<p><i>This is episode 4 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series.</i></p>
<p>Rasha Abdul Rahim is part of the Arms Control, Security Trade and Human Rights Team at Amnesty International and an expert in the human rights implications of autonomous weapons in military and policing applications.</p>
<p>Dr. Steve Wright is a professor at Leeds Beckett University who has more than 30 years under his belt&nbsp;working on the proliferation of weapon technologies. Steve is also ICRAC’s specialist on Less Lethal Autonomous Weapons (LLAWS).</p>
<p>We talk about how LLAWS could come back from conflict zones to haunt civil society and we discuss the inherent dangers for policing, border control and the suppression of populations.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		<enclosure url="https://www.icrac.net/wp-content/uploads/2018/01/05-humanrights.mp3" length="15615023" type="audio/mpeg" />

		<post-id xmlns="com-wordpress:feed-additions:1">3927</post-id>	</item>
		<item>
		<title>Podcast Episode #4: Interview with Nehal Bhuta, International Law Professor</title>
		<link>https://www.icrac.net/podcast-episode-4-interview-with-nehal-bhuta-international-law-professor/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Sun, 03 May 2015 18:51:08 +0000</pubDate>
				<category><![CDATA[Podcast]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=3929</guid>

					<description><![CDATA[This is episode 4 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series. Nehal Bhuta is a highly respected Professor of International Law at the European University Institute in Florence. He was one of the experts chosen to speak at the 2015 CCW Informal Meeting of Experts on LAWS. [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-3641 alignnone" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/01/bhuta.jpeg?resize=275%2C183&#038;ssl=1" alt="" width="275" height="183" /></p>
<p><i>This is episode 4 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series.</i></p>
<p>Nehal Bhuta is a highly respected Professor of International Law at the European University Institute in Florence. He was one of the experts chosen to speak at the 2015 CCW Informal Meeting of Experts on LAWS.</p>
<p>In this interview we probe some of the legal complexities of the relationship between the law and the weapons review process. We also discuss the concept of &#8220;meaningful human control&#8221;.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3929</post-id>	</item>
		<item>
		<title>Podcast Episode #3: Interview with Jody Williams, Chairwoman of the Nobel Women’s Initiative</title>
		<link>https://www.icrac.net/podcast-episode-3-interview-with-jody-williams-chairwoman-of-the-nobel-womens-initiative/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Mon, 27 Apr 2015 18:44:45 +0000</pubDate>
				<category><![CDATA[Podcast]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=3931</guid>

					<description><![CDATA[This is episode 3 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series. Jody Williams has been a humanitarian campaigner for most of her adult life. In 1991 she began coordinating the successful Campaign to Ban Landmines and received the Nobel Prize for Peace in 1997. But she didn’t [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignnone size-medium wp-image-3643" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/01/Jody.jpg?resize=300%2C201&#038;ssl=1" alt="" width="300" height="201" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/01/Jody.jpg?w=2048&amp;ssl=1 2048w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/01/Jody.jpg?resize=300%2C201&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/01/Jody.jpg?resize=768%2C516&amp;ssl=1 768w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/01/Jody.jpg?resize=1024%2C688&amp;ssl=1 1024w" sizes="auto, (max-width: 300px) 100vw, 300px" /></p>
<p><i>This is episode 3 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series.</i></p>
<p>Jody Williams has been a humanitarian campaigner for most of her adult life. In 1991 she began coordinating the successful Campaign to Ban Landmines and received the Nobel Prize for Peace in 1997. But she didn’t stop there. In 2004 Jody took the lead in establishing the Nobel Women’s Initiative with other women Nobel peace laureates and has been the chairwoman ever since. Their aim is to support and promote the work of women around the world working for peace with justice and equality.</p>
<p>In this interview Jody talks about the importance of the Campaign to Stop Killer Robots and tells us what is most annoying about its opponents.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3931</post-id>	</item>
		<item>
		<title>Podcast Episode #2: Interview with Stephen Goose, Director of Human Rights Watch Arms Division</title>
		<link>https://www.icrac.net/3914-2/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Mon, 27 Apr 2015 15:42:52 +0000</pubDate>
				<category><![CDATA[Podcast]]></category>
		<guid isPermaLink="false">https://www.icrac.net/?p=3914</guid>

					<description><![CDATA[&#160; This is episode 2 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series. Steve Goose has been heavily involved in the prohibition of three nasty weapons systems. In this podcast he tells us about the discussions at the CCW are panning out and compares the pace with previous [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<audio class="wp-audio-shortcode" id="audio-3914-3" preload="none" style="width: 100%;" controls="controls"><source type="audio/mpeg" src="https://www.icrac.net/wp-content/uploads/2018/01/02-SteveGoose.mp3?_=3" /><a href="https://www.icrac.net/wp-content/uploads/2018/01/02-SteveGoose.mp3">https://www.icrac.net/wp-content/uploads/2018/01/02-SteveGoose.mp3</a></audio>
<p>&nbsp;</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignnone size-medium wp-image-3637" src="https://i0.wp.com/www.icrac.net/wp-content/uploads/2018/01/goose1.jpg?resize=300%2C200&#038;ssl=1" alt="" width="300" height="200"></p>
<p><i>This is episode 2 of 6 in the CCW Special that kicks off the new official ICRAC Podcast series.</i></p>
<p>Steve Goose has been heavily involved in the prohibition of three nasty weapons systems. In this podcast he tells us about the discussions at the CCW are panning out and compares the pace with previous campaigns. He then explains clearly the steps at the Convention on Certain Conventional Weapons (CCW) from where we are now right through to a new protocol to prohibit the development, production and use of fully autonomous weapons systems. <a href="http://www.unog.ch/80256EDD006B8954/(httpAssets)/59A5B2704D850B49C1257E2F004FF9EA/$file/2015_LAWS_MX_HRW_WA.pdf">Steve Goose CCW intervention</a>.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		<enclosure url="https://www.icrac.net/wp-content/uploads/2018/01/02-SteveGoose.mp3" length="13658239" type="audio/mpeg" />

		<post-id xmlns="com-wordpress:feed-additions:1">3914</post-id>	</item>
	</channel>
</rss>
