<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>YouTube video &#8211; ICRAC</title>
	<atom:link href="https://www.icrac.net/category/youtube-video/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.icrac.net</link>
	<description>International Committee for Robot Arms Control</description>
	<lastBuildDate>Tue, 15 Dec 2020 17:40:47 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.2</generator>
<site xmlns="com-wordpress:feed-additions:1">128339352</site>	<item>
		<title>ICRAC Video: Peaceful Uses of Robotics and Banning LAWS</title>
		<link>https://www.icrac.net/new-icrac-video-on-peaceful-uses-of-robotics-and-banning-laws/</link>
		
		<dc:creator><![CDATA[Peter Asaro]]></dc:creator>
		<pubDate>Thu, 12 Nov 2015 17:36:45 +0000</pubDate>
				<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Media]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Slider]]></category>
		<category><![CDATA[Statements]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2869</guid>

					<description><![CDATA[Stop the Killer Robots from Kamille Rodriguez on Vimeo. The video explains that a ban on killer robots would not have negative impacts on the development of other robotics applications and research. It was created for ICRAC by digital animation artist Kamille Rodriguez: http://www.kamillerodriguez.com/ &#160;<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Peter Asaro' src='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.peterasaro.org/">Peter Asaro</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Dr. Peter Asaro is a philosopher of science, technology and media. His work examines the interfaces between social relations, human minds and bodies, artificial intelligence and robotics, and digital media.

His current research focuses on the social, cultural, political, legal and ethical dimensions of military robotics and UAV drones, from a perspective that combines media theory with science and technology studies. He has written widely-cited papers on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro's research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. His research has been published in international peer reviewed journals and edited volumes, and he is currently writing a book that interrogates the intersections between military robotics, interface design practices, and social and ethical issues.

Dr. Asaro has held research positions at the Center for Cultural Analysis at Rutgers University, the HUMlab of Umeå University in Sweden, and the Austrian Academy of Sciences in Vienna. He has also developed technologies in the areas of virtual reality, data visualization and sonification, human-computer interaction, computer-supported cooperative work, artificial intelligence, machine learning, robot vision, and neuromorphic robotics at the National Center for Supercomputer Applications (NCSA), the Beckman Institute for Advanced Science and Technology, and Iguana Robotics, Inc., and was involved in the design of the natural language interface for the Wolfram|Alpha computational knowledge engine (winner of the 2010 SXSW Web Interactive Award for Technical Achievement), for Wolfram Research.

He is currently working on an Oral History of Robotics project that is funded by the IEEE Robotics and Automation Society and the National Endowment for the Humanities Office of Digital Humanities.

Dr. Asaro received his PhD in the History, Philosophy and Sociology of Science from the University of Illinois at Urbana-Champaign, where he also earned a Master of Arts from the Department of Philosophy, and a Master of Computer Science from the Department of Computer Science.</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><iframe src="https://player.vimeo.com/video/117102411" width="550" height="300" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p><a href="https://vimeo.com/117102411">Stop the Killer Robots</a> from <a href="https://vimeo.com/user21751690">Kamille Rodriguez</a> on <a href="https://vimeo.com">Vimeo</a>.</p>
<p>The video explains that a ban on killer robots would not have negative impacts on the development of other robotics applications and research.</p>
<p>It was created for ICRAC by digital animation artist Kamille Rodriguez:<br />
<a href="http://www.kamillerodriguez.com/">http://www.kamillerodriguez.com/</a></p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Peter Asaro' src='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/730c6c6178743fb0e7fdfc64686309f4701c6a1cfb57d66242717d43b57b746b?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.peterasaro.org/">Peter Asaro</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Dr. Peter Asaro is a philosopher of science, technology and media. His work examines the interfaces between social relations, human minds and bodies, artificial intelligence and robotics, and digital media.

His current research focuses on the social, cultural, political, legal and ethical dimensions of military robotics and UAV drones, from a perspective that combines media theory with science and technology studies. He has written widely-cited papers on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro's research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. His research has been published in international peer reviewed journals and edited volumes, and he is currently writing a book that interrogates the intersections between military robotics, interface design practices, and social and ethical issues.

Dr. Asaro has held research positions at the Center for Cultural Analysis at Rutgers University, the HUMlab of Umeå University in Sweden, and the Austrian Academy of Sciences in Vienna. He has also developed technologies in the areas of virtual reality, data visualization and sonification, human-computer interaction, computer-supported cooperative work, artificial intelligence, machine learning, robot vision, and neuromorphic robotics at the National Center for Supercomputer Applications (NCSA), the Beckman Institute for Advanced Science and Technology, and Iguana Robotics, Inc., and was involved in the design of the natural language interface for the Wolfram|Alpha computational knowledge engine (winner of the 2010 SXSW Web Interactive Award for Technical Achievement), for Wolfram Research.

He is currently working on an Oral History of Robotics project that is funded by the IEEE Robotics and Automation Society and the National Endowment for the Humanities Office of Digital Humanities.

Dr. Asaro received his PhD in the History, Philosophy and Sociology of Science from the University of Illinois at Urbana-Champaign, where he also earned a Master of Arts from the Department of Philosophy, and a Master of Computer Science from the Department of Computer Science.</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2869</post-id>	</item>
		<item>
		<title>LAWS: An Open Letter from AI &#038; Robotics Experts</title>
		<link>https://www.icrac.net/laws-an-open-letter-from-ai-robotics-experts/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Wed, 29 Jul 2015 08:15:26 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2858</guid>

					<description><![CDATA[Thousands of experts in artificial intelligence, robotics and related professions have signed an open letter, hosted by the Future of Life Institute, calling for a ban on autonomous weapons that select and engage targets without human intervention. You can read more on the importance of this letter to the current global effort of banning lethal [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>Thousands of experts in artificial intelligence, robotics and related professions have signed an <a href="http://futureoflife.org/AI/open_letter_autonomous_weapons#signatories">open letter</a>, hosted by the <a href="http://futureoflife.org/">Future of Life Institute</a>, calling for a ban on autonomous weapons that select and engage targets without human intervention.</p>
<p>You can read more on the importance of this letter to the current global effort of banning lethal autonomous weapon systems (LAWS) on the <a href="http://www.stopkillerrobots.org/2015/07/aicall/">Campaign to Stop Killer Robots website</a>.</p>
<p>ICRAC is committed to the peaceful uses of robotics and the regulation of robot weapons. We welcome this open letter, and numerous members of ICRAC have not only signed but also commented on this youngest development over the last couple of days in various media. Below is a selection of audio and video content in English, Italian and German language.</p>
<p><iframe loading="lazy" src="http://www.cnet.com/videos/share/id/kje1bJWKBsCFql0OOXKhKB1Rtqf46pvn/" width="600" height="350" frameborder="0" seamless="seamless" allowfullscreen="allowfullscreen"></iframe></p>
<p><strong>CNET – Ban autonomous weapons, urge AI experts including Hawking, Musk and Wozniak</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Noel Sharkey and Thomas Nash</a>)<br />
Release date: 27 Jul 2015</p>
<p><iframe loading="lazy" src="https://s.embed.live.huffingtonpost.com/HPLEmbedPlayer/?segmentId=55aef80bfe34445cec000164&amp;autoPlay=false" width="570" height="321" frameborder="0"></iframe></p>
<p><strong>Huffpost Live – A.I. Experts Push For Military Robot Ban</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Heather Roff and Ian Kerr</a>)<br />
Hundreds of artificial intelligence experts and revered thinkers, including Stephen Hawking and Elon Musk, are calling for a global ban on military robots. We explore the issue and whether these autonomous weapons could lower the threshold for war.<br />
Release date: 28 Jul 2015</p>
<p><iframe loading="lazy" src="https://www.youtube.com/embed/XIpmztX1X68" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p><strong>CTV – Will the growth of killer robots set off a global arms race?</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Ian Kerr</a>)<br />
In this 5 minute interview with Canada AM host, Beverly Thompson, Ian Kerr discusses the the difference between semi-autonomous and autonomous weapons, the call to ban killer robots, why he is a signatory to the open letter, and why efficacy is not the only consideration in deciding whether to ban a dangerous use of technology.<br />
Release date: 04 Aug 2015</p>
<p><iframe loading="lazy" src="http://www.bbc.co.uk/programmes/p02y817k/player" width="400" height="500" frameborder="0"></iframe></p>
<p><strong>BBC World Service – Fighting off ‘Killer Robots’</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Heather Roff</a>)<br />
More than 1,000 tech experts, scientists and researchers have written a letter warning about the dangers of autonomous weapons, warning that a ‘military AI race is a bad idea’. One signatory, Heather Roff Perkins from the University of Denver spoke to the BBC’s Dominic Laurie.<br />
Release date: 28 Jul 2015</p>
<p><iframe loading="lazy" src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/217772360&amp;color=ff5500" width="100%" height="166" frameborder="no" scrolling="no"></iframe></p>
<p><strong>Killer robots: the coming arms race?</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Peter Asaro</a>)</p>
<p>Are you worried about killer robots? Last week, some of the most prominent thinkers in science and technology signed an open letter that warned of the coming arms race should militaries pursue the development and deployment of artificially intelligent weaponry. The letter was written by The Campaign to Stop Killer Robots, an international coalition of NGOs, and was signed by almost 14,000 people, including Stephen Hawking, Elon Musk, and Steve Wozniak. Today, we discuss the threat of fully autonomous weapons and artificially-intelligent warfare with PETER ASARO, professor of media studies at The New School and spokesperson for The Campaign to Stop Killer Robots, roboticist and Georgia Tech professor RONALD ARKIN, as well as GEORGE ZARKADAKIS, writer and AI architect.</p>
<p>Release date: 6 Aug 2015</p>
<p><iframe loading="lazy" src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/218308595&amp;color=ff5500" width="100%" height="166" frameborder="no" scrolling="no"></iframe></p>
<p><strong>Ban Killer Robots Interview – Calgary Newstalk 770</strong><br />
(<a href="http://icrac.net/who/">with ICRAC’s Ian Kerr</a>)<br />
In this 25 minute interview with Roger Kinkade and Rob Breakenridge of Newstalk770, CHQR Radio, Ian Kerr talks at length about autonomous weapons and the AI community’s call to ban “killer robots”. The conversation is about what a killer robot is, why they are likely to be developed, what the dangers are if we don’t ban them and a number of broader issues regarding the future of artificial intelligence.<br />
Release date: 8 Aug 2015</p>
<p>&nbsp;</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2858</post-id>	</item>
		<item>
		<title>Our robotic future: The hopes and worries of 10 year old Bethany</title>
		<link>https://www.icrac.net/our-robotic-future-the-hopes-and-worries-of-10-year-old-bethany/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Sat, 21 Mar 2015 20:26:09 +0000</pubDate>
				<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2502</guid>

					<description><![CDATA[Ten year old Bethany Clifford-Tait is concerned about a future that includes autonomous weapons systems aka ‘killer robots’. Such were her worries that she wrote to the producers of the oldest and most popular BBC children’s programme Blue Peter to ask them to spread the word to other children across the country in one of [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div id="attachment_2515" style="width: 234px" class="wp-caption alignleft"><a href="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2015/03/Beth.jpg" target="_blank" rel="noopener noreferrer"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2515" class="wp-image-2515 size-medium" style="margin-top: 0px; margin-right: 30px; margin-bottom: 0px;" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2015/03/Beth-224x300.jpg?resize=224%2C300" alt="" width="224" height="300" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/03/Beth.jpg?resize=224%2C300&amp;ssl=1 224w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/03/Beth.jpg?w=717&amp;ssl=1 717w" sizes="auto, (max-width: 224px) 100vw, 224px" /></a><p id="caption-attachment-2515" class="wp-caption-text">Bethany Clifford-Tait.</p></div>
<p>Ten year old Bethany Clifford-Tait is concerned about a future that includes autonomous weapons systems aka ‘killer robots’. Such were her worries that she wrote to the producers of the oldest and most popular BBC children’s programme <a href="http://www.bbc.co.uk/cbbc/shows/blue-peter">Blue Peter</a> to ask them to spread the word to other children across the country in one of their TV shows. They fulfilled her desire of course, but on the top of that, they have used the power of the social media by purchasing an adequate number of views <a href="https://themarketingheaven.com/buy-youtube-views/" https:="">from The Marketing Heaven</a> to make the clip from the show go viral. The producers were so impressed that Bethany has been awarded the much coveted Blue Peter badge (see picture).</p>
<p>We at ICRAC are delighted with Behany’s work and applaud her request. ICRAC chairman Professor Noel Sharkey said, “it is wonderful to see such a sense of responsibility in one so young. This is the very generation that is most likely to suffer from the use of these weapons if we cannot get them prohibited. The power of one so young speaking out for her generation should not be underestimated.”</p>
<p>We at ICRAC work for a future where robots serve peaceful purposes rather than being weaponized, inter alia by providing expertise and engaging with the media, the general public and the international community at the <a href="http://www.stopkillerrobots.org/2015/03/ccwexperts2015/">United Nations</a>. We strongly support Bethany’s request to the BBC to let other children know about the future dangers that they could be facing with the automation of war.</p>
<p>With Bethany’s (and her dad’s) friendly permission we&nbsp; reproduce her request here. Bethany writes:</p>
<p style="padding-left: 30px;"><em>For a new episode I would like for you to do it about “The Campaign To Stop Killer Robots”. It is a campaign to stop people building robots programed to kill someone without human intervention. The tin robot David Wreckham was on Blue Peter in 2003 as a waiter but now this robot has a more serious mission.</em></p>
<p>&nbsp;</p>
<p style="padding-left: 30px;"><iframe loading="lazy" src="https://www.youtube.com/embed/XD7N9GGGWT4" width="420" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p style="padding-left: 30px;"><em>He was made by my Uncle Ray. My dad and my uncle make robots and one of them called “Skeletron” won an award in BBC’s Techno Games where they met Professor Noel Sharkey and became friends.</em></p>
<p style="padding-left: 30px;"><em>Ten years later, Noel asked for some help with an event in London. This was the “Campaign To Stop Killer Robots”, so last year my Dad, Matt Tait, and my uncle, Raymond Tait, went off to the House of Commons with David Wreckham.</em></p>
<p style="padding-left: 30px;"><iframe loading="lazy" src="https://www.youtube.com/embed/A5F8KK1L-yc" width="418" height="235" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p style="padding-left: 30px;"><em>This is a kind of boring interview but it tells you about the campaign. Grown ups might understand it but it isn’t child-friendly!</em></p>
<p style="padding-left: 30px;"><em><strong>That’s what I want you to do.</strong></em></p>
<div id="attachment_2503" style="width: 310px" class="wp-caption alignright"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2503" class="wp-image-2503 size-medium" style="margin: 5px 5px 5px 10px;" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2015/06/wreckham2-300x198.jpg?resize=300%2C198" alt="" width="300" height="198" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/wreckham2.jpg?resize=300%2C198&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/wreckham2.jpg?w=494&amp;ssl=1 494w" sizes="auto, (max-width: 300px) 100vw, 300px" /><p id="caption-attachment-2503" class="wp-caption-text">David Wreckham at the House of Parliament</p></div>
<p style="padding-left: 30px;"><em>The “Campaign To Stop Killer Robots” is to stop humans building robots programmed to kill someone without a human there to take responsibility for “the kill”.</em></p>
<p style="padding-left: 30px;"><em>At the moment drones have to be flown by remote control so a human is there and can stop it. But we are very close to being able to program the drone to work independently without a human there, which means that soon we can let the drone make the decision. The problem is that they might make the wrong decision and kill the wrong person or innocent people in the area and no-one will be held responsible.</em></p>
<p style="padding-left: 30px;"><em>The campaign wants politicians and world leaders to think about this and put laws in place to make people responsible for the machines they create or use.</em></p>
<div id="attachment_2505" style="width: 310px" class="wp-caption alignright"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2505" class="wp-image-2505 size-medium" style="margin: 5px 10px;" src="https://i0.wp.com/www.icrac.net.php56-3.dfw3-2.websitetestlink.com/wp-content/uploads/2015/06/Andy-noel-scuttle-zoe-300x225.jpg?resize=300%2C225" alt="" width="300" height="225" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/Andy-noel-scuttle-zoe.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/Andy-noel-scuttle-zoe.jpg?resize=1024%2C768&amp;ssl=1 1024w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/Andy-noel-scuttle-zoe.jpg?w=2000&amp;ssl=1 2000w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/Andy-noel-scuttle-zoe.jpg?w=3000&amp;ssl=1 3000w" sizes="auto, (max-width: 300px) 100vw, 300px" /><p id="caption-attachment-2505" class="wp-caption-text">ICRAC Chairman Noel Sharkey demonstrating Scuttle, the 8-legged robot, on Blue Peter in 2008</p></div>
<p style="padding-left: 30px;"><em><strong>Why is this important for children to understand?</strong></em></p>
<p style="padding-left: 30px;"><em>Children are being taught computer programming in schools and should be made aware of the morals and consequences of their actions instead of just believing their work can only be used for games and apps. It’s our world and our future and we ought to have a say in how it’s run.</em></p>
<p style="padding-left: 30px;"><em>By Bethany Clifford-Tait</em></p>
<p style="padding-left: 30px;"><em>Aged 10</em></p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2502</post-id>	</item>
		<item>
		<title>The Campaign and the Media</title>
		<link>https://www.icrac.net/the-campaign-and-the-media/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Mon, 27 Oct 2014 20:20:16 +0000</pubDate>
				<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2494</guid>

					<description><![CDATA[On Monday, October 20th, The School of Media Studies at The New School hosted a discussion chaired by ICRAC’s Dr. Peter Asaro with Nobel Peace Prize Laureate Ms. Jody Williams and Mary Wareham of Human Rights Watch, looking at the Campaign to Stop Killer Robots, the involvement of ICRAC and many other NGOs worldwide and [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><iframe loading="lazy" width="560" height="315" src="https://www.youtube.com/embed/rWC5ZfDqZB4" frameborder="0" allowfullscreen></iframe></p>
<p><strong></strong>On Monday, October 20<sup>th</sup>, The School of Media Studies at <a href="http://www.newschool.edu">The New School</a> hosted a discussion chaired by <a href="http://blogs.newschool.edu/news/2014/10/the-media-behind-killer-robots-a-panel-event/#.VD_latR4ohQ">ICRAC’s Dr. Peter Asaro</a> with Nobel Peace Prize Laureate Ms. Jody Williams and Mary Wareham of Human Rights Watch, looking at the <a href="http://www.stopkillerrobots.org/">Campaign to Stop Killer Robots</a>, the involvement of ICRAC and many other NGOs worldwide and particularly the evolving nature of media outreach and advocacy for humanitarian disarmament. Clearpath Robotics Co-Founder and CTO Ryan Gariepy makes an appearance, explaining the promises and dangers of robotics and his company’s stance against robotic weapon systems.</p>
<div class="entry">
<p>In particular, the panelists consider how media has reacted to and covered the challenges posed by autonomous weapons and call for a ban as well as the similarities and differences to media outreach and advocacy for campaigning against landmines and cluster munitions. They discuss how the Internet and social media have changed the landscape, and also how new media has changed the campaign’s approaches to mainstream media and journalists.</p>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2494</post-id>	</item>
		<item>
		<title>Canada’s leading robot company rejects ‘killer robots’ — updated!</title>
		<link>https://www.icrac.net/canadas-leading-robot-company-rejects-killer-robots-updated/</link>
		
		<dc:creator><![CDATA[nsharkey]]></dc:creator>
		<pubDate>Thu, 14 Aug 2014 20:15:59 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2490</guid>

					<description><![CDATA[Hi-Tech Canadian Robotics company, Clearpath, today issued a statement pledging not to manufacture autonomous weapons systems despite their commercial advantage and they urged other companies to follow suit: “those who might see business opportunities in this technology to seek other ways to apply their skills and resources for the betterment of humankind.” ICRAC applauds the [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><iframe loading="lazy" src="https://www.youtube.com/embed/tI8kAc2vHVY" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<div class="entry">
<p>Hi-Tech Canadian Robotics company, Clearpath, today issued a statement pledging not to manufacture autonomous weapons systems despite their commercial advantage and they urged other companies to follow suit: “those who might see business opportunities in this technology to seek other ways to apply their skills and resources for the betterment of humankind.”</p>
<p>ICRAC applauds the integrity of a company that says, “despite our continued involvement with Canadian and international military research and development, Clearpath Robotics believes that the development of killer robots is unwise, unethical, and should be banned on an international scale.”</p>
<p>Putting morally right action before profit is an unusual and bold move in today’s hi-tech world and ICRAC wishes them great success in their future business ventures with the added prestige this will give them.</p>
<p>Clearpath is the first entire company to join the growing number of robotics professionals in pointing out that autonomous weapons would be unable to comply with the laws of war into the forseeable future.</p>
<p>Last year nearly 300 roboticists and computer professionals signed an ICRAC petition calling for a ban on autonomous robot weapons systems <a title="ban on autonomous robot weapons" href="http://icrac.net/call/">http://icrac.net/call/</a></p>
<p>In an open letter, Ryan Gariepy, Co-Founder and CTO of Clearpath Robotics wrote: “would a robot have the morality, sense, or emotional understanding to intervene against orders that are wrong or inhumane? No. Would computers be able to make the kinds of subjective decisions required for checking the legitimacy of targets and ensuring the proportionate use of force in the foreseeable future? No.” And further, “In our eyes, no nation in the world is ready for killer robots – technologically, legally, or ethically”</p>
<p>Surely now the Canadian government will have more reason to think about following Mines Action Canada’s call to <a title="&quot;keep killer robots fiction&quot;" href="http://killerrobots-minesactioncanada.nationbuilder.com">‘keep killer robots fiction”</a>.</p>
<p>Here is the open letter to the public in full:</p>
<blockquote><p>The Campaign to <a title="Stop Killer Robots" href="http://www.stopkillerrobots.org/">Stop Killer Robots</a> (http://www.stopkillerrobots.org/) was launched in April 2013, bringing the topic of “killer robots” under public scrutiny – and for good reason.</p>
<p>To the people against killer robots: we support you.</p>
<p>This technology has the potential to kill indiscriminately and to proliferate rapidly; early prototypes already exist. Despite our continued involvement with Canadian and international military research and development, Clearpath Robotics believes that the development of killer robots is unwise, unethical, and should be banned on an international scale.</p>
<p><em>The Context?</em><br />
How do we define “killer robot”? Is it any machine developed for military purposes? Any machine which takes actions without human direction? No. We’re referring specifically to “lethal autonomous weapons systems (LAWS)”; systems where a human does not make the final decision for a machine to take a potentially lethal action.</p>
<p>Clearpath Robotics is an organization that engineers autonomous vehicles, systems, and solutions for a global market. As current leaders in the research and development space for unmanned vehicles, making this kind of statement is a risk. However, given the potentially horrific consequences of allowing development of lethal autonomous robots to continue, we are compelled to insist upon the strictest regulation of this technology.</p>
<p><em>The Double-Edged Sword?</em><br />
There are, of course, pros and cons to the ethics of autonomous lethal weapons and our team has debated many of them at length. In the end, however, we, as a whole, feel the negative implications of these systems far outweigh any benefits.</p>
<p>Is a computer paired with the correct technology less likely to make rash, stress-driven decisions while under fire? Possibly. Conversely, would a robot have the morality, sense, or emotional understanding to intervene against orders that are wrong or inhumane? No. Would computers be able to make the kinds of subjective decisions required for checking the legitimacy of targets and ensuring the proportionate use of force in the foreseeable future? No. Could this technology lead those who possess it to value human life less? Quite frankly, we believe this will be the case.</p>
<p>This is an incredibly complex issue. We need to have this discussion now and take a stance; the robotics revolution has arrived and is not going to wait for these debates to occur.</p>
<p><em>Clearpath’s Responsibility?</em><br />
Clearpath Robotics strives to improve the lives of billions by automating the world’s dull, dirty, and dangerous jobs. This belief does not preclude the use of autonomous robots in the military; we will continue to support our military clients and provide them with autonomous systems – especially in areas with direct civilian applications such as logistics, reconnaissance, and search and rescue.</p>
<p>In our eyes, no nation in the world is ready for killer robots – technologically, legally, or ethically. More importantly, we see no compelling justification that this technology needs to exist in human hands. After all, the development of killer robots isn’t a necessary step on the road to self-driving cars, robot caregivers, safer manufacturing plants, or any of the other multitudes of ways autonomous robots can make our lives better. Robotics is at a tipping point, and it’s up to all of us to decide what path this technology takes.</p>
<p><em>Take Action?</em><br />
As a company which continues to develop robots for various militaries worldwide, Clearpath Robotics has more to lose than others might by advocating entire avenues of research be closed off. Nevertheless, we call on anyone who has the potential to influence public policy to stop the development of killer robots before it’s too late.</p>
<p>We encourage those who might see business opportunities in this technology to seek other ways to apply their skills and resources for the betterment of humankind. Finally, we ask everyone to consider the many ways in which this technology would change the face of war for the worse. Voice your opinion and take a stance. #killerrobots</p>
<p>Ryan Gariepy?Co-Founder &amp; CTO, Clearpath Robotics?<br />
Twitter: @clearpathrobots?<br />
Facebook: <a title="www.facebook.com/ClearpathRobotics" href="http://icrac.net/2014/08/canadas-biggest-robot-company-rejects-killer-robots/www.facebook.com/ClearpathRobotics">www.facebook.com/ClearpathRobotics</a></p></blockquote>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='nsharkey' src='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/e6cd227594f64421151214d3d51a2a80df88e84aa4bd648da1116ba45dffc7e0?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/">nsharkey</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Noel SharkeyPhD, DSc FIET, FBCS CITP FRIN FRSA is Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield and  was an EPSRC Senior Media Fellow (2004-2010).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2490</post-id>	</item>
		<item>
		<title>Lethal Autonomous Robots and the UN Convention on Conventional Weapons (CCW)</title>
		<link>https://www.icrac.net/lethal-autonomous-robots-and-the-un-convention-on-conventional-weapons-ccw/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Fri, 14 Mar 2014 05:11:35 +0000</pubDate>
				<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2443</guid>

					<description><![CDATA[The Campaign to Stop Killer Robots, of which ICRAC is one of the founding members, shared a video explaining how the issue of lethal autonomous robots has been picked up by the international community at the United Nations Convention on Conventional Weapons in Geneva. The purpose of the Convention is to ban or restrict the [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><iframe loading="lazy" src="https://www.youtube.com/embed/tCGidyqwHWk" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<div class="entry">
<p>The <a href="http://www.stopkillerrobots.org/">Campaign to Stop Killer Robots</a>, of which ICRAC is one of the founding members, shared a video explaining how the <a href="http://icrac.net/2013/11/campaign-to-stop-killer-robots-takes-significant-step-forward-at-un/">issue of lethal autonomous robots has been picked up by the international community</a> at the United Nations <a href="http://www.unog.ch/80256EE600585943/%28httpPages%29/4F0DEF093B4860B4C1257180004B1B30?OpenDocument">Convention on Conventional Weapons</a> in Geneva.</p>
<blockquote><p>The purpose of the Convention is to ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately. The structure of the CCW – a chapeau Convention and annexed Protocols – was adopted in this manner to ensure future flexibility. The Convention itself contains only general provisions. All prohibitions or restrictions on the use of specific weapons or weapon systems are the object of the Protocols annexed to the Convention.</p></blockquote>
<p>The video also provides a look at the forthcoming <a href="http://www.unog.ch/80256EE600585943/%28httpPages%29/3CFCEEEF52D553D5C1257B0300473B77?OpenDocument">experts meeting on May 13-16, 2014</a>. During the current runup to that meeting, the issue of LARs is oftentimes compared to blinding laser weapons. These are prohibited by the CCW in <a href="http://www.unog.ch/80256EDD006B8954/%28httpAssets%29/8463F2782F711A13C12571DE005BCF1A/$file/PROTOCOL+IV.pdf">Protocol IV</a> which came into force in 1998. ICRAC and the Campaign to Stop Killer Robots work towards a similar prohibition or <a href="http://www.stopkillerrobots.org/the-problem/">“ban” of lethal autonomous robots</a>. The expert meeting report will be discussed at the next CCW meeting of states parties in November.</p>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2443</post-id>	</item>
		<item>
		<title>Geneva Academy Debate: Matthew Waxman vs. ICRAC’s Peter Asaro</title>
		<link>https://www.icrac.net/geneva-academy-debate-matthew-waxman-vs-icracs-peter-asaro/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Tue, 26 Nov 2013 05:06:41 +0000</pubDate>
				<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2439</guid>

					<description><![CDATA[Autonomous Weapon Systems: Dangerous Killer Robots or Smarter and Less-Harmful Warfare?, Geneva Academy Debate, Wedesday 20 November 11am – 1pm, Maison de la Paix (Chemin Eugène-Rigot 2), Geneva. The motion under debate will be: “Should there be an absolute ban on autonomous systems capable of using lethal force?” Two key speakers will argue for and [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><iframe loading="lazy" src="https://player.vimeo.com/video/80056029" width="500" height="277" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<div class="entry">
<blockquote><p>Autonomous Weapon Systems: Dangerous Killer Robots or Smarter and Less-Harmful Warfare?, <a href="http://www.geneva-academy.ch/the-academy/events/events-a-news-2013/1011-autonomous-weapons-should-there-be-an-absolute-ban-on-autonomous-systems-capable-of-using-lethal-force">Geneva Academy Debate</a>, Wedesday 20 November 11am – 1pm, Maison de la Paix (Chemin Eugène-Rigot 2), Geneva.</p>
<p>The motion under debate will be: “Should there be an absolute ban on autonomous systems capable of using lethal force?” Two key speakers will argue for and against the motion, and respond to each other’s presentation. This will be followed by a discussion session with the audience, and a public vote.</p>
<p>Moderator :<br />
<a href="http://www.essex.ac.uk/law/staff/profile.aspx?ID=2358">Noam Lubell</a>, Professor of Law, School of Law, University of Essex<br />
Swiss Chair of International Humanitarian Law, The Geneva Academy of International Humanitarian Law and Human Rights</p>
<p>With the participation of:<br />
Arguing for a ban: <a title="Who We Are" href="http://icrac.net/who/">Peter Mario Asaro</a>, PhD, Affiliate Scholar, The Center for Internet and Society Stanford Law School, Director of Graduate Programs School of Media Studies<br />
Arguing against a ban:<a href="http://www.law.columbia.edu/fac/Matthew_Waxman"> Prof. Matthew Waxman</a>, University of Columbia Law School; Council on Foreign Relations; co-author of ‘Law and Ethics for Autonomous Weapon Systems’.</p></blockquote>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2439</post-id>	</item>
		<item>
		<title>Georgia Tech “TechDebate”: Ron Arkin vs. ICRAC’s Rob Sparrow</title>
		<link>https://www.icrac.net/georgia-tech-techdebate-ron-arkin-vs-icracs-rob-sparrow/</link>
		
		<dc:creator><![CDATA[Frank Sauer]]></dc:creator>
		<pubDate>Tue, 26 Nov 2013 05:04:14 +0000</pubDate>
				<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2435</guid>

					<description><![CDATA[“Published on 21 Nov 2013 – This video is of the inaugural debate in the TechDebates on Emerging Technologies series, which focused on Lethal Autonomous “Killer” Robots. LARs are machines that can decide to kill. Such technology has the potential to revolutionize modern warfare and more. The need for understanding LARs is essential to decide [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p><iframe loading="lazy" src="https://www.youtube.com/embed/nO1oFKc_-4A" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<div class="entry">
<blockquote><p>“Published on 21 Nov 2013 –</p>
<p>This video is of the inaugural debate in the TechDebates on Emerging Technologies series, which focused on Lethal Autonomous “Killer” Robots. LARs are machines that can decide to kill. Such technology has the potential to revolutionize modern warfare and more. The need for understanding LARs is essential to decide whether their development and possible deployment should be regulated or banned. This TechDebate centers on the question: are LARs ethical?</p>
<p>Debaters:<br />
<a href="http://www.cc.gatech.edu/aimosaic/faculty/arkin/">Ron Arkin</a>, Robotics Professor at Georgia Tech’s College of Computing<br />
<a title="Who We Are" href="http://icrac.net/who/">Rob Sparrow</a>, Philosophy Professor at Monash University School of Philosophical, Historical and International Studies Professor, Australia</p>
<p>The TechDebates on Emerging Technologies is a debate series presented by the Center for Ethics and Technology (CET) at the Georgia Institute of Technology. Please visit <a href="http://www.ethics.gatech.edu">http://www.ethics.gatech.edu</a> to learn more about CET and its initiatives. CET invites you to participate in public deliberation on the ethics of LARs in the AGORA-net, a web-based and interactive argument visualization software. You can access the AGORA-net at <a href="http://www.agora.gatech.edu">http://www.agora.gatech.edu</a>“.</p></blockquote>
</div>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='Frank Sauer' src='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/7367abd54bcccab11252f513db7ac0ab9bd9b726dcc720fe7e55b9f594fdda9d?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://www.unibw.de/frank.sauer">Frank Sauer</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em"></div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2435</post-id>	</item>
		<item>
		<title>ICRAC Member and Campaign to Stop Killer Robots Deliver Statements at the UN General Assembly First Committee</title>
		<link>https://www.icrac.net/icrac-member-and-campaign-to-stop-killer-robots-deliver-statements-at-the-un-general-assembly-first-committee/</link>
		
		<dc:creator><![CDATA[mbolton]]></dc:creator>
		<pubDate>Wed, 30 Oct 2013 21:06:20 +0000</pubDate>
				<category><![CDATA[ICRAC in the media]]></category>
		<category><![CDATA[ICRAC News]]></category>
		<category><![CDATA[Statements]]></category>
		<category><![CDATA[YouTube video]]></category>
		<category><![CDATA[Arms Trade Treaty]]></category>
		<category><![CDATA[Article 36]]></category>
		<category><![CDATA[Campaign to Stop Killer Robots]]></category>
		<category><![CDATA[First Committee]]></category>
		<category><![CDATA[Human Rights Watch]]></category>
		<category><![CDATA[ICRAC]]></category>
		<category><![CDATA[Matthew Bolton]]></category>
		<category><![CDATA[NGOs]]></category>
		<category><![CDATA[nuclear weapons]]></category>
		<category><![CDATA[Pace University]]></category>
		<category><![CDATA[robotic weapons]]></category>
		<category><![CDATA[small arms and light weapons]]></category>
		<category><![CDATA[United Nations]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2413</guid>

					<description><![CDATA[On behalf of global civil society organizations, International Committee for Robot Arms Control member Matthew Bolton calls for disarmament and arms control “driven by the needs and rights of people most affected by armed violence.” The Campaign to Stop Killer Robots also spoke, calling for fully autonomous weapons to “be prohibited through an international treaty, [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<div id="attachment_2416" style="width: 310px" class="wp-caption alignleft"><a href="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2015/06/NGO-Statement-to-UNGA-1.jpg"><img data-recalc-dims="1" loading="lazy" decoding="async" aria-describedby="caption-attachment-2416" class="size-medium wp-image-2416" src="https://i0.wp.com/www.icrac.net.php53-3.dfw1-2.websitetestlink.com/wp-content/uploads/2015/06/NGO-Statement-to-UNGA-1-300x200.jpg?resize=300%2C200" alt="ICRAC member Dr. Matthew Bolton, presenting a statement on disarmament at the UN General Assembly’s First Committee on Tuesday. Photo by Shant Alexander for Control Arms." width="300" height="200" srcset="https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/NGO-Statement-to-UNGA-1.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.icrac.net/wp-content/uploads/2015/06/NGO-Statement-to-UNGA-1.jpg?w=1024&amp;ssl=1 1024w" sizes="auto, (max-width: 300px) 100vw, 300px" /></a><p id="caption-attachment-2416" class="wp-caption-text">ICRAC member Dr. Matthew Bolton, presenting a statement on disarmament at the UN General Assembly’s First Committee on Tuesday. Photo by Shant Alexander for Control Arms.</p></div>
<p><i>On behalf of global civil society organizations, <a href="http://icrac.net" target="_blank">International Committee for Robot Arms Control </a>member Matthew Bolton calls for disarmament and arms control “driven by the needs and rights of people most affected by armed violence.” The <a href="http://www.stopkillerrobots.org/" target="_blank">Campaign to Stop Killer Robots </a>also spoke, calling for fully autonomous weapons to </i><em>“be prohibited through an international treaty, as well as through national laws and other measures.” To watch <a href="http://new.livestream.com/accounts/5796840/NGOSpeeches?cat=event&amp;query=control" target="_blank">video footage of the NGO speeches, click here.</a></em></p>
<p>Dr. Matthew Bolton, a member of the International Committee for Robot Arms Control (ICRAC),  addressed the <a href="http://www.un.org/en/ga/first/">United Nations General Assembly First Committee</a> Tuesday afternoon, on behalf of <a href="http://www.article36.org/" target="_blank">Article 36</a> and other international non-governmental organizations (NGOs) working on disarmament, peacebuilding and humanitarian issues.</p>
<p>“We call for an approach to disarmament that is driven by the needs and rights of people most affected by armed violence, not by the discretion of states and organizations most responsible for it,” said Dr. Bolton to representatives of the 193 UN member states, as well as UN agencies and NGOs. The First Committee has responsibility for disarmament and international security.</p>
<p>The <a href="http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com13/statements/29Oct_NGO-ways-of-work.pdf">NGO statement</a>, read by Dr. Bolton and endorsed by 11 organizations, congratulated states on “some noteworthy progress” in <a href="http://www.reachingcriticalwill.org/disarmament-fora/others/hlm-nuclear-disarmament">recent international discussions on the elimination of nuclear weapons</a>, the <a href="http://www.un.org/News/Press/docs/2013/sc11131.doc.htm">recent Security Council resolution on small arms and light weapons</a> as well as the <a href="http://www.un.org/disarmament/ATT/">Arms Trade Treaty</a>, signed by over 100 states since June.</p>
<p>Despite these developments in global policy making on controlling weapons, however, Dr. Bolton asserted that “now is not the time for resting on laurels.” The NGO statement identified numerous concerns, including the abuse of the consensus rule in disarmament forums, exclusion of meaningful civil society participation, lack of equal opportunities for women in decisionmaking and the marginalization of the voices of victims and survivors of armed violence.</p>
<p>“Creativity and new human-centered approaches must be a requirement for all states advocating nuclear disarmament, conventional arms control and reduced military expenditure,” said Dr. Bolton, reading the NGO statement. “We can and must replace stalemate and watered-down outcomes with alternatives that advance human security and social and economic justice.”</p>
<p>The Campaign to Stop Killer Robots also <a href="http://www.stopkillerrobots.org/wp-content/uploads/2013/10/KRC_StatementUNGA1_29Oct2013_delivered.pdf" target="_blank">delivered a statement </a>in the same session, calling for a prohibition on fully autonomous weapons.</p>
<p>“Our campaign believes that human control is essential to ensure the protection of civilians and to ensure compliance with international law,” said Mary Wareham of <a href="http://www.hrw.org/" target="_blank">Human Rights Watch</a>, delivering the statement on behalf of the campaign. “We seek a comprehensive and preemptive ban on weapons systems that would be able to select and attack targets without meaningful human intervention. These fully autonomous weapons or ‘lethal autonomous robots’ must be prohibited through an international treaty, as well as through national laws and other measures.”</p>
<p>Dr. Bolton is an expert on global disarmament policy and assistant professor of political science at <a href="http://pace.edu" target="_blank">Pace University</a>. He is author of <a href="http://us.macmillan.com/foreignaidandlandmineclearance/MatthewBolton"><i>Foreign Aid and Landmine Clearance: Governance, Politics and Security in Afghanistan, Bosnia and Sudan</i></a> (I.B. Tauris, 2010) and a forthcoming travelogue <a href="http://www.ibtauris.com/Books/Society%20%20social%20sciences/Politics%20%20government/International%20relations/Arms%20negotiation%20%20control/Political%20Minefields%20The%20Hidden%20Agendas%20Behind%20Clearing%20the%20Worlds%20Landmines.aspx?menuitem=%7BF66D6451-D7DF-403F-A234-82FAE9B3F795%7D"><i>Political Minefields</i></a> (I.B. Tauris, 2014). He has written widely on the politics of <a href="http://www.theguardian.com/commentisfree/cifamerica/2009/nov/26/obama-landmine-ban-treaty">landmines</a>, <a href="http://www.theguardian.com/commentisfree/2008/dec/06/armstrade-obama-white-house">cluster munitions</a>, <a href="http://thehill.com/blogs/congress-blog/foreign-policy/293325-arms-trade-treaty-keeping-weapons-from-terrorists-and-human-rights-abusers">the Arms Trade Treaty</a> and <a href="http://thehill.com/blogs/congress-blog/technology/295807-us-must-impost-moratorium-and-seek-global-ban-on-killer-robots">fully autonomous military robotics</a> (“killer robots”). He recently co-<a title="Futureproofing Is Never Complete: Ensuring the Arms Trade Treaty Keeps Pace with New Weapons Technology" href="http://icrac.net/2013/10/futureproofing-is-never-complete-ensuring-the-arms-trade-treaty-keeps-pace-with-new-weapons-technology/">authored an ICRAC Working Paper </a>on regulating robotic weapons with the Arms Trade Treaty.</p>
<p><a href="http://icrac.net/who/">ICRAC</a> is an international committee of experts in robotics technology, robot ethics, international relations, international security, arms control, international humanitarian law, human rights law, and public campaigns, concerned about the pressing dangers that military robots pose to peace and international security and to civilians in war.</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2413</post-id>	</item>
		<item>
		<title>The Politics of Killer Robots: Experts Consider Political, Legal and Ethical Implications of Drones and Other Robotic Weapons at Pace University Symposium</title>
		<link>https://www.icrac.net/the-politics-of-killer-robots-experts-consider-political-legal-and-ethical-implications-of-drones-and-other-robotic-weapons-at-pace-university-symposium/</link>
		
		<dc:creator><![CDATA[mbolton]]></dc:creator>
		<pubDate>Fri, 30 Aug 2013 19:37:55 +0000</pubDate>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[YouTube video]]></category>
		<guid isPermaLink="false">http://www.icrac.net.php53-3.dfw1-2.websitetestlink.com/?p=2108</guid>

					<description><![CDATA[Not all conduct is justified in war. Centuries of tradition – from religious texts to chivalry and honor codes to modern international humanitarian and human rights law – have limited what weapons armed groups can use, who and what they can target and where and when they may fight. Each new innovation in military technologies and techniques [&#8230;]<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></description>
										<content:encoded><![CDATA[<p>Not all conduct is justified in war. Centuries of tradition – from religious texts to chivalry and honor codes to modern <a title="War and International Laq" href="http://www.icrc.org/eng/war-and-law/index.jsp" target="_blank" rel="noopener noreferrer">international humanitarian and human rights law </a>– have limited what weapons armed groups can use, who and what they can target and where and when they may fight. Each new innovation in military technologies and techniques strains the old limits and prompts new conversations about the norms of war.</p>
<p>Today, there is vigorous debate among scholars, activists, soldiers and journalists about how to govern the use of robotic weapons systems like aerial drones, in order to limit civilian casualties and avoid undermining global regulations on the use of violence.</p>
<p>On 5 June, scholars from <a title="Pace University New York City Political Science" href="http://www.pace.edu/dyson/academic-departments-and-programs/political-science/" target="_blank" rel="noopener noreferrer">Pace University </a>and beyond joined this debate with a ‘Robotic Weapons Control Symposium’ at the Downtown Campus, organized by the Political Science department in the<a href="http://www.pace.edu/dyson" target="_blank" rel="noopener noreferrer">Dyson College of Arts and Sciences</a>, funded by a $7,500 Thinkfinity Grant from <a title="Verizon Foundation" href="http://www.verizonfoundation.org/" target="_blank" rel="noopener noreferrer">Verizon Foundation</a>, in partnership with <a href="http://www.pace.edu/ctlt/" target="_blank" rel="noopener noreferrer">Pace University’s Center for Teaching Learning and Technology</a>.</p>
<p>“We are on the cusp of a technological revolution in the way wars are waged,” said <a title="Matthew Bolton Faculty Profile" href="http://www.pace.edu/dyson/academic-departments-and-programs/political-science/faculty/matthew-bolton" target="_blank" rel="noopener noreferrer">Dr. Matthew Bolton, Assistant Professor of Political Science at Pace University </a>and organizer of the symposium. “Intellectuals have a responsibility to contribute to conversations about how we will reinterpret and renew traditional constraints on killing for a digital age.”<img data-recalc-dims="1" decoding="async" title="More..." src="https://i0.wp.com/politicalminefields.wordpress.com/wp-includes/js/tinymce/plugins/wordpress/img/trans.gif" alt="" /></p>
<p>The conference drew together around 20 thinkers from a variety of disciplines, including robotics, computer science, political science, philosophy, physics and the law. Presenters included a mix of Pace University professors and students from various schools and departments, as well as participants from the <a href="http://www.newschool.edu/public-engagement/school-of-media-studies/" target="_blank" rel="noopener noreferrer">New School</a>, <a href="http://www.nyu.edu/" target="_blank" rel="noopener noreferrer">NYU</a>, <a href="http://www.princeton.edu/sgs/" target="_blank" rel="noopener noreferrer">Princeton</a>, <a href="http://www.ri.cmu.edu/" target="_blank" rel="noopener noreferrer">Carnegie Mellon</a>, <a href="http://www.du.edu/korbel/" target="_blank" rel="noopener noreferrer">University of Denver</a>, <a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/" target="_blank" rel="noopener noreferrer">Sheffield University in the UK</a>and <a href="http://www.mind-and-brain.de/home/" target="_blank" rel="noopener noreferrer">Berlin School of Mind and Brain, Humboldt-Universität zu Berlin </a>in Germany.</p>
<p>Participants considered three key questions:</p>
<ul>
<li><i>What are the ways that robotics is changing weapons technology?</i></li>
<li><i>What will be the military, political, ethical and humanitarian impacts of the growing roboticization of warfare?</i></li>
<li><i>What, if any, legal and normative restrictions should be placed on armed robots?</i></li>
</ul>
<p><a title="2013 Robotic Weapons Control Symposium at Pace University" href="http://www.youtube.com/playlist?list=PL5_KbrpCavYPIPmcELPChx70J2HIGIFlA" target="_blank" rel="noopener noreferrer">To view videos of several of the lectures from this event, click here.</a></p>
<p>The conference started with an introduction to robotic ethics by Dr. <a href="http://www.cs.cmu.edu/~illah/" target="_blank" rel="noopener noreferrer">Illah Nourbakhsh</a>,  Professor of Robotics at Carnegie Mellon University. Dr. Nourbakhsh argued that the disciplines of computer science, robotics and artificial intelligence needed to think carefully about social responsibility and the implications of their research, particularly in the context of weapons development.</p>
<p>“We, the practitioners of robotics need to embrace ethical analysis if we are to understand the consequences of our research decisions,” said Dr. Nourbakhsh, author of the critically-acclaimed book<a href="http://mitpress.mit.edu/books/robot-futures-0" target="_blank" rel="noopener noreferrer"><i>Robot Futures</i></a>. “The technology of robotics has progress so rapidly in the last decade that researchers’ innovations are emerging in the battlefield, hospital and home without proper strategic thinking about the ethical consequences of robotic design and downstream impact.”</p>
<p>He was followed by a several presenters looking at different ethical questions in the context of using robotics weapons in war. Dr. <a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/" target="_blank" rel="noopener noreferrer">Noel Sharkey</a>, professor of artificial intelligence and robots, and professor of public engagement at the University of Sheffield, argued for a ban on the emerging class of fully-autonomous armed robots – ‘killer robots’ – which select and fire upon targets without any direct human control.</p>
<p>“The continued automation of killing will have disastrous consequences for humanity and we should stop it now,” said Dr. Sharkey, chair of the <a href="http://icrac.net/" target="_blank" rel="noopener noreferrer">International Committee for Robot Arms Control </a>(ICRAC), one of the organizations leading the new <a href="http://www.stopkillerrobots.org/" target="_blank" rel="noopener noreferrer">Campaign to Stop Killer Robots</a>.</p>
<p>“Robots cannot comply with the Laws of War in discriminating civilians from combatants and they lack the reasoning necessary to make judgments about the proportional use of force. Unlike a human, they cannot be held accountable for their errors. We must not cross the line where machines are delegated the decision to kill humans. To do otherwise is both morally repugnant and unthinkable.”</p>
<p>The implications of fully autonomous weapons were discussed in depth with presentations by Dr. <a href="http://www.peterasaro.org/" target="_blank" rel="noopener noreferrer">Peter Asaro </a>of the New School, Dr. <a href="http://www.linkedin.com/pub/michal-klincewicz/40/753/37b" target="_blank" rel="noopener noreferrer">Micha? Klincewicz </a>of Humboldt University, <a href="http://csis.pace.edu/~benjamin/" target="_blank" rel="noopener noreferrer">Dr. D. Paul Benjamin </a>of Pace University’s Seidenberg School of Computer Science and <a href="http://www.google.com/url?sa=t&amp;rct=j&amp;q=cayman%20mitchell%20pace&amp;source=web&amp;cd=3&amp;cad=rja&amp;ved=0CDMQFjAC&amp;url=http%3A%2F%2Fwww.linkedin.com%2Fpub%2Fcayman-mitchell%2F70%2F121%2Faab&amp;ei=HUcXUqSZMqfC4AP9_YAw&amp;usg=AFQjCNHEHD3WLBT2nzvKXrOpeuEK-eIvAg&amp;bvm=bv.51156542,d.dmg" target="_blank" rel="noopener noreferrer">Cayman Mitchell </a>‘14, a Pace University undergraduate.</p>
<p>“Preparing for and participating in this symposium was a highlight of my undergraduate career at Pace,” said Mitchell, an honors student who studies computer science and peace and justice studies on the New York City campus. “The interdisciplinary nature of the conference was exciting because it allowed me to listen to and converse with scholars from many different fields, almost all of whom overlapped to some extent with my diverse academic interests.”</p>
<p>In the afternoon, participants began to explore the legal and political ramifications of the roboticization of war. <a href="http://www.law.pace.edu/faculty/thomas-m-mcdonnell" target="_blank" rel="noopener noreferrer">Professor Thomas McDonnell </a>of the <a href="http://www.law.pace.edu/" target="_blank" rel="noopener noreferrer">Pace University Law School </a>considered the legalities of the US use of armed drones to attack targets in Pakistan, Yemen and Somalia, while<a href="http://appsrv.pace.edu/lubin/faculty/departments/showFacultyDetail.cfm?Name=Robert%20Wiener" target="_blank" rel="noopener noreferrer">Professor Robert Wiener</a> of the <a href="http://pace.edu/lubin/" target="_blank" rel="noopener noreferrer">Lubin School of Business </a>offered reflections on how Jewish law and the Golem myth might offer insights into the regulation of ‘unmanned’ weapons.</p>
<p>“Today’s conference was a unique and exciting opportunity to learn from a breadth of scholars engaging in this important and timely issue.  It is reassuring to see so many from such diverse fields coming together on this” said <a href="http://socsci.colorado.edu/~roff/Site/Welcome.html" target="_blank" rel="noopener noreferrer">Dr. Heather M. Roff</a>, Visiting Associate Professor at the University of Denver and Research Associate at the <a href="http://www.usafa.af.mil/" target="_blank" rel="noopener noreferrer">United States Air Force Academy</a>. “The moral, political and legal questions pertaining to the use of autonomous machines in war are extremely pressing. This symposium was extremely interesting in this respect, as it engaged each type of question from such bright scholars.”</p>
<p>In her presentation, Dr. Roff offered insights into how robotic weapons are reshaping the political structure of decisionmaking about military strategy. She was followed by recent Pace graduate<a href="http://www.google.com/url?sa=t&amp;rct=j&amp;q=cassandra%20stimpson%20&amp;source=web&amp;cd=6&amp;cad=rja&amp;ved=0CDsQFjAF&amp;url=http%3A%2F%2Fwww.linkedin.com%2Fpub%2Fcassandra-stimpson%2F50%2F751%2F771&amp;ei=30YXUs3UErGs4AOo7YCgDg&amp;usg=AFQjCNF9MkXkS9qm4N0KgCPBu6ofpuM1LA&amp;bvm=bv.51156542,d.dmg">Cassandra Stimpson </a>‘13, who used scapegoating theory to explain the socio-political functions of extrajudicial killings by armed drones.</p>
<p>“The symposium was a rapid exchange of ideas from some of the most educated and passionate on the subject of autonomous and drone warfare. I was able to contribute overarching themes in my presentation about the general inefficiency and danger of targeted killing in a technological age where this act can be done with such ease.” said Stimpson, an honors political science major and peace and justice studies minor. “To be able to interact and present ideas on a new scholarly plane was vital to my growth, and the type of event I want to continue to contribute to in the future.”</p>
<p>The conference concluded with an open discussion about how to take the conversation forward and communicate with the broader public, governments and armed groups about the urgency of reinvigorating humanitarian and human rights norms in digitized warfare.</p>
<p>“We are very grateful to <a href="http://www.verizonfoundation.org/" target="_blank" rel="noopener noreferrer">Verizon Foundation </a>and the <a href="http://www.pace.edu/ctlt/" target="_blank" rel="noopener noreferrer">Center for Teaching, Learning and Technology </a>for supporting this crucial conversation, all the participants for their thought-provoking contributions and, in particular, the work of Cayman and Cassandra in arranging logistics for the conference,” said Dr. Bolton.</p>
<p>The <a href="http://www.pace.edu/dyson/academic-departments-and-programs/political-science/" target="_blank" rel="noopener noreferrer">Pace University Political Science Department </a>is home to one of the most popular and fastest growing majors on the New York City campus. Capitalizing on its location in the heart of the Financial District, opposite City Hall and two express subway stops from the United Nations, the department encourages students to reflect on how power works at various levels of government and society in addressing to major local, national and global challenges. Political Science faculty and students apply their lessons outside the classroom in international policymaking processes, including on peace, security, disarmament, human rights and humanitarian issues.</p>
<p>[Reposted from <a href="http://politicalminefields.com/2013/08/23/the-politics-of-killer-robots/">http://politicalminefields.com/2013/08/23/the-politics-of-killer-robots/</a>]</p>
<h3>Author information</h3><div class="ts-fab-wrapper" style="overflow:hidden"><div class="ts-fab-photo" style="float:left;width:64px"><img alt='mbolton' src='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=64&#038;d=retro&#038;r=g' srcset='https://secure.gravatar.com/avatar/a830bf59e0364ba33f24fb19a6c29ea5bb8c95259e3ffa70dcbad0d35df1b295?s=128&#038;d=retro&#038;r=g 2x' class='avatar avatar-64 photo' height='64' width='64' loading='lazy' decoding='async'/></div><!-- /.ts-fab-photo --><div class="ts-fab-text" style="margin-left:74px"><div class="ts-fab-header"><div style="font-size: 1.25em;margin-bottom:0"><strong><a href="http://matthewbreaybolton.com">mbolton</a></strong></div></div><!-- /.ts-fab-header --><div class="ts-fab-content" style="margin-bottom:0.5em">Matthew Bolton is professor of political science at Pace University in New York City. He is an expert on global peace and security policy, focusing on multilateral disarmament and arms control policymaking processes. He has a PhD in Government and Master's in Development Studies from the London School of Economics and a Master's from SUNY Environmental Science and Forestry. Since 2014, Bolton has worked on the UN and New York City advocacy of the International Campaign to Abolish Nuclear Weapons (ICAN), recipient of the 2017 Nobel Peace Prize. Bolton has published six books, including Political Minefields (I.B. Tauris) and Imagining Disarmament, Enchanting International Relations (Palgrave Pivot).</div><div class="ts-fab-footer"></div><!-- /.ts-fab-footer --></div><!-- /.ts-fab-text --></div><!-- /.ts-fab-wrapper -->]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2108</post-id>	</item>
	</channel>
</rss>
