The Road to Geneva: ICRAC and the Campaign headed to CCW Expert Meeting

Posted on 24 April 2014 by Frank Sauer

In its 2009 mission statement, ICRAC urgently called upon the international community to commence a discussion about arms control in the realm of military robotics. Thanks to the work of NGOs forming the “Campaign to Stop Killer Robots” coalition, a coalition that is constantly growing since April 2013 and of which ICRAC is one of the founding members, this call has been heard. The issue of fully automous weapon systems has been picked up by the international community at the United Nations Convention on Conventional Weapons in Geneva. This CCW mandate is a huge success for ICRAC and the Campaign to Stop Killer Robots.

What exactly is the issue?

Fully autonomous weapon systems or “lethal autonomous robots” (LARs) are systems that “once activated, can select and engage targets without further intervention by a human operator”, according to the U.S. Department of Defense Directive 3000.09. There is widespread agreement within the pertinent scientific and expert communities that these systems (which are currently under development and for which as of now only precursors exist, most commonly as stationary or ship-based terminal-defense C-RAM systems) pose a set of new – potentially troubling – political, legal and ethical challenges, once developed and deployed.

In a nutshell, the three most important issues are (1) doubts about these systems’ capabilities to comply with key requirements of international humanitarian law, namely applying lethal force in a discriminatory and proportionate manner (i.e. that LARs might pose unnecessary harm to civilians in conflict) whilst ensuring human accountability; (2) concerns regarding global peace and security, i.e. that deploying LARs might make war more likely or accidentally trigger conflict escalation; (3) worries that fundamental principles of humanity might be infringed upon, i.e. that delegating the decision to kill a human to a machine – which is not accountable for its action in any meaningful legal or ethical sense – is simply wrong in itself (Gubrud 2014; Altmann 2013; Wagner 2013; Sharkey/Suchman 2013; Asaro 2012; Sauer/Schörnig 2012; StopKillerRobots).

Some scholars propose that the potential dangers currently discussed in connection to LARs can be reined in (1) by fitting LARs with an “ethical governor” (Arkin 2010) to regulate their lethal behavior on the battlefield; (2) by “enslaving” the machine to keep the chain of accountability intact (Lin et al. 2008); (3) by adapting existing legal norms to deal with the new, supposedly “inevitable” technology (Anderson/Waxman 2013).

However, among many other things, it remains wholly unclear (1) if and how a technical solution for a problem of international humanitarian law can ever be devised; (2) if a system that is autonomous and thus potentially “creative” in its decision-making can at the same time be enslaved, with humans accountable for its potentially unforeseeable actions; (3) if the technology is in fact as inevitable as some take it to be, with international law having to adapt – rather than the other way around.

It is worth noting that this current debate on LARs is not limited to scholarly or diplomatic discourse. First representative polling data from the U.S. suggest that a majority (55%) of the general population is in fact opposing their use. 40% of the U.S. population even “strongly oppose” autonomous weapon systems, only 26% “somewhat support” the notion that killing by algorithm is an acceptable course of action for the military. A majority (53%) favors a red line to be drawn instead, for instance by banning LARs via a binding international treaty (Carpenter 2013).

What happens in Geneva?

Against the background of these recent controversies and developments, on 15 November 2013, States Parties to the CCW adopted a report (CCW 2013: para 32) that includes a mandate to hold a four-day “Meeting of Experts” to discuss questions related to “lethal autonomous weapons systems” on May 13-16 2014 in Geneva, Switzerland; a report will then be discussed at the Conference of the States Parties in November 2014.

A possible outcome of this process could be a decision on a negotiation mandate that would, via negotiations in 2015, lead to a new CCW Protocol VI at the CCW Review conference in 2016, banning the development and deployment of LARs. ICRAC hopes for such a future in which robotics technology is limited to peaceful uses.

For now, however, ICRAC is pleased that its call for a serious international discussion has been heard. In record time the Campaign to Stop Killer Robots has managed to push the issue of LARs from the side-lines of debates on technology and warfare to the center-stage of United Nations arms control diplomacy. This is a fantastic success in itself. It is clear now that fully autonomous weapons are neither a sci-fi meme no-one needs to seriously worry about nor simply an inevitability we just have to live with. Quite the opposite, they are now at the center of attention of governments, militaries, scientists, NGOs and activists worldwide.

As part of the Campaign to Stop Killer Robots, numerous members of ICRAC will be present at the CCW in Geneva, many of them as speakers at the Expert Meeting sessions or at side-events. Our goal is to engage in a fruitful discussion with the States Parties, offering our combined expertise on robotics technology, robot ethics, international relations, international security, arms control, international humanitarian law and human rights law. Since the CCW inludes all the key countries involved in LARs research and development, ICRAC is looking forward to exchanging views and witnessing the political process unfolding in Geneva regarding future policies on LARs.

Categorized | ICRAC News

Leave a Reply