Abstract

Much attention has been paid during the last couple of years to the emergence of autonomous weapons systems (AWS), weapon systems that allow computers, as opposed to human beings, to have increased control over decisions to use force. These discussions have largely centered on the use of such systems in armed conflict. However, it is increasingly clear that AWS are also becoming available for use in domestic law enforcement. This article explores the implications of international human rights law for this development. There are even stronger reasons to be concerned about the use of fully autonomous weapons systems—AWS without meaningful human control—in law enforcement than in armed conflict. Police officers—unlike their military counterparts—have a duty to protect the public. Moreover the judgments that are involved in the use of force under human rights standards require more personal involvement that those in the conduct of hostilities. Particularly problematic is the potential impact of fully autonomous weapons on the rights to bodily integrity (such as the right to life) and the right to dignity. Where meaningful human control is retained, machine autonomy can enhance human autonomy, but at the same time this means, higher standards of responsibility about the use of force should be applied because there is a higher level of human control. However, fully autonomous weapons entail no meaningful human control and, as a result, such weapons should have no role to play in law enforcement.

pdf

Share