https://www.eff.org/deeplinks/2021/07/dont-let-police-arm-autonomous-or-remote-controlled-robots-and-drones Skip to main content * About + Contact + Press + People + Opportunities + EFF 30th Anniversary * Issues + Free Speech + Privacy + Creativity and Innovation + Transparency + International + Security * Our Work + Deeplinks Blog + Press Releases + Events + Legal Cases + Whitepapers * Take Action + Action Center + Electronic Frontier Alliance + Volunteer * Tools + Privacy Badger + HTTPS Everywhere + Surveillance Self-Defense + Certbot + Atlas of Surveillance + Cover Your Tracks + Crocodile Hunter * Donate + Donate to EFF + Shop + Other Ways to Give + Membership FAQ * Donate + Donate to EFF + Shop + Other Ways to Give * Search form Search [ ] --------------------------------------------------------------------- Email updates on news, actions, and events in your area. Join EFF Lists * Copyright (CC BY) * Trademark * Privacy Policy * Thanks Electronic Frontier Foundation Donate [summer-sit]Greetings from the Internet [summer-sit] Electronic Frontier Foundation * About + Contact + Press + People + Opportunities + EFF 30th Anniversary * Issues + Free Speech + Privacy + Creativity and Innovation + Transparency + International + Security * Our Work + Deeplinks Blog + Press Releases + Events + Legal Cases + Whitepapers * Take Action + Action Center + Electronic Frontier Alliance + Volunteer * Tools + Privacy Badger + HTTPS Everywhere + Surveillance Self-Defense + Certbot + Atlas of Surveillance + Cover Your Tracks + Crocodile Hunter * Donate + Donate to EFF + Shop + Other Ways to Give + Membership FAQ * Donate + Donate to EFF + Shop + Other Ways to Give * Search form Search [ ] Don't Let Police Arm Autonomous or Remote-Controlled Robots and Drones DEEPLINKS BLOG By Matthew Guariglia July 16, 2021 [drone-bann] Don't Let Police Arm Autonomous or Remote-Controlled Robots and Drones Share It Share on Twitter Share on Facebook Copy link [drone-banner_0] It's no longer science fiction or unreasonable paranoia. Now, it needs to be said: No, police must not be arming land-based robots or aerial drones. That's true whether these mobile devices are remote controlled by a person or autonomously controlled by artificial intelligence, and whether the weapons are maximally lethal (like bullets) or less lethal (like tear gas). Police currently deploy many different kinds of moving and task-performing technologies. These include flying drones, remote control bomb-defusing robots, and autonomous patrol robots. While these different devices serve different functions and operate differently, none of them--absolutely none of them--should be armed with any kind of weapon. Mission creep is very real. Time and time again, technologies given to police to use only in the most extreme circumstances make their way onto streets during protests or to respond to petty crime. For example, cell site simulators (often called "Stingrays") were developed for use in foreign battlefields, brought home in the name of fighting "terrorism," then used by law enforcement to catch immigrants and a man who stole $57 worth of food. Likewise, police have targeted BLM protesters with face surveillance and Amazon Ring doorbell cameras. Today, scientists are developing an AI-enhanced autonomous drone, designed to find people during natural disasters by locating their screams. How long until police use this technology to find protesters shouting chants? What if these autonomous drones were armed? We need a clear red line now: no armed police drones, period. The Threat is Real There are already law enforcement robots and drones of all shapes, sizes, and levels of autonomy patrolling the United States as we speak. From autonomous Knightscope robots prowling for "suspicious behavior" and collecting images of license plates and phone identifying information, to Boston Dynamic robotic dogs accompanying police on calls in New York or checking the temperature of unhoused people in Honolulu, to predator surveillance drones flying over BLM protests in Minneapolis. We are moving quickly towards arming such robots and letting autonomous artificial intelligence determine whether or not to pull the trigger. According to a Wired report earlier this year, the U.S. Defense Advanced Research Projects Agency (DARPA) in 2020 hosted a test of autonomous robots to see how quickly they could react in a combat simulation and how much human guidance they would need. News of this test comes only weeks after the federal government's National Security Commission on Artificial Intelligence recommended the United States not sign international agreements banning autonomous weapons. "It is neither feasible nor currently in the interests of the United States," asserts the report, "to pursue a global prohibition of AI-enabled and autonomous weapon systems." In 2020, the Turkish military deployed Kargu, a fully autonomous armed drone, to hunt down and attack Libyan battlefield adversaries. Autonomous armed drones have also been deployed (though not necessarily used to attack people) by the Turkish military in Syria, and by the Azerbaijani military in Armenia. While we have yet to see autonomous armed robots or drones deployed in a domestic law enforcement context, wartime tools used abroad often find their way home. The U.S. government has become increasingly reliant on armed drones abroad. Many police departments seem to purchase every expensive new toy that hits the market. The Dallas police have already killed someone by strapping a bomb to a remote-controlled bomb-disarming robot. So activists, politicians, and technologists need to step in now, before it is too late. We cannot allow a time lag between the development of this technology and the creation of policies to let police buy, deploy, or use armed robots. Rather, we must ban police from arming robots, whether in the air or on the ground, whether automated or remotely-controlled, whether lethal or less lethal, and in any other yet unimagined configuration. No Autonomous Armed Police Robots Whether they're armed with a taser, a gun, or pepper spray, autonomous robots would make split-second decisions about taking a life, or inflicting serious injury, based on a set of computer programs. But police technologies malfunction all the time. For example, false positives are frequently generated by face recognition technology, audio gunshot detection, and automatic license plate readers. When this happens, the technology deploys armed police to a situation where they may not be needed, often leading to wrongful arrests and excessive force, especially against people of color erroneously identified as criminal suspects. If the malfunctioning police technology were armed and autonomous, that would create a far more dangerous situation for innocent civilians. When, inevitably, a robot unjustifiably injures or kills someone--who would be held responsible? Holding police accountable for wrongfully killing civilians is already hard enough. In the case of a bad automated decision, who gets held responsible? The person who wrote the algorithm? The police department that deployed the robot? Autonomous armed police robots might become one more way for police to skirt or redirect the blame for wrongdoing and avoid making any actual changes to how police function. Debate might bog down in whether to tweak the artificial intelligence guiding a killer robot's decision making. Further, technology deployed by police is usually created and maintained by private corporations. A transparent investigation into a wrongful killing by an autonomous machine might be blocked by assertions of the company's supposed need for trade secrecy in its proprietary technology, or by finger-pointing between police and the company. Meanwhile, nothing would be done to make people on the streets any safer. MIT Professor and cofounder of the Future of Life Institute Max Tegmark told Wired that AI weapons should be "stigmatized and banned like biological weapons." We agree. Although its mission is much more expansive than the concerns of this blog post, you can learn more about what activists have been doing around this issue by visiting the Campaign to Stop Killer Robots. No Remote-Controlled Armed Police Robots, Either Even where police have remote control over armed drones and robots, the grave dangers to human rights are far too great. Police routinely over-deploy powerful new technologies in already over-policed Black, Latinx, and immigrant communities. Police also use them too often as part of the United State's immigration enforcement regime, and to monitor protests and other First Amendment-protected activities. We can expect more of the same with any armed robots. Moreover, armed police robots would probably increase the frequency of excessive force against suspects and bystanders. A police officer on the scene generally will have better information about unfolding dangers and opportunities to de-escalate, compared to an officer miles away looking at a laptop screen. Moreover, a remote officer might have less empathy for the human target of mechanical violence. Further, hackers will inevitably try to commandeer armed police robots. They already have succeeded at taking control of police surveillance cameras. The last thing we need are foreign governments or organized criminals seizing command of armed police robots and aiming them at innocent people. Armed police robots are especially menacing at protests. The capabilities of police to conduct crowd control by force are already too great. Just look at how the New York City Police Department has had to pay out hundreds of thousands of dollars to settle a civil lawsuit concerning police using a Long Range Acoustic Device (LRAD) punitively against protestors. Police must never deploy taser-equipped robots or pepper spray spewing drones against a crowd. Armed robots would discourage people from attending protests. We must de-militarize our police, not further militarize them. We need a flat-out ban on armed police robots, even if their use might at first appear reasonable in uncommon circumstances. In Dallas in 2016, police strapped a bomb to an explosive-diffusing robot in order to kill a gunman hiding inside a parking garage who had already killed five police officers and shot seven others. Normalizing armed police robots poses too great a threat to the public to allow their use even in extenuating circumstances. Police have proven time and time again that technologies meant only for the most extreme circumstances inevitably become commonplace, even at protests. Conclusion Whether controlled by an artificial intelligence or a remote human operator, armed police robots and drones pose an unacceptable threat to civilians. It's exponentially harder to remove a technology from the hands of police than prevent it from being purchased and deployed in the first place. That's why now is the time to push for legislation to ban police deployment of these technologies. The ongoing revolution in the field of robotics requires us to act now to prevent a new era of police violence. Related Issues Privacy Tags Street Level Surveillance Drones Share It Share on Twitter Share on Facebook Copy link Join EFF Lists Join Our Newsletter! Email updates on news, actions, events in your area, and more. Email Address [ ] Postal Code (optional) [ ] Anti-spam question: Enter the three-letter abbreviation for Electronic Frontier Foundation: [ ] Don't fill out this field (required) [ ] [Submit] Thanks, you're awesome! Please check your email for a confirmation link. Oops something is broken right now, please try again later. Related Updates The angular outline of three faces as a computer might see them, colored like a rainbow Deeplinks Blog by Adam Schwartz | July 13, 2021 Clearview's Face Surveillance Still Has No First Amendment Defense Clearview AI extracts faceprints from billions of people, without their consent, and uses these faceprints to help police identify suspects. This does grave harm to privacy, free speech, information security, and racial justice. It also violates the Illinois Biometric Information Privacy Act (BIPA), which prohibits a company from... [eff-pr-og] Press Release | July 7, 2021 EFF Gets $300,000 Boost from Craig Newmark Philanthropies to Protect Journalists and Fight Consumer Spyware San Francisco - The Electronic Frontier Foundation (EFF) is proud to announce its latest grant from Craig Newmark Philanthropies: $300,000 to help protect journalists and fight consumer spyware."This donation will help us to develop tools and training for both working journalists and student journalists, preparing them to protect themselves... 3 icons against a blue and gold background: DNA, Border agent, and thumbprint Deeplinks Blog by Saira Hussain | June 30, 2021 Victory! Biden Administration Rescinds Dangerous DHS Proposed Rule to Expand Biometrics Collection Marking a big win for the privacy and civil liberties of immigrant communities, the Biden Administration recently rescinded a Trump-era proposed rule that would have massively expanded the collection of biometrics from people applying for an immigration benefit. Introduced in September 2020, the U.S. Department of Homeland Security (DHS)... [NSA-eagle-2_0] Deeplinks Blog by Matthew Guariglia, Cindy Cohn | June 30, 2021 PCLOB "Book Report" Fails to Investigate or Tell the Public the Truth About Domestic Mass Surveillance The Privacy and Civil Liberties Oversight Board (PCLOB) has concluded its six-year investigation into Executive Order 12333, one of the most sprawling and influential authorities that enables the U.S. government's mass surveillance programs. The result is a bland, short summary of a classified report, as well as a justified,... Supreme Court Deeplinks Blog by Cindy Cohn | June 28, 2021 Supreme Court Says You Can't Sue the Corporation that Wrongly Marked You A Terrorist In a 5-4 decision, the Supreme Court late last week barred the courthouse door to thousands of people who were wrongly marked as "potential terrorists" by credit giant TransUnion. The Court's analysis of their "standing" --whether they were sufficiently injured to file a lawsuit--reflects a naive view of the... The angular outline of three faces as a computer might see them, colored like a rainbow Deeplinks Blog by Matthew Guariglia | June 24, 2021 Now Is The Time: Tell Congress to Ban Federal Use of Face Recognition Cities and states across the country have banned government use of face surveillance technology, and many more are weighing proposals to do so. From Boston to San Francisco, New Orleans to Minneapolis, elected officials and activists know that face surveillance gives police the power to track... Image of a skyline of buildings with antennas on them. Deeplinks Blog by Jon Callas | June 22, 2021 Understanding Amazon Sidewalk Just before the long weekend at the end of May, Amazon announced the release of their Sidewalk mesh network. There are many misconceptions about what it is and what it does, so this article will untangle some of the confusion.It Isn't Internet SharingMuch of the press about Amazon Sidewalk... The shadow of a police officer looms in front of a Ring device on a closed door. Deeplinks Blog by Matthew Guariglia, Karen Gullo | June 17, 2021 Emails from 2016 Show Amazon Ring's Hold on the LAPD Through Camera Giveaways In March 2016, "smart" doorbell camera maker Ring was a growing company attempting to market its wireless smart security camera when it received an email from an officer in the Los Angeles Police Department (LAPD) Gang and Narcotics Division, who was interested in purchasing a slew of devices.The Los... [interoperable_plum] Deeplinks Blog by Cory Doctorow | June 11, 2021 The GDPR, Privacy and Monopoly In Privacy Without Monopoly: Data Protection and Interoperability, we took a thorough look at the privacy implications of various kinds of interoperability. We examined the potential privacy risks of interoperability mandates, such as those contemplated by 2020's ACCESS Act (USA), the Digital Services Act and Digital Markets Act (EU),... [medical-privacy] Deeplinks Blog by Lee Tien | June 10, 2021 Big Data Profits If We Deregulate HIPAA This blog post was written by Kenny Gutierrez, EFF Bridge Fellow. Recently proposed modifications to the federal Health Insurance Portability and Accountability Act (HIPAA) would invade your most personal and intimate health data. The Office of Civil Rights (OCR), which is part of the U.S. Department of Health... Join Our Newsletter! Email updates on news, actions, events in your area, and more. Email Address [ ] Postal Code (optional) [ ] Anti-spam question: Enter the three-letter abbreviation for Electronic Frontier Foundation: [ ] Don't fill out this field (required) [ ] [Submit] Thanks, you're awesome! Please check your email for a confirmation link. Oops something is broken right now, please try again later. Share It Share on Twitter Share on Facebook Copy link Related Issues Privacy Related Tags Street Level Surveillance Drones Back to top EFF Home Follow EFF: * twitter * facebook * instagram * youtube * flicker * rss Contact * General * Legal * Security * Membership * Press About * Calendar * Volunteer * Victories * History * Internships * Jobs * Staff * Diversity & Inclusion Issues * Free Speech * Privacy * Creativity & Innovation * Transparency * International * Security Updates * Blog * Press Releases * Events * Legal Cases * Whitepapers * EFFector Newsletter Press * Press Contact Donate * Join or Renew Membership Online * One-Time Donation Online * Shop * Other Ways to Give * Copyright (CC BY) * Trademark * Privacy Policy * Thanks JavaScript license information *