Back to News & Commentary

Robot Police Dogs are Here. Should We be Worried?

A semi-autonomous robot dog walking on a sidewalk.
The deployment of advanced technologies like robots often happens faster than our legal systems can adjust. We need clear policies, transparency, and democratic debate and input to ensure these technologies do not threaten civil liberties.
A semi-autonomous robot dog walking on a sidewalk.
Jay Stanley,
Senior Policy Analyst,
老澳门开奖结果 Speech, Privacy, and Technology Project
Share This Page
March 2, 2021

The New York Police Department is receiving a lot of attention for testing robot 鈥渄ogs,鈥 which it has deployed in several situations, including to deliver food in a hostage situation and to scout out a location where the police feared a dangerous gunman might be lurking. The state police in Massachusetts have also with these robots, as our 老澳门开奖结果 colleagues there uncovered, and police in have acquired one. What are we to think of these robots from a civil liberties perspective?

There鈥檚 definitely something spooky about all the robots made by Boston Robotics, a company that has become famous for videos of its increasingly agile humanoid and animal-oid robots. Add to that spookiness an awareness of our nation鈥檚 levels of police violence, racism, brutality, and unnecessary killings 鈥 and the shameful state of the law that enables those abuses 鈥 and this robot police dog, dubbed 鈥淒igidog,鈥 definitely evokes some primal fears.

But behind that visceral uneasiness are a number of very real issues.

One of the things that makes these robots so unnerving is that everybody implicitly understands that the possibility of weaponizing them will continue to hang out there like a tempting forbidden fruit for law enforcement. (We鈥檝e written in more detail elsewhere about the concerns with weaponized police robots.) There is also the fear that they could evolve from a remote-controlled tool to an autonomous decision-maker that makes actual law enforcement decisions of some kind. That would bring up the many issues around bias and inaccuracy in AI decision-making. The ultimate nightmare, of course, would be robots that are armed and made autonomous.

Those kinds of futuristic concerns give us good reason to feel uneasy about these robots. But there are other more immediate concerns around the costs and benefits of these high-tech devices. Anybody can come up with a Hollywood scenario where a new technology saves the day. But the real questions are how frequently such scenarios come up, how dangerous are they, how effective is the solution, how expensive is it, what negative side effects might its adoption bring, and are there alternative, less invasive solutions that would work just as well?

Communities should ask those questions. Viewed narrowly, there鈥檚 nothing wrong with using a robot to scout a dangerous location or deliver food to hostages. But communities should take a hard look at expensive, rare-use technologies at a time when the nation is increasingly recognizing the need to invest in solving our social problems in better ways than just empowering police.

Finally, full transparency and meaningful limitations are crucial. When a powerful new surveillance or other police technology is introduced 鈥 especially one that is probably flexible enough to be used in many ways, good and bad, that we haven鈥檛 even thought of 鈥 it鈥檚 important that there be public conversations every step of the way. It鈥檚 vital for the public to know the answer to questions such as: What are the plans and policies around how it will be used? How does it actually end up being used? And none of those conversations can happen without transparency and independent oversight.

One of the ways communities can make sure that happens is by joining the growing list of cities that have passed 鈥Community Control Over Police Surveillance鈥 (CCOPS) legislation, which requires police to be transparent about their use of surveillance technologies and ensures that community members, through their city council representatives, are empowered to decide if and how such technologies are used.

Last year, New York City joined the cities that have passed that kind of legislation with its Public Oversight of Surveillance Technology (POST) Act. Unfortunately the information that the department has released about Digidog has been . Instead of publishing detailed policies for the robot dog, for example, the NYPD swept the technology into an overbroad category with other camera technologies. The is vague about when the technology can be used, lacks deployment oversight and documentation requirements, and contains weak data protection and training sections. The health and safety sections completely disregard Boston Dynamics鈥 own of a minimum two-meter distance to the robot due to the risk that 鈥渇ingers may break or get amputated if caught in joints while Spot鈥檚 motors are active.鈥 And the policy even makes false statements about the machine learning and video analytics capabilities of the Digidog.

The deployment of advanced technologies like robots all too often happens faster than our social, political, and legal systems can adjust. This kind of robotics technology threatens to veer off in all manner of spooky directions; clear and forthright policies, overall transparency, and democratic debate and input are vital.

 

Learn More 老澳门开奖结果 the Issues on This Page