Robot police dogs are on patrol, but who’s holding the leash?

In late May, after months of debate, the Los Angeles City Council approved the donation of a four-legged, doglike robot to the nation’s third-largest police department. The approval was granted at a public meeting that was interrupted at times by shouting, applause, banners such as “No Robot Dogs,” and the ejection of disruptive protesters, according to The Los Angeles Times.

In the end, the council voted 8 to 4 to accept the nearly $280,000 in-kind gift from the Los Angeles Police Foundation of the robot manufactured by Boston Dynamics, a Massachusetts-based robotics firm that is the global leader in developing quadruped robots for policing and surveillance.

The Boston Dynamics model given to the LAPD — named “Spot” by its manufacturer — is roughly the size of a golden retriever, weighing about 70 pounds and standing about 2 feet tall when walking. The robot is designed to be either remote controlled or fully autonomous. It can climb stairs and open doors. The robot can be customized to detect hazardous substances like carbon monoxide or some combustible gases. The various payloads available include sensors, cameras, and microphones, and can be customized with thermal imaging, among other features.

The Los Angeles City Council’s move to accept the donation will require quarterly reports on the deployment and use of the robot. Its sign-off was necessary as a result of a recent state law — Assembly Bill 481 — that requires police departments to seek approval and outline use policies before acquiring military-grade hardware.

But apparently this is not the case in other cities, including New York, where the police department also announced plans this spring to deploy two Boston Dynamics robots, paid for with asset forfeiture funds, as well as a new surveillance robot known as the K5. A number of law enforcement agencies around the country are acquiring such robots with little transparency, critics say.

There does not appear to be any publicly available data on the overall number of robots deployed by law enforcement agencies in the United States. “I haven’t seen any numbers that speak to how many are out there,” said Howard Henderson, a criminologist based at Texas Southern University in Houston. There also appears to be little independent academic or scientific data on the effectiveness of these units.

There does not appear to be any publicly available data on the overall number of robots deployed by law enforcement agencies.

Boston Dynamics says there were more than 1,000 of its Spot robots operating in 35 countries in 2022, according to a blog post on the company’s website. In an email message, Renee Sieli, the head of a public relations firm that represents Boston Dynamics, told Undark that the company does not track use in police departments specifically, but that “a handful” of the robots are currently used for public safety purposes.

The lack of statistics is concerning, said Alondra Nelson, the Harold F. Linder professor of Social Science at the Institute for Advanced Study, and, until recently, the acting director of the White House Office of Science and Technology Policy for the Biden administration. “We don’t have any data” on the technology’s safety, effectiveness, failure rate, and human interactions, said Nelson. “Fundamentally, we just don’t have basic notice and explanation on how these things are being used.”

Ghost Robotics, a Philadelphia-based robotics firm that manufactures the Vision 60, a quadruped robot marketed towards military and homeland security use, among other applications, also did not reply to an email requesting usage data.

Surveillance and privacy researchers say there are few restrictions on robotic surveillance in many communities. There are also concerns that lower-income areas and people of color will be overpoliced and over-surveilled by the robots.

It’s crucial to establish guardrails when implementing new surveillance technologies because there is a tendency to “mission creep,” said Nicol Turner Lee, a sociologist who researches the intersections of technology, race, and policy at the Brookings Institution. “There is a propensity to improve those technologies for greater accuracy,” she said. “And in the criminal justice system, greater accuracy can almost always amount to higher levels of incarceration.”

There are also concerns about the potential for arming robots with Tasers and other weapons. Critics says this could eventually lay the groundwork for the armed and fully autonomous killing machines seen in dystopian movies and television shows such as “Black Mirror.” In recent months several viral videos have appeared to show robotic dogs armed with guns in Russia and China. (Undark has not independently verified the authenticity of the videos.)

There are also concerns about the potential for arming robots with Tasers and other weapons.

Boston Dynamics, which was recently acquired by the South Korean-based automotive company Hyundai, says it has a strict policy against the unauthorized weaponization of its robotic products and will deactivate any units that have been armed. Sieli, on behalf of Boston Dynamics, reiterated that policy in a statement emailed to Undark.

“Spot helps keep people out of harm’s way and aids first responders in assessing dangerous situations,” the statement said.

“Additionally,” the statement continued, “we want to point out that any attempted weaponization of Boston Dynamics’ robots is strictly prohibited, as clearly outlined in our Terms and Conditions, our ethical principles, as well as in an open letter against weaponization, which was spearheaded by Boston Dynamics and co-signed by five other leading robotics companies.”


Robots developed for policing duties were initially introduced for situations considered too dangerous for human intervention. The first bomb disposal robot, for example, was created by the British Army in 1972 to detonate suspected car bombs in Northern Ireland. But increasingly these robots are deployed in a wide variety of applications, from crowd control to capturing video and wireless surveillance.

Almost 1,000 robots of all types were in use by domestic law enforcement agencies in 2016, according to publicly available data analyzed by the Center for the Study of the Drone at Bard College, which ceased operating in 2020. The center noted that number was only “a partial accounting” and it’s believed to be much higher today, according to surveillance and criminal justice experts.

The exact number is unknown because law enforcement agencies may or may not self-report on robots and other emerging technologies, according to Matthew Guariglia, a senior policy analyst at the San Francisco-based Electronic Frontier Foundation and a specialist in the history of policing and surveillance.

The Los Angeles Police Department patrols an area of about 470 square miles inhabited by 4 million people. The department has been a pioneer in the use of emerging technologies for policing and surveillance from helicopters in the 1950s to drones and police body cameras in the last decade. The significant public opposition to the department’s drone acquisition in 2017 was similar to the more recent pushback on robots.

The department’s use policy for the Boston Dynamics robot, released in May, limits the device to specific tactical situations and prohibits facial recognition, weaponization, and the robots’ use during routine patrols. The department says it plans to use the robot “in the coming months” but did not respond to a request for information about when and where it has been deployed, nor a request for comment on critics’ concerns about its use.

In New York, the police department’s initial plan to deploy the Boston Dynamics robot it acquired in 2020 was cut short after the device was tasked to a Bronx home invasion and a Manhattan public housing project. Critics at the time described “the device as emblematic of how overly aggressive the police can be when dealing with poor communities,” according to The New York Times.

The Honolulu Police Department, meanwhile, was criticized for deploying the robot during the Covid-19 pandemic in 2021 to check the temperatures of unhoused people living in a tent camp. That case was particularly concerning because it displayed a lack of care to people facing “profound vulnerability,” said Nelson of the Institute for Advanced Study, whose research focuses on the intersection of science, technology, politics, and race.

“By that time, we knew enough about the pandemic to have basic precautions to actually send human beings” to engage unhoused people around their health care, said Nelson.

Special weapons and tactical teams have also tasked the Boston Dynamics robots to confront barricaded suspects in cities such as Houston and St. Petersburg, Florida. Police officials in metropolitan Detroit have also been considering acquiring dog-like robots.

Looking abroad, police departments have deployed the units in Australia and the Netherlands, among other locations. The Boston Dynamics robot was also positioned by authorities in Singapore to encourage — or enforce — social distancing at the height of the pandemic.


Both the Boston Dynamics and Ghost Robotics robots can climb stairs and navigate uneven terrain. Their maximum speeds are about 3.5 and 7 miles per hour, respectively.

The robots can follow a pre-determined route or be controlled remotely with an application on a tablet. The Boston Dynamics model — popular for policing and surveillance — can also open doors with an arm attachment. The all-weather, amphibious Ghost Robotics Vision 60 is favored by homeland security and armed forces, such as the Japanese Ministry of Defense and the U.S. Customs and Border Protection, which is considering deploying the V60 to patrol U.S.-Mexico border.

The Boston Dynamics model includes an extensive package of audio-video analytics, including five pairs of cameras that provide black and white images and video.

With a payload add-on, this robot — and many others — uses a technique known as Light Detection and Ranging, or LIDAR, to map features as far as 120 meters, or nearly 400 feet, away. LIDAR uses sensors to emit light pulses that bounce off nearby objects to develop a three-dimensional map. The sensors then calculate the distance to each object by determining how long it took each light pulse to return.

Developing navigation algorithms for quadruped robots in an urban environment can be challenging, said Jia Pan, an associate professor in computer science at the University of Hong Kong who focuses on AI, sensors, and autonomous robotics.

Two of the most important considerations, he said in a Zoom interview from Hong Kong, are safety and enabling the robot to move efficiently through crowds. “These are actually two related problems,” said Pan. “If you want to guarantee safety, you can make the robot move very slowly. Whenever it meets some people or an obstacle, it can stop. But this is very inefficient.”

Pan and a team of researchers from China, Hong Kong, and the United States developed an experimental quadruped robot to encourage social distancing among pedestrians during the pandemic. According to a 2021 paper in IEEE Access, a peer-reviewed scientific journal, their robot used a “crowd-aware routing algorithm to effectively promote social distancing by using human-friendly verbal cues to send suggestions to over-crowded pedestrians.”


A resident of the Studio City neighborhood of Los Angeles was among those who opposed the plan to accept the robot donation at the city council meeting in May.

“I think it’s very interesting that the robo-dogs are called Spot which has always been a nice little pet,” the resident testified. “And the fact that you’re voting on them today and they’re going to go into neighborhoods where a lot of surveillance already is.”

“Everything always gets tested in the same neighborhoods,” she added.

That comment echoed similar concerns about transparency, privacy, and mass surveillance voiced by scientists, researchers, and policy advocates. How will authorities collect, retain and share surveillance data captured by these robots?

So far, there appear to be more questions than answers. Law enforcement agencies do not uniformly release information about the capabilities or operations of these and other military-grade hardware, said Henderson of Texas Southern University and Guariglia of the Electronic Frontier Foundation.

This spring, the New York Police Department announced plans to use its Boston Dynamics robots, in addition to the K5 manufactured by Knightscope of Mountain View, California. The K5, which was previously tested at the Lefrak City high rise apartment complex in Queens for about two years, is currently being leased and used in least one Manhattan subway station, where it is accompanied by a police officer.

The 5-foot, 400-pound device — described by its maker as a “fully autonomous outdoor security robot” — resembles the iconic Daleks in the television series “Dr. Who.” It’s “similar to like a Roomba,” said Jeffrey Maddrey, the NYPD’s chief of department, at the April press conference introducing both the Boston Dynamics and Knightscope devices.

But, unlike a robot vacuum, the K5 is a rolling surveillance unit that includes 16 microphones, speakers that can playback live or recorded messages, GPS, sonar, thermal imaging, and the capacity to detect nearby wireless signals. The K5 also boasts automatic license plate recognition technology (it is often deployed as a parking monitor), among other features.

Scientists, researchers, and policymakers have also raised questions around the potential for racial or ethnic bias. Law enforcement agencies have historically over-surveilled African American, Latino, and Indigenous communities with emerging technologies such as wiretaps and drones, critics say.

These communities tend to have higher rates of policing and incarceration, said Nicol Turner Lee of the Brookings Institution, and some emerging technologies often yield inaccurate or discriminatory outcomes. The inaccuracies and discriminatory outcomes by facial recognition technology, for example, have received considerable attention in recent years. Facial recognition is less accurate identifying female and darker complexions than those of White males, according to research pioneered by computer scientists Joy Buolamwini of MIT Media Lab and Timnit Gebru, formerly of Microsoft Research.

Turner Lee is also among those concerned that police departments acquiring the robots maintain adequate staffing levels. The robots should not become “quick fixes” for municipalities and police departments that are facing budget and staffing shortfalls, she said.

While the Los Angeles Police Department will be required to issue quarterly reports on the Boston Dynamics robot’s usage, that policy could go further, said Henderson, the criminologist. Police departments and local jurisdictions should be subject to third-party oversight and regular evaluations, he said. “It should almost be on a monthly basis to make sure too much time doesn’t go by between implementation and auto correcting for any errors that will be inevitable,” added Henderson.

Perhaps the most troubling issue, according to some researchers, policy analysts, and ethicists, is the possibility of equipping the robots with Tasers or other weapons.

Axon Enterprise, the Arizona-based technology and weapons manufacturer formerly known as Taser International, is reportedly researching proposals to develop a robot equipped with its well-known electroshock weapon. “Axon believes that the future of policing will include more robotic security and we will continue to innovate in this space,” Alex Engel, Axon’s global vice president of corporate communication, wrote in a statement to Undark. The statement also said the research is “still in early concept stages” and the company will identify “appropriate use cases and the right ethical measures that need to be in place prior to exploring further.”

Last December, the San Francisco Board of Supervisors unanimously voted to reverse a proposal that would have allowed the San Francisco Police Department to deploy robots for lethal use of force. “The police said they had no plans to arm the robots with guns but wanted the ability to put explosives on them in extraordinary circumstances,” according to NPR.

Nonetheless, the proposal to arm robots faced blistering criticism. Guariglia said it was brought to public attention because of Assembly Bill 481, the state law that requires police departments to outline use policies for military hardware. But he and other researchers are concerned that global events — such as the Russian invasion of Ukraine, where both sides are modifying drones to deliver bombs — could encourage more police departments to consider arming remote controlled or fully autonomous robots.

“We know from history that whenever governments deploy some kind of weapon or surveillance system in warfare,” Guariglia said, “that it’s only really a matter of time before those things travel back home.”

Read more

about policing

Comments

Leave a Reply

Skip to toolbar