We are surrounded by surveillance cameras that record us at every turn. But for the most part, while those cameras are watching us, no one is watching what those cameras observe or record because no one will pay for the armies of security guards that would be required for such a time-consuming and monotonous task.
But imagine that all that video were being watched 鈥 that millions of security guards were monitoring them all 24/7. Imagine this army is made up of guards who don鈥檛 need to be paid, who never get bored, who never sleep, who never miss a detail, and who have total recall for everything they鈥檝e seen. Such an army of watchers could scrutinize every person they see for signs of 鈥渟uspicious鈥 behavior. With unlimited time and attention, they could also record details about all of the people they see 鈥 their clothing, their expressions and emotions, their body language, the people they are with and how they relate to them, and their every activity and motion.
That scenario may seem far-fetched, but it鈥檚 a world that may soon be arriving. The guards won鈥檛 be human, of course 鈥 they鈥檒l be AI agents.
Today we鈥檙e publishing a report on a $3.2 billion industry building a technology known as 鈥渧ideo analytics,鈥 which is starting to augment surveillance cameras around the world and has the potential to turn them into just that kind of nightmarish army of unblinking watchers.
Using cutting-edge, deep learning-based AI, the science is moving so fast that early versions of this technology are already starting to enter our lives. Some of our cars now come equipped with that can sound alarms when a driver starts to look drowsy. today can alert us when a person appears on our doorstep. use AI-enabled cameras that monitor customers and automatically charge them when they pick items off the shelf.
In the report, we looked at where this technology has been deployed, and what capabilities companies are claiming they can offer. We also reviewed scores of papers by computer vision scientists and other researchers to see what kinds of capabilities are being envisioned and developed. What we found is that the capabilities that computer scientists are pursuing, if applied to surveillance and marketing, would create a world of frighteningly perceptive and insightful computer watchers monitoring our lives.
Cameras that collect and store video just in case it is needed are being transformed into devices that can actively watch us, often in real time. It is as if a great surveillance machine has been growing up around us, but largely dumb and inert 鈥 and is now, in a meaningful sense, 鈥渨aking up.鈥
%3Ciframe%20allowfullscreen%3D%22%22%20frameborder%3D%220%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2F1dDhqX3txf4%3Fautoplay%3D1%26autoplay%3D1%26version%3D3%22%20thumb%3D%22files%2Fweb19-robots-560x315.jpg%22%20width%3D%22560%22%3E%3C%2Fiframe%3E
Privacy statement. This embed will serve content from youtube.com.
Computers are getting better and better, for example, at what is called simply 鈥渉uman action recognition.鈥 AI training datasets include thousands of actions that computers are being taught to recognize 鈥 things such as putting a hat on, taking glasses off, reaching into a pocket, and drinking beer.
Researchers are also pushing to create AI technologies that are ever-better at 鈥溾 (sounding alarms at people who are 鈥渦nusual,鈥 鈥渁bnormal,鈥 鈥渄eviant,鈥 or 鈥渁typical鈥), emotion recognition, the perception of our attributes, the understanding of the physical and social contexts of our behaviors, and wide-area tracking of the patterns of our movements.
Think about some of the implications of such techniques, especially when combined with other technologies like face recognition. For example, it鈥檚 not hard to imagine some future corrupt mayor saying to an aide, 鈥淗ere鈥檚 a list of enemies of my administration. Have the cameras send us all instances of these people kissing another person, and the IDs of who they鈥檙e kissing.鈥 Government and companies could use AI agents to track who is 鈥渟uspicious鈥 based on such things as clothing, posture, unusual characteristics or behavior, and emotions. People who stand out in some way and attract the attention of such ever-vigilant cameras could find themselves hassled, interrogated, expelled from stores, or worse.
Many or most of these technologies will be somewhere between unreliable and utterly bogus. Based on experience, however, that often won鈥檛 stop them from being deployed 鈥 and from hurting innocent people. And, like so many technologies, the weight of these new surveillance powers will inevitably fall hardest on the shoulders of those who are already disadvantaged: people of color, the poor, and those with unpopular political views.
We are still in the early days of a revolution in computer vision, and we don鈥檛 know how AI will progress, but we need to keep in mind that progress in artificial intelligence may end up being extremely rapid. We could, in the not-so-distant future, end up living under armies of computerized watchers with intelligence at or near human levels.
These AI watchers, if unchecked, are likely to proliferate in American life until they number in the billions, representing an extension of corporate and bureaucratic power into the tendrils of our lives, watching over each of us and constantly shaping our behavior. In some cases, they will prove beneficial, but there is also a serious risk that they will chill the freedom of American life, create oppressively extreme enforcement of petty rules, amplify existing power disparities, disproportionately increase the monitoring of disadvantaged groups and political protesters, and open up new forms of abuse.
Policymakers must contend with this technology鈥檚 enormous power. They should prohibit its use for mass surveillance, narrow its deployments, and create rules to minimize abuse.
Read the full report here.