This post was adapted from a at an AI Now held on July 10 at the MIT Media Lab. is a new initiative working, in partnership with the 老澳门开奖结果, to explore the social and economic implications of artificial intelligence.
It seems to me that this is an auspicious moment for a conversation about rights and liberties in an automated world, for at least two reasons.
The first is that there鈥檚 still time to get this right. We can still have a substantial impact on the legal and policy debates that will shape development and deployment of automated technologies in our everyday lives.
The second reason is Donald Trump. The democratic stress test of the Trump presidency has gotten everyone鈥檚 attention. It鈥檚 now much harder to believe, as Eric Schmidt once assured us, that 鈥渢echnology will solve all the world鈥檚 problems.鈥 Technologists who have grown used to saying that they have no interest in politics have realized, I believe, that politics is very interested in them.
By contrast, consider how, over the last two decades, the internet came to become the engine of a surveillance economy.
Silicon Valley鈥檚 apostles of innovation managed to exempt the internet economy from the standard consumer protections provided by other industrialized democracies by arguing successfully that it was 鈥渢oo early鈥 for government regulation: It would stifle innovation. In almost the same breath, they told us that it was also 鈥渢oo late鈥 for regulation: It would break the internet.
And by the time significant numbers of people came to understand that maybe they hadn鈥檛 gotten such a good deal, the dominant business model had become so entrenched that meaningful reforms will now require Herculean political efforts.
How smart can our 鈥渟mart cameras鈥 be if the humans programming them are this dumb?
When we place 鈥渋nnovation鈥 within 鈥 or atop 鈥 a normative hierarchy, we end up with a world that reflects private interests rather than public values.
So if we shouldn鈥檛 just trust the technologists 鈥 and the corporations and governments that employ the vast majority of them 鈥 then what should be our north star?
Liberty, equality, and fairness are the defining values of a constitutional democracy. Each is threatened by increased automation unconstrained by strong legal protections.
Liberty is threatened when the architecture of surveillance that we鈥檝e already constructed is trained, or trains itself, to track us comprehensively and to draw conclusions based on our public behavior patterns.
Equality is threatened when automated decision-making mirrors the unequal world that we already live in, replicating biased outcomes under a cloak of technological impartiality.
And basic fairness, what lawyers call 鈥渄ue process,鈥 is threatened when enormously consequential decisions affecting our lives 鈥 whether we鈥檒l be released from prison, or approved for a home loan, or offered a job 鈥 are generated by proprietary systems that don鈥檛 allow us to scrutinize their methodologies and meaningfully push back against unjust outcomes.
Since my own work is on surveillance, I鈥檓 going to devote my limited time to that issue.
When we think about the interplay between automated technologies and our surveillance society, what kinds of harms to core values should we be principally concerned about?
Let me mention just a few.
When we program our surveillance systems to identify suspicious behaviors, what will be our metrics for defining 鈥渟uspicious鈥?
This is a brochure about the 鈥8 signs of terrorism鈥 that I picked up in an upstate New York rest area. (My personal favorite is number 7: 鈥淧utting people into position and moving them around without actually committing a terrorist act.鈥)
How smart can our 鈥渟mart cameras鈥 be if the humans programming them are this dumb?
And of course, this means that many people are going to be logged into systems that will, in turn, subject them to coercive state interventions.
But we shouldn鈥檛 just be concerned about 鈥渇alse positives.鈥 If we worry only about how error-prone these systems are, then more accurate surveillance systems will be seen as the solution to the problem.
I鈥檓 at least as worried about a world in which all of my public movements are tracked, logged, and analyzed accurately.
Bruce Schneier likes to say: Think about how you feel when a police car is driving alongside you. Now imagine feeling that way all the time.
There鈥檚 a very real risk, as my colleague Jay Stanley has warned, that pervasive automated surveillance will:
鈥渢urn[] us into quivering, neurotic beings living in a psychologically oppressive world in which we鈥檙e constantly aware that our every smallest move is being charted, measured, and evaluated against the like actions of millions of other people鈥攁nd then used to judge us in unpredictable ways.鈥
I also worry that in our eagerness to make the world quantifiable, we may find ourselves offering the wrong answers to the wrong questions.
The wrong answers because extremely remote events like terrorism don鈥檛 track accurately into hard predictive categories.
And the wrong question because it doesn鈥檛 even matter what the color is: Once we adopt this threat-level framework, we say that terrorism is an issue of paramount national importance 鈥 even though that is a highly questionable proposition.
Think about how you feel when a police car is driving alongside you. Now imagine feeling that way all the time.
The question becomes 鈥渉ow alarmed should we be?鈥 rather than 鈥渟hould we be alarmed at all?鈥
And once we鈥檙e trapped in this framework, the only remaining question will be how accurate and effective our surveillance machinery is 鈥 not whether we should be constructing and deploying it in the first place.
If we鈥檙e serious about protecting liberty, equality, and fairness in a world of rapid technological change, we have to recognize that in some contexts, inefficiencies can be a feature, not a bug.
Consider these words written over 200 years ago. The Bill of Rights is an anti-efficiency manifesto. It was created to add friction to the exercise of state power.
The Fourth Amendment: Government can鈥檛 effect a search or seizure without a warrant supported by probable cause of wrongdoing.
The Fifth Amendment: Government can鈥檛 force people to be witnesses against themselves; it can鈥檛 take their freedom or their property without fair process; it doesn鈥檛 get two bites at the apple.
The Sixth Amendment: Everyone gets a lawyer, and a public trial by jury, and can confront any evidence against them.
The Eighth Amendment: Punishments can鈥檛 be cruel, and bail can鈥檛 be excessive.
This document reflects a very deep mistrust of aggregated power.
If we want to preserve our fundamental human rights in the world that aggregated computing power is going to create, I would suggest that mistrust should remain one of our touchstones.