Back to News & Commentary

The Good Wife Tackles Algorithmic Discrimination. Meanwhile, in Real Life…

Still from 'The Good Wife'
Still from 'The Good Wife'
Rachel Goodman,
Former Staff Attorney,
ϰſ Racial Justice Program
Share This Page
December 3, 2015

This week’s episode of "" raised some important questions: Will Jason and Alicia sleep together? Will Cary and Lucca sleep together? Will Courtney and Eli sleep together? And one that is perhaps a little more important: Will the algorithms that increasingly govern our economic and personal lives exacerbate racial inequality in America?

The setup is this: A Black restauranteur, whose restaurant is in a predominantly Black neighborhood, wants to sue longtime series tech behemoth ChumHum (a Google-type entity) for its newly launched mapping app, ChummyMaps. The app directs users away from “unsafe” neighborhoods and hides businesses, like hers, located in those neighborhoods. But these “unsafe” neighborhoods seem to be all the places where people of color live. She enlists Diane and Cary to make the case.

During the course of the episode, Alicia and Lucca, who are defending ChumHum, get an education in algorithmic discrimination, or “digital redlining” as it’s sometimes known. Redlining is the practice of excluding neighborhoods of color from mortgage credit, and it used to be the formal policy of banks and the federal government. Official outlined these neighborhoods in red. Here, the episode’s opening sequence makes the comparison explicit with an image of one of the old redlined maps of New York City.

Home Owners' Loan Corporation (HOLC) redlined map of Manhattan from 1938.

Home Owners' Loan Corporation (HOLC) redlined map of Manhattan from 1938. (LaDale Winling / urbanoasis.org)

Alicia and Lucca learn that ChumHum’s algorithms can produce other upsetting racialized results. Its automatic photo tagging algorithm misidentified Black women as “animals,” much like Google’s real-life photo-tagging software Black people as “gorillas.” Another attorney of color sees different ads (like one for a soul food restaurant) in her ChumHum account than does Cary, who is white. And a ChumHum user named Jamal complained that ChumHum’s auto-complete function suggested queries that associated him with criminal behavior. These last two problems echo the real-life of Harvard professor LaTanya Sweeney, who discovered that Google served ads related to arrest records in response to searches for Black-identified names, like Travon — but not for white-identified names, like Brad. (Google ultimately applied some fixes to address these particular findings.)

The Good Wife has become known for its ripped-from-the-headlines premises and plots, and this episode was no different. The potentially discriminatory reach of algorithms is becoming increasingly apparent. To that end, the ϰſ has urged government regulators to enforce civil rights laws — like the Equal Credit Opportunity Act — online to make sure that algorithms don’t inflict real harms on people of color and others (like women and people with disabilities), who those laws are designed to protect. We’ve also worked with a coalition of civil rights groups on a set of that corporations and the government should keep in mind in the use of big data.

True to form, the Good Wife episode wraps up the legal case, with ChumHum agreeing to fix certain aspects of ChummyMaps. But the real-world problem of algorithmic bias is complicated. As long as and the persist, and the algorithms dictating our online experience don’t comport with civil rights principles, machines analyzing patterns in big data risk reinforcing existing societal discrimination.

The ongoing question of how we tackle this issue is even more crucial than what happens to Alicia and Jason.

Correction: This post initially suggested it was Lucca who saw different ads than those Cary saw. This was incorrect; it was actually another attorney named Monica.

Learn More ϰſ the Issues on This Page