Are you a registered Democrat? You could be more likely to experience anxiety these days, causing you to need more mental health care. Have you lived in neighborhoods near industrial zones? That could increase your chance of chronic illness. Do you buy video or board games? You might be less likely to exercise, raising your medical costs in the long term.
According to an that ProPublica and NPR released on Tuesday, health insurers have begun acquiring huge amounts of non-health-related data about the people they insure or will potentially insure. This data includes race, net worth, consumer behavior, criminal and civil court records, and prior addresses, among other things. Health insurers buy it from data brokers, who scoop up pretty much everything from the data trails we all leave behind as we move through the world. Those data brokers, as well as the health insurers themselves, also create algorithms to find relevant patterns in this data 鈥 like relationships between particular purchasing habits or life events and increased health care expenditures.
While health insurers claim they鈥檙e not using these algorithms to set insurance costs for individuals, they鈥檙e unable to cite any law that would prevent them from doing just that. And considering that the very purpose of insurance is to assess risk and charge customers accordingly, there鈥檚 a very real concern that insurers will start using these algorithms to set their fees.
Existing health disparities mean that data will consistently show members of certain groups to be more likely to need more health care. What will happen, then, if this data starts being used against those groups? We know, for example, that Black women are to experience serious complications from pregnancy than white women. So, health insurers might conclude that a woman who is Black and recently married is likely to cost them more money than a white woman in the same position. Even in cases where they don鈥檛 have accurate race data, insurers might draw the same conclusion for women who purchase Black hair-care products or those who have tweeted about television shows like Atlanta or Scandal.
More broadly, people who live in poor neighborhoods and neighborhoods of color are to have health problems than those in affluent neighborhoods. The ProPublica piece quotes one health data vendor joking, 鈥淕od forbid you live on the wrong street these days 鈥 You鈥檙e going to get lumped in with a lot of bad things.鈥 Is it fair to make health care more expensive for people based on zip code or race?
The Affordable Care Act prohibits insurers from discriminating on the basis of pre-existing conditions or gender, but it doesn鈥檛 say anything about race, religion, national origin, or anything else insurers can learn about you from data brokers. At the state level, where insurance in this country is largely regulated, more than half of states using race explicitly in pricing health insurance. That鈥檚 a problem, especially in the age of big data, when it鈥檚 extremely tempting for insurers to raise prices for customers they perceive to be risky, sometimes in order to drive them away. Actors in other lines of insurance, like auto or homeowners鈥 insurance, have started to use digital data to raise prices for customers who they predict 飞辞苍鈥檛 switch insurers if their rates go up. It鈥檚 a big enough problem that 20 states have issued bulletins the practice.
Historical and ongoing racial discrimination has created an enormous racial wealth gap, and because we continue to live in such a segregated country, almost all the data held by data brokers reflects and encodes racial disparities. When predictive models are built using this data, people of color are consistently disadvantaged 鈥 Black people whose credit scores are as good or better than those of whites might not get a loan simply because of the neighborhood in which they live.
If that happens in the lending context, the federal Equal Credit Opportunity Act protects the borrower. When similar algorithmic discrimination occurs in the housing market, the Fair Housing Act provides protection, as does Title VII when there鈥檚 a job at issue. Since, in addition to barring intentional discrimination, each of these statutes prohibits neutral policies that nonetheless have a disparate impact on members of protected groups 鈥 like people of color 鈥 they are vital in the era of algorithmic decision-making. (Although the Trump Administration is to get rid of this crucial 鈥渄isparate impact鈥 standard.)
The ProPublica report shows that the danger of discrimination in insurance is increasingly real. But there鈥檚 a big hole in civil rights law when it comes to insurance. State legislatures should explore new ways to prevent discrimination in health insurance, including requirements that insurers audit their own use of consumer data for discriminatory effects and publish the results. Consumers deserve no less.