Back to News & Commentary

Weird Computer-Generated Quiz Produces Customer Service Fail

Jay Stanley,
Senior Policy Analyst,
老澳门开奖结果 Speech, Privacy, and Technology Project
Share This Page
June 22, 2012

I lost my credit card yesterday and had a very telling experience on the phone with American Express trying to get it replaced. After I gave them various pieces of information, the customer service agent said they would ship me a new card to the billing address on file. Just when I thought I was done, she then read something to the effect of, 鈥淔or security purposes, I am going to ask you a question. The information this question is based on is not connected to your account, but was obtained from third-party information services.鈥

She then asked, 鈥淲hich of the following companies have you been associated with?鈥 And named four companies, none of which I had ever heard of in my life, much less been 鈥渁ssociated鈥 with. I picked her choice #5, 鈥渘one of these companies.鈥 The woman paused to wait for her computer and then said, 鈥淚t says that answer is wrong.鈥 At my request she re-read me the list. I still didn鈥檛 recognize any of these supposed companies. Then she told me that because it said my answer was wrong, she couldn鈥檛 issue me a new card, until I went through some other rigamarole involving calling them back from a certain phone number listed in their records.

I tried arguing with her about this very strange turn in the conversation, pointing out that they were already shipping the card to the billing address, and that I had verified other information, and wasn鈥檛 that enough? She said it wouldn鈥檛 allow her to issue me a card because I had failed the test. I told her the information the questions were based on was incorrect, but that made no difference.

Clearly what was going on was that she had been delivered a computer-generated quiz for me derived from information obtained from a third-party data broker such as Choicepoint or Lexis-Nexis. (The New York Times recently published a of one such company, Acxiom.)

Maybe others are familiar with this procedure but it was certainly new to me. I thought the experience was interesting for several reasons:

  • American Express does not appear to be concerned about freaking out their customers by making it obvious to them just how much information unrelated to the provision of their service the company is collecting.
  • One possibility is that the stupid computer-generated quiz was giving some formal corporate or parent-company name or a name for a company that had been changed, which led me to not recognize it. But at least as likely is that the information was just wrong. The information collected by these companies has been found to be highly inaccurate; in a , two out of 11 test subjects were reported to be corporate directors of companies that they had (丑尘尘尘鈥).
  • The possibility that I was being denied a service (at least, temporarily) and inconvenienced due to erroneous information held by a third party is an ominous indicator. It was a minor matter for me, but still鈥擨 never asked a bunch of companies to go out and start building a file of (quasi-accurate) information about me. I have no business relationship with those companies or leverage over them. The mindless bureaucratic 鈥渢est鈥 I was given was just a small peek into a whole secret world of data collection that is going on behind the scenes鈥攁nd a reminder that that world can have consequences for our lives. If reliance on their information becomes more widespread and more serious, those consequences will increase.
  • Finally, note that in terms of my dealings with American Express, there was little difference between computer agent and human agent. I had earlier been bounced from the automated voicemail tree because it couldn鈥檛 handle my situation (I didn鈥檛 know my card number). Yet the fact that I was interfacing with a human being made zero difference in my treatment; I was still effectively trapped in a computerized decision tree. This employee had no discretion to dispose of my case outside the parameters of what her computer allowed. Her computer was not a tool that extended her brain; her brain was merely a translator between me and the computer algorithm, which was very much in charge. No doubt the quiz given to me had also been prepared by a computer, and the information on which it was based compiled by a computer at Axciom, Choicepoint, or a competitor. This question of the boundaries between the human and the computer is an interesting and potentially consequential one, which recently came up in the context of Google anti-trust issues.

In the end, the problem got resolved. While at first the agent said she could not ask me a new set of questions, eventually her computer seemed to relent, and after once again robotically reading the little speech about information 鈥渙btained from third-party information services,鈥 she presented me with a list of street addresses (no city or state). I had to pick the one where I had once lived. I dimly recognized one of the addresses as an apartment I once inhabited for 9 months in grad school in the 1990s, and鈥擨 passed the test! American Express鈥檚 computer was happy at last.

Learn More 老澳门开奖结果 the Issues on This Page