Last month, a web-based service called 鈥淕hetto Tracker鈥 was unveiled. The site鈥檚 creator touted it as a travel advice service where users could pin digital maps with safety ratings to enable those new to town to avoid dodgy neighborhoods. While crowd-sourced travel advice is not a particularly novel or noteworthy idea, the site鈥檚 suggestive use of the word ghetto to evoke neighborhoods of color and its intention to label certain areas categorically 鈥済ood/bad,鈥 鈥渟afe/unsafe,鈥 in conjunction with its choice of the below stock photo on the homepage, has resulted in an understandable backlash from those who have . After a storm of negative publicity, the operators quickly renamed the service 鈥淕ood Part of Town,鈥 the stock photo was replaced with one depicting a black family, and the site dropped all of the references to 鈥済hetto鈥 to refer to a 鈥渂ad鈥 area. Despite this rebranding effort, the site鈥檚 operators decided to the site altogether just days after the launch.
While Ghetto Tracker/Good Part of Town attracted lots of negative attention for its deeply problematic framing of its otherwise simple crowd-sourced travel advice service, a growing field of more sophisticated geo-navigational applications that have garnered less press attention have been quietly springing up, many of which similarly incorporate safety judgments into navigational aids. This at a time when, according to a , seventy-four percent of adult smart phone owners use their phones to receive information based on their current location (this directionally challenged blogger included).
In an informative with Brooke Gladstone on September 20, geographer and mobile geospatial applications scholar Jim Thatcher explained that some geo-navigational applications do many of the same things as Ghetto Tracker, yet in far less blatant, and arguably more pernicious, ways. Such applications abound, including , , Microsoft鈥檚 patented , as well as a recent billion-dollar Google acquisition called .
Looking more closely at Microsoft鈥檚 Pedestrian Route Production, which the company patented in 2012, provides insight into some of the potential civil liberties concerns these types of applications can raise. According to , Pedestrian Route Production, which the 鈥渁void ghetto feature鈥 for GPS devices, was designed to provide navigational walking routes that factored in such considerations as the weather, crime statistics and demographic information. Microsoft perceives these types of analytics useful because, as the patent language states, 鈥渋t can be more dangerous for a pedestrian to enter an unsafe neighborhood than a person in a vehicle since a pedestrian is more exposed and it is more difficult for her to leave an unsafe neighborhood quickly.鈥 The Microsoft patent also enables the company to factor advertising considerations into their routing algorithm, so that they could, for example, send a user down a street where a paid advertiser鈥檚 storefront or billboard is located, rather than a non-monetized street that the user would otherwise select if provided enough information.
These applications quickly run into issues of transparency. , 鈥淢icrosoft may or may not ever put this technology into one of their products, but we have no way of knowing how these decisions are being made. How are they using demographic information? Are they saying, this area has a median income of X, therefore, this user with an income of Y would not like to go there?鈥 These issues are particularly thorny since we as users are seldom provided information regarding the nature of navigational and search algorithms that companies like Google and Microsoft vehemently shield under trade secret law.
Beyond transparency we are concerned about the potential for these types of services to result in what Thatcher calls 鈥渢eleological redlining,鈥 whereby the design features of the software would make it so that its users rarely, if ever, encounter certain destinations. This redlining could have devastating and destabilizing effects on already marginalized and disenfranchised communities whose businesses would lose foot traffic. And this redlining would likely serve to reinforce existing harmful and negative stereotypes about poor communities and communities of color. Discriminatory attitudes could become further entrenched by our reliance on applications that route around them. Imagine the negative feedback loop created as individual rely on their devices to tell them where to go, which, in turn further harms businesses and communities.
The interweaving of social media and geo-navigational applications could exacerbate this problem. Applications like Waze rely on users and social networking connections to share travel information and suggest particular routes. It鈥檚 easy to imagine very subtle judgments among loosely connected social groups having a large influence on where drivers and pedestrians are directed in their travels. Perhaps worst of all, technology is masking the racial and socio-economic judgments at issue here. Individuals don鈥檛 have to question their own assumptions because it is not their own stereotyping at play but just the seeming impartiality of a mobile device.
All of this is not to urge the rejection of these technologies but rather a healthy skepticism about the masses of data that can be laid atop our maps鈥攁nd the value judgments that underlie them.