2013-09-06

A service called "Ghetto Tracker" appeared online at the beginning of this week and quickly drew criticism for its racist and classist overtones. Shortly after, the site was renamed "Good Part of Town." Its creator, who would only identify himself as a 30-something-year-old in Tallahassee, told Gawker: "This was originally seriously developed as a travel tool and the name 'Ghetto Tracker' was meant to be something that people would remember."

The basic premise of Ghetto Tracker/Good Part of Town -- to crowdsource travel advice – actually isn’t so outrageous, but the framing, even without the word "Ghetto" in the name, and the intention -- to label whole geographic areas as "good" or "bad," "safe" and "unsafe" -- make the operation distasteful.

Yet in the growing field of geo-web applications, incorporating safety judgments into navigational aids is becoming increasingly common. Accusations of reinforcing racist or classist stereotypes could be lobbed at any of those apps. "In any form," writes Emily Badger at The Atlantic Cities, "this idea toes a touchy line between a utilitarian application of open data and a sly wink toward people who just want to steer clear of 'those kinds of neighborhoods.'"

So how should we think about these apps? When does technology step over that line from being merely useful to becoming insidiously stereotype-enforcing?

Anyone can investigate a neighborhood by looking up local crime rates, median income, and demographics online – not to mention the information gleaned from word-of-mouth reports. To perform such research and then make a decision about traveling to a particular area involves critical thinking, which is hardly objectionable. The ethical problem occurs when your mobile device takes over that thinking for you.

Microsoft’s Pedestrian Route Production technology, patented in January 2012 – and immediately dubbed "the avoid-ghetto app" by many in the media -- was designed to one day let Windows Phone users filter walking routes according to "weather information, crime statistics, [and] demographic information." According to the language of the patent, such filtering is useful because "if it is relatively cold outside, then a pedestrian is far more likely to feel an impact then [sic] if a vehicle equipped with a heating system protected her. Moreover, it can be more dangerous for a pedestrian to enter an unsafe neighborhood then [sic] a person in a vehicle since a pedestrian is more exposed and it is more difficult for her to leave an unsafe neighborhood quickly." It makes sense to keep safety in mind while navigating an unfamiliar area on foot, but letting a computer algorithm divert you from a particular neighborhood on account of statistics is problematic.

There’s another feature mentioned in the Microsoft patent that deserves scrutiny: the ability to sell route directions. Corporations could pay to have the app send users through routes with carefully plotted advertising campaigns. If your GPS system directed you to turn down one street rather than a parallel one just so you’d encounter a specific poster, would it do so with your consent? 

Jim Thatcher, a geographer at Clark University, says our increasing reliance on mobile spatial technologies opens the door for something he calls "teleological redlining," in which applications "make it very easy to malign certain areas" and can even "obliterate" the possibility of our encountering certain people, places, and events. What’s more, these applications are generally presented to us as "neutral." We often forget to consider the motivations and biases at work behind the scenes.

Our experiences have always been mediated by technology, Thatcher says, but these days that technology is increasingly opaque to its users – that is, few of us actually understand the mechanisms behind it. How many people know exactly how an email gets from one inbox to another? In contrast, the way the U.S. Postal Service transports letters is clear to us. Understanding the mechanisms is a key part of understanding the motivations driving these systems.

"If I go online and look up crime reports for a certain area and decide," Thatcher says, "that five murders on this street is too much for me," then he's aware of his own motivations for staying away from that street. On the other hand, when a mobile app performs a similar analysis for a user, the user cannot be aware of the motivations behind the final decision. Whether the app is incorrectly linking poverty to danger or instead choosing a route laid out by an advertiser, the user’s instinct is to follow the directions. Blind trust kicks in. 

As mobile devices get smarter and more ubiquitous, it is tempting to let technology make more and more decisions for us. But doing so will require us to sacrifice one of our favorite assumptions: that these tools are inherently logical and neutral. As "Good Part of Town"-née-"Ghetto Tracker" suggests, even innocuous information can send charged messages if it’s bundled or filtered in a certain way. And as Thatcher points out, the motivations driving the algorithms may not match the motivations of those algorithms' users. 



    

Show more