The GroupMe Media Frenzy

0
410
Peter Murray of The Georgetowner. | CBS This Morning

Georgetown stores are in the business of racial profiling—at least according to a Washington Post report picked up by Fox 5, channel 9, Drudge, Gawker and the Daily Caller. The report details racial bias in a public safety reporting app set up in cooperation with police by the Georgetown Business Improvement District, but media discussion has left out a few crucial details.

The Washington Post story heavily borrows research, facts, and anecdotes from an investigative report I wrote for The Georgetowner’s Aug. 5 issue, all without giving proper (or any) attribution. While we are thrilled that the Post’s sway is causing more media outlets to pick up our story, we are disappointed that one of our city’s best-reputed media outlets has failed to recognize a basic tenant of journalism, citation.

Next, the app is not racist, some users are. Messages on it may contain racially charged or biased content, but the app itself is a simple, real-time group messaging app along the lines of iChat or WhatsApp. Looking through messages in the app, however, we saw an alarming trend; users reported African Americans 15 times as often as white people.

Many posts in the app have led to real arrests of real criminals. That much can’t be disputed. The problem, demonstrated by the catalog of GroupMe messages is that users are more closely watching blacks than whites. Yet, research shows that African Americans are no more likely to shoplift than white people. (Other studies have shown that retail employees are much more likely to steal from their employer than are store outsiders.) Even if in Georgetown, African Americans were more likely to shoplift than whites—the implication of the BID’s recent statements to the press—the disparity between whites and blacks flagged in the app is too big to reflect that likelihood. More likely, white thefts aren’t getting caught.

Looking through the app, you’d think that all of the criminals in Georgetown are black. The constant stream negative reinforcement about African Americans leads to confirmation bias. For example, research on police has found then when officers use race as a factor in criminal profiling based on presumed statistical probabilities, they contribute to the statistics upon which they rely, which helps further justify the profiling of black people. A similarly self-fulfilling, circular phenomenon is likely occurring among GroupMe users in Georgetown.

Using GroupMe to increase communication has serious potential to further public safety. While most posts concern criminal or suspicious behavior, some provide alerts on weather and traffic, and the app as a whole keeps community members connected to each other and to the police. However, there must be training so that app users recognize their own biases before they broadcast them to the group. Since the story was published in The Georgetowner, the BID has indicated that it will take steps to better train users and eliminate bad actors from the group. We look forward to seeing those steps in action and reporting further on this important topic.

The issue raises questions not only for Georgetown but for other places where police, businesses and citizens are implementing new digital communication strategies for community policing with GroupMe, Facebook Chat or the next big messaging app. How do we make sure app users are sending accurate, unbiased, valuable information to the police? When messages demonstrate bias, is a potentially small reduction in crime (neither the police nor BID have any hard data linking arrests and GroupMe posts) worth alienating an entire group of citizens? Perhaps we can begin to answer these questions once racial bias is exposed on more private message boards across the country and our society embraces a culture of honesty and transparency when it comes to the intersection of criminal justice and technology.

Share this:

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Time limit is exhausted. Please reload the CAPTCHA.