TMCnet Feature Free eNews Subscription
February 07, 2013

Racial Profiling on Google? Yes, Says Harvard Professor

By Jacqueline Lee, Contributing Writer

Latanya Sweeney, a professor of government and technology at Harvard who also specializes in online privacy, decided to find out whether or not Google (News - Alert) searches exhibit racial bias.



Sweeney decided to conduct the experiment after a Google search of her own name triggered an ad for a background check service, implying that she had an arrest record.

Sweeney and her team of researchers conducted 2,184 searches for racially-associated names. The team investigated both Google.com and Reuters (News - Alert).com, which displays ads from Google AdWords.

After conducting these searches, Sweeney determined that when she entered common African-American names like DeShawn, Jermaine and Darnell, both sites generated ads suggesting an arrest record. In fact, Sweeney reported that an arrest-related ad came up in 81 to 86 percent of searches on one site and in 92 to 95 percent of searches on the other.

In contrast, names predominantly given to white babies such as Emma, Jill and Geoffrey only generated arrest-related ads in 21 to 23 percent of searches on one site, and in zero to 60 percent of searches on the other.

“There is discrimination in delivery of these ads,” Sweeney determined.

Google responded that AdWords does not conduct racial profiling. “We also have a policy which states that we will not allow ads that advocate against an organization, person or group of people,” a Google spokesperson said in a statement.

According to Google, the search engine isn’t matching African-American names to arrest-related advertising. The keywords that generate the ads are chosen by the advertisers themselves.

Sweeney conceded that she would have to conduct more research to discover whether Google algorithms, individual advertisers or societal bias accounted for her results. However, she stands by her findings, suggesting there’s only a 1-percent chance that the numbers could be triggered by chance.

“Notice that racism can result, even if not intentional,” Sweeney wrote in her report, “and that online activity may be so ubiquitous and intimately entwined with technology design that technologists may now have to think about societal consequences like structural racism in the technology they design.”




Edited by Braden Becker
» More TMCnet Feature Articles
Get stories like this delivered straight to your inbox. [Free eNews Subscription]
SHARE THIS ARTICLE

LATEST TMCNET ARTICLES

» More TMCnet Feature Articles