The bias against black children is no more clear than with the Google "three black teenagers" challenge.
Type “three black teenagers” into a Google image search, and a bunch of menacing police mug shots turn up. “Three white teenagers” results in smiling young folks who look to be selling Bibles door-to-door.
Kabir Alli, an 18-year-old from Midlothian, Va., heard about the “three black teenagers” challenge and posted a video clip on Twitter this week with his results.
"I had actually heard about this search from one of my friends and just wanted to see everything for myself. I didn’t think it would actually be true," Alli told USA Today. "When I saw the results, I was nothing short of shocked."
Alli’s Twitter post has been retweeted more than 70,000 times, and the hashtag #threeblackteenagers has been used to discuss the pervasive anti-black bias in American media and culture today.
From #IfIWereGunnedDown to the recent conversation about how convicted sexual assault felon Brock Turner had his high school yearbook photo shown during his trial (and not his mug shot), this is not a new issue.
"I understand it’s all just an algorithm based on most visited pages, but Google should be able to have more control over something like that," Alli said.
Google—which employs at least 2 trillion searches per year—says it’s merely reflecting back the biases that exist in society (i.e., how people tag and describe photos) and is not at fault. Alli says he does not believe that Google is racist.
"If someone is going to search for mug shots, we are not looking for Google to give back positive-looking mug shots. It does not mean you won’t find negative content if you go looking for it," said “longtime Google observer" Danny Sullivan to USA Today. "I do think Google needs to spend more time looking at what they are doing and asking themselves how they balance being a reflection of what’s across the Web with also making sure that they are not reinforcing things and that they are making sure people are fairly treated."
Last year Google apologized—and suspended people’s ability to submit edits to Google Maps—after searches using the n-word directed users to the White House.
And last July, Google apologized after its Photos app automatically labeled black people as "gorillas." Programmer Jacky Alciné tweeted a screenshot of photos he had uploaded in which the app had labeled Alcine and a friend, both African American, "gorillas."
"These companies are not doing the user test keyword searches that African Americans are doing. This is one of the reasons we consistently see this happening," Safiya Umoja Noble, a professor of information studies and African-American studies at UCLA who is writing a book that compiles and analyzes the social consequences of racially biased online searches, told USA Today.
Read more at USA Today.