(The Root) — Ever do a search in Google and get back some crazy result you weren't expecting?
A few years ago, I did a Google search on the term "black girls" to assist my nieces and stepdaughter in some stuff for school. What was the first search result? www.hotblackp—sy.com.
I was horrified. But I was also curious about why this kind of material was most representative of black girls. As a researcher and educator, I knew there had to be more to this phenomenon. I kept collecting searches. That was in 2010.
By 2011, I was conducting a study about Google search to figure out how and why it is that women are hypersexualized in search engines. At that time, when I did a search on the term "black girls," the No. 1 hit was www.sugaryblackp—sy.com, followed by more porn sites and a rock band of white guys from the United Kingdom using the name "Black Girls." After the page jump below (click on the number 2) is a not safe for work partial screenshot from Sept. 18, 2011:
The first page of search results are the most important, because research shows that most people don't go much further. In my study, only one resource out of 10 was an empowerment-oriented organization: Black Girls Rock! And the ads down the right — from which Google makes most of its money — were sexually explicit. What was clearly missing on the first page of the search in 2011 was content focused on the educational achievements, interests or social success of black girls. Even fashion, games or entertainment targeted toward black girls did not get much priority on the first page.
And what about the two famous black girls in the White House for the past four years? Nope. Absent from the first page, too.
In other words, Google, in one algorithmic click, had made black girls synonymous with sex.
This bothered me. I wrote about this issue for Bitch magazine in the spring of 2012 because it was time to have a frank discussion about what happens to girls and women in search engines. Shortly after that, the results on searches for black girls changed.
Today, when I do the same search, www.blackgirlsareeasy.com is the first hit. (Try it.) While it's not a porn site, it plays off stereotypes of black women and girls for its readers' delight, and its content is consistent with old, derogatory ideas about black women that have often been used to justify their mistreatment.
It is true that results shift over time. Yet, over the two years I studied searches on black girls, what was disturbing is how sexist content was relatively the same, and it was consistently worse for black, Latina and Asian women. Some results are changing, but some ideas remain.
Today, when you search for girls of any type, you still often get sexist material. As I've written and talked about this phenomenon, I've noticed that hypersexualized representations of black girls have declined, but not so for Asian and Latina girls. We often get content that is derogatory or sexualized — not positive or empowering — for which the audience is not likely an actual girl. Racist and sexist misrepresentations are real.
So yes. Google is racist (and sexist), but it's complicated.
Like The Root on Facebook. Follow us on Twitter.
It's complicated because it has a significant bottom line. Part of the content on the first page of searches is indexed Web pages, which are often biased toward content that Google owns and makes money from, as documented by consumerwatchdog.org (pdf). It's why you're likely to get video results from YouTube (owned by Google) before you get them from Vimeo, even if they both have the same content available.
In order to get the right ads — which run down the right side or at the very top or bottom of the page — to show up, Google uses a computer algorithm for its AdWords product, which is different from the one it uses to generate search results (indexed Web pages). This tool creates linkages between keyword searches and ads. Through its ad programs, Google touts in its 2011 Economic Impact report that businesses receive $8 in profit for every $1 they spend with Google.
As for the search results we find on the first page, sometimes they are the result of popularity. Sites that get many visits or clicks are likely to surface near the top, but there are other factors, too, like the way advertising is linked to certain words. Google also makes money from its AdSense program that helps websites link their ads to searches on specific terms.
Search engine optimization is also at play. Managed by a cottage industry of companies and consultants outside of Google, this is a process of linking keywords to pages of a website, often embedded in the code, to help drive a site to the top.
No matter the Web mechanisms, sex is a lucrative industry online. The relationship between ads and websites when you search for women and girls is of no surprise. Anyone can buy "black girls" or any other word combinations. In this business model, communities are less in control of the ways in which they are represented. Identity is for sale to the highest bidder.
Latanya Sweeney of Harvard University just released a study (pdf) about the power of Google's AdWords and racial identity. She found that "black-sounding" names are more likely to bring up advertising for criminal-background checks than white names. Sweeney's work points to the ways that racial bias happens in Google search.
What is worrisome about the commercial-search business model is that companies have a vested interest in our clicking on websites and ads that make them the most money. This model has little relationship to the best way for us to find knowledge or information that we can trust. It drives home the reason we cannot substitute search engines for great schools, competent teachers and well-funded libraries. It raises questions about how companies, not communities, control what is found in a search engine. For marginalized groups that need continued advocacy for social, economic or political justice, co-optation of identity by these processes is problematic.
But it's not just girls and women who should take note. In 2004, the Anti-Defamation League challenged Google when searches on "Jew" brought back anti-Semitic, neo-Nazi and Holocaust-denial websites. Google responded by issuing an explanation that signaled it could do little to affect search results. It claimed that its algorithm technology was neutral, and search results were a matter of how people use Google, rather than the technology itself.
Consider the website MartinLutherKing.org, dedicated to discrediting Dr. King. Stormfront, the site's owner, can game the algorithm to get first-page placement. Despite protests, Google will not demote the racist site. Consider another example: In November of 2009, a search for Michelle Obama produced a photoshopped image of her as a monkey. Rather than remove the image from its database, Google ran a disclaimer about offensive search results. Many politicians and celebrities have also been humiliated by losing control of their identity in search engines, including George W. Bush and Rick Santorum.
Again, like other forms of hate speech protected under the First Amendment, without regard to the social ramifications of their words, Google's position supports the idea that racists are able to co-opt the name of Dr. King, Michelle Obama or the Jewish community under the auspices of freedom of speech. Hence, Google is not socially responsible for the results of its algorithm.
Many are at a disadvantage when it comes to how search works. Clear and obvious racist (and sexist) values are embedded in some of the digital technologies we use. What both Sweeney's and my research efforts show is how racism and sexism function on the Web. This is heavy stuff, but many of us are trying to call attention to Google's bias because we want to see change. Change is possible.
In the end, it's really not so complicated.
Google is a social technology whose effectiveness is determined by its use. It is a reflection of a profit model, and it is a reflection of our societal values, with one big exception: Google's algorithms are made by people, and hence, they can be made more socially responsible. An unwillingness to do so has real civic costs that we are just now beginning to uncover.
Racial bias is real on the Internet, and the search engines we use reflect that. The research raises the question, if Google can't do anything about this kind of sexism and racism, then what search engine can?
Innovation is possible. Google is the example to which we turn as the hallmark of innovation. In fact, Google just released a new report about how search works, making this process more transparent than ever before.
Let's bring the leading minds to the table and explore the opportunities, rather than perpetuate old traditions in new media.
Safiya Umoja Noble is an assistant professor in the department of African-American studies, and a faculty affiliate to the Graduate School of Library and Information Science, the department of gender and women's studies and the Center for Writing Studies, at the University of Illinois at Urbana-Champaign. Follow her on Twitter.
SAFIYA UMOJA NOBLE, Ph.D. is an Assistant Professor in the Department of African-American Studies at the University of Illinois at Urbana-Champaign. She is an interdisciplinary scholar of race, gender and technology in the United States, with a particular focus on representation in digital media platforms. Her research and teaching interests include the political economy of the Internet; critical perspectives on Black women’s representation in technology systems; digital popular culture and the arts; the role of Black labor in technology production, manufacturing, consumption and disposal of information and communication technologies; and the role of digital technology in public life. Safiya earned her Ph.D. in Library & Information Science from the University of Illinois at Urbana-Champaign as an Information in Society Fellow, funded by the Institute for Museum and Library Services. She holds a B.A. in Sociology from California State University, Fresno and an M.S. from the University of Illinois. She has published on the political economy of Geographic Information Systems. Her current research, “Searching for Black Girls: Old Traditions in New Media” illumines how technology platforms represent gendered and racialized identities, which she has written about for the public press in Bitch Magazine: Feminist Response to Pop Culture. Safiya regularly blogs on critical perspectives on digital technology culture from a Black feminist perspective and is actively involved in Humanities, Arts, Science and Technology Advanced Collaboratory (HASTAC). Prior to joining the faculty at the University of Illinois at Urbana-Champaign, she spent over 15 years in multicultural and urban marketing and advertising.