Google Searches Prove Racism Is Embedded in the Algorithm

Updated Oct. 23 2018, 4:10 a.m. ET Racism existed before search engines, but Google has consistently not worked to prevent it. Lets look at some times Googles algorithms could have done better. 1. The now classic example: the difference between searching for three black teenagers and three white teenagers

Google Searches That Show Racism Embedded In the Algorithm

By

Updated Oct. 23 2018, 4:10 a.m. ET

Racism existed before search engines, but Google has consistently not worked to prevent it.

Let’s look at some times Google’s algorithms could have done better.

1. The now classic example: the difference between searching for “three black teenagers” and “three white teenagers”

I didn't think it was true until I did the search myself... 😕😕😕😕 #googleisracist

A post shared by Annmarie Stubblefield (@marieluv_76) on Jun 9, 2016 at 7:19am PDT

2. That time Google insisted that white people do not steal cars.

#GoogleIsRacist

A post shared by Juan C Torres REALTOR® (@the_juan_and_only) on Apr 29, 2012 at 8:54pm PDT

3. This Instagram user’s child began to Google “Why are asteroids…” when Google decided to take over and ask “Why are asians bad drivers?” 

My daughter is starting to look up asteroids and she gets this when typing it in... Lol @nailsbykenny

A post shared by Sharona Lynn Sendecki (@sharonalynnsendecki) on Feb 1, 2014 at 6:59pm PST

4. And it's not the first time Google has reinforced racial stereotypes through their suggested text. 

Google is a little racist... #google #googleisracist #blackstereotypes #racist #faf

A post shared by Daniel Tybroski (@daniel_tybroski) on Apr 27, 2013 at 4:31pm PDT

5. Here are some claims Google makes about Jewish people.

#googleisracist pt.1

A post shared by Brendan Arve McCaffery (@bmccaf10) on Jun 17, 2014 at 12:06pm PDT

6. And here's a stereotype the search engine feels is worth perpetuating.

Lol! #googleisracist #wedontalllookalike

A post shared by @ wheezyhuy on Oct 23, 2012 at 2:31pm PDT

Now, let's dig a little deeper into the history of racist Google and the search engine's responses to complaints. Remember the example we used in #1?

In 2016, 18 year old Kabir Alli Googled a simple phrase: “three black teenagers.” What appeared went viral: mugshots of black teenagers. When he googled “three white teenagers” the result was completely different: happy groups of white teenagers laughing and hanging out.

YOOOOOO LOOK AT THIS pic.twitter.com/uY1JysFm8w

— ❤️💫 heartthrob 💫❤️ (@iBeKabir) June 7, 2016

The outpouring of anger led to an apology from Google but also a statement that there was little they could do regarding the algorithm. Since this occurrence, people have tried to get Google to take for responsibility for their algorithm and recognize that the search engine is not neutral.

Algorithms of Oppression is a great, socially urgent book about how ‘racism and sexism are part of the architecture and language of technology.’ I learned so much from it. Thank you @safiyanoble! pic.twitter.com/vx4vKF4MPV

— Tanya Horeck (@tanyahoreck) April 22, 2018

In her book, Algorithms of Oppression: How Search Engines Reinforce Racism, published this year by NYU Press, Safiya Umoja Noble explores how Google does not do enough to combat racism in the search engine and actually reinforces racial stereotypes.  Noble, an Assistant Professor at the University of Southern California teaches classes on the intersection of race, class, and the internet. 

And this important line about culpability, IT companies, and software engineers: "If Google software engineers are not responsible for the design of their algorithms, then who is?"

— Dr. Dhanashree Thorat (@shree_thorat) April 24, 2018

At Google in 2017, 91% of employees were white or asian. Additionally, only 31% of their workforce identified as women. When it comes to the tech portions of the corporation it gets even worse, with only 20% women and 80% men employees

Noble argues that algorithms are not neutral: they are created by people and filled with bias. Based on her extensive research, Noble believes that Google must take responsibility for the racism of the search engine. Algorithms are composed in computer code, and like all languages, this language reflects the culture it is created in. 

And this -> "I consider my work a practical project, the goal of which is to eliminate social injustice and change the ways in which people are oppressed with the aid of allegedly neutral technologies."

— Dr. Dhanashree Thorat (@shree_thorat) April 24, 2018

Noble found that Googling “black girls” quickly leads to porn. It also doesn’t take long when Googling “asian girls” to find sexualized women, wearing little clothing. When Googling “successful woman,” she most often found images of white women. 

When these instances emerge, Google always blames them on their “neutral” algorithm and does small fixes to those specific searches to change them. Noble argues that Google needs to take responsibility and have a larger reckoning, completely reworking algorithms.

"I don’t think tech companies are equipped to self-regulate any more than the fossil fuel industry." @safiyanoble #CUNYGCITP pic.twitter.com/FJWxEbXlvj

— Jenna Freedman 🤖 (@zinelib) April 21, 2018

Instead of opening up portals of information and increased understanding across difference, Google is reinforcing old stereotypes and giving them new life. To describe this, Noble coined a term “technological redlining” echoing racist housing practices of the second half of the 20th century. Today, technological companies are making invisible the way their programs and algorithms make decisions, and are effectively hiding the biases.  

mic drop @safiyanoble 👏👏👏👏👏 pic.twitter.com/1aDtGod2U5

— erin glass (@erinroseglass) April 25, 2018

As Safiya Umoja Noble recognizes, there are so many ways Google and other search engines could do better. 

First, tech companies need to recognize that algorithms are not inherently neutral. There is no easy fix for widespread biases, but  a more diverse workforce could begin to help these issues. 

Google is just the beginning, Noble hopes other tech companies like Yelp will learn similar lessons.

"We have many racist stereotypes in the United States that harm Asian Americans, and these kinds of derogatory notions are likely... circulating widely as key phrases in other online spaces outside of Yelp." @safiyanoble https://t.co/mO118fVtKa

— NYU Press (@NYUpress) April 26, 2018

ncG1vNJzZmivp6x7pbXSramam6Ses7p6wqikaKyimrultc2gZmtoYW18cYGOamtoaWafuaN9jqCmqJ%2Bcmnq0scCrmqGdo2LBqa3TZqqhp6div6KvyKykZp2dl7KlsMSdZKKmXam1pnnApZ6oqpmpta4%3D

 Share!