Consider the following scenario: you’ve forgotten what the difference is between a gorilla or a chimpanzee, so you run a fast Google picture search for “gorilla” to refresh your memory. Photographs of a Black couple appear in the search results, rather than images of lovely animals.
What if this is simply an error in the algorithm? Alternatively, is Google an advertising corporation rather than an information company, repeating the biases that exist in the world in which it operates? What can be done to remedy this discrimination, but who is responsible for it?
Google’s Algorithms of Oppression
According to Dr. Safiya Noble, Professor of UCLA and best-selling author on Algorithms of Oppression, the bigotry embedded in these networks is “programmed.” Because it would enable these types of deceptive, inaccurate kinds of outcomes to come into focus, the logic is sexist and racist. Sadly, there are countless incidents of harm caused by algorithmic prejudice that have been documented.
Dr. Noble joined up on At Liberty with the purpose of addressing what she refers to as “algorithmic oppression,” as well as what is there that can be done to combat such bias and destroy systematic racism in software, prescriptive modelling, search platforms, surveillance equipment, and other technologies.
Google Filters Classify Homosexuality And Lesbianism With Rape, Racism, And Paedophilia
Google Instant is screening phrases like a hate crime, homosexual and lesbian, placing them within the same level as paedophilia, rape, and nigger. This allows Instant results to be presented while you look for terms such as paki jokes, coon, necrophilia, female genital mutilation, and kiddie fiddler.
There are numerous instances in which when you write a specific word or phrase that Google does not want you to search about, you will be able to search about it, but you will not be able to search about other terms that google does not allow. When you type the word Paki, you will find it in the search results despite it coming under the hate category. As an example, numerous racist phrases exist, and while Google should not allow people to search for them, it does allow us to read about them.
Google Has To Address The Algorithmic Problem
I don’t mean to imply that we don’t require Google filters or that they are entirely ineffective. My point is that the way Google filters have been constructed is not the correct method, and it sends the wrong message to the public. Google needs to address the algorithmic issues with its Google Filters service.