Don’t Be Evil

Student Perspectives is our series of guest posts written by current CityLIS students.

This post is written by current CityLIS student Hilary Jordan. She discusses the biases embedded in search algorithms and the importance of remembering the human factor when discussing how information is processed and disseminated. The original post can be found here.

***

On June 27, 2017, The European Commission handed Google a €2.42 billion (£2.1 billion) fine for breaching EU antitrust rules. It was slightly too late for Foundem, a price-comparison website which had consistently been marked ‘spam’ in after an update to Google’s algorithm and pushed to the bottom of its search results. One second it ranked 1st or 3rd, the next in the 70s or 80s. Now Foundem’s website exists as a relic, a record of its long battle with Google.

Incidentally, Google’s own price comparison site “Froogle” was at the top of search results.

Being essentially invisible in Google searches was an existential threat to the company. Most people use Google as gateway and guide to the Internet. But over two years Foundem pursued every conceivable avenue – there was no reasoning with Google.

Google has repeatedly denied any wrongdoing and insisted that algorithms were simply doing their job. They finally relented and “manually whitelisted” Foundem in December 2007.

Whilst in this instance Google were held (somewhat) to account, I believe some of the language used shows a deeper problem we have when we talk about algorithms. Google’s defence was that an update to the algorithm, designed to root out spam websites with little original content, penalised Foundem. But Google is its search algorithm. If it made a mistake, it was Google’s mistake. Whether the act was accidental or malicious, Google cannot somehow disown it.

The same problem crops up whenever we hear about “racist algorithms” in the news. Stories such as facial recognition software identifying a black woman as a gorilla can prompt questions about who’s working on these algorithms and how they’re doing it, but often we fail to probe deeper.

If it is an algorithm that makes mistakes, or an algorithm that collects our data, then often it appears to slide into our lives, ubiquitous and silent. If, though, there is a human on the other end, then we can challenge it. We need to remember the human on the other end, not necessarily so we have specific people to punish, but so that we are active users of the Internet, rather than passive.

Sources:

Pasquale, F. 2015, The black box society: the secret algorithms that control money and information, Harvard University Press, Cambridge, Massachusetts.

Vincent, J. , Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech. Available: <https://www.theguardian.com/technology/2017/dec/04/racist-facial-recognition-white-coders-black-people-police>

Wilson, N. , Google’s nemesis: Meet The British couple who took on a giant, won… and cost it £2.1 billion. Available: <https://www.wired.co.uk/article/fine-google-competition-eu-shivaun-adam-raff >

About Joseph Dunne-Howrie

I am artist in residence in the MA/MSc Library and Information Science department at City, University of London and module year coordinator for MA/MFA Performative Writing/Vade Mecum at Rose Bruford College of Theatre and Performance.My research interests include intermediality, live performance in digital culture, participatory and immersive theatre, performance documentation, archives, and performative writing.
This entry was posted in Student Perspectives and tagged , , , . Bookmark the permalink.