Student Perspectives is our series of guest posts written by current CityLIS students.
This post is written by current CityLIS student Stephanie McMullan. Inspired by the issues raised in the TV series ‘Bodyguard’, Stephanie considers the ethical implications of mass surveillance and grapples with the complex topic of digital rights and internet privacy. The original post can be read here.
A little bit late to the party, last Saturday night I sat down to binge watch the BBC’s Bodyguard after a day of reading articles for my new postgraduate course. A drama about political double-dealing, murder and terrorism; I didn’t expect it in any way to apply to all the articles I had been reading for my Information Sciences course. However, outside of the web of murder and espionage another element of the drama caught my attention – RIPA 18, the fictitious “Snooper’s Charter” the Home Secretary is trying to pass.
The imaginary RIPA 18 is a bill that would give the government power to view accounts and search histories, come to conclusions about the people seeking out circumspect information and prosecute them, even retrospectively. Understandably, people in the show are up in arms about this intrusion into their privacy.
Nevertheless, while the Bodyguard sensationalizes the prevalence of murder and espionage in politics, it struck me how true to life governments access to our data is.
To many people the prospect of a Big Brother society is something that lives on the edge of our world; either in television, fiction or in far away states like Russia, China or even India. Or if people do think of the access the government has to our details it is easy to believe the adage “if you have nothing to hide you have nothing to fear”.
However, this is becoming harder and harder to believe. A few weeks ago The Guardian published an investigation into the use of algorithms by local councils to predict child abuse in families. According to the article councils used personal details such as school attendance, arrears data, housing repairs and police records to determine at risk children. While done with the best of intentions this highly controversial use of personal data brings “the risk of accidentally incorporating and perpetuating discrimination against minorities” as “systems inevitably incorporate the biases of their designers”. Did the people whose information was processed consent to the use it was put to? Even if they did, should our data selves really be able to bring the fear of prosecution and suspicion into our offline lives?
And it’s not just the government who makes use of our private lives. In a scandal that could belong on the BBC the Cambridge Analytica exposé highlighted how much information is held by social media sites like Facebook and how much of ourselves we can reveal with something as simple as a ‘like’ or a personality test and how this information can be used to target our political sensibilities and influence world events. With the Facebook accounts of 87 million people leaked again this week we can only speculate to what use this data will be put to.
So if our data can be used to incriminate us and potentially change the outcome of democratic elections, is it not something we should be more aware of and take better care of?
It seems like the EU have finally tried to start answering these questions. With the newly enacted GDPR legislation individuals are being given greater rights to their data with the ability to request their information quicker, know what their data is being used for and apply for the “right to be forgotten”. Organisations meanwhile are being held responsible for data safe keeping, with huge fines hanging over their heads if data breaches aren’t properly handled.
Whilst this is a step in the right direction,”as it is often the case with complex legislation, the [GDPR] Articles leave grey areas of normative uncertainty uncovered” (Floridi, 2018). Therefore, the legislation is only as useful as the individuals who uphold and apply it. LIS professionals clearly sit within any framework that upholds data and information policies, so in many ways can shape the ethical form data protection takes under GDPR. As the custodians and retrievers of information in businesses and libraries they can help with the formation of information policies and ensure these are compliant with the law and ethically constructed to protect both the individuals and companies they serve. We need only look at the CILIP and IFLA ethical frameworks to see that information professionals are already thinking of these issues. Furthermore, as educators LIS professionals teach and instruct people at all levels from schools and universities to businesses and public services in a wide range of databases and systems and can ensure best practice is observed and people are informed of the best ways to protect themselves.
So in the new world of big data can library and information professionals be our bodyguard? I think so; though to the best of my knowledge none of us look like Richard Madden…