Spotlight on our alumni… Maryam Negm

Welcome to the new blog section “Alumni Spotlight”!

Today we meet and celebrate Maryan Negm who received the highest dissertation prize in our MA in International Politics and Human Rights (cohort 2021-22). Her dissertation title was How Data is Breaking Democracy‘.

This is what she says about her experience at City:

“I’m an Egyptian national who grew up in the Gulf, attending multicultural international schools before obtaining a BBA with a concentration in Finance from the American University in Cairo. I was quite fortunate to have professors who possessed a wealth of knowledge and experience that allowed for a high level of intellectual discussion thus broadening my perspective and enhancing my critical thinking. Pursuing an MA International Politics and Human Rights at City, University of London, was a challenging endeavour that would not have been successful without the quality of education and support I received. I am honoured to have earned the highest dissertation grade in my cohort. After completing my course, I started working as a project manager for Complete Projects, a boutique consulting firm, where I hope to continue to progress in my career, cooperating and knowledge-sharing with people from all walks of life.

 

And here you can read an extract from dissertation section entitled “Manifestations of Data Colonialism in Chinese Big Data”:

The goal of these state data and AI programs is not to enhance freedom. On the contrary, they damage the space of freedom and subjectivity, as did historical colonialism. My findings shed light on China’s lack of epistemic objectivity, in which it is firmly believed that this governance model is the most beneficial to society and the state. Thus, a theme or an ideology emerges to justify such an invasive and corrosive imposition, meeting the third feature of data colonialism. The imposition of specific social and economic standards upon society, to be enforced by incentivising and punishing modified individual behaviour through the datafication of all human life by way of near incessant monitoring, results in the utter reduction of autonomic individualism to mere data points. The 2014 Plan and the Guidelines on Combined Punishments illustrate the government’s belief that limiting a person’s autonomy is ultimately for their own good, and thus it justifies its practice of nudging. By appropriating human life and social structure and enabling data colonialism, China can manipulate and control its population into consuming, sharing, prohibiting, or any other behaviour it deems proper and true. This is what Richards warns of when governments use Big Data. As a result, self-autonomy and free choice are heavily impacted due to self-surveillance and self-censorship to improve quality of life, thus violating Dahl’s polyarchy.[1] The state’s concentration of power fulfils data colonialism’s fourth and final feature. This new world order is not based on freedom and democracy. Instead, it cleanses citizens of individuality, choice, and rationality, which are crucial democratic virtues.

My analysis emphasises the importance of algorithmic coding and that the already opaque governance structures call for transparency. If the code is to set the rules that impact people’s lives, including insurance rates, employment opportunities, university admissions, and even immigration procedures, then that code must be revealed to prevent discrimination and keep regulators in check. Otherwise, algorithmic coding is the new digital informant in state surveillance programs. Data about the bio-self, such as race, age, sex, religious belief, marital status, employment history, family background, health, and nationality, create patterns for machine learning algorithms which makes prediction possible, a consequence of AI (Williams et al, 2018). The correlations derived from the learned patterns are used to make imputations with missing data. This process may appear seemingly innocent but is problematic when individuals are not privy to the algorithms being used because algorithms designed to find patterns also create biases that reflect what is currently, and more importantly, previously, happening in society. Pasquale (2016) uses the term ‘black box’ to refer to the hidden algorithmic rules. The black box problem is more dire when considering that it is upon these human biases that resources and risks are allocated to different social groups (Pasquale, 2016). If left unchecked, algorithmic governance exploits the marginalised by reducing them to malleable data points and will continue to do so in the absence of transparency, contrary to Schumpeter’s minimal standard of democracy.[2]

The same biases can be weaponised against citizens who exercise free speech on social media, a pillar in democratic societies. This was made blatantly clear in my findings of the 2014 Plan. The Plan emphasises that traditional moral values of honesty and integrity are lacking in Chinese practices today and therefore outweigh respect for individual rights. As is common practice in authoritarian regimes, whispers of dissent are swiftly crushed to maintain order and control. With the creation of the digital panopticon, the introduction of new AI facial recognition software, and extensive online and physical surveillance, identifying citizens participating in anti-government speech has never been easier. It is, quite literally, built into the system, whereby critical expressions about the state are met with a lower rating, a punishment, and the possibility of imprisonment. The expansion of data colonialism alleviates much of the work as citizens conduct their lives online and willingly relinquish their data to surveillance capitalism companies, which in turn offer this data to the government on a silver platter. The government ‘nudges’ citizens with the threat of harsh consequences to modify their behaviour, self-censor, vanquish safe spaces to exchange ideas and maintain order under absolute authoritarian rule, thus eradicating basic civil liberty and depicting another violation of Schumpeter’s and Dahl’s standards of democracy.

 

Footnotes

[1] Refer to section 5 for Schumpeter’s and Dahl’s definitions of democracy.

[2] Ibid.

 

References

Baviskar, Siddhartha, and Mary Fran T. Malone (2004) “What Democracy Means to Citizens — and Why It Matters.” Revista Europea de Estudios Latinoamericanos y Del Caribe / European Review of Latin American and Caribbean Studies, no. 76 : 3–23. http://www.jstor.org/stable/25676069.

Pasquale, Frank (2016) The Black Box Society: The Secret Algorithms Behind Money and Information. Cambridge: Harvard University Press.

Richards, Neil (2013) “The Dangers Of Surveillance”. Harvard Law Review 126 (7): 1934-1965. http://www.jstor.org/stable/23415062.

Schumpeter, Joseph (2003) Capitalism, Socialism, And Democracy. USA: Taylor & Francis e-Library.

Williams, Betsy Anne, Catherine Brooks, and Yotam Shmargad (2018) “How Algorithms Discriminate Based On Data They Lack: Challenges, Solutions, And Policy Implications”. Journal Of Information Policy 8: 78-115. doi:10.5325/jinfopoli.8.2018.0078.

Leave a Reply

Your email address will not be published. Required fields are marked *