There is no doubt that the internet has drastically changed contemporary society by increasing global connection, increasing productivity, and increasing convenience for the majority of people around the world. On a surface level these all seem to be reasons to celebrate the internet, and they are, however the perceived positive impacts overshadows the bigger picture as the big player corporations who dominate the boundary-less space of the online world are rarely challenged in their processes. In 2015 Frank Pasquale wrote a book titled The Black Box Society which focused on the secret algorithms that he argues control most aspects of modern life, specifically money and information. He argued that “what we do and don’t know about the social (as opposed to the natural) world is not inherent in its nature, but is itself a function of social constructs” Pasquale, 2015, pg. 2). He is making the point here that there is an assumed objectivity of algorithms within the online sphere. He suggests here that all meaning has been constructed through a particular perspective, and that the internet is far from a free and equal, democratic space. The internet is a domain that tends to reflect the material world that we all occupy and vice versa. In a capitalist, patriarchal society that is vastly unequal, the reality of the online world reflects those same values; as exemplified by double standards and lack of consequences (for those who hold different forms of power) among other discrepancies that often go ignored.
Pasquale recognises the importance of power arguing that “knowledge is power..” and that “to scrutinise others while avoiding scrutiny oneself is one of the most important forms of power” (Pasquale, 2015, pg. 3). We see this play out in the dynamics between tech corporations and the average citizen. As discussed in the chapter, the public are considered ‘open books’ and we voluntarily submit our personal data over to these corporations in order to make our lives easier. And while the convenience is easy to recognise within our daily lives on a personal level, the overall impact on our increasingly connected globalised world is harder to track. Big data platforms get to dictate their own ‘community guidelines’ and policies around the type of content that is allowed on different sites; in an almost invisible way they are shaping culture by defining what is and isn’t acceptable in the online (and subsequently offline) world. In an article going back to 2016, Quartz found that there were particular groups who were constantly targeted and censored online including: plus-sized women, mothers, women in general, sexual health organisations, indigenous groups, journalists (often those speaking out against particular political parties/entities), artists/museums/galleries (specifically those that included any sort of nudity), and LGBTQ groups and individuals (York, 2016). The continued policing of marginalised groups will be discussed in more detail below.
Due to the increasing reliance we all have on our digitised devices it is easy to ignore the invasion of privacy and the way in which these corporations have used algorithms and other forms of new technology to influence what we buy, where we go, or even who we vote for (The Great Hack, 2019). Influence over our everyday lives is slowly being overtaken by corporations than that of government as Pasquale argues “most major decision about our lives are made in the private sector, not by a state bureaucracy” (Pasquale, 2015, pg. 17). Given that many people fail to interrogate the intentions of these large corporations the assumption that they are public goods eclipses the reality that they are corporations that prioritise profit over anything else. In 2012 one of the richest tech men in the world Bill Gates spoke at a media summit stating that he felt he could probably have ‘as much’ of an impact on society from outside the political sphere than within it (CBS, 2012). In order to make change from a political standpoint you need consensus, either within your party or across the population. For someone like Bill Gates to make a change or have an impact he can use his large financial capital to make decisions under the guise of philanthropy. Another significant difference is the length of time you could potentially hold that influence, in politics there are restrictions on the terms you can spend in office, as a ‘philanthropist’ nobody is setting restrictions or an expiry for when you must stop contributing. Different forms of power are underestimated when it comes to the online world and while this example focuses on the economic capital that Bill Gates holds the internet also privileges other forms of capital including the social and the cultural.
The title of Pasquale’s book is a metaphor for the secrecy built into the algorithmic system. Algorithms used by corporations like Facebook and Google are predictive algorithms, which take an analytical approach to online data that are concealed within ‘black boxes’. Pasquale called them black boxes because we, the public, do not have access to the processes of these algorithms as they are protected by law and are labelled as ‘trade secrets’. Algorithms rely on existing data sets which it then analyses, recognising patterns which it then replicates to predict the future. Historically speaking, the ideology of humanism has prevailed which privileged the European white male as the default human to which all identities that fell outside of that default were considered as ‘other’ (Braidotti, 2013, pg. 15). So what does that mean for internet users today? It means that bias is built into the seemingly objective formulas which continue to replicate the status quo. A more recent book that also focuses on algorithms was written by Safiya Umoja Noble in 2018 titled ‘Algorithms of Oppression: How Search Engines Reinforce Racism’. Her interest in the topic stemmed from her own personal experience using the largest search engine Google. She describes her horror at the inequalities she saw online when using Google. There were significant differences in the results of black girl vs. white girl as well as the auto-fill suggestions that appeared once she began typing something into the search bar. As she utilised this ‘impartial’ tool, Noble noticed that sexism and racism (among other discriminating factors) were built into this search engine which is presented to the public as democratic and of providing truths. Consequentially, all those ‘non-default’ beings are disadvantaged through these invisible algorithms.
The biases built into the online institutions are no accident; the existing data that is fed into the algorithmic formulations is never neutral, it always comes from a particular perspective. This makes it extremely important to be aware of historical constructions of particular identities. In her book Noble argues that much of the data that exists around minorities generally perpetuate stereotypes or have negative connotations attached. Along with the auto-fill suggestions Noble raises an interesting point about the page ranking system that Google employ through their algorithmic calculations. Popularity of particular pages prioritises them over less frequented pages implying that pages with more visits somehow deserve to be viewed more than others. Noble points out that “if the majority rules in search engine results, then how might those who are in the minority ever be able to influence the way they are represented in a search engine” (Noble, 2018, pg. 6). The inherent flaws in this page ranking system goes largely ignored considering the embedded nature of Google in contemporary (western) society.
Throughout her book Noble also asks the reader to be mindful of the interests of these platforms and reminds us that these platforms are corporations with economic goals prioritised over ethical principles; “despite the widespread beliefs in the internet as a democratic space where people have the power to dynamically participate as equals, the internet is in fact organised to the benefit of powerful elites, including corporations that can afford to purchase and redirect searches to their own sites” (Noble, 2018, pg. 50). The ability to use economic power to influence page rankings is an obvious indicator of the lack of neutrality across multiple platforms and although accessibility of the internet is increasing, accessibility in and of itself does not create democracy. There is no doubt that there is increasing accessibility to the internet across the globe but that does not mean there is any increasing in democracy or equality. Noble suggests that “structural inequalities of society are being reproduced on the Internet, and the quest for a race-, gender-, and class-less cyberspace could only ‘perpetuate and reinforce current systems of domination’” (Noble, 2018, pg. 59). We must continue to challenge the motives of the dominant platforms and push for more transparency of their operations.
The increased reliance of the internet, especially in educational environments has meant that many people around the world must be engaged with particular platforms and services in order to participate. This directly impacts on the architecture of the internet and the next part of my presentation will be focusing on an issue known as shadowbanning. Many argue that the currency of social networks is human attention. According to The Economist a shadowban: “in theory, curtails the ways in which that attention may be earned without blocking a user’s ability to post new messages or carry out typical actions on a network. Shadowbanned users are not told that they have been affected… The only hint that such a thing is happening would be a dip in likes, favourites or retweets [in the case of twitter]– or an ally alerting them to their disappearance” (GF, 2018).
While shadowbanning occurs on a number of platforms, my focus will be on the photosharing app Instagram and their policing of queer, people of colour, women, sex workers, disabled bodies, plus size bodies and most identities falling outside of a cis-white-male body. When it comes to online nudity, Instagram’s policy appears to be straightforward. Their community guidelines state: “Photos of post mastectomy scarring and women actively breastfeeding are allowed” however, “photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully nude buttocks” are not (Instagram, 2020). While this appears fairly clear cut, Instagram seem to have a murky grey area in which ‘sexually suggestive’ content and their creators often falls victim to the perils of being shadowbanned.
An independent and online newsletter/platform specifically for women, trans, and non-binary people called Salty describe their mission as
“Legacy and mainstream media has failed women, trans and non-binary people. They assumed our straightness, our thinness, our frigidity and our fragility for far too long. They preyed on our insecurities in order to market products to us, and told us stories from one perspective, over and over again.
But Salty isn’t legacy media. We’re a radical new publishing platform with a mission to pass the mic to Salty babes across the world and amplify their voices. We’re fighting everyday to ensure the authentic stories of women, trans and non-binary people are not erased.” Salty, 2020
Salty recently undertook their own research that looked at the ways that algorithms affect marginalised groups, coming to the conclusion that plus-size profiles were often flagged for ‘excessive nudity’ and ‘sexual solicitation’, and that queer people and women of colour are policed far more than their white counterparts. When images of fully-clothed plus-sized or black women are removed for being ‘inappropriate’, the platforms AI learns to adopt biases that reinforce misogyny and racism, creating barriers for certain groups in the digital realm. Again the irony becomes apparent that the assumption that social media is an equalising force in modern society but in reality actually serve to further suppress communities who are most often discriminated against. Salty concluded that risqué content featuring cis white women seems less censored than content featuring plus-sized, black, queer women- and cis white men appear to have a free pass to behave and post in any way they please, regardless of the harm they inflict. A clear example of this double standard is the presence of both PornHub and Brazzers on Instagram, two of the biggest pornography companies in the world, with their total amount of followers combined exceeding 18 million people. The illusion of freedom on the Internet only serves to benefit those already at the top of the social hierarchy; the marginalised who challenge these existing norms are constantly punished with no fair or due process. We as a global society must continue to push and challenge these corporations for more transparency if we realistically aim to eradicate different forms of discrimination in both the online and material worlds.
Braidotti, R, 2013, ‘Post-Humanism: Life beyond the Self’, in Posthuman, Polity Press, Oxford, pp. 13-54
BrazzersOfficial, 2020, < https://www.instagram.com/brazzersofficial/>
CBS, 2012, ‘Bill Gates Says He’ll Never Run For Office’, CBS News, 9 October 2012, viewed 30 March 2020, < https://www.cbsnews.com/news/bill-gates-says-hell-never-run-for-office/>
GF, 2018, ‘What is “shadowbanning”?’, The Economist, 1 August, 2018, viewed 30 March 2020, < https://www.economist.com/the-economist-explains/2018/08/01/what-is-shadowbanning>
Instagram, 2020, ‘Community Guidelines’, < https://help.instagram.com/477434105621119/?helpref=hc_fnav&bc=368390626577968&bc=285881641526716>
Noble, SU, 2018, ‘A Society Searching’, Algorithms of Oppression: How Search Engines Reinforce Racism, New York University Press, New York, pp. 15-63
Pasquale, F, 2015, ‘Introduction: The Need to Know’, The Black Box Society: The Secret Algorithms That Control Money and Information’, Cambridge: Harvard University Press, pp. 1-18
PornHub, 2020, <https://www.instagram.com/pornhub/>
Salty, 2020, ‘What We Stand For’, Salty, <https://saltyworld.net/whatwestandfor/>
The Great Hack, 2019 [Netflix], Jehane Noujaim & Karim Amer
York, J, 2016, ‘A Complete Guide To All The Things Facebook Censors Hate Most’, Quartz, June 29, 2016, viewed 30 March, 2020, <https://qz.com/719905/a-complete-guide-to-all-the-things-facebook-censors-hate-most/>