To what extent has a lack of diversity influenced the development of the internet? How does this lack of diversity harm societies and individuals?
“Technology is neither good nor bad nor neutral”
The interaction between technology and social ecology makes technological development often have environmental, social, and human consequences that go far beyond the immediate purposes of the technological device and practice itself.
(Boyd & Crawford, 2012)
The Internet has its roots in the Third Industrial Revolution of the late 1970s, and as technology has evolved, information technology has become progressively more intelligent. Social networking sites and platforms have become essential to everyday practice, shaping online behaviour and offline activities.
In 1998 Larry Page and Sergey Brin developed the PageRank algorithm that drives Google’s search results (Beer, 2016). It was the basis for the early search functionality of web 1.0, providing a platform that allowed users to break the constraints of time and space to access information freely and for free.
Since the beginning of the Web 2.0 era, the field of algorithms has rapidly emerged, and the regime of information visibility is closely linked to empowerment (Bucher, 2012). Big Data has redefined how knowledge is constituted and the research process is conducted (Boyd & Crawford, 2012). Its technology essentially provides a symbolic system that creates and communicates all alternative facts, concepts, meanings and interpretations (Voiskounsky, 1999).
As a result, web 2.0 has rapidly developed a platform economy, with social networking sites attracting large numbers of the networked public who play different social roles in communicating with each other and thus create meaning.
However, personalisation techniques in the age of big data and the monopoly of a few giant technology companies have mainly contributed to the lack of diversity.
Unmeasurable power – algorithms and monopolies
The Internet connects individuals to the world and has become a medium for sharing knowledge. However, it has constructed a utopian world that appears to open up the world but, in fact, invisibly ‘encloses’ the user within it.
Media becomes the critical mechanism for sorting, categorising, and ranking social domains (Bucher, 2012), and critical decisions are based on data analysed by algorithms rather than on the data itself (Beer, 2016).
Algorithms are coding procedures that transform input data into desired outputs based on specified calculations (Gillespie, 2018). It is a meaningless machine whose logic, maintenance and redesign are in the hands of the information providers, who have a privileged position (Gillespie, 2018).
Moreover, the mechanism it works primarily derives from the financial benefits technology companies derive from transforming data collected from users’ online digital traces into a complex set of information practices. As a result, the visibility of information is fundamentally not neutral, and it usually tends to make sense of content (Bucher, 2012).
Furthermore, the advent of search engines has changed how and what information is available to the masses, significantly increasing the ability of users to “find a needle in a haystack “. Thus, while algorithms have silently and actively shaped users’ everyday practices, users enjoy the convenience it brings and subconsciously ignore algorithms’ power.
The public sphere is where opinions are expressed and exchanged in society. Power relations usually constitute it, that is, by the prevailing political order, economic ties, cultural heritage, and technological capabilities (Schlesinger, 2020). However, algorithms represent authority by searching, sorting, filtering, recommending, and deciding. ‘Media freedom’ is mainly in the hands of groups that dominate social power, such as governments and technology giants. Nowadays, the structure of several technology giants is essentially a vertically integrated organisation, i.e., the process from production to distribution is owned by an integrated company. It has led to the concentration of the industry in the hands of a few global organisations, forming a media monopoly and limiting the diversity of the market.
Two specific representative examples.
- Microsoft – Corporate Giant Monopoly
Technology giant companies dominate the Internet because traffic is limited, so the technology company giants are grabbing it. They care about building connections to gain traffic and financial gain. They cannot make money directly through their products, so users become ‘money-making products’.
Microsoft faced federal antitrust charges in the 1990s for extending its computer operating system software monopoly to the Internet through its web browser (Sarah et al., 2003). Microsoft, as an early technology company, mastered almost every aspect of the software industry, organising the entry of other companies through a dominant barrier in the operating system sector, thus reducing new competitors and ensuring a profitable position.
The monopoly position was not illegal, but it needed to be regulated because US antitrust laws restrict excessive competition (Sarah et al., 2003). The use of monopolies and market positions created by competition created an invisible wall of protection, resulting in an oligopolistic browser standard that significantly weakened the public’s right to choose alternatives. In addition, it limits the diversity of markets. It may lead to an unbalanced Internet dominated by one company in the national economy, limiting technological innovation and having far-reaching financial implications.
- Google – the filter bubble hypothesis
Google is now one of the most indispensable platforms in the daily practice of the public, with a wide range of businesses and products such as browsers, navigation, office tools and social networking services.
Pariser defines the wave of online personalisation as the “filter bubble “phenomenon, which traps individuals and groups in an information bubble (Rowland, 2011). The search results provided by Google are the content users are most likely to click on rather than the most popular results. Users with different positions may receive completely different information. Personalised results depend on individual preferences for daily practice, which automatically selects information instead of the user, who feels the convenience but neglects to access or expand information beyond their point of view and tends to solidify ideas and information blindness.
Furthermore, the filter is invisible and acts as a ‘segregator’. No one knows what information is being hidden by the filter, which can significantly hinder the exchange of diverse views and create political bias or social prejudice that can have a detrimental effect on society—for example, racial discrimination.
Moreover, mass media should ensure that news is multifaceted and balanced to keep citizens well informed, especially in democratic decision-making (Haim et al., 2017).
However, research has shown that Google News does have a bias through its recommendation algorithm, with some news being underrepresented but not completely filtering other information (Haim et al., 2017). Furthermore, the benefits of personalisation in terms of advertising revenue are clear. The more accurate websites can target consumers based on their personalised preferences, the more advertising dollars they can capture (Rowland, 2011). Therefore, algorithmic technology can be considered a platform monopoly tool, and giant companies will likely collect user data without restriction, thus creating market dominance.
Impact of lack of diversity on the development of the Internet.
The dominance of Google and Microsoft as sizeable international technology companies, coupled with the impact of algorithmic technology, is likely to lead to social biases, political errors, and economic issues such as market competition. The enormous power brought by technology has become the means of capital “plunder”. The user’s interests, the market balance and a series of social equity issues can hardly be guaranteed.
Facebook, for example, tried to steer users to vote through fake news in 2016 and accessed and shared data on as many as 87 million users with analytics companies without permission. Thus, the Internet, as a new public sphere, accountability and regulatory issues are at the forefront of development. On the one hand, tech companies need to take responsibility for ensuring the legal rights of their participants, such as free speech and access to diverse information. On the other hand, the government should regulate the power of technology companies to avoid market monopolies and protect the interests of individuals and society.
Moreover, high centralisation leads to a lack of diversity, so the new Internet of decentralised technologies has been developed. Decentralisation reduces the intermediary status, and users do not need to rely on third-party institutions in favour of increased autonomy; moreover, the blockchain system facilitates the trustworthiness and security of information, thus effectively protecting users’ privacy (Rennie et al., 2019).
The lack of diversity illustrates that the development of personalised technology on the Internet has led it to lose its initial vision of a public domain. Furthermore, it alerted individuals to be sceptical about technology rather than just accepting or rejecting it. It also raised concerns about the responsibility of technology companies and governments to regulate the development of the Internet. And the development of the Internet more fairly and transparently through new decentralised technologies.
Technology is a tool whose ultimate value depends on the struggle over its use (Rowland, 2011).
Beer, D. (2016). The social power of algorithms. Information, Communication & Society, 20(1), 1–13. https://doi.org/10.1080/1369118x.2016.1216147
Boyd, D., & Crawford, K. (2012). Critical Questions for Big Data. Information, Communication & Society, 15(5), 662–679. https://doi.org/10.1080/1369118x.2012.678878
Bucher, T. (2012). Want to Be on the top? Algorithmic Power and the Threat of Invisibility on Facebook. New Media & Society, 14(7), 1164–1180. https://doi.org/10.1177/1461444812440159
“Facebook: The Panopticon of Modern Age” by iLifeinicity is licensed under CC BY 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by/2.0/?ref=openverse.
Gillespie, T. (2014). Chapter 9: The Relevance of Algorithms. In Media Technologies (pp.167-193). The MIT Press. https://doi.org/10.7551/mitpress/9780262525374.003.0009
Haim, M., Graefe, A., & Brosius, H.-B. (2017). Burst of the Filter Bubble? Digital Journalism, 6(3), 330–343. https://doi.org/10.1080/21670811.2017.1338145
“ILSs advertising on Google” by 30 Lines is licensed under CC BY-SA 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by-sa/2.0/?ref=openverse.
Lapowsky, I. (2018, April 4). Facebook Exposed 87 Million Users to Cambridge Analytica. Wired. https://www.wired.com/story/facebook-exposed-87-million-users-to-cambridge-analytica/
McMullan, T. (2015, July 23). What does the panopticon mean in the age of digital surveillance? The Guardian. https://www.theguardian.com/technology/2015/jul/23/panopticon-digital-surveillance-jeremy-bentham
Noble, S. (2018, March 26). Google Has a Striking History of Bias Against Black Girls. Time. https://time.com/5209144/google-search-engine-algorithm-bias-racism/
Rennie, E., Potts, J., & Pochesneva, A. (2019, November 10). Blockchain and the creative industries: provocation paper. APO. https://apo.org.au/node/267131
Rowland, F. (2011). The Filter Bubble: What the Internet is Hiding from You (review). Portal: Libraries and the Academy, 11(4), 1009–1011. https://doi.org/10.1353/pla.2011.0036
Schlesinger, P. (2020). After the post-public sphere. Media, Culture & Society, 42(7-8). https://doi.org/10.1177/0163443720948003
Sarah, P., Kristin O, P., & Rob H, K. (2003). How regulation fails the free market: antitrust and “successful” monopolies. In Allied Academies International Conference. Academy of Legal, Ethical and Regulatory Issues, Proceedings, 7(2),37-42. https://www.proquest.com/docview/192409684?parentSessionId=GpMDPtWat4VV8knAllSrzax1fmpM4UV0HI0R9NCPhLM%3D&pq-origsite=primo&accountid=14757
Solon, O. (2016, December 12). 2016: the year Facebook became the bad guy. The Guardian. https://www.theguardian.com/technology/2016/dec/12/facebook-2016-problems-fake-news-censorship
“Transcending Moore’s Law” by jurvetson is licensed under CC BY 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by/2.0/?ref=openverse.
The Science Elf. (n.d.). The Microsoft Monopoly. Youtube. Retrieved October 7, 2022, from https://www.youtube.com/watch?v=DN1ytVJcFds
Voiskounsky, A. E. (1999). Internet: Culture, Diversity and Unification. Javnost – the Public, 6(4), 53–65. https://doi.org/10.1080/13183222.1999.11008727