There is a clear issue in digital platforms maintaining a completely self-regulated autonomy as they are gifted with the ability to inextricably influence the personal, commercial, and political relationships of society. The system of self-regulating digital platform involves these social media companies internally producing guidelines and terms of services which outline their responsibilities and capabilities as a communicative social platform. This ecosystem of self-regulated communicative platforms provides corporations with the power to make independent exploitive decisions with a focus on financial gain, global market expansion and political dominance at the detriment of their users. The ability for digital platforms to maintain complete self-regulation is heavily problematic specifically in the control over freedom of expression through their ability to silence unfavourable voices and opinions, self-define ethical standards of practice as well as provide third-party applications the personal data of users. With the alarming current lack of viable trust from a self-regulatory mechanism, alternative structures have been explored to create an ecosystem that promises to fundamentally focus on upholding ethical standards to remove corporate regulated biases. Idealistically with a system of self-regulated communication, digital platforms could maintain responsibility for upholding ethical standards of social communication, however ulterior self-interests of corporate financial gain create an ecosystem troubled with unethical internalised decision-making.
Data Monetisation – Cambridge Analytica
The powerful sphere of influence that self-regulated digital platforms have within this current ecosystem has a particular negative impact towards individual data privacy and protection. This influence can be identified in a common theme by which digital platforms impose their own terms of service regulations which enable the monetization and repurposing of user data. In the unrestricted nature of self-regulation, digital platforms give themselves the power to freely track user behavior, recording all forms of communications and categorizing individuals into dehumanized points of data for advertising. Whilst an argument can be provided in the transparency from digital platforms such as Facebook and Google in the publication of these terms of services, several instances of platforms disregarding the required ethically protection of user data for monetary gains has been consistently evident. One most notable case study of the exploitation of user data despite seemingly transparent self-regulated terms of service is a lawsuit against Meta Platforms known as Cambridge Analytica. (Flew, 2020) Meta Platforms were reported to have sold off access of personal Facebook profiled data to a political consultancy firm (Cambridge Analytica) and other third parties for political ‘micro-targeted’ advertising during the United States Election. The global criticism surrounding this case was immense with a clear exploitation of their self-regulated position whilst consistently committing to data protection as Facebook Assocaite General Counsel Harry Kinmonth announced, “protecting people’s information and privacy is a top priority for Facebook,” (Zialcita, 2019) which is a representation of false promises from digital platforms. This is a direct consequence in monopolized digital companies presenting a façade public commitment to data privacy protection whilst having a substantially internalized ability to manipulate user data for financial and global expansion. There was interjections from the courts of America whereby after years of investigations, Facebook were issued a $663,000 fine and forced to present a public apology, however these unethically monetized decisions driven from self-regulation will remain a consistent occurrence behind the private servers of digital platforms whilst the regulator body remains the companies themselves with state influence being only reactive and not preventative in this case. (Flew, 2020)
Facebook CEO Mark Zuckerberg apologises for data privacy failures (YouTube)
Data Privacy – TickTock
Similarly, the self-regulated ability for digital platforms to access data is extremely problematic whereby third parties can capitalise on accessing this information which they are legally unable to obtain themselves. This is demonstrated in the autonomous data collection Ticktock has demonstrated evidently in the case study led by Media Journalist Josh Taylor. This public case study provides an understanding into the internal strategy led be their self-guided terms of services that effectively isolates personal information from digital communicators for affiliate third party application. The Guardian specifically alludes to Ticktock’s subjective regulations which provide self-certification for acquiring “user geolocation and altitude information” as well as biometric identifiers based on user-generated videography, outlining the innate control digital platforms have as primary regulators. (Taylor, 2023) In this specific publicised investigation inside the regulations of Ticktock, there is undisputed evidence which clearly outlines the autonomous method for digital platforms to implement self-regulated guidelines that can be subjectively interpreted and create opportunities for companies to sell access to personal data for hyper targeted advertising, generating their largest revenue stream. This monopolization of social communications is publicly seen as unethical with a societal expectation for digital platforms to as summarised by Professor of New Media Robin Mansell “provide a freedom to communicate within well-defined rules of conduct… with private access to information of educational value.” (Mansell, 2020) This summarisation from Robin in the social expectations placed upon digital platforms is conclusively unable to be upheld in the current self-regulatory systems led by corporations with a primary focus on revenue generation through untrustworthy data advertisement, placing user data privacy as a secondary concern.
Further to this strategic profit driven data collection is the implication of governments capitalizing on this self-regulation process with numerous accounts stating Ticktock data is directly accessible by Chinese Governmental authorities for their domestic interests likely inclusive of monitoring western and domestic civilians. Despite this obvious interference by the Chinese Government, there is an overreaching argument favouring digital self-regulation as a transparent method of removing governmental interference in digital freedom of communication. Whilst there is clear evidence to outline the greater detrimental nature of a “purely state-governed regulatory ecosystem” whereby freedom of speech can be directly state regulated hidden behind guidelines, the current autonomy in providing digital platforms with self-regulatory power as has no significant deviation in protecting user data only independent motivations. (Singh, 2023)
Removing Freedom of Speech
Another key example of self-regulated control is the ability to silence groups and individuals in accordance with their self-beneficial guidelines. The monopolization of these methods of digital communication is consequential to having the majority of an individual’s freedom of expression suppressed in they are banned from the platform. Currently there are clear involvements from government officials in silencing completely unethical communicators who incite highly criminal actions, the self-guided regulations enable digital platforms to similar power that can be include removing a users’ freedom of speech on the platform. The most notable instance in the power of self-regulated platforms impacting freedom of speech is the “2021 banning of Trump off Twitter and Facebook….” conveying the ability for even corporate and political powerhouse such as Trump can have their freedom of table removed from the digital platform. (Cusumano, 2021)
Anadolu Agency via Getty Images
To conclude, there is an evidenced based understanding in the self-regulated control digital platforms maintain over the cooperate landscape as well as the communicative actions of users. This current ecosystem has self-interested companies who exploit their control over social communication and data collection for the financial benefit of themselves and third interested parties. This flawed system is also exploited by government interjection into data collection that is not readily available. There needs to be a partnership of independent and government bodies to alter the problematic foundations of digital communication.
Flew, T. (2020). Guarding the gatekeepers – Griffith Review. Retrieved 20 August 2022, from https://www.griffithreview.com/articles/guarding- gatekeepers-trust-truth-digital-platforms/
Guardian News and Media. (Josh Taylor, 2023). Tiktok data collection could reveal what
floor a user is on, cybersecurity firm says. The Guardian.
Singh, N. (2023, May 16). Newly regulated digital platforms and self-regulation: Exploring the feasibility of the mechanism. Law School Policy Review & Kautilya Society. https://lawschoolpolicyreview.com/2023/05/16/newly-regulated-digital-platforms-and-self-regulation-exploring-the-feasability-of-the-mechanism/
Zialcita, P. (2019). NPR Cookie Consent and Choices. https://www.npr.org/2019/10/30/774749376/facebook-pays- 643-000-fine-for-role-in-cambridge-analytica-scandal
Mansell, R., & Steinmueller, W. E. (2020). p. 35 – 54. In Advanced introduction to platform
economics. essay, Edward Elgar Publishing
Leetaru, K. (2022, October 12). What does it mean for social media platforms to “sell” our data? Forbes. https://www.forbes.com/sites/kalevleetaru/2018/12/15/what-does-it-mean-for-social-media-platforms-to-sell-our-data/?sh=67cd6f972d6c
Cusumano, M. A. (2021, January 19). Social media companies should self-regulate. now. Harvard Business Review. https://hbr.org/2021/01/social-media-companies-should-self-regulate-now
YouTube. (2018). Facebook CEO Mark Zuckerberg Apology. Retrieved October from https://www.youtube.com/watch?v=-LBB-hNGU9Q.