How to effectively regulate the spread of bullying, harassment, violence, hate, pornography, and other issues on the Internet on digital platforms.

Regulation and mediation of the Internet

“Social media platforms arose out of the exquisite chaos of the web (Gillespie, 2018a, pp.5).” Many network designers profited from the freedom, hosting, and extending of all participation, expression, and social connections promised by the network (Gillespie, 2018a, pp.5). As these platforms grew, chaos and controversy quickly came back to haunt them. Because if an individual has something to say, whether it’s inspiring or scolding, people want to say it where others can hear them. And the meaning of the existence of social media platforms is to let more people have a wide range of opportunities to talk and interact with more like-minded people. Although the advantages of these platforms clearly outweigh the disadvantages, like the Internet utopia in everyone’s mind, however, the peaceful Internet is a hidden danger, it also has a painful side, such as pornography, obscenity, violence, and illegal, as well as abusive, and hateful. In a word, Internet regulation and mediation are necessary for the process of Internet development (Gillespie, 2018). This regulation requires cooperation between the government media platforms and capitalists, and of course, every citizen who uses the Internet should be held accountable for his or her improper behavior on the Internet.

Image via iplocation
  • Internet security in the era of Web 2.0

Many social and political problems our society is currently confronting have their roots in media and communication policies. Modern Web 2.0 media and communications frequently do not fall under the traditional broadcasting, telecommunications, and media rules. In order to create policies that have an impact on technological, economic, political, and social growth and are effective, coordinated, and socially useful, Professor Robert G. Picard proposed the following key points: “ First, to enhance social inclusion and equity by offering services to those with special needs or visual or hearing impairments. Second, to shield young people and other vulnerable populations from unpleasant and adult material. Third, guard against invasive corporate and government monitoring or misuse of users’ personal information. Fourth, support self-regulatory and legal structures that encourage media and communication accountability. Fifth, prohibit the expansion and misuse of monopolistic power in communications and the media (Victor & Robert G, 2017).”

“Children online” Via Instagram

Since children’s brains and cognitive capacities are still developing, they need to be protected when using the Internet because this will limit their capacity to process information, comprehend outcomes, draw comparisons, and reason. The development and health of children are harmed by media. Lemish suggested that exposure to particular media types and excessive media use could have detrimental consequences on children’s neurological, psychological, and societal development (Lemish, 2013). The prevalence of sexually explicit content online, the ease with which pornography, drug use, and profanity can spread, as well as media representations of violence that can easily mislead young children, are all causes for concern (Nichols, 2011). Let’s not forget that there is also the issue of online game addiction and cyberbullying, which raises new concerns about internet safety. There are several ways to protect children, starting with the simplest forms of family control like parental technology control, preventing specific websites from showing images of children, and media platforms stepping up oversight while the government should step up its willingness to do so.

 

Scrolling through our social media feeds feels like a harmless part of our daily lives. But is it actually as harmless as seems? According to social media expert Bailey Parnell, our growing and unchecked obsession with social media have unintended long-term consequences on our mental health. As social media continues to become part of the fabric of modern life – the “digital layer” – abstinence is becoming less of an option. Bailey thinks it’s high time we learned to practice safe social before it’s too late.

 

  • Protecting vulnerable people online

Vulnerable groups other than children include the elderly and adults with limited intellectual capacity due to developmental or mental disorders and/or dementia who cannot live fully independently (Victor & Robert G, 2017). These vulnerable groups also need to be protected.

Image via Twitter

For example by monitoring their finances and property to avoid financial losses caused by fraudulent clicks. Due to the improper manipulation of marketing and advertising information by the media, vulnerable groups are extremely easy to be exploited through Internet communication (Saleem, 2019). For example, the misleading description of suicide in news reports, the vulnerability of the elderly to financial fraud (Lau, 2015), addiction to online games and gambling, etc. When vulnerable people are in family care or supervised by caregivers, media use can be monitored by family members to reduce exposure to specific content.

 

 

  • How to protect user privacy and data from being leaked

The most commonly discussed and important initial aspect of Internet security is safeguarding users’ privacy and data security from invasive businesses, hackers, or state surveillance and abuse. Invasion into people’s life, disclosure of personal information, and image theft are all frequent occurrences in the media. The technological foundation of the modern communication environment opens up new possibilities for public surveillance (Victor & Robert G, 2017). Significant new privacy concerns have emerged as a result of the growing capacity of governments, media, digital businesses, and other businesses or hackers to collect personal information by monitoring a person’s use of the Internet, social media, and mobile devices, as well as their purchases and other behaviors (Nissenbaum, 2009). Although this kind of data collection is originally intended to provide better services to users, it is not transparent to the public during these data collection processes. In addition, the exchange of data and the rights and protections of users of these services are often unclear (Victor & Robert G, 2017). Furthermore, Individuals should be shielded against invasive media coverage and electronic data collecting through the design of policies (Kirkham et al., 2013). Additionally, laws should limit the amount of time that personal data can be kept for governmental and commercial reasons, ensure transparency in the collection and storage of information for the public, and permit citizens to choose their collection. The policy should also make sure that no user permission is required for the direct manipulation of hidden data before it is collected and kept. The person whose data is collected is also shielded from any negative effects and risks associated with data collecting.

  • How do law and self-regulation work

Via Wikimedia projects.

Using judicial and self-regulatory systems to advance media and communication accountability (Victor & Robert G, 2017). It refers to everyone who engages in media and communication in society, and every single one of us is accountable for our unlawful and harmful online behavior toward both society and individuals. To prevent excessive restrictions on the right to free speech, media and communication companies as well as content creators should be held accountable for any actions that defame others, invade their privacy, disseminate false or fraudulent information, or withhold accurate information to mislead the public or endanger national security. Accountability strives to address problems regarding transgressions of personal ethics and Internet usage standards (Shen, 2016). Based on responsibility and the impact that the creation and distribution of harmful content have on society and individuals, the level of accountability may be determined. Legal and regulatory frameworks should represent fairness and justice, have the power of majesty, and be the symbol of majesty (Lunt & Livingstone, 2011). Serious offenses should result in legal action, while minor offenses should result in a platform’s banning of comments, a risk warning, or a title that suggests punishment. Additionally, the platform should be appropriate and punishable for inappropriate behavior. Of course, people should be able to appeal on their behalf.

 

Admittedly, regulating the Internet has a role to play, but it also poses obvious challenges, as no regulatory regime is fully effective (Flew et al., 2019). However, the Internet is not inherently completely ungovernable (Flew, 2018). Today, much of the work of regulating the Internet is done by big tech and telecommuting companies, which have to comply with the laws not only of their own countries but also of other countries where they have substantial business interests and assets. As the influence of mass, media publishers have waned, they have been replaced by new, larger, and more powerful publishers in the Internet age (Flew et al., 2019). So regulating the Internet is extremely complex and competitive. Because national policies often clash, the speed of law enforcement is not timely. And the speed at which content is uploaded and disseminated is rapid.

Image via Google
  • Looking ahead

In conclusion, increasing expectations for the accountability of digital platforms to citizens and government oversight means that global digital platform companies should be held accountable for the content on their websites and the collection and storage of citizen data. They have a responsibility to manage the information on their websites in a way that satisfies public interests and concerns while still safeguarding individual privacy, most crucially openly and transparently. There should be no unauthorized exploitation of public data for personal advantage.

 

 

References

 

Big Tech is shaping remote working for today and tomorrow. (2022, January 3). Morning Future. https://www.morningfuture.com/en/2022/01/03/big-tech-is-shaping-remote-working-for-today-and-tomorrowbig-tech-smartworking-home-work/

 

Contributors to Wikimedia projects. (2022, August 10). Cyber-utopianism. Wikipedia. https://en.wikipedia.org/wiki/Cyber-utopianism

 

Data protection and privacy: 12 ways to protect user data. (2020, June 22). Cloudian. https://cloudian.com/guides/data-protection/data-protection-and-privacy-7-ways-to-protect-user-data/

 

Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy10(1), 33–50. ingenta connect. https://doi.org/10.1386/jdmp.10.1.33_1

 

Flew, T. (2018). Digital platform regulation: Global perspectives on internet governance (pp. 18–23). Springer Nature.

 

Gillespie, T. (2018a). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media (pp. 1–23). Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029

 

Kirkham, T., Winfield, S., Ravet, S., & Kellomaki, S. (2013). The personal data store approach to personal data security. IEEE Security & Privacy11(5), 12–19. https://doi.org/10.1109/msp.2012.137

 

Kirkham, T., Winfield, S., Ravet, S., & Kellomaki, S. (2013). The personal data store approach to personal data security. IEEE Security & Privacy, 11(5), 12–19. https://doi.org/10.1109/msp.2012.137

 

Lau, L. Y.-C. (2015). Internet fraud in Hong Kong: An analysis of a sample of court cases. In Cybercrime Risks and Responses (pp. 81–102). Palgrave Macmillan UK. http://dx.doi.org/10.1057/9781137474162_6

 

Lunt, P., & Livingstone, S. (2011). Media regulation: Governance and the interests of citizens and consumers. SAGE.

 

Malyk, M. (2022, May 5). What is internet regulation? Your guide to the internet and its rules. EasyLlama. https://www.easyllama.com/blog/what-is-internet-regulation

 

Nichols, S. L. (2011). Media representations of youth violence. In Children Behaving Badly? (pp. 167–179). John Wiley & Sons, Ltd. http://dx.doi.org/10.1002/9780470976586.ch12

 

Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life (1st ed.). Stanford University Press.

 

Saleem, M. (2019, June). Brexit impact on cyber security of United Kingdom. 2019 International Conference on Cyber Security and Protection of Digital Services (Cyber Security). http://dx.doi.org/10.1109/cybersecpods.2019.8885271

 

Strickland, D. (2020, July 23). Internet use policy template – Acceptable use policy (free download). CurrentWare. https://www.currentware.com/blog/internet-usage-policy/

 

Shen, H. (2016). China and global internet governance: Toward an alternative analytical framework. Chinese Journal of Communication9(3), 304–324. https://doi.org/10.1080/17544750.2016.1206028

 

Victor, P., & Robert G, P. (2017). Essential Principles for Contemporary Media and Communications Policymaking . Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/our-research/essential-principles-contemporary-media-and-communications-policymaking.

 

by Yiqiu Ivy Yang

1 October, 2022

TUT – RE09