Responsible “people” for the platform

"Digital Platform Strategy" by World Economic Forum is licensed under CC BY-NC-SA 2.0.

In the digital media era, people’s life has been completely inseparable from the Internet. It makes lives easier in many ways. However, with the rapid evolution of digital media, the problems also appeared. “The Internet does not discriminate in the speed, reach, access and efficiency it provides to all users.” (Farah and Cathy, 2020) Because of this, it provides some users with the opportunity to do bad things. Someone is spreading bullying, harassment, violent content, hate, porn and other problematic content on the platform. However, the users of internet combine with different age groups including teenager. The influences of harmful content are incalculable. Hence, the question of who is responsible for this problem and how to stop it has then become the focus of attention.

Poell and Waal et al. (2018) stated “Understanding how platform mechanisms reshape societies may in turn help us understand how societies can govern platforms.” By reading Poell and Waal (et al.)’s article to understand the historical development of the platforms. In turn, looking at the spread of bad content on platforms, one can divide the responsibility between users, governments, and platforms. With the context, this essay will critically analyze the responsibility of users, governments, and platforms. And then, follow by discussing the solutions of spreading the harmful content.

 

The responsibility of platforms, government, and users

  1. Platforms

In the beginning, people use the platform due to its technology. But it changes the lifestyle and society organization in fact. People

Social Media Break Time” by Intersection Digital is licensed under CC BY-NC 2.0.

use the platform to shop, take a taxi, chatting and then slowly live their lives without the use of the platform. Follow the development, the focus of platform shifts to its economic relations. Poell and Waal et al. (2018) analyzed “two particularly important ingredients of a platform’s architecture are its ownership status and business model.” Ownership status could be divided whether it gain the profit. Business model refers to platforms get value in which ways. In the general environment, people’s lives have shifted from an offline to an online model. According to Statista (2022), the average global social network penetration rate is 58.4%. At the same time, some platforms through virtual tipping to attract users. The platform may choose to take a cut of the virtual currency rewarded to it, thereby reaping the benefits. Since the revenue is higher than working as normal, it attracts a large number of users who want to earn money on the internet. And the difficulty of attracting traffic to ordinary content leads to the fact that some people spread nude photos, some article about violent to attract traffic. Although platforms have rules and regulations in place to regulate the distribution of harmful content, they still exist. This shows that the platform is not regulated enough and that it is primarily responsible for the spread of harmful content.

 

  1. Government

When a platform society is formed from multiple platforms, it can no longer be analyzed as a single platform. There are two types of platforms in the platform society, infrastructure platforms and sectoral platforms. The social media platforms that people often consume are infrastructure platforms, and the infrastructure platform is basically made up of five major companies – Google, Facebook, Apple, Amazon, and Microsoft. “As of 2018, the core of the Western online infrastructure is completely privatized.” (Poell and Waal et al., 2018) Government has a responsibility to regulate platforms to prevent excessive privatization from disrupting society. Meanwhile, the geopolitics of platform society reflects all the platforms controlled by the government although they in different country culture. For instance, American platform constrained by the government rarely, but Chinese platform restricted by government indirectly.

  1. Users

Except the platform and government, users themselves also have the responsibility of spread the harmful content. The original intention of use platforms is its technologies, its conveniences people’s lives. and then people find it could bring the profit. When users find that they can benefit from the platform, the defense of the public interest is left behind. Good content production can help people earn extra income and lift themselves out of poverty. However, some users do not think about it and only want to gain attention and profit by spreading violent and pornographic content. Some users support such behavior and reward such content. Therefore, users are also responsible for the distribution of harmful content on the platform.

 

Government, Platforms, and users’ solutions

 

  1. Government

Since the issue has received widespread attention, the government has also introduced relevant policies. The Online Safety Act (2021), enacted by the government, defines illegal and restricted online content. In Australia, there is the world’s first government agency dedicated to keeping people safe online — eSafety. In addition, The Online Safety Act allow the eSafety to require or request that internet service providers block material that promotes, incites, instructs, or depicts abhorrent acts of violence. Such as, the video created by the perpetrator of the March 2019 Christchurch terrorist attack. eSafety has stopped the viral spread of the video through the appropriate powers. This has contributed to the strengthening of online regulation and has contributed to reducing the spread of harmful content. “Governing the platform society cannot simply be left to markets, if only because its infrastructure has come to penetrate all sectors, private and public. Governments have always played distinctive roles in the regulation of market sectors, locally and nationally.” (Poell and Waal et al., 2018) Digital platforms connect and control society and the economy, and governments need to pay attention to and act on issues arising from them.

 

  1. Platforms
Facebook” by chriscorneschi is licensed under CC BY-SA 2.0.

 

Under government regulation, the platform has introduced measures to address this issue. Facebook has announced that it will launch a new feature on Instagram to push teenagers away from harmful content. Facebook’s vice-president of global affairs, Nick Clegg, suggesting that this feature will make a big difference. When teenagers repeatedly watch more than one type of content and the content is harmful, the system will push other good content to them. A “break” function has also been added to prompt young people to reduce their usage time. However, this was challenged by Senator Klobuchar (cited in eSafety Commissioner, 2021), who argued that technology companies were attracting users by pushing content precisely to them through algorithms. Laws should be provided to give people the right to choose whether or not to disclose their privacy. Digital platforms need to balance regulation and profitability. While the distribution of harmful content can attract many users to stop by, the long term development must take into account the physical and mental health of all users, as well as their feelings about using it.

  1. Users

Users should actively report harmful content when using digital platforms. Parents can also use some platform features to filter undesirable content for their children. At the same time, users need to abide by the platform rules. Do not spread and post unhealthy content and work together to create a healthy platform community.

 

In the future, there is much more to be done to regulate the platform. The government needs to continually review platform content to ensure that platforms are not relaxing to maintain content health. Platforms also need to increase their censorship efforts and regulate the scope of content that can be posted. Users who break the rules should be dealt with seriously. Users also need to follow through with policies and actively report harmful content. The regulation of digital platforms is not only the responsibility of the platforms, but also of the government and users. Society is now largely platformed and it is a shared responsibility to maintain the content of the platforms.

References

eSafety (2022). Retrieved 14 October 2022, from https://www.esafety.gov.au/about-us/who-we-are.

eSafety Commissioner(2021). Retrieved 13 October 2022, from https://www.esafety.gov.au/key-issues/Illegal-restricted-content.

Facebook to ‘nudge’ teens away from harmful content. Abc.net.au. (2022). Retrieved 14 October 2022, from https://www.abc.net.au/news/2021-10-11/facebook-to-nudge-teens-away-from-harmful-content/100528882.

Farah, L., &Cathy, L. (2020). How to help slow the spread of harmful content online. World Economic Forum. Retrieved 13 October 2022, from https://www.weforum.org/agenda/2020/01/harmful-content-proliferated-online/.

Poell, & Waal, M. de (Eds.). (2018). The Platform Society as a Contested Concept. In Dijck, The platform society (pp. 5–32). Oxford University Press.

Social media: worldwide penetration rate 2022 | Statista. Statista. (2022). Retrieved 13 October 2022, from https://www.statista.com/statistics/269615/social-network-penetration-by-region/.