INTRODUCTION
In the 21 century of rapidly blooming technology, the spread of problematic content on the internet has been an urgent issue that waits to be solved. Many media platforms by propagandizing the wide–open fields of content to attract users (Gillespie, 2018), and government connive this behavior as the indefinite definition between “platform” and “company“. However, the dangers hide behind them are always ignored. The pluralism and tolerance of internet lead the spread of this content to be more and more rampant. When users, media platforms, and government that are benefiting from this technology, they also have obligations to be responsible for preventing the circulation of problematic content. Therefore, media platforms need to reinforce the content moderation of users, and government should be assured that the supervision and regulation of media platforms will be balanced and strictly done, then, the circulation of such an issue can be ended.
Most of users communicate online with a thought – if they want to say something, they will not consider whether it is positive or negative, they want to say it where others can hear them (Gillespie, 2018), thus internet is a double–edged sword for them and they may be the creator of problematic content within the communication process. On the one hand, internet offer adequate freedoms to users to interact with a wide–rage of people. On the other hand, the utopian environment it created allows them to produce problematic content while the freedom of speech. For that reason, it is necessary for media platforms to have a content moderation of their users. For example, Nintendo – one of the most family–friendly of game consoles. It set a range of words that related to the “COVID” as the new banned words in the game after the pandemic of COVID-19, which means if users post something about “COVID” during the playing, then, this content will be moderated and cancelled by the platform (Copia Institute, 2021). Because Nintendo company thinks that the discussion about “COVID” are controversial, and users may create problematic content when they discuss this topic, such as abusing players who come from the country that first had the coronavirus. Although there are some users argue that prohibiting them from the touchy subjects will make them feel uncomfortable and limited, and their game experience on this platform will be influenced in a certain extent. Nintendo company still insists that executing content moderation can keep users away from those issues which may bring cyber–bullying and hatred, and it can make them are immersed in the story of game. Furthermore, the information permits to exchange on the Nintendo platform probably be different from what players expect to discuss with their online friends, however, there is a playground belongs to Nintendo where they have to follow the rules set in here (Copia Institute, 2021). The ambiguous openness of media platforms always be complained by users under this situation, but in fact, the rules that platforms regulate for the content moderation have effectively protected users to be affected by the bullying, harassment, violent content, hate, porn and other problematic content. For instance, one of the rules of Nintendo – avoiding all kinds of political issues in the game. Although this is an excessive rule, it has blocked the road of the politicians whose purpose is to have the lobby or cause hated speech through the media platform. In hence, it is hard for media platforms to have the universal services and craft the rules without a literal standard, thus content moderation is difficult to be perfect and accepted by everyone. However, these platforms are trying to construct an open and safe internet world that suited for every types of users as much as they can.
Government was lack of the supervision and regulation to media platforms, thus some digital companies by deliberately guiding users to create problematic content and shaping the public opinions. For example, Facebook as a powerful and non–neutral force in electoral politics, one of its small design change had directly impacted the result of U.S. presidential election in 2012. The “get–out–the–vote” option led the youth voter participation to increase dramatically, then, most of the young people chosen to support the Democratic and the outcome of this election was decided by this platform in a great extent (Alexis, 2017). It can be used to illustrate that the democracy of U.S. citizens has been eroded by the malicious competition which was controlled by the company (Meta) that hides behind media platform (Facebook). One of the reason that causes this incident happened is government gives a slippery definition to categorize “platform” and “company“, and many of the legal responsibilities and liabilities treat companies as singular entities indeed (Gillespie, 2018), so that it allows these companies to do the things which are at the margin of democracy without the governance and claim that they do not have duty to regulate the content of their websites (Gorwa, 2019). Moreover, when the digital company operates an event which needs to utilize problematic content, there are multi–stakeholders that relate to – not only the capitalistic firms, but also the users and other society communities, thus when government crafts the regulation and supervision of media platforms, it has to concern about several parties. Therefore, the most proper solution for government is to give a broad leeway to media platforms on the content problems as long as they have some systems which cannot be changed by backstage staffs in order to constrain themselves as well, such as the notice and takedown system (Gorwa, 2019). Because this way enables government to find the balance among the “firms“, “NGOs (Non–Governmental Organizations)”, and “states“, and stop the spread of problematic content on these platforms under the situation that has no harm to anyone.
CONCLUSION
In conclusion, users, media platforms, and government should be responsible for the spread of problematic content, because internet is manipulated by them and also facilitate them. Although the acts of stoping its circulation are argued by people, the content moderation and the forced systems are the most suitable ways by helping to construct an open and safe cyberspace.
References
Alexis, C. M. (2017, October 13). What Facebook Did to American Democracy: And why it was so hard to see it coming. The Atlantic.
https://www.theatlantic.com/technology/archive/2017/10/what–facebook–did/542502/
Copia Institute. (2021, December 15). Content Moderation Case Study: Nintendo Blocks Players From Discussing COVID, Other Subjects (2020). Techdirt.
Gorwa, R. (2019). The platform governance triangle: Conceptualising the informal regulation of online content. Internet policy review, 8(2).
https://doi.org/10.14763/2019.2.1407
Gillespie, T. (2018). Custodians of the Internet : Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.
https://doi–org.ezproxy.library.sydney.edu.au/10.12987/9780300235029
Gillespie, T. (2018). The SAGE Handbook of Social Media. In Jean. B, Alice, E. M. & Thomas. P (Eds.), Governance by and through Platforms (pp. 254-278). SAGE Publications.
https://ebookcentral–proquest–com.ezproxy.library.sydney.edu.au/lib/usyd/detail.action?docID=5151795
Master Class. (2021, September 4). What Is GOTV (Get Out the Vote)?.
https://www.masterclass.com/articles/what–is–gotv–get–out–the–vote