The macro wave brought by Techlash
TUT05 Xiai Liu
social media by Sean MacEntee is licensed under a Creative Commons Attribution 4.0 International License.
It is impossible for people today to experience life without the Internet. Finding information requires constantly flipping through books. People have to go to a physical store to buy what they want, and they can’t watch their favorite TV shows back. After decades of rapid development, the Internet changed everything. It has become a bank, cinema, school, classroom, office, and other places, it has penetrated into people’s lives. Facebook, Amazon, Netflix, Google are already familiar network platforms, but behind the vigorous development of these platforms, the public, politicians, and the government are greatly dissatisfied with them. This negative reaction to the growth of large technology companies is called Techlash.
At the level of economic influence, the emergence of large technology companies leads to a monopoly of business and unfair market competition. They use their strong market position to crowd out weak companies. “Facebook almost 80% of mobile social traffic and Amazon about 75% of e-book sales” (Flew et al., 2019). As the influence of mass media publishers has diminished, they have been replaced by new, even larger, and more powerful publishers in the Internet age (Ibid.).
At the same time, the Internet is not as safe as we thought. Firstly, from the public’s point of view, when people use the platform, they will often find relevant recommendations from people they may like or know. For example, Facebook’s news feed algorithm determines what people you will contact, which is calculated according to the activities of “friends” and “friends of friends”(van, 2018). Secondly, the Internet makes life easier, but it also creates more opportunities for crime (Lee, 2019), the privacy of customers is very vulnerable. Protect users’ information should be paid attention to by the public, technology companies and the government. For the future of the Internet and the future of the media, these situations must be changed. No matter which technology company relaxes its vigilance on users’ privacy protection, it will have a huge impact. The issues of big technology companies’ economic impact, users’ privacy protection and hate speech will be discussed in the next sections.
Facebook by Book Catalog is licensed under a Creative Commons Attribution 4.0 International License.
The Facebook incident is a very intuitive example. On October 4th, at about 11: 40 am Eastern Time, Facebook itself, including its own applications, WhatsApp, Messenger, Instagram, and Oculus, began to show system error messages, and then it was shut down for nearly five hours. Facebook controlled nearly 80% of mobile social traffic (Flew et al., 2019), and this short system error caused 3.5 billion people to be forced to suspend their entertainment activities or business negotiations (Isaac & Frenkel, 2021). If an ordinary platform is out of service for a few hours, it may be just an ordinary server repair for users. But for a large platform like Facebook, the impact is significant. The failure of the server leads to some users have moved to other platforms for communication such as LinkedIn and Zoom and Discord chat rooms (Isaac & Frenkel, 2021). Users may not be greatly affected, but for businesses, the outage in just a few hours may lead to thousands of losses, which is attributed to Facebook’s monopoly on the current traffic of social platforms.
Self-regulation of technology companies
regulation by Mike Cohen is licensed under a Creative Commons Attribution 4.0 International License.
Capitalist interests are earned through users’ data on the platforms of technology companies. Facebook can even identify and locate what users of a certain age are looking for (van, 2018), and this could improve the effectiveness of advertising communication. Many advertisers heavily rely on this service offered by technology companies, that’s why Facebook only went out of service for five hours, but it caused thousands of losses for advertisers. With such a huge user privacy database, the platform protects them because of its own social enterprise responsibility, which belongs to self-regulation. Self-regulation avoids damages to users’ privacy and protects the corporate image and institutional ethics (Flew et al., 2019).
However, their management of the content published by users is very loose. Generally speaking, the platform will not examine the published content. Because digital media and platform companies appear as intermediaries instead of media companies (Flew et al., 2019) most of the time. The technology companies do not assume the responsibility of monitoring users’ comments, and they only simply provide services. What they really care about is the content that affects the company’s profits, and they don’t pay attention to things that don’t conflict with their own interests (Flew et al., 2019). Therefore, it is weak management if the government lets technology companies conduct self-supervision, and it could lead to a large number of fake news, extreme content, cyberbullying, and sexual harassment (Flew et al., 2019). The right of freedom of speech is not the reason for loose management of hate speech. Hence, the network needs to be restrained by laws and regulations at present. The government has the responsibility to protect the freedom of speech as well as a healthy environment of the network. Approaches of government regulation will be discussed in the next section.
The government regulates the platform
government by Mike Lawrence is licensed under a Creative Commons Attribution 4.0 International License.
For a better network environment, strict management from the government is very necessary. When the government supervises the network, they will formulate detailed rules and conduct strict supervision, and implement the law on the rules. When a large number of citizens flock to a public discussion, they are helping to spread fake news (Cohen, 2019). Once the conspiracy has the upper hand, it will not only easily lead to violent incidents, but also facilitate the recruitment of some rebel groups (Ibid.). Therefore, it will be more effective and accurate to let the government supervise the network to clear the bad information on the network.
Nevertheless, the government’s implementation of content regulation could be very time-consuming. As a result, it may often happen that the regulation cannot keep up with the renovation of the network. Compared with the technology company, the government may be not familiar with the business of the company, so it might be easier for the company to set its own rules and objectives to conduct the content regulation. But countries have different laws and regulations, and the platforms contain users from all around the world, so it could be difficult for the rules formulated by a single country to be implemented on all users. So, in the next part, we will set our eyes on the users themselves.
Citizen Management Platform
Colleagues looking at laptop by Make Me Local is licensed under a Creative Commons Attribution 4.0 International License.
Users play a very important role in platform supervision. The main body of the platform is billions of users. So apart from technology companies, users are the people who are most familiar with the platform. But participation is contagious among users, and they will be influenced by contacts who express political views (Halpern et al., 2017). Therefore, the purpose of the public supervision is to make the public realize the importance of privacy protection. However, public opinion can be good and bad at a time. The good side of the public opinion is that, through supervising platforms by the public, the public also be able to monitoring technology companies and the government in essence. For example, the cohesion of the Internet enables the general public to have the right to express themselves through the Internet. Such actions can not only increase the pressure of the public opinion on the government and technology companies to make them pay more attention to governance the platform, but also attract more people from all walks of life to realize the importance of platform regulation and discuss them together.
But public opinion is still bad news after all. The proliferation of false information will hinder people from obtaining real information. In Myanmar, Facebook has become the main digital platform, and the explosion of hate speech on Facebook has indeed fueled anti-Muslim sentiment and incited violence (Plantin & Punathambekar, 2018). What citizens can do is to give more opinions and resonate more. But if they want to implement the management system, they still need the cooperation of the company or the government and resort to the law to treat the management strictly.
The Internet has indeed changed the world, but it has also triggered Techlash. There will be a dark side behind the beauty. In order to make money or study competitors, advertisers require digital platforms to track people’s every move on the Internet (Karpf, 2018). This is an obvious invasion of people’s privacy; big data will be beneficial to both parties only if these data can be protected reasonably. Meanwhile, not all speech is worthy of protection. Fake news, bullying, and extreme content should not be products of freedom of speech. They need to be managed and brought to justice. If technology companies want to protect their current market position, they must pay attention to platform regulation. For a better internet environment, the public, the government, and technology companies should unite and manage together.
Cohen, S. B. (2019, November 22). Read Sacha Baron Cohen’s scathing attack on Facebook in full: ‘greatest propaganda machine in history’. The Guardian. Retrieved October 11, 2021, from https://www.theguardian.com/technology/2019/nov/22/sacha-baron-cohen-facebook-propaganda.
This work is licensed under a Creative Commons Attribution 4.0 International License.
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation AS media policy: Rethinking the question of Digital Communication Platform Governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1
Halpern, D., Valenzuela, S., & Katz, J. E. (2017). We face, I tweet: How different social media influence political participation through collective and internal efficacy. Journal of Computer-Mediated Communication, 22(6), 320–336. https://doi.org/10.1111/jcc4.12198
Isaac, M., & Frenkel, S. (2021, October 4). Gone in minutes, out for hours: Outage shakes Facebook. The New York Times. Retrieved October 13, 2021, from https://www.nytimes.com/2021/10/04/technology/facebook-down.html?auth=link-dismiss-google1tap.
Karpf, D. (2018, September 18). 25 years of wired predictions: Why the future never arrives. Wired. Retrieved October 11, 2021, from https://www.wired.com/story/wired25-david-karpf-issues-tech-predictions/.
Lee, B., & Tim. (2019, March 29). 30 years on, what’s next #fortheweb? World Wide Web Foundation. Retrieved October 11, 2021, from https://webfoundation.org/2019/03/web-birthday-30/.
Plantin, J.-C., & Punathambekar, A. (2018). Digital media infrastructures: Pipes, platforms, and politics. Media, Culture & Society, 41(2), 163–174. https://doi.org/10.1177/0163443718818376
van, J. (2018). The Platform Society as a contested concept. Oxford Scholarship Online. https://doi.org/10.1093/oso/9780190889760.003.0002