The Social Dilemma (2020) is a documentary-drama, featuring technology experts and industry insiders to reveal the dark side of social media. The film blends expert interviews with narrative segments. Industry executives and developers in key positions at Google, Facebook, Instagram and Twitter are interviewed to express their views on the fundamental crisis. The dramatized segments describe a screen-dependent family and the effect of their online habits. This film highlights main concerns in the digital media domain: addiction, rising rates of suicide and self-harm among teenagers, Covid-19 misinformation, fake news and polarization. The film presents itself as an urgent warning that social media is changing our lives, while some have criticized it only presents a one-sided view of big tech and fails to tackle the real issues.
Social media is a drug
“There are only two industries that call their customers ‘users’: illegal drugs and software – Edward Tufte.”
The Social Dilemma focuses on our obsession with addictive devices and discusses how tech companies are deliberately designed their products to be addictive. Tristan Harris, a former design ethicist at Google, said that many seemingly ‘unconscious’ interface elements are designed to subconsciously change user behavior and lock users into a cycle of addiction. For example, the most common design technique, infinite scrolling mechanism, uses intermittent rewards similar to the slot machines (Montag, et al, 2019). TikTok’s main feature, the “For You” page, acting as a drug because it provides a constant supply of refreshments to keep users on their site (Montag, Yang & Elhai, 2021). In addition, push notifications driving user addiction by using psychology that encourage users to repeatedly check their screens and constantly engage. Three fictional characters represents the behind-the-scenes puppet masters in the film, who manipulate users through typing indicator and photo tagging to maximize users interaction.
The documentary also describes the influence of social media on teen depression and behavior. The Social Dilemma provides a story about a teenage girl Isla who seeks attention and validation through “Like” and filtered selfies. According to Justin Rosenstein, the inventor of the “like” button on Facebook, the concept of the “like” button was introduced to spread positivity, but for generation Z it has become a pry for approval. Twenge (2017) found that time spent on screen activities was significantly associated with an increase in depression and suicides.
We are ‘the product’
This film describes an emerging ‘surveillance capitalism’ (Zuboff, 2015), that has led to so much concern about privacy. ‘Surveillance capitalism’ refers to the monetizing of personal data (Zuboff, 2015). Since personal data has been commodified, people’s attention has become a scarce resource.
“If you aren’t paying for the product, then you are the product. – Tristan Harris”
The main product is not the services built and offered by the social media platforms, but the consumers’ attention. For tech companies, the primary goal is to maximize platforms’ profits. The social companies use psychological and algorithm-based tricks to keep users on their sites and spend more time on their sites. Big tech companies are collecting, storing and selling users’ online behavior by monitoring their searches, likes, and purchases to generate data that can be used for commercial purposes (Lawrence, 2018). As advertising is the main driver of the platform, companies sell information generated from behavioral data to advertisers and target potential customers in order to maximize the probability of users clicking on advertisements.
The concept of ‘Panopticon’ can be applied to today’s ubiquitous digital surveillance. Bentham (1791) argued that the panopticon allows overseer to watch prisoners without being watched, while prisoners do not know whether they are being watched or not (McMullan, 2015). In the context of the digital age, surveillance and data collection is particularly similar to that of panopticon, as it is a one-way information avenue that lack of transparency (Lawrence, 2018).
In 2018,The New York Times reported that a company called Cambridge Analytica purchased data on 50 million Facebook profiles of US voters without their knowledge during the 2016 election. The data was used to target voters with personalized political advertisements and thus influence the outcome of elections. Cambridge Analytica also plays a role in the EU referendum (Cadwalladr & Graham-Harrison, 2018). The breach of user agreements and the violation of privacy rights by Facebook has raised widespread concern among users about personal privacy and security.
A threat to democracy
Social media enable people to access and consume more diverse source of information, while in today’s high-choice media environment, users show a tendency to favor information based on their preexisting attitudes and beliefs (Dubois & Blank, 2018). Concerns about polarization and misinformation have been raised in democratic society. Social media has clearly become a political weapon, especially the use of fake news (David et al, 2018).
Social media changed the way people access information and limited the exposure to diverse perspectives. The feed algorithms in social networks, search engines, news aggregators contribute to the increase in highly personalized information and “filter bubbles” (Pariser, 2011) which filtering mechanisms presenting content that people are more likely to consume. According to Digital News Report (2020), online become the main news source for a large audience, and most of these services rely on algorithms to deliver news to people. The study lays out the nature of the concern that algorithms and algorithmically driven services play a significant part in people’s news consumption (Fletcher, 2020).
“How news feed algorithms supercharge confirmation bias | Eli Pariser | Big Think” by Big Think. Source: YouTube. https://www.youtube.com/watch?v=prx9bxzns3g
Social media is often accused of exacerbating political polarization and ideological segregation by creating “echo chambers” in which individuals are largely exposed to conforming political view (Sunstein, 2009). According to group polarisation theory (Sunstein, 2002), an echo chamber allows people engage in selective exposure to information that reinforces existing views within a group, resulting in social extremism. Therefore, an echo chamber environment threatens democracy by limiting political information and discussion (Barberá et al, 2015).
As social networks play the predominant role in online political communication, online polarization may foster fake news and misinformation spreading. The Social Dilemma describes the main objectives of successful social networks is to maximize engagement:“We created a system that biases towards false information. Not because we want to, but because false information makes the companies more money than the truth. The truth is boring.” The film points out that fake news on Twitter travels six times faster than real news, and increased exposure to disinformation in online contexts has been linked to a range of detrimental effects.
Facebook issued an official response to the claims made in The Social Dilemma, saying the film “buries the substance in sensationalism.” Facebook points out that its product was not designed to be addictive, and in fact when the algorithm was changed in 2018, time spent on Facebook was reduced by 50 million hours a day. Facebook’s business models is free for all users and will not exploit or sell user data to advertisers. Moreover, Facebook is taking step to limit misinformation and content that could lead to polarization.
So what do we do?
Although The Social Dilemma draws a dystopia of social media and attempts to raise awareness of issues such as addiction, design ethics, data privacy, and polarization, it considers technology is the sole cause of these problems. At the end of the film, industry insiders offer short-term solutions, including turning off notification, deleting apps, ignoring content recommendations, reducing screen time and ensure diverse information sources. However, it fails to address the real problems with these big tech companies, because companies are responsible for its harmful effects.
Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting From Left to Right: Is Online Political Communication More Than an Echo Chamber? Psychological Science, 26(10), 1531–1542. https://doi.org/10.1177/0956797615594620
Dubois, E., & Blank, G. (2018). The echo chamber is overstated: the moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745. https://doi.org/10.1080/1369118X.2018.1428656
Fletcher, R. (2020). The truth behind filter bubbles: Bursting some myths. Reuters Institute. https://reutersinstitute.politics.ox.ac.uk/risj-review/truth-behind-filter-bubbles-bursting-some-myths
Gadwalladr, G. & Graham-Harrison, E. (2018, March 17). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
GCGLearnFree.org. (2019, June 18). What is an Echo Chamber? [Video]. YouTube. https://www.youtube.com/watch?v=Se20RoB331w
Lawrence M. (2018). Control under surveillance capitalism: from Bentham’s panopticon to Zuckerberg’s ‘Like’. Political Economy Research Centre Blog. https://www.perc.org.uk/project_posts/control-surveillance-capitalism-benthams-panopticon-zuckerbergs-like/
Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S. A., Sunstein, C. R., Thorson, E. A., Watts, D. J., & Zittrain, J. L. (2018). The science of fake news. Science (American Association for the Advancement of Science), 359(6380), 1094–1096. https://doi.org/10.1126/science.aao2998
McMullan, T. (2015, July 23). What does the panopticon mean in the age of digital surveillance? The Guardian. https://www.theguardian.com/technology/2015/jul/23/panopticon-digital-surveillance-jeremy-bentham
Montag, C., Lachmann, B., Herrlich, M., & Zweig, K. (2019). Addictive Features of Social Media/Messenger Platforms and Freemium Games against the Background of Psychological and Economic Theories. International Journal of Environmental Research and Public Health, 16(14), 2612–. https://doi.org/10.3390/ijerph16142612
Montag, C., Yang, H., & Elhai, J. D. (2021). On the Psychology of TikTok Use: A First Glimpse From Empirical Findings. Frontiers in Public Health, 9, 641673–641673. https://doi.org/10.3389/fpubh.2021.641673
Newman, N. Fletcher, R. Schulz, A. Andi, S. Nielsen, R. (2020). Digital News Report 2020. Reuters Institute. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2020-06/DNR_2020_FINAL.pdf
Pariser, E. (2011). The Filter Bubble: What the internet is hiding from you. New York, NY: Penguin Press.
Pariser, E. (2011, March). Beware online “filter bubbles”. [Video]. Ted. https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=en
Rhodes, L. (Producer), Orlowski, J. (Director). (2020). The Social Dilemma. Netflix. https://www.netflix.com/hk/title/81254224
Sunstein, C. R. (2002). The Law of Group Polarization. The Journal of Political Philosophy, 10(2), 175–195. https://doi.org/10.1111/1467-9760.00148
Sunstein, C. (2009). Republic. Com 2.0. New York, NY: Princeton UP.
Twenge, J. M., Joiner, T. E., Rogers, M. L., & Martin, G. N. (2018). Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time. Clinical Psychological Science, 6(1), 3–17. https://doi.org/10.1177/2167702617723376
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science (American Association for the Advancement of Science), 359(6380), 1146–1151. https://doi.org/10.1126/science.aap9559
Zuboff, S. (2015). Big other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75–89. https://doi.org/10.1057/jit.2015.5