What motivates People to Use Social Media?
A recent survey by O’Day & Heimberg (2021) found that maintaining relationships with loved ones was the primary motivation for social media use with about 47% of online adults saying this is one of their primary social media functions. That’s to be expected, given that that’s essentially what social media is about. There isn’t much of an individual’s (or a company’s) life that doesn’t eventually wind up on social media with profiles, messaging, commenting opportunities, milestone events, and day-to-day updates.
The entertainment value of social media is sky-high because it is visually content-heavy, hence highly easy to consume, and because it allows unlimited scrolling. Some experts even classify it as a news and entertainment aggregator. Additionally, Sedgwick et al. (2019) explain how people use social media to locate content (30.9% of internet users), to see what’s being talked about (29.1% of internet users), and to get ideas for things to do and purchase (28.1%).
Social media shopping searches are also some of the most common reasons people use the platforms. If you are an (aspiring) entrepreneur in the field of electronic commerce, pay close attention to the following number: Approximately 26.5% of internet users conduct product research on social media (Hunsaker & Hargittai, 2018). That is more than a quarter, which is a huge number and shows just how effective social media marketing can be.
Negative Effects of Social Media
- Cyberbullying
Cyberbullying refers to harassment that takes place via electronic devices like laptops, phones, and tablets with the use of the Internet. Social networking sites, online forums, and video games all provide platforms for cyberbullying to take place. Sending or making available online material that is damaging, deceptive, or malicious with the intent to damage another person is considered cyberbullying. A kind of cyberbullying is the public disclosure of private information about another person with the intent to humiliate or shame them. This can happen on the following platforms:
- Facebook, Instagram, Snapchat, and Tik Tok
- Communication via text message or mobile/tablet app
- Internet-based forms of instant messaging, DM, and chitchat
- Social networking sites, blogs, and message boards like Reddit
- Sending Messages to Virtual Gamer Guilds
For more information on cyberbullying, check the video below:
According to the Criminal Code Act 1995, serious instances of cyberbullying can result in a sentence of up to three years in prison and/or a fine.
In Florida, certain forms of cyberbullying may be punishable under Florida’s criminal cyberstalking law:
Stomp Out Bullying lists the following as actions as criminal forms of cyberbullying:
Sexting, Committing hate crimes, Making death threats, Child pornography, Stalking a person, Making violent threats, Exploiting someone sexually, Harassing someone (on racial or gender basis) and Taking and posting photos of someone where they need privacy.
What is the Future of Cyberbullying?
There has been a growing movement against cyberbullying that emphasizes the need for standardized responses. There have been calls for the federal government to issue clear, actionable guidelines to help schools combat bullying and harassment both online and offline.
- Violence on Digital Platforms.
It is not uncommon to see graphic depictions of accidents, shootouts, murders, and the like on Facebook, Instagram, Twitter, TikTok, or other media platforms. People do this to stay afloat in the cutthroat digital media industry; they resort to such measures to increase their audience size. In addition, there is a lot of violence in uncensored video games for kids. According to the results of one survey, 97% of games classified for teenagers and 64% of games deemed appropriate for all ages contain some form of violent content (Thompson & Haninger, 2001). Thompson et al. (2006) adds that violence in games has become so common that the most violent player is the one who is rewarded and allowed to progress to the next level.
The consequences?
Evidence suggests that exposure to graphic violence in the media can desensitize young people to the realities of violence and hence encourage more aggressive behaviour. These mutually reinforce one another, growing stronger over time. Therefore, it is essential to find a way to end this cycle. This can happen on the level of beliefs or on the level of actions. The video embedded below will provide you with a sense about how violent video games affects child behaviors.
Although there has been a lot of study on the topic of violence in media, not much has been done to curb it or create a code of conduct. It is well documented that children’s exposure to media has a formative effect on their development, especially in terms of influencing their worldview and subsequent action. It is important to realize that younger audiences are being introduced to content that might shape their attitude toward violence.
The Solution
According to the American Academy of Pediatrics, children should not watch more than 1–2 hours a day, and parents should watch with their children to keep an eye on what they are watching and to encourage open dialogue.
Psychologists can advise governments on how to best apply censorship to games and other social media platforms. It is the duty of governments to ensure that digital media organizations are not motivated purely by financial gain but also act responsibly toward their communities.
Parents have a central role to play. They should monitor and advise their kids accordingly. For instance, children under the age of 17 should not play most M-rated games. Just because another parent has given their child permission to watch or play a violent video game does not mean it is appropriate for your child. Ultraviolent conduct has a negative impact on growing brains, especially when it is paired with sexual images.
Besides the proposed ideas above, the video below has voices and ideas from experts on how to deal with this problem
Who Should Regulate Social Media?
The best people to oversee the content of social media platforms are the companies that produce them. In this case, there are various explanations. To start, they need to fulfil their responsibility. One might say they are profiting off of a public good, namely, citizens’ personal information. Therefore, social media businesses have a commitment to the public to control the dissemination of disinformation, cyberbullying, violence, porn, extremism, hate speech, etc., just as licensees of the broadcast spectrum must adhere to certain public interest responsibilities in exchange for the chance to monetise a public resource. As a second point, these businesses have a compelling motive: to stay in business. If a website hosts a lot of content that users find offensive or questionable, they should expect a significant exodus of visitors. The possibility of being regulated is also an incentive. Third, nobody else is equipped or has the knowledge to deal with the issue as they do. Complex tools and technologies will be needed for post-hoc detection and removal of offensive material. They are already in the process, which is the fourth point. In 2018, YouTube employed 10,000 workers worldwide to monitor and remove offensive content, and during a three-month period, they eliminated 8 million videos, of which 81% were removed automatically and 74% were never viewed. Over a three-month period in 2018, Facebook, which employs more than 30,000 people dedicated to the detection and removal of content violating the platform’s user guidelines, eliminated more than 15 million pieces of violent content, of which 99.9 percent was done automatically.
Do governments have a role to play? Yes. Content standards and enforcement rules should be defined and updated by social media firms. There should ideally be a neutral body in charge of this task, with representation from various interested parties like the public and the police. In the interest of openness, completed standards and guidelines should be made available to the public. For the sake of openness and to encourage compliance, this authority should also publish compliance data on a quarterly basis. Furthermore, the government should make social media accounts for blatantly illegal content if it is not deleted in a reasonable amount of time after being notified. In the event of a dispute, there should be a straightforward and quick means of resolution. The government should bear the burden of proof, which will make content removal the exception rather than the rule. Finally, the government’s attention should be primarily directed toward fixing the societal problems that are structural in nature, such as communalism, casteism, sexism, extremism, weak law and order, etc. In this way, online conversations only reflect societal trends. The necessity for internet speech restrictions can be reduced if the rule of law is strictly enforced.
References.
Hunsaker, A., & Hargittai, E. (2018). A review of Internet use among older adults. New Media & Society, 20(10), 3937-3954.
O’Day, E. B., & Heimberg, R. G. (2021). Social media use, social anxiety, and loneliness: A systematic review. Computers in Human Behavior Reports, 3, 100070.
Sedgwick, R., Epstein, S., Dutta, R., & Ougrin, D. (2019). Social media, internet use and suicide attempts in adolescents. Current opinion in psychiatry, 32(6), 534.
Thompson, K., & Haninger, K. (2001). Violence in E-rated video games. JAMA: The Journal of the American Medical Association, 286, 591.
Thompson, K. M., Tepichin, K., & Haninger, K. (2006). Content and ratings of mature-rated video games. Archives of Pediatrics & Adolescent Medicine, 160, 402–410.