The Rise of Social Media in Politics: Misinformation and Radicalization

http%3A%2F%2Fsites.bu.edu%2Fcmcs%2F2018%2F11%2F26%2Ffake-news-social-media-and-politics%2F

http://sites.bu.edu/cmcs/2018/11/26/fake-news-social-media-and-politics/

Ava Kavanagh, A & E Editor

Since the Covid-19 Pandemic hit, the usage of social media has been on the rise, filling in our gaps of free-time. For most of the younger generation, social media has become their first-hand tool in viewing news content. As a result, Twitter, Instagram, Facebook, and Tiktok have all become politically charged. In just the past year, we have seen the tangible power of social media in Cancel Culture, organized political protests/raids, and a new reliance on social connection via social media.

Controversial social and political topics such as The Black Lives Matter movement and the 2021 political election have all had massive social media presence. The Pew Research Center found 23% of adults on social media (in the United States) have changed their views of a political or social issue because of something they saw on social media. As we all have a growing reliance on social media, is there a way to prevent misinformation from becoming part of daily news?

The problem is, social media is not designed to be moral it’s designed to make profits. Social media sites such as Facebook, Twitter, Instagram, TikTok are all based on an algorithm to maximize profits. Usually, social media is considered something that serves the public. While this can be true if used carefully and fact-checked often–it can also spread an influx of misinformation and radicalization in politics.

Now let’s get back into algorithms and how/why they play such an important role in polluting social media politics. The algorithm places the posts which would have the most relevance to you at the top of your feed. Relevancy is based upon whether an article has a greater prior engagement. Tweets or posts that have an emotional reaction or confirm your existing beliefs are more likely to be reacted to and then show up on more feeds. It’s a cycle of pushing our confirmation bias and inevitably making misinformation spread everywhere.

Another factor is the amount of information we see on social media. Even if misinformation is corrected, it’s unlikely that the same set of people will see the correction because of the infamous algorithm. Correction of misinformation does not push those emotional and confirmation bias buttons in the same way. This is a major problem. One that can lead to the mistrust of politics in the media. The Fact-Checking Program in social media, notably Facebook, is not catching the majority of misinformation because of the obvious flaws in the system itself. Social media sites make money with advertisement interaction, selling our attention as a commodity, and as a result, misinformation and even disinformation become an incentive to receive more ad interaction and thus profits for social media sites.

When misinformation becomes information and social media users load information meant to confirm their initial beliefs, it can easily lead to radicalization. Your feed mainly consists of other people who share beliefs with you. It becomes more difficult to hear views that are different from yours. When certain posts aren’t fact-checked while other posts confirm what you have already seen it becomes a dangerous cycle that can lead down a rabbit hole.

What can we do to fix this problem? Although social media can be helpful, without a balance it can quickly lead to misinformation and radicalization. At the end of the day, fixing individual habits through fact-checking and monitoring the sources you are viewing always helps. A long-term solution is to push for a structure change of social media companies. An improved fact-checking program across all social media sites would decrease misinformation greatly.