Researchers from USC believe they have identified the primary driving force behind the dissemination of false news: the social media platforms’ incentive systems for frequent information sharing. The team’s findings were upended by the widespread belief that misinformation spreads because users lack the critical thinking abilities required to distinguish truth from falsehood or because their strong political views cloud their judgement. They were published Monday by Proceedings of the National Academy of Sciences. About 30% to 40% of the fake news was disseminated by just 15% of the research’s most frequent news sharers.

What drives these users, pondered the study team from the USC Marshall School of Business and the USC Dornsife College of Letters, Arts, and Sciences? As it turns out, social media features a rewards system that motivates users to remain on their accounts and keep posting and sharing, just like any video game. Users are more likely to gain attention if they publish and share often, especially when doing so with spectacular or eye-catching content.

“Due to the reward-based learning systems on social media, users form habits of sharing information that gets recognition from others,” the researchers wrote. “Once habits form, information sharing is automatically activated by cues on the platform without users considering critical response outcomes, such as spreading misinformation.”

Why does fake news spread?

In all, 2,476 active Facebook users between the ages of 18 and 89 chose to take part in the study after seeing online advertisements. They received payment for spending about seven minutes answering a “decision-making” survey. Surprisingly, the researchers discovered that users’ social media behaviours increased the quantity of phoney news they spread by two or even three times. Other criteria, such as political convictions and a lack of critical thinking, had less of an impact on the spread of fake news than their behaviours. Six times as many bogus news stories were transmitted by regular, habitual users as by infrequent or new users.

“This type of behaviour has been rewarded in the past by algorithms that prioritize engagement when selecting which posts users see in their news feed, and by the structure and design of the sites themselves,” said second author Ian A. Anderson, a behavioural scientist and doctoral candidate at USC Dornsife. “Understanding the dynamics behind misinformation spread is important given its political, health and social consequences.”

What did the study conclude?

1) Misinformation is not always spread routinely.
2) Users may be encouraged to develop sharing habits that increase their sensitivity to sharing verifiable material.
3) Restructuring the online ecosystems that encourage and enable the spread of disinformation is necessary for effective misinformation reduction.

Leave a Reply

Your email address will not be published. Required fields are marked *