Facebook’s Algorithm Encourages Misinformation, Toxicity, Internal Report Says

Post a Comment

Several internal Facebook documents have been published by the Wall Street Journal this week. It was first a question of a less strict moderation for certain “VIP” accounts on Instagram and Facebook, then of the harmfulness of Instagram for young girls. Wednesday, it is the recommendation algorithm of the social network that is targeted by these documents, reports
BFMTV.

In 2018, Facebook changed its recommendation algorithm that feeds users’ news feed. Mark Zuckerberg’s teams wanted to accentuate “meaningful social interactions”, that is, between friends, families and loved ones. The aim was to reduce the exposure of a user to “professional” content, which can alter their mental health.

“Our approach had unhealthy side effects”

Corn in an internal report that the Wall Street Journal procured, a group of researchers within the American company demonstrated that the algorithm actually behaves quite differently. Rather than preserving the user, he highlights content deemed violent, toxic or even fakenews.

“Disinformation, toxicity and violent content are abnormally prevalent in re-shared content,” he wrote in one of the memos published by the American daily. “Our approach has had unhealthy side effects on important parts of the content, especially in politics and news. Our responsibility is growing, ”add the researchers.

According to the conclusions of this survey, certain political parties or media have thus oriented their editorial line towards sensationalism and outrage. The objective: to provoke “engagement”, a term evoking the shares, the “likes” or the comments linked to a publication. “Many interlocutors told us that they feared, in the long term, the negative effects that this algorithm can have on democracy,” said a researcher in an internal report.

Related Posts

Post a Comment

Subscribe Our Newsletter