Facebook quietly changed its algorithm in 2018 to prioritise reshared material – then kept it in place despite realising it encouraged the spread of toxicity, misinformation, and violent content, leaked internal documents reveal

Facebook quietly changed its algorithm in 2018 to prioritise reshared material – then kept it in place despite realising it encouraged the spread of toxicity, misinformation, and violent content, leaked internal documents reveal

Daily Mail

Facebook quietly changed its algorithm in 2018 to prioritise reshared material, only for it to backfire and cause misinformation, toxicity and violent content to become ‘inordinately prevalent’ on the platform, leaked internal documents have revealed.

The company’s CEO Mark Zuckerberg said the change was made in an attempt to strengthen bonds between users — particularly family and friends — and to improve their wellbeing.

But what happened was the opposite, the documents show, with Facebook becoming an angrier place because the tweaked algorithm was rewarding outrage and sensationalism.

Researchers for the company discovered that publishers and political parties were deliberately posting negative or divisive content because it racked up likes and shares and was spread to more users’ news feeds, according to the Wall Street Journal.

It has seen a series of internal documents that reveal Zuckerberg was even warned about the problem in April 2020 but kept it in place regardless.

Facebook quietly changed its algorithm in 2018 to prioritise reshared material, only for it to backfire and cause misinformation, toxicity and violent content to become ‘inordinately prevalent’ on the platform, leaked internal documents have revealed.

The company’s CEO Mark Zuckerberg said the change was made in an attempt to strengthen bonds between users — particularly family and friends — and to improve their wellbeing.

But what happened was the opposite, the documents show, with Facebook becoming an angrier place because the tweaked algorithm was rewarding outrage and sensationalism.

Researchers for the company discovered that publishers and political parties were deliberately posting negative or divisive content because it racked up likes and shares and was spread to more users’ news feeds, according to the Wall Street Journal.

It has seen a series of internal documents that reveal Zuckerberg was even warned about the problem in April 2020 but kept it in place regardless.

Read the full story in Daily Mail

Report

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments