Nearly two years ago, public health experts blamed social media platforms for contributing to a measles outbreak by allowing false claims about the risks of vaccines to spread.
Facebook pledged to take tougher action on anti-vaccine misinformation, including making it less prominent in the news feed and not recommending related groups. But shortly after, Facebook-owned Instagram continued to serve up posts from anti-vaccine accounts and hashtags to anyone searching for the word “vaccines.” Despite actions against anti-vaccine content since then — some as recent as last month — Facebook has failed to totally quash the movement on its platforms.
Now, with Covid-19 vaccines potentially making their way to some Americans as soon as this month, the tech companies will face their biggest test on this front yet. The stakes for them to get it right, after years of struggling to combat vaccine misinformation, couldn’t be higher.
“To beat this pandemic, we also have to defeat the parallel pandemic of distrust,” Francesco Rocca, president of the International Federation of Red Cross and Red Crescent Societies, said on Monday.
Some social networks have already put policies in place specifically against Covid-19 vaccine misinformation; others are still deciding on the best approach or are leaning on existing policies for Covid-19 and vaccine-related content. But making a policy is the easy part — enforcing it consistently is where platforms often fall short.
Facebook, Twitter and other platforms have their work cut out for them: The coronavirus and pending vaccines have already been the subject of numerous conspiracy theories, which platforms have taken action on or created policies about. Some have made false claims about the effectiveness of masks or baseless assertions that microchips will be implanted in people who get the vaccine.
Earlier this month, Facebook booted a large private group dedicated to anti-vaccine content. But many groups dedicated to railing…
Read the full article at rss.cnn.com