3,000 Facebook accounts deleted for spreading false news about vaccines
Facebook claims that its action has reduced the uncertainty of vaccinations by about 50 percent in the United States alone.
Since the coronavirus pandemic, Facebook has taken a much stricter stance against posts containing falsehoods about a health topic. But how much this is the case has now been quantified in a new report: according to the company, more than 20 million content was deleted between the start of the pandemic and June this year, and more than 190 million warning signs were issued.
Facebook also detailed that 3,000 accounts had been removed from the site since the outbreak began in 2020 because of the spread of highly disinformation material about vaccines. That number may not seem like much, but according to Engadget, research recently found that the vast majority of vaccine fake news is linked to only a few super-distributors. One of them was recently named.
Facebook also shared that the posts removed were varied in a phone conversation with journalists. Some of them, for example, claimed that the coronavirus vaccine makes the skin magnetic. This is, by the way, one of the most popular fake news stories, even though it has been proven in several cases that the theory has no scientific basis whatsoever.
According to Facebook, it’s not uncommon for fake news distributors to use coded language to hide their communications on the site.