In a move to fight fake news, Facebook yesterday said it deleted more than 800 pages and accounts. The company claims it removed 559 pages and 251 accounts for spreading disinformation, political spam, and inauthentic activity.
Facebook’s head of cybersecurity, Nathaniel Gleicher said:
People need to be able to trust the connections they make on Facebook. It’s why we have a policy banning coordinated inauthentic behavior — networks of accounts or Pages working to mislead others about who they are, and what they are doing.
Oscar Rodriguez, a Product Manager at Facebook, said that US midterm elections played a big part in the removal of these pages and accounts.
The company said a lot of people are creating fake pages and accounts to post misleading content, leading victims to ad farms (websites intentionally flooded with ads and clickbait pieces). Facebook pointed out that in the past, these networks used celebrity gossip and natural calamities as clickbait material.
But now they are turning toward political content to drive traffic to their websites. Facebook said these networks are posting fake content through multiple pages and groups to garner clicks. It added that the company is getting better at identifying politically or economically fueled campaigns that abuse Facebook policy.
Facebook also removed 66 accounts and pages run by a Russian company allegedly selling users’ data. The company has many programs targeting elections around the world, including the US mid-term elections and the Indian election of 2019, to curb fake news.
It’ll be interesting to see whether this move will have any impact on regular users, inclined to engage in political discussions.