Social media giant Facebook has removed thousands of groups from its platforms over the trading of fake and misleading reviews.
The cull occurred after two separate interventions by Britain’s competition watchdog, the Competition and Markets Authority (CMA).
In January 2020, Facebook committed to improving its identification, investigation, and removal of groups and other pages where misleading and fake reviews were being traded, and to preventing their return. Four months later, Facebook gave a similar pledge in relation to its Instagram.com business.
During a follow-up investigation, evidence was found that fake reviews were still being traded on both platforms, prompting the CMA to intervene for a second time.
The CMA said on Friday that Facebook had taken down a further 16,000 groups that were dealing in fake and misleading reviews. The company has also changed the way it identifies, removes, and blocks from its platforms paid content that could mislead Facebook and Instagram users.
“We have engaged extensively with the CMA to address this issue,” said a spokesperson for Facebook.
“Fraudulent and deceptive activity is not allowed on our platforms, including offering or trading fake reviews.”
Facebook has committed to suspending or banning users who repeatedly create Facebook groups and Instagram profiles that promote, encourage, or facilitate fake and misleading reviews. New automated processes will be introduced to improve the detection and removal of this content.
Changes will be made to Facebook’s search tools to make it harder for people to find fake and misleading review groups and profiles on Facebook and Instagram.
“The pandemic has meant that more and more people are buying online, and millions of us read reviews to enable us to make informed choices when we shop around. That’s why fake and misleading reviews are so damaging,” said CMA chief executive Andrea Coscelli.
Facebook’s sluggish response to the problem of fake reviews does not sit well with Coscelli.
They said: “Facebook has a duty to do all it can to stop the trading of such content on its platforms. After we intervened again, the company made significant changes—but it is disappointing it has taken them over a year to fix these issues.”