- Facebook's key UK fact-checker Full Fact only has two primary fact-checkers combatting misinformation in the country.
- There are around 40 million monthly active users of Facebook in the UK.
- The disparity between the small size of Facebook's fact-checking team and the scale of its UK userbase raises questions about the company's ability to effectively police misinformation in the UK.
- Facebook's other UK fact-checking partner, FactCheckNI, focuses on Northern Ireland specifically.
- The California social networking giant has faced scrutiny in recent weeks over the size of its fact-checking squads across the globe, from the US to Australia.
Facebook's only fact-checking partner covering all of the UK has just two primary fact-checkers dedicated to tackling hoaxes and misinformation — raising questions about Facebook's ability to police its platform for fake news in a country of around 40 million users.
Following swathes of criticism over the past few years over the proliferation of hoaxes, misinformation, and malicious propaganda on its platform, Facebook has over the past few years partnered with 50 fact-checking organisations all over the globe.
If these third-party organizations identify a piece of content as false, it will have a warning attached to it, and Facebook's algorithms will curtail its ability to spread in the social network's Newsfeed.
In the United Kingdom, Facebook works with two partners, Full Fact, and FactCheckNI. FactCheckNI is dedicated to fact-checking specialising in Northern Ireland — meaning Full Fact is Facebook's only partner covering the entire country.
A spokesperson for Full Fact told Business Insider that it has two key employees working on Facebook fact-checking. "There are currently two staff members who primarily focus on Facebook work," they wrote in an email. "They do not exclusively work on Facebook fact-checks, and other members of staff will sometimes work on Facebook checks."
Full Fact has conducted 250 fact-checks for Facebook thus far, they said.
FactCheckNI didn't immediately respond to Business Insider's request for comment, and it's not clear how many fact-checkers it has dedicated to Facebook, but it seems unlikely to be significantly more than Full Fact.
There are 28 people on LinkedIn listed as working for Full Fact, and four listed as working for FactCheckNI. (LinkedIn records are an imperfect way to measure company sizes, but are directionally accurate.)
Facebook has an estimated 40 million monthly active users across the United Kingdom — 78% of all internet users in the country.
Meanwhile, Facebook has faced a wave of scrutiny over the size of its fact-checking operations in other countries.
In Australia — a country of 17 million Facebook users — there are seven fact-checkers, who have collectively produced 220 fact checks, according to a report from BuzzFeed News. They come from Agence France-Presse and Australia Associated Press, Facebook's two partners in the country.
And in the United States, Facebook's core market, The Hill reported in January that Facebook has 26 full-time fact checkers. Supplied from six different partner companies, they carry out fact-checks on "roughly 200 pieces of content per month."
Like other countries, the UK has had to grapple with a wave of online disinformation in recent years, on subjects ranging from terror attacks and the environment to Brexit and local politics. As an example, Full Fact highlighted only this week a misleading Facebook post about whether antibacterial sprays worked against the novel coronavirus.
Full Fact started working with Facebook as a third-party fact-checker in January 2019, and in June released a report on how it was progressing. It welcomed the program as "worthwhile" and suggested other internet platforms consider implementing similar efforts — but raised concerns about the "scale" of the program. There is, it wrote, "a need to scale up the volume of content and speed of response."
In contrast, Facebook now has tens of thousands of content moderators policing the social network for objectionable and illegal content, and more than 35,000 people working on "safety and security" issues in total.
A source close to Facebook stated that the firm's fact-checking program was working, and that demoted posts had fewer views, and resulted in similar posts also being demoted. The company feeds its decisions into a machine learning model to identify potentially fake content easier and faster. The person added that there was no easy fix in the fight against misinformation.
Do you work at Facebook? Got a tip? Contact this reporter using a nonwork device via encrypted messaging app Signal (+1 650-636-6268), encrypted email (robaeprice@protonmail.com), standard email (rprice@businessinsider.com), Telegram/Wickr/WeChat (robaeprice) or Twitter DM (@robaeprice). PR pitches by standard email only, please.
Read more:
- Instagram's lax privacy practices let a trusted partner track millions of users' physical locations, secretly save their stories, and flout its rules
- Mark Zuckerberg's personal security chief accused of sexual harassment and making racist remarks about Priscilla Chan by 2 former staffers
- Facebook says it 'unintentionally uploaded' 1.5 million people's email contacts without their consent
- Years of Mark Zuckerberg's old Facebook posts have vanished. The company says it 'mistakenly deleted' them.
Join the conversation about this story »
NOW WATCH: 8 weird robots NASA wants to send to space