Facebook says it has made “significant changes” to how it aims to protect the integrity of elections and fight misinformation as the United Kingdom prepares for a General Election next month.
Polling day takes place on December 12 and there are fears that the election could be hit by attempts at interference via the spread of misinformation using Facebook, as has been in the case in a number of election campaigns across the globe in recent years, including the 2016 United States Presidential election.
Facebook has been criticised for its response to the spread of misinformation – as well as its role in the Cambridge Analytica scandal – but the social network claims to be prepared for the general election.
SEE: Cheat sheet: Facebook Data Privacy Scandal (free PDF)
“Helping protect elections is one of our top priorities and over the last two years we’ve made some significant changes,” said Rebecca Stimson, head of UK public policy at Facebook.
“We’ve introduced greater transparency so that people know what they are seeing online and can scrutinize it more effectively. We have built stronger defenses to prevent things like foreign interference and we have invested in both people and technology to ensure these new policies are effective,” she added.
With the election campaign officially underway, Facebook has created an ‘elections taskforce’ involving staff from the UK, Europe and the US to ensure integrity of Facebook platforms including WhatsApp and Instagram. Those employees will be brought together inside ‘operation centres’ as the election date gets closer.
One area Facebook is particularly focused on is preventing misinformation and influence operations which manipulate or corrupt public debate for strategic goals – with the groups behind these accounts and pages often posting in support of both sides of a controversial issue in order to polarise audiences.
Several coordinated misinformation campaigns have been taken down over the course of the past year, with Facebook linking fake accounts and ‘inauthentic’ behaviour to operators in Iran, Russia, Macedonia, Kosovo and others. Facebook says it will continue to look out for this type of coordinated activity during the course of the election campaign.
“My team leads all our efforts across our apps to find and stop what we call influence operations, coordinated efforts to manipulate or corrupt public debate for a strategic goal,” said Nathaniel Gleicher, head of cybersecurity policy at Facebook.
“This team has not seen evidence of widespread foreign operations aimed at the UK. But we are continuing to search for this and we will remove and publicly share details of networks of CIB [coordinated inauthentic behavior] that we identify on our platforms,” said Gleicher.
“We know bad actors use fake accounts as a way to mask their identity and inflict harm on our platforms. That’s why we’ve built an automated system to find and remove these fake accounts. And each time we conduct one of these takedowns, or any other of our enforcement actions, we learn more about what fake accounts look like and how we can have automated systems that detect and block them,” he added.
SEE: Warning to Facebook and Google: Block political adverts ahead of UK general election
However, the focus is on inauthentic behavior by fake or manipulated accounts – Facebook takes a different approach to political advertising and how political parties deploy it. In a call with journalists, the company admitted that it would allow a controversial doctored video of Labour MP Keir Starmer to appear as an advert.
Mozilla, privacy campaigners, academics and others recently published an open letter to Facebook and Google, asking them to suspend political advertising for the duration of the general election to prevent the spread of disinformation – but Facebook doesn’t appear to have plans to do this.
MORE ON CYBERSECURITY