Teams gather to fight election manipulation

“We’ve essentially done much scenario planning and ‘war games’ internally within the war room to plan out different types of problems that we may see,” says Samidh Chakrabarti, who oversees Facebook’s elections and civic engagement team. “We’ve practiced and we’ve done drills to see how we can detect that, how we can come to quick decisions, and how we can take quick action.”

The War Room is staffed from 4 am until midnight, and as of next week, will be buzzing 24/7 with representatives from teams that represent every corner of the company. WhatsApp, Instagram, Operations, Software Engineering, Data Science, Research Operations, Legal, Policy, Communications — they’re all represented in the room. Charts of user behavior on Facebook and its other apps are on monitors around the room. Facebook uses machine learning and artificial intelligence to monitor the spikes that could point to hate speech, fake news going viral or efforts at voter suppression.

Nathaniel Gleicher, Facebook’s head of cybersecurity, says the company’s goal is that the election be fair, and that “debate around the election be authentic….The biggest concern is any type of effort to manipulate that.”

Ahead of the Brazilian presidential election, the company identified an effort to suppress voter turnout and was able to shut it down quickly, thanks in part to the proximity of so many teams in a single room.

“Content that was telling people that due to protests, that the election would be delayed a day,” says Chakrabarti. “This was not true, completely false. So we were able to detect that using AI and machine learning. The War Room was alerted to it. Our data scientists looked into what was behind it and then they passed it to our engineers and operations specialists to be able to remove this at scale from our platform before it could go viral.”

Facebook is combining its teams focused on the US and Brazilian elections because fighting what the company calls “bad actors” is a global problem that never ends. The idea is that these teams can share information about the latest tactics they’re seeing and share best practices for blocking them.

Gleicher warns that Facebook is seeing growing efforts to manipulate the public debate as we get closer to the U.S. midterms.