Facebook will use AI to find extremist posts
Artificial intelligence will largely be used in conjunction with human moderators who review content on a case-by-case basis.
But developers hope its use will be expanded over time, said Monika Bickert, the head of global policy management at Facebook.
In a blog post published Thursday, Facebook described how an artificial-intelligence system would teach itself to identify key phrases being used to bolster a known terrorist group.
— 2017 New York Times
The same system, they wrote, could learn to identify Facebook users who associate with clusters of pages or groups that promote extremist content, or who return to the site again and again, creating fake accounts in order to spread such content online.
"Ideally, one day our technology will address everything," Bickert said. "It's in development right now." But human moderators, she added, are still needed to review content for context.
Brian Fishman, Facebook's lead policy manager for counterterrorism, said the company had a team of 150 specialists working in 30 languages doing such reviews.
Facebook has been criticized for not doing enough to monitor its site for content posted by extremist groups. Last month, Prime Minister Theresa May of Britain announced that she would challenge internet companies — including Facebook — to do more to monitor and stop them.
"We cannot allow this ideology the safe space it needs to breed," May said after the bombing of a concert in Manchester that killed 22 people.
J.M. Berger, a fellow with the International Centre for Counter-Terrorism at The Hague, said a large part of the challenge for companies like Facebook is figuring out what qualifies as terrorism — a definition that might apply to more than statements in support of groups like Islamic State.
TALK TO US
If you'd like to leave a comment (or a tip or a question) about this story with the editors, please email us. We also welcome letters to the editor for publication; you can do that by filling out our letters form and submitting it to the newsroom.