Facebook to combat hate crimes

Angus Livingston
(Australian Associated Press)


Facebook will direct white supremacist and terrorist searches to websites urging people to leave hate behind.

It is also using first-person military videos to train artificial intelligence to faster identify terror attacks like the live-streamed Christchurch massacre.

The company came under international pressure following the March mass shooting, in which 51 people were killed in a video broadcast live on Facebook.

On Wednesday Facebook announced a range of measures aimed at preventing terrorists from taking advantage of its platforms.

“This is a fight that happens on a daily basis,” Facebook’s Global Director for Counterterrorism Policy Brian Fishman told AAP.

“We see bad actors trying to circumvent the techniques that we’ve put in place to identify them, and we change what we’re doing as a response to those techniques.”

The new techniques include directing searches for white supremacist and terrorist terms to organisations who encourage people to leave hate groups.

When asked if Facebook had considered switching off its live-streaming option, Mr Fishman said Facebook Live still served a valuable purpose.

“We’ve seen it used in recent weeks and months by protesters in Hong Kong fighting for democracy,” he said.

“(But) it’s very clear that bad actors want to exploit services like that, and we are trying to be responsive to that while making sure that the services can be used.”

Facebook also tracks people when they’re not using the app or website, but Mr Fishman said there were difficulties in using that information to flag potential threats.

He said that was a conversation to have with governments about laws allowing that sort of information to be collected and used.

“For example, when is it appropriate to utilise signals from different platforms to identify potential threats?” he said.

“When there is regulation that prevents that or it is ambiguous, that puts us in a tricky place.”

Mr Fishman says Facebook accepts social media should be regulated, but he argues that given the “daily” changing nature of terrorists and others, the laws should allow dynamic action.

Facebook is also using first-person battle videos from the US and UK armies to train its machine learning on how to spot similar videos.

“I don’t know if that would have happened in the same way previously,” Mr Fishman said.

“(But) it’s not something where you can just flip a switch and there’s a silver bullet and all of a sudden every technique is functioning at full force against every organisation.”


Like This