BY PATRICK TUCKER
JUNE 17, 2016
A new software tool could help social-media companies shut down the distribution of violent ISIS recruiting videos.
The Islamic State recruits supporters and fellow travelers from around the world largely by spreading photos and videos of its violent exploits online. What if social-media companies could automatically detect and delete such imagery?
The Counter Extremism Project, working with Dartmouth University computer scientist Hany Farid and funding from Microsoft, have developed a new method for doing just that. They hope to provide the software to help companies like Twitter, Facebook, and Google stop extremist groups from distributing such material on social media.
Based on a concept called robust hashing, the idea was invented in 2008 by Farid, who was trying to help stop the flow of child pornography. But telling a machine to recognize a picture of child abuse “is not possible,” he told reporters on a conference call on Friday. “We have not gotten to the stage where we can reason about fairly high-level things having to do with content. And so they were stuck.”
Consider that Google and Stanford needed a neural network of some 16,000 processors looking at more than 10 million YouTube videos before the system could recognize a cat.
“Deep learning, and other related modern-day learning algorithms are not capable of quickly and reliably distinguishing between appropriate and inappropriate content,” Farid told Defense One in an email. “The speed and error rate for these approaches is simply prohibitive for dealing with Internet-scale image uploads” — billions of new images per day.
So rather than teach a network of machines to understand the concept of abuse, Farid sought to teach software to recognize images that were spreading across the Internet after someone had tagged them as inappropriate. This way, the computer must simply recognize the image, not the idea that the image is depicting, and stop it from proliferating further.
Farid describes robust hashing as a method of extracting a “distinct signature” from an image. The signature is “an abstract numerical representation that is both distinct and stable to modifications to the underlying medium,” meaning that it remains useful even if the image is modified a bit. He further described the signature as “a string of numbers that embody that actual, underlying content,” like a photo’s DNA.
Farid donated the software to the National Center for Missing &Exploited Children, which licenses it for free to social media companies to police their sites for pictures of child pornography. It has an error rate of about one in five billion, he says.
About eight months ago, Farid teamed up with the Counter Extremism Project to apply robust hashing to a different challenge: recognizing jihadist content. They trained a system on the Project’s ever-expanding propaganda-images database, and set it to work.
“The technology will be running on any company that chooses to participate in this process. Associated with the technology is a database of hashes of known content. That hash table can be distributed easily through any number of different means and so every time there is new content, the hash table gets expanded, that hash table gets distributed to the companies that are participating and then the process is fully automatic at that point,” he explained.
While social media companies will be able to license the software at no cost, Farid doesn’t intend to give away the intellectual property underlying it. The Counter Extremism Project is proposing a new center, called the National Office for Reporting Extremism, or NORex, to host the database and handle licensing and logistics for using the new technology. NORex would be modeled, in part, off the National Center for Missing and Exploited Children.
Social media companies have been stepping up efforts to combat the spread of violent jihadist imagery online. In February, Twitterannounced that they had suspended more than 120,000 accounts used to spread ISIS content.
Counter Extremism Project CEO Ambassador Mark Wallace said that he had had “collegial discussions” with social media companies about the proposed NORex center.
One tech company representative told Defense One that many in the industry were skeptical of Wallace and the proposed plan. The representative said that on April 29th, a Facebook executive organized a conference call with Wallace and several key players in the social media space. The Facebook executive described a framework for an industry-funded center like NCMEC, but for extremist content. The tech companies were not sold. “The skepticism from participants focused primarily on the effectiveness of such a group. Someone raised privacy concerns, but the conversation was more about whether it would be able to accomplish what [the Facebook executive] described. Child pornography is very different from terrorist content. The NCMECdatabase is known, illegal, child sexual exploitation images and the definition of extremist content varies by country,” the representative told Defense One.
That suggests that CEP still has some work to to in reassuring industry that the process of tagging content going into the database as “extremist” will be fair, since the definition of extremism is not uniform. CEP would be play an important role in deciding whether or not tagged content was actually extremist in nature, or simply controversial. Wallace said that he anticipated future debate and disagreement about some content and its tagging. In the meantime, some of the film and audio and footage that CEP already had in its possession met an obvious and common sense definition of extremism. “I think we can all agree that there is going to be a set of images, video and audio that should be removed expeditiously,” he said.
The effort has also won some important backing from the White House. “We welcome the launch of initiatives such as the Counter Extremism Project’s National Office for Reporting Extremism (NORex) that enable companies to address terrorist activity on their platforms and better respond to the threat posed by terrorists’ activities online. The innovative private sector that created so many technologies our society enjoys today can also help create tools to limit terrorists from abusing these technologies in ways their creators never intended,” Assistant to the President for Homeland Security and Counterterrorism Lisa Monaco said on Sunday.
No comments:
Post a Comment