Tenure-track Faculty Position

Prof. Farid discusses online extremism on Science Friday

Submitted by Wojciech K. Jarosz on Thu, 02/02/2017 - 16:32

Back in the early 2000s, the internet had a problem with child pornography. For its part, the United States hadn’t anticipated the explosion of illegal images that had come online in the early days of the internet. Tracking these illegal activities became much more difficult, and removing all trace of the images from the World Wide Web seemed nearly impossible. So government officials turned to Silicon Valley for help.

But technology companies dragged their feet. By 2008 little had been done to fix the issue of online child pornography until one tech honcho—Microsoft—contacted Dartmouth College computer scientist Hany Farid. Farid is an expert in photo forensics, techniques used most often to identify fake images. But together Farid and Microsoft built a tool to identify any image by a unique signature, like a photo fingerprint. With that signature Microsoft could compare images—before they got posted on websites—to a database of nearly 30,000 images of child ponorgraphy catalogued by the National Center for Missing and Exploited Children. Farid tested the tool outon just 10 images, guessing that it would take several weeks, if not months, to find a match. It took just four days. Technology companies began to adopt the tool over the next decade; Google finally signed on in 2014.

Now Farid is ready to use this same technology to fight another internet spectre—terrorist messaging. According to the Counter Extremism Project (CEP), online terrorist videos and images play an important role in radicalizing extremists. But unlike images of child exploitation, the rules that govern what is and isn’t terrorist messaging aren’t as clear. Such messages aren’t illegal under U.S. law, and some worry that an arbitrary definition of “terrorist messaging” would pave the way for censorship. Silicon Valley is again dragging its feet to find a solution, even as evidence mounts that content hosted on their websites is in part responsible for recent acts of terrorism. Indeed, earlier this month two women filed a lawsuit against Twitter for assisting in the terrorist attacks that killed their loved ones in 2015 and 2016.

Farid joins Ira to discuss how photo forensics could curb terrorist messaging online. He is joined by Jillian York, the director for International Freedom of Expression at the Electronic Frontier Foundation, for a discussion about technology and the boundaries of censorship.

Dartmouth's capacity to advance its dual mission of education and research depends upon the full diversity and inclusivity of this community. We must increase diversity, particularly among our faculty and staff. As we do so, we must also create a community in which every individual, regardless of gender, gender identity, sexual orientation, race, ethnicity, socio-economic status, disability, nationality, political or religious views, or position within the institution, is respected. On this close-knit and intimate campus, we must ensure that every person knows that he, she, or they is a valued member of our community.