Larry Greenemeier in Scientific American:
Social media companies have long used sophisticated algorithms to mine users’ words, images, videos and location data to improve search results and to finely target advertising. But efforts to apply similar technology to root out videos that promote terrorists’ causes, recruit new members and raise funding have been less successful. Video, which makes up well over half of mobile online traffic, is particularly problematic because it can spread extremists’ messages virally in minutes, is difficult to track and even harder to eliminate. Despite these high-profile challenges, Facebook, Google and Twitter face a growing backlash—including advertiser boycotts and lawsuits—pushing them to deal more effectively with the darker elements of the platforms they have created. New video “fingerprinting” technologies are emerging that promise to flag extremist videos as soon as they are posted. Big questions remain, however: Will these tools work well enough to keep terrorist videos from proliferating on social media? And will the companies that have enabled such propaganda embrace them?
ISIS has a well-established playbook for using social media and other online channels to attract new recruits and encourage them to act on the terrorist group’s behalf, according to J. M. Berger, a former nonresident fellow in The Brookings Institution’s project U.S. Relations with the Islamic World. “The average age of an [ISIS] recruit is about 26,” says Seamus Hughes, deputy director of The George Washington University’s Program on Extremism. “These young people aren’t learning how to use social media—they already know it because they grew up with it.” ISIS videos became such a staple on YouTube a few years ago that the site’s automated advertising algorithms were inserting advertisements for Procter & Gamble, Toyota and Anheuser–Busch in front of videos associated with the terrorist group. Despite assurances at the time that Google was removing the ads and in some cases the videos themselves, the problem is far from solved. In March Google’s president of its EMEA (Europe, Middle East and Africa) business and operations, Matthew Brittin, apologized to large advertisers—including Audi, Marks and Spencer, and McDonald’s U.K.—who had pulled their online ads after discovering they had appeared alongside content from terrorist groups and white supremacists. AT&T, Johnson & Johnson and other major U.S. advertisers have boycotted YouTube for the same reason.