Media Call to Ban Attack Footage and Manifesto

March 15th, 2019 5:18 PM

Social media outlets raced to remove content in the wake of the New Zealand mosque shooting. In the aftermath, media members criticized the tech companies for their inability to restrict the shooter’s footage and manifesto of the New Zealand Mosque shooting. 

Political commentators and journalists condemned tech companies as somehow being complicit with the rise of terrorism. WIRED magazine observed that there is a growing sense that “both YouTube and Facebook have grown too big to moderate.” 

Liberal commentators like Jill Filipovic blasted Silicon Valley tech companies, saying that “Twitter, Facebook and Instagram allow white supremacists accounts on their sites, which is exactly how this shooter was able to upload his video. If they took their own rules seriously, they would ban bigots. They choose not to.”

New York Times Opinion Writer Charlie Warzel wrote a piece headlined: “The Massacre in New Zealand Was Made to Go Viral” noting that this terror attack “marks a grim new age of social media-fueled terrorism.”

He remarked how rapidly the footage of the attack “leapt across the internet faster than social media censors could remove it” calling the footage “a grotesque first-person-shooter-like record of man’s capacity for inhumanity.”

He added that what made this particular attack different from other high-profile terrorism “is both the methodical nature in which the massacre was conducted and how it was apparently engineered for maximum virality.”

“Though platforms like Facebook, Twitter, and YouTube scrambled to take down the recording and an accompanying manifesto from the gunman, they were no match for the speed of their users; new artificial-intelligence tools created to scrub such platforms of terrorist content could not defeat human cunning ...”

Ben Collins of NBC News, whose beat is covering the rise of “dystopia,” tweeted that: “After all of this, I still have hope we can end the YouTube radicalization cycle.”

He went into detail by asking “When was the last time you were recommended an ISIS video on YouTube or Facebook? The answer is probably never. That’s because law enforcement and tech companies made it a top priority.”

He elaborated further on how he thought tech companies should clamp down on free speech to do so:

“It will take a realignment of priorities for tech companies to snuff out white supremacists seizing on faulty algorithms to incite violence. Both law enforcement and tech co’s need to treat violent white supremacy for what it is: terror cells taking advantage of the vulnerable.”

He condemned tech companies, accusing them of pandering to hate and somehow enabling terror attacks, “tech companies need to get over the politics of upsetting white supremacists, and the dark money that dogwhistles them into immense power. They need to show some guts during a spate of terror they unknowingly helped abet.”

Lucinda Creighton, a senior adviser at the an international policy org called the Counter Extremism Project, joined in condemning tech organizations for not removing enough content. "While Google, YouTube, Facebook and Twitter all say that they're cooperating and acting in the best interest of citizens to remove this content, they're actually not because they're allowing these videos to reappear all the time,” she wrote.

CNN cited John Battersby, a counter-terrorism expert at Massey University of New Zealand, who lamented the fact that the globalization of the internet has made terror networks an international problem,

"This fellow live streamed the shooting and his supporters have cheered him on, and most of them are not in New Zealand,” he commented.

CNN also referenced its legal enforcement analyst Steve Moore, who said the spread of this video could inspire copycat attacks

"What I would tell the public is this: Do you want to help terrorists? Because if you do, sharing this video is exactly how you do it," Moore warned

When WIRED magazine covered the topic, Facebook issued a statement that the company’s hearts go out to the victims and community. Facebook claimed to have “quickly removed both the shooter’s Facebook and Instagram accounts and the video. We're also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”

WIRED commented on the rising sentiment that “It’s possible that both YouTube and Facebook have grown too big to moderate.” It added that “Some have suggested that, if these Christchurch videos are popping up faster than YouTube can take them down, then YouTube should stop all video uploads until it has a handle on the problem.”

"Queer scholar" Anthony Oliveira accused tech companies like YouTube of being complicit, “You built, plank by plank, the stage from which a virulent racist ideology reached so many children. We live in the nightmare your irresponsibility and greed helped build.”

Feminist lawyer and writer Jill Filipovic went so far as condemning multiple tech companies on Twitter, “Twitter, Facebook and Instagram allow white supremacists accounts on their sites, which is exactly how this shooter was able to upload his video. If they took their own rules seriously, they would ban bigots. They choose not to.”