Facebook and Google have broken down barriers to media delivery, even for mass killers



The murder of 49 people at two mosques in Christchurch, New Zealand was designed to be viewed and shared on the world’s largest tech platforms, taking full advantage of Silicon Valley’s laissez-faire approach to the issue. content moderation.

It started with a racist manifesto uploaded to the Scribd document-sharing service and linked to Twitter for anyone to read.

There was a helmet-mounted camera of the attack synced with a Facebook Live account, and a link to the feed shared between a hate-filled online community.

There were real-time footage of the massacre being served to an audience that had found the connection.

Facebook only deleted the user’s account after local police alerted the company to the active shooting documented on the feed.

But by then, others had already posted the video to YouTube. This site, owned by Google’s parent company, Alphabet, was quick to remove new downloads from the video. Twitter said it was doing the same.

Soon the clips circulated on Reddit, a sprawling online bulletin board service owned by Condé Nast’s parent company Advance Publications. Reddit removed the message boards titled WatchPeopleDie and Gore, which featured the video as well as other clips of people killed or injured. These message boards have been operating on the site for the past seven and nine years, respectively.

Hours after the attack, users posted in the YouTube comments below coverage of the attack by major news organizations with links to the original live stream.

On one account, a user who identified himself as a 15-year-old spoke on a black screen, claiming that the platform would not allow him to post the images directly to the site, but that a link to the video was in the description.

The link led to the full 17-minute live broadcast of the mass shooting. It was hosted on Google Drive.

The unfiltering of the world has long been hailed as a utopian goal of the Internet, a means of dismantling the doors guarded by the bureaucracies of the print and broadcast media. Blogs covering niche news and addressing underserved communities could proliferate. An amateur glow that would never be allowed to air even on the smallest cable channels could be seen by millions of people. Dissenters could share information that would otherwise be censored.

READ MORE: Muslims in Southern California Respond to New Zealand Massacre: ‘It Could Have Been Us’

But the sight ignored the poisonous spores the guards had kept at bay.

The United Nations have implicated Facebook in stoking the flames of hatred against Rohingya Muslims in Myanmar, who have been the subject of a campaign of ethnic cleansing by the country’s military. YouTube has enabled millions of child pornography and exploitation videos, and its recommendation algorithms have been identified as promoting violent white supremacy by suggesting increasingly radical channels to viewers. Twitter is infamous for its coordinated harassment campaigns, often inspired by virulent misogyny and bigotry.

“There are so few incentives for these platforms to act responsibly,” said Mary Anne Franks, professor of law at the University of Miami and chair of the Cyber ​​Civil Rights Initiative, which advocates for legislation aimed at combating online abuse. “We have allowed companies like Facebook to escape categorization, by saying ‘we are not a media or entertainment company’, and allowed them to evade regulation.”

In response, the tech giants said it was impossible to control the billions of hours of content flowing through their platforms despite the efforts of employees and hired contractors to sift through the worst messages reported by companies. users or automatic detection systems.

Those who share the footage of the Christchurch shooting “are at risk of committing an offense” because “the video is likely to be objectionable content under New Zealand law,” the New Zealand Ministry of Justice said on Friday. Home affairs in a press release.

“The content of the video is disturbing and will be harmful for people to view. This is a real tragedy with real victims and we strongly encourage people not to share or view the video. “

But the tech companies that host the images are largely protected from legal liability in the United States by a 1996 telecommunications law that relieves them of any liability for content posted on their platforms. The law has empowered businesses, which generate billions in profits each year, by placing the burden of moderation on its users.

“If you have a potentially hazardous product, it’s your responsibility as an industry to make the proper judgments before you put it out into the world,” Franks said. “Social media companies have avoided any real confrontation that their product is toxic and out of control. “

The risks of live broadcasting have been present since the invention of radio, and the media have developed safeguards in response.

It was illegal for radio broadcasts to allow live calls to be broadcast until 1952, when a station in Allentown, Pa. Circumvented the law by inventing a band delay system that allowed a certain editorial control.

In 1998, a man from Long Beach used the live broadcast model to get his point across by parking at the 110 and 105 freeway interchange and pointing a shotgun at passing cars. Alarmed drivers called the police – soon, LA car chase helicopters were on the scene, reporting live.

He fired shots to keep the police at bay and displayed a banner on the sidewalk: “The HMOs are here for the money !! Live free, love safely or die.

Then, to conclude what The Los Angeles Times called “one of the most graphic and bizarre events to ever take place on a Los Angeles freeway”, he detonated a Molotov cocktail in the cab of his truck and s ‘is shot in the head live on television.

In response to public outcry over the grisly images, which in some cases had cut off the afternoon cartoons, TV networks began to introduce tape delays more widely in live coverage and address the issues. situations with visibly disturbed subjects with more caution.

The tape delay system is not perfect. In 2012, a glitch in the system led Fox News to accidentally live-stream the suicide of a man fleeing from the police. “It didn’t belong on television,” presenter Shepard Smith said in a shaken apology to viewers. “We took every precaution we knew how to prevent this from being on TV, and I personally apologize to you for what happened. . . . I am sorry.”

When Facebook Live streaming launched in 2016, Facebook CEO Mark Zuckerberg presented a different mindset behind the feature to reporters at Buzzfeed News.

“Because it’s live, there’s no way it can be staged,” he said. “And because of that, it frees people to be themselves. It’s live; it cannot be perfectly planned in advance. In a somewhat counterintuitive way, it’s a great way to share raw, visceral content.

[email protected]

Follow me on Twitter: @samaugustdean


Leave A Reply

Your email address will not be published.