These days, mass shooters like the one currently being held in the Buffalo, NY supermarket attack don’t stop to plan their brutal attacks. They are also creating marketing plans while arranging for their massacres to be broadcast live on social platforms in hopes of fomenting more violence.
Sites like Twitter, Facebook and now game streaming platform Twitch have learned painful lessons from dealing with the violent videos that often accompany such shootings. But experts are calling for a broader discussion about livestreams, including whether they should exist, because once these videos are uploaded, they’re nearly impossible to completely erase.
The self-proclaimed white supremacist gunman who police say killed 10 peopleall black, in a Buffalo supermarket on Saturday, had mounted a GoPro camera on his helmet to broadcast his assault live on Twitch, the video game streaming platform used by another shooter in 2019 who killed two people in a synagogue in Halle, Germany.
He had previously outlined his plan in a detailed but rambling set of online diary entries that were apparently posted publicly before the attack, though it’s unclear how people might have seen them. His goal: to inspire imitators and spread his racist beliefs. After all, he was a copycat himself.
He decided not to broadcast on Facebook, as another mass shooter did when he killed 51 people in two mosques in Christchurch, New Zealand, three years ago. Unlike Twitch, Facebook requires users to create an account in order to watch live streams.
However, not everything went as planned. By most accounts, the platforms moved faster to stop the Buffalo video from airing than they did after the Christchurch shootings in 2019, said Megan Squire, senior researcher and technology expert at Southern. Poverty Law Center.
Another Twitch user watching the live video likely brought it to the attention of Twitch’s content moderators, she said, which would have helped Twitch stop the stream less than two minutes after the initial hits. of fire by a company spokesperson. Twitch did not say how the video was reported.
“In this case, they did pretty well,” Squire said. “The fact that the video is so hard to find right now is proof of that.”
In 2019, the Christchurch shooting was livestreamed on Facebook for 17 minutes and quickly spread to other platforms. This time around, the platforms generally seemed to coordinate better, including sharing the video’s digital “signatures” used to detect and remove copies.
But the platform’s algorithms may have a harder time identifying a copied video if someone edited it. This created problems, such as when some Internet forum users remade the Buffalo video with twisted attempts at humor. Tech companies should have used “more sophisticated algorithms” to detect these partial matches, Squire said.
“It feels darker and more cynical,” she said of attempts to release footage of the shooting in recent days.
Twitch has over 2.5 million viewers at any given time; around 8 million content creators upload videos to the platform each month, according to the company. The site uses a combination of user reports, algorithms, and moderators to detect and remove any violence that occurs on the platform. The company said it quickly deleted the shooter’s stream, but didn’t share many details about what happened on Saturday – including whether the stream was flagged or how many people watched the rampage in direct.
A Twitch spokesperson said the company shared the live stream with the Global Internet Forum to Counter Terrorism, a nonprofit group created by tech companies to help others monitor their own platforms for replays. But snippets of the video still made their way to other platforms, including the Streamable site, where it was available to millions of people. A spokesperson for Hopin, the company that owns Streamable, said on Monday it was working to remove the videos and terminate the accounts of those who uploaded them.
Going forward, the platforms could face future moderation complications due to a Texas law – reinstated by an appeals court last week – that prohibits large social media companies from “censoring” user views. The shooter “had a very specific point of view” and the law is vague enough to create a risk for platforms that moderate people like him, said Jeff Kosseff, associate professor of cybersecurity law at the US Naval Academy. “It really puts your finger on the scale of maintaining harmful content,” he said.
Alexa Koenig, executive director of the Human Rights Center at the University of California, Berkeley, said there has been a shift in how tech companies react to such events. In particular, said Koenig, coordination between companies to create fingerprint repositories for extremist videos so they cannot be re-uploaded to other platforms “has been an incredibly important development.” .
A Twitch spokesperson said the company will review how it responded to the shooter’s live stream.
Experts suggest that sites such as Twitch could exercise more control over who can live stream and when – for example, providing for delays or whitelisting valid users while banning rule breakers. More broadly, Koenig said, “there’s also a general societal conversation that needs to take place around the usefulness of live streaming and when it’s useful, when it’s not, and how we set standards about how it is used and what happens if you use it.”
Another option, of course, would be to end the live stream altogether. But that’s almost impossible to imagine given how much tech companies rely on live streams to attract and maintain user engagement in order to make money.
Freedom of speech, Koenig said, is often the reason tech platforms give for allowing this form of technology — beyond the unspoken profit component. But that should be balanced “with the right to privacy and some of the other issues that arise in this case,” Koenig said.
This story has been updated to clarify that all 10 people killed in the shooting were black.
Copyright 2022 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.