On Friday, March 15th, the world witnessed a horrific mass shooting as a lone gunman attacked 2 mosques in Christchurch, New Zealand. Friday is a holy day for Muslims as they all gather at mosques for prayer services. The shooter – a self-identified white supremacist – took advantage of this, taking at least 50 lives in his attack and wounding dozens of others.
The gunman live-streamed the whole incident on Facebook for 17 uninterrupted minutes, and it wasn’t until after the footage had been brought to their attention by New Zealand police that the social media giant began removing it.
Once they were aware of the incident, the company was quick to remove almost 1.5 million uploads of the video and they managed to block the upload of a further 1.2 million. Altogether, Facebook was able to curtail approximately 2.7 million uploads of the gruesome footage.
In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload…
— Meta Newsroom (@MetaNewsroom) March 17, 2019
Despite Facebook’s efforts, there are still people bypassing their monitoring system and finding ways to share the video as well as graphic stills from the massacre (this is happening on Twitter, YouTube, WhatsApp, and Instagram as well, according to Reuters). Facebook, Twitter, and Alphabet Inc (which owns YouTube) are taking down the videos as swiftly as they can, but are being criticized by authorities around the world for not having a better system in place.
Facebook’s live stream feature has been controversial in the past. In 2017, a woman went on Facebook live to record a police officer shooting her boyfriend in the driver’s seat of their car. In another instance, a woman and three of her friends live-streamed the beating and torture of an 18-year-old for over 30 minutes.
While the feature can be difficult to monitor since users live-stream birthdays, gaming moments, and other trivial videos alongside violence, we are likely to see proposals over the next few weeks that attempt to force social media companies to utilize better AI and other deep learning methods to prevent another incident like this.
YouTube has also made significant efforts to curb the presence of the Christchurch footage on its website. But their efforts haven’t been enough. U.S. Senator Mark Warner wrote in an email to Gizmodo that “It is ever clearer that YouTube, in particular, has yet to grapple with the role it has played in facilitating radicalization and recruitment.”
While the footage can’t be erased permanently from the Internet, the next few weeks will give us an indication as to how much these companies can do to try to eradicate as much of it as possible as well as how they might develop new tools that help them avoid a situation like this in the future.