Social apps, Facebook, YouTube and Google said they were working to remove videos and other content associated with the deadly terrorist attacks. As many as 49 people killed and 20 others injured in deadly mass shooting at two mosques in Christchurch on Friday, New Zealand Prime Minister Jacinda Ardern said.
At least 10 people were killed at Linwood Masjid mosque and 30 at the Al Noor Mosque near Hagley Park. The police have detained four people, including a woman, in connection with the shootings and recovered several improvised explosive devices.
Facebook said it removed the video after New Zealand police alerted the company to it, plus the suspected shooter’s Facebook and Instagram accounts. Teams from across the company have been responding to reports and blocking content, the company said.
“Since the attack happened, teams from across Facebook have been working around the clock to respond to reports and block content, proactively identify content which violates our standards and to support first responders and law enforcement,” she said. “We are adding each video we to find to an internal data base which enables us to detect and automatically remove copies of the videos when uploaded again. We urge people to report all instances to us so our systems can block the video from being shared again.”
YouTube said it is working to remove videos as it becomes aware of them, and urged users to flag videos that may violate the site’s guidelines.
“As with any major tragedy, we will work cooperatively with the authorities,” a spokesperson for YouTube parent company Google said.
The shooter also left a 74-page manifesto that he posted on social media under the name Brenton Tarrant, identifying himself as a 28-year-old Australian and white nationalist who was out to avenge attacks in Europe perpetrated by Muslims.