It took half an hour for anyone to report the live video stream of the Christchurch terror attack to Facebook's moderators.
About 4000 people had seen the original, live video, before anyone reported it as harmful content.
Facebook has released further information publicly, after only tweeting a very select amount of information since the attacks.
Its vice president, Chris Sonderby, said once they became aware of the video moderators removed it "within minutes", but doesn't specify how long it took.
Mr Sonderby said fewer than 200 people watched the original, live broadcast, and none of them reported it during its 17-minute run-time.
Only after another 12 minutes - 29 minutes since it began livestreaming, and once thousands more had seen it - was the video reported to Facebook.
He said a link to the live stream was posted on another website before Facebook was alerted to it.
Mr Sonderby said of the 1.5 million copies of the video Facebook removed from the site, 1.2m were blocked at upload, meaning they never were able to be viewed. That leaves 300,000 versions of the footage which were successfully uploaded before they were deleted.
"We removed the original Facebook Live video and hashed it so that other shares that are visually similar to that video are then detected and automatically removed from Facebook and Instagram," Mr Sonderby said.
"Some variants such as screen recordings were more difficult to detect, so we expanded to additional detection systems including the use of audio technology."
The statement does not say how many people viewed the videos nor how many people shared them.
Mr Sonderby said Facebook is working directly with the New Zealand Police to support the investigation, and is committed to working with the government and the rest of the tech industry to counter the threat of terrorism.
The company's CEO, Mark Zuckerberg, has not made any public comments.
The company's live stream function has come under scrutiny from advertisers, internet service providers, and the prime minister, for what they perceive as the company taking too long to remove the videos, and even making the live stream function available to all of its users in the first place.
New Zealand's Privacy Commissioner, John Edwards, called for Facebook to hand over all the names of anyone who shared the video to police. It's been classed as objectionable by the Chief Censor, David Shanks, which makes it illegal to view, possess or distribute.
Representatives from Facebook, Twitter, Google (including YouTube) and Microsoft have spoken since the attack, under the banner of a joint industry cooperative, the Global Internet Forum to Counter Terrorism (GIFCT).
The group said it's shared the "digital fingerprints" of more than 800 visually-distinct attack videos to its shared database. This allows videos to be recognised by any of the organisations as they're uploaded, and disallowed.
The GIFCT says its mission is to collaboratively thwart terrorists' use of their services, and play their part in addressing the global challenge of terrorism.
The site states Facebook removes 99 percent of IS and Al-Qaeda's terror content before anyone flags it as harmful, and in some cases before it's even live on the site.
It says once Facebook's aware of a piece of terror content, it removes 83 percent of other copies uploaded within an hour.