It’s happened again - a Netflix production with potentially dangerous content for vulnerable teens has sneaked past classification before being flagged by the public and changed.
Listen to the full interview
The horror film The Perfection has been reclassified in New Zealand to 18+ from 16+ and not because of the gore. The film's scenes of sexual abuse were originally omitted from the streaming service’s 16+ rating which only noted themes of language, violence and nudity. Netflix voluntarily changed its classification, noting “rape, sexual violence, suicide references and graphic violences”.
It’s reignited concerns about flaws in the classification structure and copycat behaviour that first appeared after the Netflix teen drama 13 Reasons Why was released.
Chief Censor David Shanks says the fact that The Perfection wasn’t viewed by any authority until after it launched on Netflix, shows something needs to change.
Meanwhile the government is looking to make the classification of on demand video content carried on platforms such as Netflix or Lightbox mandatory, to bring it into line with other media and provide better guidance and protection for families and young people.
Netflix has been working together with the Chief Censor’s office, but in a reactive capacity, says Shanks.
“We’re still not in a situation where we are reviewing content before it is put out to air and streamed to the New Zealand public. As a result, we are necessarily in a reactive frame where if people raise complaints, if issues come to our attention we can then move - but that’s not ideal.”
In the case of The Perfection, it was a member of the public who raised a concern with the Chief Censor.
“As we were looking at it, we started seeing international reports indicating a number of reviewers were having quite strong reactions to seeing this film.”
Shanks says there isn’t one particular aspect of the film that is the most disturbing, although there are scenes of sexual violence that are disturbing and triggering for people, self harm and attempted suicide, he says the film would be on the borderline between an R16 and R18 classification had it not had a large amount of content with cause for concern.
“We know from research that mid-teens will essentially watch what they want, a mid-teenager will not refuse or turn away from an R18 publication or series or film just because it has a R18 classification on it and we’re realistic about that, which is why we are so focused on making sure that we get the right consumer information through.”
The medium is the message, says Shanks.
“On a Netflix show if you are looking at a film like The Perfection and deciding whether or not you’re going to watch it, imagine you’re a 15-year-old, heard about this show, you think this might be interesting, I’ve heard people at school talk about it, I might check that out. As that child or teen is looking at that film on Netflix, they do not see the warning on the front page, they see the content discussion about what’s in there written by Netflix…and simply the number R18 or R16.”
He says we need to think about how clear messages will come across to the viewer. Teenagers can make the call for themselves whether or not they want to watch something, they just need to be informed to be able to do it, he says.
Volume and speed is one of the problems, he says, particularly with internet content. The Chief Censor’s office is a small team and there is a lot of content online, he says
“Netflix senior people will tell you that their preferred model is to work with the local regulators to deliver the right kinds of warnings and the right kinds of classifications for a domestic audience, they’re in close to 200 countries. They appreciate it’s going to be near impossible for them to figure out what the cultural sensitivities are and issues are in each jurisdiction they’re operating. They need a local regulator they can partner with to give them a framework to develop fit for purpose classifications and warnings.”
There are various ways the problem can be tackled, says Shanks, and it’s critical it is dealt with now before technology moves forward. We are already the next phase of virtual reality and augmented reality and he says it shouldn’t be left to the developers and content publishers to make up the rules for what kind of warnings come with it.
“We’ve got to think about how we can get the basics right, now.”