The Harmful Digital Communications Act 2015 was enacted on 2 July 2015. Sections 23 to 25 say that if certain steps are followed, online hosts are protected against almost all liability for content hosted on their platform. But there are a few issues that hosts should be aware of.
A brief explanation of how the safe harbour works is a good starting point. It applies to online content hosts. An online content host is a person who has control over an online system on which content is accessible to users. That includes Facebook, parts of Google, the New Zealand Herald, and most bloggers.
The safe harbour provisions are already in effect, as is the criminal offence of causing harm by posting a digital communication. The remainder of the Act will not come into force until regulations have been drafted.
When a user doesn’t like certain content that is available online, that user can complain to the host of that content rather than the author of the content. If the user does that, then provided the host follows certain steps and meets certain criteria set out in section 24, the host cannot be sued. Click here for a flowchart setting out the process for hosts in some detail.
The protection afforded to hosts by section 24 does not protect against content posted or procured by the host itself, against copyright liability, breach of a suppression order or breach of bail reporting requirements. But except for those matters, if the host follows section 24, no other content posted by users and hosted by a host can give rise to proceedings against the host. That means:
- Hosts are protected against liability for defamation. That expands the protection already afforded to online hosts by a limited definition of “publisher”. The Court of Appeal in Murray v Wishart  NZCA 461 held that the person who controls a Facebook page is a publisher of information posted by third parties on that page only if the controller has actual knowledge of the allegedly defamatory post and fails to remove it within a reasonable time. Similar reasoning is likely to apply to the owners of blogs, media sites and other websites that allow comments or self-publishing. But now, even if a host does have actual knowledge of defamatory content, if it follows the process in section 24 then it will be protected against liability. Sometimes that will result in the content being left online. Note that this does not prevent a host from arguing (as per Murray v Wishart) that it is not a publisher (section 23(3) preserves that and any other defence that would otherwise be available).
- The tort of invasion of privacy will not be available against a host. By way of example, if a visitor to the WhaleOil blog were to post pictures of Dan Carter’s children, Mr Slater will be protected from liability provided he follows the section 24 process (and, for example, does not re-publish them).
- Other areas that the safe harbour might restrict liability include: liability under the HDC Act itself (for example, hosting content that incites suicide), the Human Rights Act 1993 (for example, publishing material that incites racial disharmony) breaches of the Privacy Act 1993, breach of confidence, or crimes that involve publishing (for example, the unauthorised disclosure of official information, or blasphemous libel, which is still apparently a crime). It would also appear to prevent a host from being classified as a party to any offending under section 66 of the Crimes Act.
There are some issues though.
First, the safe harbour process must be followed to the letter, or the host loses its protection. If a host receives a notice of complaint and contacts the author, but the author doesn’t consent to the takedown, the content must be left in place. It is not hard to imagine situations in which the host would prefer to remove the content once brought to its attention (for example, if it is clearly objectionable), or is pressured by a member of the police or some other agency to remove the content. But if it does remove content without the author’s consent, it loses the safe harbour protection.
Second, there is now a gap in copyright liability protection for hosts. Under the Copyright Act, section 92C provides a safe harbour for copyright infringement if (upon notice) the content is taken down “as soon as possible” after becoming aware of the content.
If an invalid notice is given to a host under section 24 of the HDC Act, or infringing material is otherwise brought to the host’s attention without complaint, then the host will be aware of the content, but does not immediately obtain the protection of the HDC safe harbour. To get that protection the host needs a valid notice of complaint. It also needs to leave the content up, unless it is able to contact the author and the author consents to the takedown, or if it does not receive a response from the author within 48 hours of contacting him or her.
So the host might be faced with a difficult choice if the content is both harmful and potentially in breach of copyright. It can take down the content immediately and obtain copyright protection, but lose the wider safe harbour in the HDC Act. Or it can follow the section 24 safe harbour process, but risk losing protection under the Copyright Act.
Third, hosts must have an easily accessible mechanism for users to report content under the HDC Act. That is a prerequisite for obtaining protection against liability. But that mechanism may be open to abuse. There is no penalty for making a false complaint or misrepresentation. Content that is not harmful but someone wants to suppress could disappear if a malicious complaint is received but the author cannot be contacted. Anything posted anonymously could well be removed if just one person on the internet doesn’t like it.
On the other end of the spectrum, a controversial piece of content that goes viral could be the subject of hundreds (or thousands) of complaints. A host could easily be overwhelmed if it attempts to address each complaint.
However to get safe harbour protection, a host arguably only needs to respond in full to the first complaint made about specific content. If the section 24 process is followed in relation to one complaint about specific content, then the safe harbour provides protection against any proceedings by any person in relation to that same content. It does not appear that section 24 protection is lost if a subsequent complaint in respect of specific content is ignored.
But protection could be lost if hosts respond incorrectly to a later complaint. For example, if an author doesn’t respond to a later complaint (because he or she doesn’t have to) and the host removes the content, then the removal would breach s24(2)(d) and the host would lose the protection it obtained by correctly processing the first complaint.
Hosts should therefore maintain a register of complaints. Ideally, whatever reporting mechanism is put in place would check the URL of the offending content against a database of previous reports, and add a flag to the notice of complaint if it has already been dealt with.
It will be interesting to see how the courts respond to hosts seeking safe harbour protection over the next few years. It will also be interesting to see if the complaint system is abused, and if so whether certain large internet companies will – despite the increased risk of liability – refuse to remove content that is the subject of malicious or vexatious complaints.
Our thanks to Andrew Easterbrook for writing this article