Intermediary liability, as a legal and policy concern, raises the question: To what extent should tech platforms bear financial or criminal responsibility for the online actions or omissions of their users?As digital platforms continue to reshape industries and societies, this issue lies at the heart of balancing innovation with accountability. In Nigeria, this issue has gained increasing relevance with the rise of social media use, online defamation claims and legislative interest in regulating online harms caused by social media users. Although still evolving, Nigeria’s legal framework on this issue differs significantly from countries like the U.S., making it important for technology platforms to be aware of potential legal risks. This legal update examines current legal developments surrounding intermediary liability in Nigeria.
1. Intermediary Liability for Online Defamation
Unlike in the U.S., where Section 230 of the Communications Decency Act shields tech platforms from liability for user posts, there is no direct equivalent of Section 230 in Nigeria, meaning platforms could be treated as “publishers” by Nigerian courts. This means platforms may be fully liable for defamatory content posted by users. Generally, Nigeria follows common law defamation principles rooted in English law. While platforms may try to rely on the defense of “innocent dissemination,” this legal principle isn’t well-established in Nigerian courts, although it remains a plausible to mitigate platform liability. To succeed with this defense, a tech platform must show that (a) it was not the author, editor, or publisher of the defamatory content (b) it took reasonable care regarding the publication of that content; and (c) It did not know, nor had reason to believe, the content was defamatory.
Platforms may also be able to limit liability for online defamation under the Defamation Law of Lagos State, a state law which is not federally applicable. Accordingly, this statutory defence is only applicable if the cause of action arose within Lagos State. To rely on this defense, a platform must be able to show that, it did not author the defamatory content and that it took steps to take remedial steps after if was notified of the infringing content by a claimant. It is important to note that the relevant provisions under the Defamation Law of Lagos State are yet to be tested in court. Accordingly, the position of the law is largely unsettled especially as it relates to the ability to maintain a common law action alongside a claim or defence based on the Defamation Law of Lagos.
There could also be criminal liability for publishing defamatory matter under Nigeria’s Criminal Code Act, which is federally-applicable (the “Criminal Code“). Under the Criminal Code, the publication of defamatory matter involves, exhibiting the defamatory matter in public, causing it to be read or seen, or showing or delivering it, or causing it to be shown or delivered, with intent that it may be read or seen by the person defamed or by any other person.
2. Intermediary Liability for Censorship Decisions
Platforms can also face liability for their censorship decisions in Nigeria. Unlike the U.S., where private companies are not generally subject to “free speech” challenges, Nigeria’s version of free speech protections – known as Fundamental Human Rights (FHR) to free speech – can apply to private companies. This means private companies/platforms could face legal action if users believe their rights to free expression have been unjustly restricted. In practical terms, this makes content moderation decisions more complex, as they may be subject to judicial scrutiny. It would be prudent for platforms to exercise caution as they navigate between moderating harmful content and respecting users’ rights to free speech ( or other fundamental human rights) under Nigerian constitutional law
3. Intermediary Liability for Copyright Infringement
When it comes to copyright infringement, platforms in Nigeria are generally not held liable if they meet certain safe harbor requirements. Similar to the U.S. Digital Millennium Copyright Act (DMCA) safe harbor provisions, Nigerian law allows platforms to avoid liability if they:
- Have no actual knowledge of the infringing content or are unaware of obvious signs of infringement.
- Do not financially benefit from the infringing activity when they have the right and ability to control it.
- Act quickly to remove or disable access to infringing content once they become aware of it.
- Implement proper procedures to suspend accounts of repeat infringers, as required by statute
It is important to note that the statutory protection afforded to platforms pertains only to monetary relief for infringement of copyright. Accordingly, claimants may be able to claim other equitable non-monetary reliefs in connection with copyright infringements on platforms.
4. Intermediary Liability Under Cybercrimes ( Prohibition, Prevention, etc) Act, 2024
Tech platforms are required to comply with comply with the provisions of the Cybercrimes ( Prohibition, Prevention, etc) Act, 2024 ( the “Act”). In particular, tech platforms have a statutory obligation under the Act to provide assistance with the identification, apprehension and prosecution of offenders, the identification, tracking and tracing of proceeds of any offence or any property, equipment or device used in the commission of the offence and also the freezing, removal, erasure or cancellation of the service availability to an offender. The Act imposes both monetary and jail terms on directors as well as platforms who fail to assist with investigations.
5. Regulatory Liability for Publishing Unlawful Content
Tech platforms may also be liable for publishing unlawful content. Under Nigerian law, “unlawful content” is defined as “any content that violates existing law in Nigeria“. However, tech platforms can avoid regulatory liability where they meet the legal requirements for a safe harbour protection. To meet the requirements for safe harbour protection, a tech platform must be able to demonstrate that it took all reasonable steps to ensure that “unlawful content is taken or stays down“, following the receipt of a notification from a user or an authorised government agency, of the presence of unlawful content on the relevant platform.
Conclusion
Navigating intermediary liability in Nigeria requires tech platforms to be cautious about the content they host and how they moderate it, in Nigeria. With no blanket immunity for user-generated content like Section 230 in the U.S., it would be prudent for platforms to proactively manage local risks related to online defamation, censorship decisions, copyright infringement and online harms generally.
This publication is not intended to provide legal advice and is not prepared with a specific client in mind. Kindly seek professional advice specific to your situation. You may also reach out to your usual Balogun Harold contact or contact us via support@balogunharold.com for support.