Content moderation involves the set of actions which digital platforms apply to user-generated content in order to ensure that a platform complies with its community guidelines and with legal obligations in the relevant jurisdictions. Such moderation actions can include banning, feature blocking, content takedowns, referral to law enforcement, temporary suspension, withholding payments, feature blocking, reducing visibility and labelling.
Recent Developments in Nigeria
On December 3, 2024, the NITDA issued a public notice regarding the state of compliance with the Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries, 2023. Amongst others, the Notice indicates that in 2023, across TikTok, Google, X and Microsoft platforms,
(a) 4,125,283 (Four million, one hundred and twenty-five thousand, two hundred and eighty-three) complaints were registered in 2023;
(b) 65,853,581 (Sixty five million, eight hundred and fifty three thousand, five hundred and eighty one) online posts were taking downs.
(c) 379,433 (Three hundred and seventy-nine thousand, four hundred and thirty-three thousand) online posts were removed and re-uploaded following appeal; and
(d) 12, 099,633 ( Twelve million, ninety-nine thousand, six hundred and thirty three) accounts were closed and deactivated.
Legal framework for Content Moderation in Nigeria
The legal framework for content moderation in Nigeria encompasses a variety of laws, regulations, and guidelines that govern how online platforms moderate and manage user-generated content. Below is an overview of some of the key content moderation regulations in Nigeria.
(1) The Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries, 2023 ( the “Code”)
The Code sets out best practices for content moderation by internet intermediaries whose platforms are available in Nigeria. Amongst others, the Code categorises certain internet platforms into Large service platforms. Also, the Code defines and categorises online content into 3 categories namely, unlawful content, harmful content and prohibited material. Additionally, the Code stipulates the procedures that platforms must take when moderating online content in Nigeria and ascribes legal definitions to key concepts like misinformation, disinformation, online harm and internet intermediaries.
(2) The Nigerian Data Protection Act, 2023 (“NDPA”)
The NDPA regulates the handling of personal data, including in content moderation activities. Amongst others, moderation processes must comply with data protection principles, such as transparency and lawful processing. Also, content flagged for moderation involving personal data must respect user privacy rights.
(3) The Nigerian Constitution, 1999 ( the “Constitution”)
Nigeria’s Constitution is the source of all other laws and authority in the country’s legal system and is the standard against which all other laws and regulation are measured. Nigeria’s constitution is supreme to the parliament. Nigeria’s Constitution codifies the fundamental human rights enjoyed by Nigerians. These include the fundamental human rights to life; dignity of human person, liberty, fair hearing; private and family life, freedom of thought, conscience and religion, freedom of expression and press, peaceful assembly and association; movement, freedom from discrimination; right to acquire and own immoveable property among others. Amongst others, moderation processes must comply with constitutional provisions.
(4) National Broadcasting Commission Act 2004 (the “NBC Act”)
The NBC Act regulates broadcasting business in Nigeria and establishes the main principles which broadcasters must abide by. Amongst others, the NBC Act sets the standards for broadcasting as well as the political, social, cultural, economic, professional and technological standards which licensed broadcasters must comply with. It is useful to note that certain types of technology companies and platforms are subject to the provisions of the NBC Act. Technology platforms that are within the regulatory purview of the NBC Act are required to ensure that moderation processes comply with broadcasting standards.
This publication is not intended to provide legal advice and is not prepared with a specific client in mind. Kindly seek professional advice specific to your situation. You may also reach out to your usual Balogun Harold contact or contact us via support@balogunharold.com for support.