ICO on online content moderation 

Content moderation

Today, 16 Feb 2024, the UK Information Commissioner’s Office (ICO) introduces its first guidance on content moderation, providing details on applying data protection to this process and its effects on users’ information rights. 

Image by vwalakte on Freepik

What is online content moderation? And, why is it important?   

The ICO emphasised the personal data protection obligation of the platforms to make them safer by using content moderation. It refers to analysis of the content that users generate and examination of appropriateness for publication on the platforms. This process may concern the people’s personal information and be harmful if the platforms make wrong decisions based on incorrect information. As a result, the platforms may wrongly identify the users’ content as illegal publication or cease the users’ access to the platforms. 

Guidance on content moderation 

The guidance provides practical advice on content moderation obligations to the organisations governed by the Online Safety Act 2023 but also for other purposes, as well as their obligations under UK GDPR, the UK Data protection Act 2018, and the incoming Data Protection and Digital Information Bill. 

It concentrates on moderation of user-generated or user-uploaded or -shared content on user-to-user services. The services are “an internet service by mean of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service” (Section 3(1) of the Online Safety Act 2023). This content may “be encountered by another user, or other users, of the services by means of the service” (Section 55(3) of the Online Safety Act 2023).

For the services accessed by children, there are specific duties for services against pornographic content.        

This guidance is a part of the ongoing collaboration between the ICO and Ofcom (the UK Office of Communications) regarding data protection and online safety technologies. As the regulator of the Online Safety Act, the Ofcom is responsible for implementing the regime, supervising, and enforcing the online safety duties. 

Link to guidance: https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/online-safety-and-data-protection/content-moderation-and-data-protection/

Press release: https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2024/02/information-commissioner-s-office-tells-platforms-to-respect-information-rights-when-moderating-online-content/

To frequently get the news, follow our LinkedIn Page and to have further discussion in person – connect with me on Linkedin.  Cheers! 

For more short news, connect with us on LinkedIn

To have further discussion with me


Posted

in

by