Content Moderation and Data Protection | ICO releases a short PDF, which is a bit of a Curate’s Egg on moderation issues

So I am reading this and trying to be positive, but still I get the sense of “when all you do is data protection, everything is a data protection problem” — this occasionally leaks out where ICO refer to platforms that use (rather than practise) moderation, as if it were some sort of tool.

It’s good to see a clear restatement of what pseudonymisation is. It’s worrying but unsurprising to see perpetuated the belief that IP addresses may fairly be treated as personal identifiers. It’s good to see sensible recaps of data minimisation not excluding stuff like posting history.

It’s great to see a restatement that special category information:

‘revealing’ or ‘concerning’ … race; ethnic origin; political opinions; religious or philosophical beliefs; trade union membership; genetic data; biometric data (…); health data; sex life; or sexual orientation.

…obligates “special processing”, because this is eventually going to bite age verification data (revealing or concerning sex life or sexual orientation) – even pseudonymised/linked – collected by adult-content / porn sites.

My sense is, though, that the document is geared towards enabling upcoming attacks upon moderation processes on hypothetical grounds of race, religion, trade unions, etc:

For example, a human moderator reviewing images and videos of people wearing certain clothing may be able to infer that they belong to a particular religious group, even if the content does not specify that information directly.

…which will be fun. Not.

https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/online-safety-and-data-protection/content-moderation-and-data-protection/

https://ico.org.uk/media/for-organisations/uk-gdpr-guidance-and-resources/online-safety-and-data-protection/content-moderation-and-data-protection-0-0.pdf [PDF]

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *