Privacy-enabling technologies like End-to-End Encryption are under threat from Governments that want to undermine peoples’ ability to have a private conversation.
This has been underway for many years. Having been previously framed around “drugs” and “terrorism” the current round of attacks is focused especially upon child safety, with the co-operation of concerned but misguided and narrowly focused charities.

This childrens-charity-astroturfing effort is being orchestrated on behalf of the Government by MC Saatchi with £534,000 of funding from the Home Office, possibly more.
But attacking encryption won’t solve the problem of child abuse — especially not the estimated 90%+ of the time where the abuser is “known to the victim” — and also we suffer a societal challenge with parenting, where parents must walk a line between supervising children versus giving them agency, autonomy, space to grow, and frank awareness of how to spot danger. Parents could benefit from Government help and constructive resources regarding that, but they appear to prefer spending their money on advertising against encryption.

Matters are not helped by the Government and children’s charities often conflating “numbers of reports” with the numbers of actual children at-risk, abused, and harmed. The claimed numbers of “reports” are huge — usually in the millions [see errata] — but after removing duplicates, old, and previously-solved crimes (itself a massive challenge for police) the resulting counts of abusers and of children being abused is comparatively small — obviously never small enough, but still small, especially compared to populations of millions or billions of internet users.
An impactful approach to child protection should focus upon preventing the crimes before they happen, rather than pivot upon detecting — presumably thereby hoping to deter — crimes after-the-fact. If we want to protect kids we need to invest to fix society, not surveil it.
Critics sometimes claim that encryption makes it impossible to subpoena or obtain a warrant for information from people’s phones — this is bizarre because governments already demand such data. What they are actually complaining about is that the “platform” — for instance Facebook — no longer wants to be able to see the content themselves. The warrant will have to be served upon the device owner, not upon the (social) network provider.
Good security demands that data that we share amongst family and friends should remain available only to those family and friends; and likewise that data which we share with businesses should remain only with those businesses, and should only be used for agreed business purposes.
Network providers — and, importantly, messaging-network and social-network providers — are helping their users obtain better data security by cutting themselves off from the ability to access plaintext content. Simply: they don’t need to see it, and it’s not their job to police or censor it. Their adoption of end-to-end encryption makes everyone’s data safer.
The world needs end-to-end encryption. It needs more of it. We need the privacy, agency, and control over data that end-to-end encryption enables. And encryption is needed everywhere and by everyone — not just by politicians and police forces.
Child protection is a huge and important issue, but it cannot and must-not carry the debate; not least: your children should grow to adulthood, and when they do they will need the privacy of end-to-end encryption in order to navigate this increasingly complex, increasingly online world.
Further Reading
- If you want to know more about why the Government would like to expend this much money to stop end-to-end encryption being put into Messenger, read this article
- If you want a simple, clear explanation of what makes a messenger application NOT end-to-end encrypted, try this video
- If you are interested in perspective of how little we actually bother to measure about the number of children who are rescued from abuse, rather than the far more exciting, headline-friendly, “number of reports” statistics, try this article
- If you are interested in how end-to-end encryption is an enabler of so many features which you would not expect to be necessary – for instance how Apple uses it to share bookmarks between devices – see this article
Errata regarding “Report Counts”
- The original version of this post said that report numbers are inflated by counting individual files; that is incorrect, they are apparently only inflated with duplicates, stale information, non-infringing content, and redundancy, evidenced on this page, see graphic below.
- The problem exists partly as described in this blogpost: …in 2020 [NCMEC] received 21.8 million reports comprising 65 million files (slide 17 of this PDF, and elsewhere) — yet a NCMEC source tells me that in that same year they only added 1.7 million “triple-vetted” images hashes to their database of CSAM; from this it might be possible to infer that more than 90% of NCMEC’s reports are in some way duplicates or irrelevant, but that would be unfair to do without more context…
- See also this blogpost from Facebook: …we conducted an in-depth analysis of the illegal child exploitative content we reported to the National Center for Missing and Exploited Children (NCMEC) in October and November of 2020. We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period. While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many.
- And also this other blogpost from Facebook: …We worked with leading experts on child exploitation, including NCMEC, to develop a research-backed taxonomy to categorize a person’s apparent intent in sharing this content. Based on this taxonomy, we evaluated 150 accounts that we reported to NCMEC for uploading child exploitative content in July and August of 2020 and January 2021, and we estimate that more than 75% of these people did not exhibit malicious intent (i.e. did not intend to harm a child). Instead, they appeared to share for other reasons, such as outrage or in poor humor (i.e. a child’s genitals being bitten by an animal). While this study represents our best understanding, these findings should not be considered a precise measure of the child safety ecosystem. Our work to understand intent is ongoing.
- This is not new information, it has all been previously reported in Forbes …
The matter is clearly a lot more complex than the “millions” headlines would suggest.

Posters
Each links through to an individual image-post which can be separately shared on social media.










https://www.globalidentity.blog/2019/10/the-right-to-privacy-in-digital-age.html
It would be a better approach to enable e2ee only after the user confirms she is the right age. Like “Please provide the year of birth before enabling E2EE”.
Interesting. How would that work to prevent people lying?