A couple of threads:
Hi David! I work with various civil society organisations re: end-to-end encryption, having formerly been an engineer at both FB & SunMicrosystems.
I know that it's dangerous to argue by analogy in this space, but I would be interested to know your perspective on the following:
In the USA alone, provision of airbags in vehicles saves thousands of lives per year + countless other injuries; worldwide even more. Yet airbags also are not safe for children, and many horrible injuries and deaths occurred, until we standardised on disabling bags around them.
E2EE offers billions of people the assurance of not having data centrally hacked or tampered-with. There are issues re: keeping children safe, so perhaps they shouldn't be using it.
But the discourse is against the deployment of these digital airbags at all. That's not wise.
There is apparently no positive advocacy FOR end-to-end encryption within the bounds of UK Government.
I would like to see that changed.
https://www.nhtsa.gov/equipment/air-bags

Originally tweeted by Alec Muffett (@AlecMuffett) on 2021/04/30.
E2E is beyond a "facility" – it is a radical architectural choice that puts personal data (conversations, pictures, etc) only in the hands of the people who have shared interest in it.
Quite literally: those who the data controllers deem "need to know".
E2EE conversations are "closed distribution lists" (CDLs) between mutually known participants, & the fundamental question of E2EE is whether people should be permitted to communicate using CDLs that do not automatically also include platforms and law enforcement as "participants"
Whitfield Diffie – inventor of public key signatures & cryptography – puts it:
"it used to be that the only thing which was necessary for people to have a private conversation, was to walk into a field and talk."
This – privacy – is a fundamental human want.
For me there are massive questionmarks hanging over whether it is wise, liberal, or proportionate to disallow billions of people from creating "Closed Distribution Lists" of messages unless/until they include scope for a third party to watch over them.
And there is worse:
The verbiage coming out of the EU includes blandishments re: the protection of privacy of journalists and their sources, doctors and their patients, and more.
This begs the question: how will they know, ex-ante? Will there be a whitelist of "trusted private communicators"?
The notion is quite terrifying – both in terms of the illiberality that it represents, and also that "trusted people" would be able to abuse without oversight.
I believe the benefits of E2EE outweigh the deeply illiberal consequences of attempting to mitigate its downsides.
And further: the next generation – the next leap of scale in use of the internet – will require more people, more solutions, to adopt the need-to-know architecture of E2EE.
Because the notion of having all one's data in plaintext in one place where it can be hacked, is doomed.
Oh, and in case it's not clear:
I'm not suggesting that E2EE is unwise for any particular platform, because E2EE is a valuable architectural approach for many apps.
From a safeguarding perspective, though, children are clearly unfit users for many apps.
Whether it should be parents, platforms, or the state which is responsible for policing children's behaviour / access to apps, is another argument entirely.
Originally tweeted by Alec Muffett (@AlecMuffett) on 2021/05/01.
Leave a Reply