> “You cannot create a back door that only the good guys can go through,”

Alright, technical argument here. This is false and tech talking heads are spreading this lie for ideological reasons.

You don't need to backdoor the protocol, just the specific targeted client.

Now, if Signal said this is unfair because competitors won't also backdoor their app then I am with them. What the UK should probably have done is to force phone makers as opposed to app makers to facilitate a backdoor.

Signal can scan messages before encryption and report to the authorities just fine. Whatever the UK government desires, Signal should accept it as the will of the UK people, especially given that politicians' stance on this has been public and endured election cycles.

I do get their stance, people will stop using them if they cave in and the UK gov should know that as well.

I don't know UK law but dragnet surveillance is illegal in the US but targeted warrantful requests to backdoor apps is lawful.

>You don't need to backdoor the protocol, just the specific targeted client.

So how are you backdooring the client that -- where ever you download it from -- can be dumped from the phone and compared against self compiled client with a cryptographic hash?

Introducing intentional, ubiquitous vulnerabilities is also a terrible idea, because exploits for those can leak or be stolen (that has already happened, see Shadow Brokers case), and that's catastrophic in operating systems, because they're massively scalable.

Rootkits injected via such vulnerabilities can covertly make systems so that the vulnerability and backdoor are unpatchable, and that would be a catastrophic scenario.

>Signal can scan messages before encryption and report to the authorities just fine

Writing a script that comments out the lines of code for such scanning, and that compiles the application from source is trivial.

People write E2EE layers on top of existing messaging apps. One example is OTR plugin for Pidgin/Gaim, another is CryptoCat that at one point had a system that operated on top of Facebook web UI.

The genie is out of the bottle, and the bad guys are willing to go the distance to get secure comms. Bin Laden was using airgaps. Criminals are buying crypto phones (that ANOM case was fun example of smart targeted attack though, maybe that can be used to catch some criminals in future too). What's left is the security of normal people from banana dictatorships, mass surveillance tools etc. When you backdoor privacy tools, that's who you'll get, and if that's who you're going after, then you have no right to wield such power.

> Introducing intentional, ubiquitous vulnerabilities is also a terrible idea

No e2ee app has compromised device part of their threat model. People get ratted all the time on their phones bu criminals. This is the same thing except the gov is nice enough to not look ar everything, just scan for content

> Writing a script that comments out the lines of code for such scanning, and that compiles the application from source is trivial. People write E2EE layers on top of existing messaging apps. > One example is OTR plugin for Pidgin/Gaim, another is CryptoCat that at one point had a system that operated on top of Facebook web UI.

Doesn't matter. Those apps can be backdoored too. The whole OS can. And the gov doesn't mind playing whack-a-mole. You can disagree with their policy but I disagree with openly collaborating on lying to policy makers on technical facts.

>No e2ee app has compromised device part of their threat model.

Oh really, here's one I made earlier https://github.com/maqp/tfc :-)

>The whole OS can.

So how are you backdooring a bash script that comments out lines of code from Linux source before compiling it?

You lying to policy makers with "it can be done" mindset sound like a stupid con that burns a lot of money and time in the process.