Wanted: end-to-end encryption (with a backdoor for this guy)

Graham Cluley
Graham Cluley
@
@[email protected]
@gcluley

UK Prime Minister David Cameron is worried that people are encrypting their communications, and that he (and indeed, law enforcement agencies) can’t see what you’re saying.

“In extremis, it has been possible to read someone’s letter, to listen to someone’s call, to mobile communications… The question remains: are we going to allow a means of communications where it simply is not possible to do that? My answer to that question is: no, we must not.”

Cameron, who is seeking re-election in a few months time, thinks it would be a good idea to either make secure communication apps unlawful, or force them to contain a backdoor which the police and intelligence agencies could exploit.

Sign up to our free newsletter.
Security news, advice, and tips.

Of course, if you spend any time thinking about it, you know that’s crazy. Cameron is living in cloud cuckoo land.

Firstly, how would apps be outlawed? What’s to stop any Tom, Dick or Harry downloading an app without a government backdoor from a website hosted overseas to run on his PC? What’s to stop a terrorist or paedophile downloading the source code of a secure messaging app, and compiling it on their computer?

The fact is that the only people who would be using the backdoored messaging platform would be the innocent, regular members of the public. Criminals would stay well clear and use alternative systems that guaranteed they didn’t have the police and GCHQ breathing down their necks.

David Cameron is unhappy he can't read your messages

Secondly, if a messaging service has a backdoor – what’s to stop “enemies” of the UK also exploiting it?

Whenever you put a backdoor in a system, there’s a danger that the wrong people will walk through it. If you make encryption weak and crackable, or incorporate a method by which supposedly secure messages can be accessed, that makes it weaker for *everybody* (yes, even the security services). It also means it becomes an attractive target for online criminals, fraudsters and foreign intelligence agencies too.

Are government departments and the military going to feel comfortable using messaging systems that they know can be compromised? Or is it one rule for them and another for the rest of us?

Thirdly, if it’s not a technical backdoor, but instead a method for the secure messaging vendor to extract communications at law enforcement’s request, that still opens enormous dangers.

The vendor could be careless with their security, or they could have rogue staff, or they could find the demands of hundreds of different countries demanding access to messages too much – and hanker for the old days where they didn’t know what people were saying in their private communications.

Finally, if uncrackable encrypted communications become outlawed, the UK can wave goodbye to ecommerce and becoming a major player in the digital economy. Who is going to want to do business with a country which cannot promise to keep corporate secrets secret?

Cameron is talking codswallop. Or as we say in ROT13 (which may be the strongest form of encryption we can get our hands on if this lunacy comes to pass) “pbqfjnyybc”. Utter “pbooyref”

However, I’m a law-abiding fellow, and I have no wish to get into any trouble.

So, earlier today I dropped a line to CyberStreetWise – HM Government’s website designed to “measurably and significantly improve the online safety behaviour and confidence of consumers and small businesses.” (a very noble aim I’m sure you’ll agree).

If I wanted to keep my backdoor open for David Cameron, which is the best messaging app I should use? I’m hoping someone can tell me in time for the election on 7 May 2015.

Further reading:


Graham Cluley is an award-winning keynote speaker who has given presentations around the world about cybersecurity, hackers, and online privacy. A veteran of the computer security industry since the early 1990s, he wrote the first ever version of Dr Solomon's Anti-Virus Toolkit for Windows, makes regular media appearances, and is the co-host of the popular "Smashing Security" podcast. Follow him on Twitter, Mastodon, Threads, Bluesky, or drop him an email.

10 comments on “Wanted: end-to-end encryption (with a backdoor for this guy)”

  1. Coyote

    Love the satire (even if they'll not likely see it).

    It is interesting to me that you would write about this. I've done similar but not to this degree (I certainly did not break down all the points). It is unfortunate, however, why you're writing about it: the fact is in the debate(s).

    Most important of all:
    "Or is it one rule for them and another for the rest of us?"
    Two rings to rule them all seems so wrong… that means two rules to rule them all also is wrong. Where's Sauron when you need him ?

    Less important (although still important):
    "Firstly, how would apps be outlawed? What’s to stop any Tom, Dick or Harry downloading an app without a government backdoor from a website hosted overseas to run on his PC? What’s to stop a terrorist or paedophile downloading the source code of a secure messaging app, and compiling it on their computer?"

    What's to stop me from writing my own, even ? I know what the UK can do: they can blame the US for being less strict with encryption exports. Yes, that will solve all the problems, I'm sure of it; the blame game always works! That's why it is still around and is so popular, right? It is most amusing to me, and something I also find incredibly ironic, that they attempt to stop software piracy, fail miserably and now they somehow think they're going to stop encryption? More irony (well, sort of ironic) is that the more that is encrypted the less they have an idea on what is encrypted (that is, the contents – they can obviously see if they can read something as is but if it is encrypted do they know what it might or might not have?).

    Psychology 101: if you say to the mass population, you know what, while we used to be OK with it, we're going to no longer allow encryption (it actually has gone the other way around and only recently has there been hissy fits about it). Guess what will increase? (Maybe then, it is a good thing ? (Obviously not but it would be nice))

    In any case, the risks are many and this applies when you have less privacy, too. ID theft/fraud, bank theft/fraud, etc., will only be easier. The end result is this: if encryption is weakened, yes it'll be weaker for them too. But it'll also only make it weak for those they shouldn't care about (and yes absolutely will make purchasing more risky (because that isn't risky enough already, right?)). But what do I know? The NSA had this problem long before the attack in NY so I am just missing something new and only obvious to (them). Besides, it isn't like they've never taken secure systems and reduced it down to what they need ('secure' does not mean 'impossible to compromise' because there's so many attack vectors and it is so easy to make a mistake).

    1. Jerry Nixon · in reply to Coyote

      Good comment. Except for this: you said,

      "Psychology 101: if you say to the mass population, you know what, while we used to be OK with it, we're going to no longer allow encryption (it actually has gone the other way around and only recently has there been hissy fits about it). Guess what will increase? (Maybe then, it is a good thing ? (Obviously not but it would be nice))"

      First, I think you mean Sociology. But, for the sake of argument, let's keep calling it Psychology. I believe Psychology 101 [sic] would not agree with your conclusion. I believe you will actually find societal masses reliably respond to authority in a way that is complicit. There can be a punctuated moment (like browser cookies) where they are inordinately paranoid, but in a very short time they return to trusting authority. For example, in the United States, everyone knows a mobile conversation can be and probably is monitored. What is the change in the habit of the masses? Decreased use? No. Emails, text messages, voice calls – even as listening technologies by government agencies are uncovered, their use is not diminished. Conversely, tell the masses not to smoke in public – and they don't. Use your seat belt – and they do. You see – in the end it's all public safety. In that context, tell the masses anything you want and they comply. It's Psychology 101. It's also Sociology 101. And, it's the basis of society as a whole – which might actually make it Political Science 101. (Encryption is just an aside).

      Good comment though.

      1. Coyote · in reply to Jerry Nixon

        Yes and no. You could also bring up human nature (and any number of things). I think (I honestly don't remember for sure but this is a possibility) I was (originally) referring to (but perhaps not the best example) the phenomenon (and indeed psychology isn't the best way to describe this, the next part would be closer) Streisand effect. The point is by drawing attention to something that used to be a nothing, you have just increased the number of people that are aware of it (of course, yes, you'd be right – 'aware of' does not necessarily equate to 'usage' (as I get to below)). Another way of explaining this (but this is also – as I get to later – somewhat flawed) the very fact that we all crave most what we cannot have. If something is always kept away from you (especially 'just') and you want it, you're not going to want it less – it is on your mind more because you think of it not being there and depending on how close it is (and how much you want it) it will be harder to resist over time. While it would depend on 'what' and each person, this does happen. (And yes there is always more to it than it seems)

        But to be fair, it was something of an exaggeration (the end result). More specifically, I was remarking on the futility in it all. Of course, trying to reason with a politician (especially a stubborn one and especially when it is directly involving what they want most – power (see how I did that ?)) is also futile.

        (and hopefully that all makes sense (or enough)… not certain it does).

  2. Coyote

    I was somewhat distracted earlier. I've now had a chance to read the above in proper. I think it is interesting, some points you make. It is interesting because they've happened (not surprisingly).

    For instance, "Thirdly, if it’s not a technical backdoor, but instead a method for the secure messaging vendor to extract communications at law enforcement’s request, that still opens enormous dangers."

    Snapchat is it ? More than one leak as I recall (I know for sure there was a significant one and one that affected many kids). Yet people still think it is a good idea, that you can erase what is sent over the wire (or through the air). Some news agency (an author for, don't remember which) dubbed it 'the eraseable Internet' (or something like that). But that is a fallacy and a rather serious false sense of security (exactly what happened to those kids who had their supposedly private and deleted photos become anything but (and that is for things supposedly deleted! then think of things that are caught by web spiders and stored for X time (or seen by others))).

    "Who is going to want to do business with a country which cannot promise to keep corporate secrets secret?"

    I understand the meaning but if we're to be honest, they actually can't promise that now (at least honestly) and in fact ever (and then there is government secrets not being secret and worse is when spying agencies can't keep their secrets secret). Still, the point is well made and should be taken seriously by them. There's already other attack vectors that can be the source of leaked credit card (and other) data; do they really want another? There already are many instances – and indeed exploits – where encryption is ('was') there but was circumvented (or manipulated like would be done in MiTM attacks). Why on Earth create more?

    I would argue that every single example you give, has already happened. Not once. Not twice. Many times, probably more times than are actually known. That is without these additional weaknesses. There's too many leaks as it is. That is scary.

    It really is about control and the perception they aren't in control. None of this is different from before in concept – only the medium has changed. They feel out of control and they want to fix it but they're playing a very dangerous game (whether it truly is only about wanting to protect citizens, doing that by creating more risks is counterproductive). In truth they have more control than they will admit, it is just they are ignorant: what they need is to be more aware and more educated!

  3. BenJ

    "Cameron is living in cloud cuckoo land" – I like it! Well as the general push seems to be getting everyone to use the Could Cameron should feel right at home there! Lol.

    And speaking of home, I hope he's packing his stuff at Number 10, because he'll be moving sooner than he thinks.

  4. "If I wanted to keep my backdoor open for David Cameron…" Gross! And you a married man! Actually, you would think that Eton would have taught Cameron all about backdoor exploits. (Presumably they do have IT classes?)

  5. Roger Leyland

    Is Cameron going to ban talking to people (in person)? As far as I know there is no way of hacking those "messages" unless you happen to be recording at the time (or are willing to impose torture….).

  6. Why the British can't track and intercept Camels with mounted broadband Satellite dishes ?

  7. I don't think I agree with you.

    As I think through your article, I come to different conclusions. I can appreciate your intent. In every way, I recognize the need for two positions in order to reach a reasonable middle ground. Having said that, I want to step through my thought process. I acknowledge that I am more conservative, and, naively perhaps, considerably sympathetic to the scale of threat any government must manage. Still, some of the things you say…

    1. Only innocent people would use a backdoor-enabled messaging platform.

    Yes, I agree with you, in part. If a pedophile has illegal images on his machine, he will use similar clandestine measures to have illegal messaging software. This point is a rational conclusion. Does such a law effectively address perversions like this? Probably not as much as we would want? But, they do a little.

    Moreover, public organizations standardizing their communication platforms would be required to standardize on a complying platform. It is an ill-conceived assumption that only back-alley crooks commit crimes. When public organizations conduct illicit operations, or their employees use their facilities for crime, the opportunity to hold them accountable would be real. Their communication platforms would be subject to inspection.

    There is another thing. If certain types of encryption were illegal, it is 100% possible to detect this. The pedophile communicating within his amoral network would communicate securely, but the network traffic he generates would be a red flag for any ministry monitoring for such a thing. He could download source code and compile it, but using it would broadcast his illegal complicities – at least creating a suspect.

    2. What would stop “enemies” from using the backdoor?

    As interesting as this point sounds, it is remarkably moot. Consider the illusion of your front door lock (for your apartment). It stops your mail carrier from entering, but not a locksmith. A locksmith can use the model master and enter. A locksmith can manually set tumblers and enter. This is handy when you have locked your keys in your house or in your car. Think about it: what would stop “enemies” from using the same technique?

    Does this question imply that /IF/ your “enemy” can use the same technique, it logically leads to the conclusion that it is a bad idea? What if an “enemy” uses a gun, does it imply guns are also a bad idea? This is actually a simplistic example of the “Appeal to Consequences” logical fallacy. It only takes a few minutes to think through to see it is an egregiously, fallacious, moot argument. It just is.

    Reality check: everything has some kind of backdoor. It is just a matter of how practical it is. Even encryption. Yes, sometimes encryption has no practical timeframe to decrypt it, but the subtle implication you make that one system has a backdoor and another system does not. It is elusive misdirection – one system has the backdoor painted, the other is ivy-covered. Security is not an absolute; it is only measuring degrees of difficulty to access.

    3. What would stop “vendors” from using the backdoor?

    This argument clearly suffers from the same fallacies of the previous. That being said it is also the easiest argument of the two to squelch. The idea of using multiple keys and asymmetric encryption to interoperate with a system is nothing new. You might as well say, “What is to stop the software developer from stealing money from the bank that employs him?” And yet, the banking industry survives.

    This senseless fear mongering implies we cannot secure a system. That is clearly wrong. Cryptographers do not have to allow the vendor access to the backdoor. Your article is discussing cryptographic algorithms, not physical backdoors. Just because the backdoor exists for the authority does not imply a backdoor for the vendor. Of course, not. By the way, we have systems like this already – and they are nothing novel.

    4. If they outlaw uncrack-able encryption, the UK can say goodbye to ecommerce.

    Wow. We should circle back to the realities of security. It is silly it is to refer to encryption as uncrack-able. Perhaps the word impractical is a point you could make, but you are creating a false choice. First, an algorithm with a backdoor can be as uncrack-able as any other algorithm. Period. You are implying it cannot be. A backdoor does not inherently imply the algorithm is insecure. No. You may be forgetting that uncrack-able (re: impractical) algorithms with backdoors are in use today. A backdoor does not make algorithms more insecure.

    Then, you draw an outlandish conclusion that if algorithms with backdoors continue in the UK, the entire internet economy would collapse. In fact, you said, “the UK can wave goodbye to ecommerce and becoming a major player in the digital economy.” Algorithms with backdoors are nothing new, so we're not really talking about change; there's nothing to even hint at some revolutionary economic collapse.

    Forgive me, but I cannot imagine you have much cryptographic or economic education. Your article seems based on fallacies, crazy misconceptions, and outlandish conclusions that hint from popular movies, not sound reasoning or knowledge of real technologies.

    Should the UK regularly snoop in on communication? That is one question. Should the UK be able to read private communication? That is another question. Both questions are very important. You are not discussing these questions. You are discussing conspiracy theories and pop culture paranoia. When I started reading, I hoped for a sensible investigation into the question of checks and balances. I hoped for a sensible investigation into what constitutes privacy and rights. I was hopeful.

    Instead, this article is more like a conversation in a pub. The loudest guy at the table recalls a television show plot he saw last night, and tries to apply it to a limited set of recent tabloid headlines. Everyone brainstorms worst-case scenarios over drinks, while shaking our fists at our leaders as the roots of all evil. This is certainly an “opinion” piece; I see that. Nevertheless, shouldn’t this article be a little rational? A little grounded?

    And surely, if not to kick the horse here, you can see how referring to David Cameron as “this crazy guy” is an ad hominem (personal attack) logical fallacy, right? It makes you sound so trite. Like a child. From the title, I should have set my expectations correctly. I blame myself for reading this article for news or insight. This is just link bait and off-the-wall satire. There is nothing practical or realistic in it. It’s like watching television. I just wish I had concluded that before I wrote this thesis of a comment! :-)

    1. Coyote · in reply to Jerry Nixon

      It isn't an ad hominem. He didn't just say Cameron is crazy. It is ironic indeed because you claim he doesn't have an education in cryptography[1] and you're throwing about how he's talking about conspiracy theories (and this and that). Yet it isn't at all crazy what he is suggesting. I would like to point something about about conspiracy theories: it is an explanation of an event that was (possibly) covered-up (so it already happened). Unless you're suggesting he is clairvoyant? But somehow you claiming he's making conspiracy theories (and again – by definition they aren't) is quite amusing to me given that they are anything but.

      [1] I imagine you don't either. But I won't assume such. I.e. I have no evidence and therefore I won't claim it (maybe you should show your evidence as to why you think he isn't educated?). I will state though: It is very, very advanced stuff. Perhaps you know this? Consider you do: the idea of uncrackable and you saying it isn't anything new (or that those without backdoors are just as good (they aren't, by the way))? There is a difference between brute force and cracking outright. Yes, you can brute force salted hashes but it is still a one-way process. I know this from both sides: dictionary attacks (yes I mean that, from years ago, and yes you can classify under brute forcing) and also being the programmer of something that uses salt+hash for authorisation. But I'll get back to this again, because it is actually important.

      (Incidentally, satire isn't childish per se – if you want to talk about someone using satire and sarcasm to the extreme – and indeed to point of childish – that would be me, not him).

      It isn't fear-mongering. Your claim "This senseless fear mongering implies we cannot secure a system. That is clearly wrong. Cryptographers do not have to allow the vendor access to the backdoor."

      First, there is no such thing as secure – there is only as secure as can be given the circumstances. There does not exist a 100% secure system and there never will. Second, the problem is that backdoors already have been in encryption algorithms (and indeed implementations thereof – there's a difference!). As you say, 'nothing novel', right? It is interesting though, because you also suggest what I suggest: "Security is not an absolute; it is only measuring degrees of difficulty to access. " which is a contradiction to your other point (that we can secure a system). Yes, yes, semantics, you might say. But it is still valid.

      "Just because the backdoor exists for the authority does not imply a backdoor for the vendor."

      Actually it does. Hint: 0days. Hint 2: security through obscurity and more so how it is argued as effective: just because they think (and claim) no one knows (and somehow never will) doesn't mean that will always be the case. So if there is a backdoor then it is there for everyone, whether you believe it or not – it might not be known now but it can be found out in time. That is scary and is a false sense of security (when relying on it for security, by itself) and otherwise dangerous.

      … I could go on and on but the point is the same. I want to add one thing, though: the satire you liken (or that is the impression I got) to being childish – those who use it also understand just how effective it is with making a serious (i.e. not fun/light/whatever) issue and making others realise that there is always a light side to it. Yes, a side effect but that doesn't change the fact it still has the effect.

      … and hopefully that reads okay – was not meant to be aggressive but I know I can seem that way, especially in recent times.

Leave a Reply to Christian O'haver Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.