I use the internet. You use the internet. Just about all of us use the internet.
Including bad guys…
But that doesn’t mean that only bad guys use the internet. Just as it doesn’t mean that the only people who have knives in their kitchen drawer, or hire rental cars, are murderers.
I use the internet to communicate and help plan my day. You use the internet to communicate and help plan your day too. Just about all of us use the internet to communicate and help plan our day.
So, yes, of course terrorists use the likes of Google, Facebook, and WhatsApp because they’re useful tools.
Would weakening or banning encryption (as some politicians would like) stop terrorism?
Nope.
Even if politicians could convince technology firms to give them a backdoor into their systems, criminals would simply stay well clear and use alternative systems that guaranteed they didn’t have law enforcement breathing down their necks.
Terrorists and paedophiles might even download the source code of a secure end-to-end encryption messaging app, and compile it on their own PC.
Where would a backdoor in end-to-end encryption apps leave the rest of us? With mainstream messaging apps that fail to secure our privacy, that could potentially be spied upon by law enforcement, organised criminal gangs and rogue nations.
Do you want to live in a world where it’s impossible to have private and confidential conversations? I don’t.
Furthermore, I believe in my right to protect myself from the incompetence of others. Every day we read headlines of huge data breaches – either through hacking attacks or human error (or often a mixture of both).
Properly implemented end-to-end encryption (with no backdoors) means that if technology companies cannot read our messages, then hackers can’t get at them either.
You use encryption many times every day. You use it when you make an online purchase. When you log into your favourite website. You’re using it right now to read this article.
Encryption isn’t a bad thing. It’s a good thing. It protects your privacy and secures your information. It stops hackers looting your bank accounts. It saves lives from oppressive regimes. It defends us from intelligence agencies that are prepared to break the law to collect vast amounts of data about us.
There is a danger that politicians will take ghastly incidents of terror as a platform to push forward their agenda of weakening encryption. It makes for an easy headline. It makes them sound tough in the fight against terror – at least to people who don’t know much about technology. But it won’t make a blind jot of difference to bad guys.
The people it will most hurt are the regular members of the law-abiding public.
What do you expect from a Secretary of State who thinks that this can all be prevented by engaging officers who "understand the necessary hashtags". Twitter has nothing to do with the inner workings of encryption.
The Guardian have written a simple article which explains why a backdoor wouldn't work. It also makes the obvious point that the Home Secretary is merely distracting people:
"But perhaps it is also true the home secretary would rather the focus was on the workings of WhatsApp than on her own department, the police or security services."
https://gu.com/p/66q73/stw
What most disturbs me is not that Amber Rudd is thick, no further evidence is needed after her performance over the weekend even when questioned by that useless ill-prepared incompetent Andrew Marr, but the fact that she was presumably thoroughly briefed by her senior civil servants.and said civil servants were presumably briefed by their technical experts. On that evidence alone the Home Office is clearly not fit for purpose.
If Rudd had been interviewed by Andrew Neil rather than Marr she would have been finished after a couple of questions. Neil is invariably properly briefed by competent researchers.
Oh so true! If someone of Donald "the cyber" Trump's calibre can make it to the top, who do you think we've got up there?!? There are very few people on this planet who understand the security used in IT nowadays, let alone down to protocol and cipher level. Governments need to consult these people and promise not to murder them after they've done their work. ;-)
'Terrorists and paedophiles might even download the source code of a secure end-to-end encryption messaging app, and compile it on their own PC.'
I usually thoroughly enjoy your blogs Graham, but this is a seriously misinformed statement. The vast majority of the criminals that I have helped to put in prison wouldn't have the slightest chance of compiling their own message app – just like the majority of end users, they are not deeply technical people. Of courses there are exceptions, but they are often not the brightest tech users. However, most of them have tech companies to do the hard work for them, which is what this is all about.
Put it another way. Your law abiding man or woman on the street cannot have their landline tapped into (assuming they still use one) or be surveilled without damned good reasons and a warrant signed by a judge. But there are examples where these very good reasons exist and we are all fortunate that it is possible to track certain conversations and follow certain people wherever they go. Any system that allowed every single person to vanish and never have their landline tapped into would be great security for everybody – including very bad people.
There's absolutely a balance to be reached, and any attempts to ban encryption altogether are into dangerous, totalitarian areas. But please don't try to claim that there are no good reasons for security services asking for backdoors into things, because there absolutely are. Reasons that could save lives. Why not have a system whereby backdoors can only be activated where authorised by a judge and/or the home secretary and an oversight committee maintains overall control?
"Why not have a system whereby backdoors can only be activated where authorised by a judge and/or the home secretary…"
Could you explain to me how this backdoor will function? How do you write code that will check for the existence of a properly authorised request and activate the backdoor but will keep it closed to everybody else? The fact is that if there's a backdoor ANYBODY can make use of it. You can do your best to keep its workings secret but that secret will eventually leak or people will find a way to spoof the authorisation.
Your comment about most criminals not having the necessary skills to compile their own message app is probably true for the vast majority. The thing is, they wouldn't have to. As soon as there was a backdoor available to law enforcement in popular apps like WhatsApp then compiled versions of alternatives would become available on dark (and not-so-dark) sites; all they'd have to do would be to download the compiled version just like any other app.
I don't want terrorists or criminals to succeed any more than you do but this isn't the way to tackle the problem.
Hear, hear!
Chris,
You've missed Graham's point when unfairly criticising him. You said:
"The vast majority of the criminals that I have helped to put in prison wouldn't have the slightest chance of compiling their own message app"
Graham was pointing out that the code is open source. This means that ANYBODY can use a pre-made messaging app by downloading the source. Compiling does not equating to making the software from scratch; it's 2-3 clicks and the compiler builds it from source.
Organised criminals are more than capable of doing this: either through following tutorials or having somebody do it for them but you're missing the point.
Open source means anybody, anywhere in the world can compile the software and then offer it as a download. That requires zero technical knowledge for the person downloading. Ban WhatsApp in the UK and people will use foreign-made / home-compiled software provided by overseas benefactors.
Your idea will cause law enforcement even more difficulties. You'll have more of a problem with disparate, unknown messaging apps because they're less likely to be centralised – i.e. there's no chance of getting any metadata (who is speaking to whom) let alone the message itself!
Assuming you work within the CJS I'm sure you'll be aware of RIPA. Not handing over encryption keys is an offence which is punishable with a term of imprisonment.
"Why not have a system whereby backdoors can only be activated where authorised by a judge and/or the home secretary and an oversight committee maintains overall control?"
How does a maths sum (which is what encryption is) know the difference between a warrant issued by a judge and a hacker?
Answer: it doesn't. Once the backdoor is there, it's freely used by anybody.
Google the 'clipper chip'. The Americans used this backdoor in the 90s before abandoing it. It was hacked within a short period of time. The weak security which ensued has plagued us for years through the various downgrade vulnerabilities. Backdoors = insecurity.
You find yourself in the unfortunate position of not understanding the technology. Tech firms would love to help IF it didn't mean undermining the security of their entire user-base. They make clear to Law Enforcement that it's not possible and installing a backdoor would compromise the privacy of all their users.
You're asking for the impossible / repeating the pre-scripted tripe reported by the media.
Graham is a technologist, he understands the technology. You, with respect, don't – which has led you to inaccurately conclude he's wrong or is against catching criminals.
You're completely missing my point. Where have I said about banning WhatsApp? Quite the opposite, I am pro-encryption. My idea would not cause my colleagues or I more difficulties, since I currently have to jump through some absurd, time consuming hurdles in specific situations involving certain messaging clients that I will not be discussing in this open forum. Time that I could much better spend making people like you safer. Yes, failing to give up passwords/encryption keys when a valid request has been made under RIPA can come with a 2 year sentence. Tell me, if you were a paedophile with hundreds of gigabytes of the most extreme IIOC in your encrypted container that would put you in prison for a long time and on the Sex Offender's Register for life, would you give me the key or would you take the 2 year sentence? You'll be out within 1, no register, free to reoffend, free to gain access to children again. Or if you were a terrorist with a cache of data that would massively compromise your and your terror organisation's operations, what would you choose? A short spell in prison making lots of noise about what a martyr you are for the cause and getting further radicalized.
You find yourself in the unfortunate position of being extremely patronising and not having a clue what I do for a job. How hard would it be for the messaging platforms that implement a web interface (i.e. most of them) that requires key exchange for asymmetric signing to provide keys to LE or Security Services so that the suspect's phone does the decryption for us and we can monitor the messaging traffic with the usual warrant and oversight in place? The reality is that tech companies are simply not willing to do this, not for any lofty reasons of giving a tinker's cuss about the security of their user base, but because it's bad publicity to *not* have end to end encryption and make lots of noise about how there are no backdoors to your platform, including to LE. Whether that makes its legitimate user base less safe in their day to day lives or not.
"Where have I said about banning WhatsApp? Quite the opposite, I am pro-encryption."
I think you're misunderstanding how encryption works. You're either in favour of encryption or you're not. There's no such thing as a 'backdoor' in encryption; thus you're either in favour of backdoors (no encryption) or you're in favour of no backdoors (encryption).
If you've got a backdoor then it ceases to be called an encryption scheme. Encrypting something means that *only* the person with the password/key can decrypt the message. If WhatsApp had a backdoor then it'd be called an authorisation scheme (not an encryption scheme) because anybody who is authorised can break into the messages. The problem with authorisation schemes is that hackers, computer experts, blackmailers, snoops can break in too. With encryption *nobody* apart from the intended recipient can decrypt the message[1].
[1] That's not to say the police can't break into encrypted messages, they can by working around it – e.g. implanting malware, hacking the phone etc. The difference with hacking a phone is this is targeted against a suspect; it can't be used indiscriminately like a backdoor. You may argue that the police wouldn't misuse a backdoor but anybody with access to the software (i.e. every single user) would be able to look at the software and discover the backdoor.
I accept the point made in respect to RIPA but that is a societal problem and not something tech firms are in a position to help with. But there are workarounds such as explained above – in your example you could monitor his internet connection, break into any encrypted traffic etc. It can and has been done[2].
[2] http://www.ibtimes.co.uk/fbi-crack-tor-catch-1500-visitors-biggest-child-pornography-website-dark-web-1536417
About people being radicalised in prison, that's got nothing to do with encryption. More should be done but that's the responsibility of the prison authorities and government.
Chris, I do have a clue what you do for a job – you're a Digital Forensics Investigator; or am I wrong?
"requires key exchange for asymmetric signing to provide keys to LE or Security Services"
This isn't a secure way of doing things. Look at the volume of data leaked from the CIA by a person or persons unknown. The secrets there are not out to the whole wide world. Remember Snowden? Manning? The list is endless.
Nothing stops a malicious person from storing up the message traffic (it's sent across the internet for everybody to see), they wait until the next leak of these keys and then decrypt everything. Just like happened with Blackberry when they implemented *exactly* the sort of scheme you suggest. Now nobody trust them.
Alternatively somebody could attempt to break the encryption by a series of attacks which are possible against such encryption schemes.
When you offer such advice please remember that some of us understand the technologies somewhat better than you. That's not being extremely patronising. I'm not going to tell you how to do conduct your investigations but I can tell you that the people who are genuinely experts agree with me. Partly for philosophical reasons but mainly because it's impossible to create a secure backdoor scheme. We'd all love to be able to help create a crime-free world but we don't live in a eutopia.
If my educated guess is correct Chris, would I be correct in saying MSc Forensic Information Technology Portsmouth?
I won't divulge any more information on a public forum.
'[1] That's not to say the police can't break into encrypted messages, they can by working around it – e.g. implanting malware, hacking the phone etc.'
This is not how things are done at all. What you're describing are crimes. That's not how the police or security services do things. Even if it was, you cannot scale such an approach. I don't think you are remotely grasping the scope of this problem and how much the balance of power has tipped into the hands of malefactors, thanks in part to holier-than-thou companies who want to preserve their share price.
That's not my job title, but it does touch on areas of digital forensics. I don't really see where Snowden and Manning fit into this. I'm not talking about the indiscriminate/bulk collection or tapping of digital data by authorities, which I happen to disagree with as an approach. When you say this is partly a philosophical approach, I happen to agree with you. The problem is that the philosophy from a lot of self-proclaimed experts is that all government/police/security services are bad and rarely have good intentions because Julian Assange says so, but for-profit companies who make lots of noise about their encryption schemes are absolutely in the right and have our best interests at heart. The marketing department at Apple and the rest love people like you because you do their job for them.
Chris,
It's not a crime to implement malware if you're duly authorised. In the UK, which is where you're based, you will be fully aware of the powers vested by Parliament allowing equipment interference. That you don't know that, or purport not to know that, would suggest that you're either being disingenuous or genuinely lacking in your legislative knowledge.
The experts, who are engaged by government departments in-house, will confirm this as will any lawyer who specialises in the field.
You talk about discovering a paedophile and being unable to unlock his device. If you can prove that he's downloaded indecent material then that's sufficient. It'd be a bonus to unlock the storage device but certainly not fatal to the case.
I'm surprised you that, working in forensics, you aren't more careful with your personal privacy. That you can be readily identified suggests you aren't au fait with current deanonymisation techniques.
Apple understand the problems with backdoors as do world leading independent experts. It is a difficult pill to swallow when you tell me that E2E encryption is thwarting investigations and the *only* solution is a backdoor. This suggests a conceptual misunderstanding.
If you're advocating backdoors on the basis that it'll make people safer then you need to explain that there's no secure way of doing this. That's an entirely different argument and people need to appreciate this.
My point is underscored by the fact that 'going dark' is nothing new. In years gone by suspects would have had a conversation on the phone and there'd be no record of that after the event (unless they were being monitored of course). If you get given a mobile device you can seek cloud records, attempt to crack the encryption on the device, compel the passcode, seek the assistance of the other person and there are other methods too.
Suggesting we're going dark is alarmist and obtuse. We've never had so much information – cell site records, CCTV, biometrics etc.
Suspects would simply seek more secure, unregulated channels or even devise their own rudimentary code or one-time pad.
When you get military experts, programmers, cryptographers, security aficionados, Queen's Counsel and the government's own reviewer of counter terrorism agreeing that backfire are bad then your argument holds little force.
The police are just the servants of the people; not their masters. When society says "no" then you've got to sit down, shut up and accept the will of the people. That's not to say you can't have your own opinion – you just can't force legislative change on an unwilling public.
We're both on the same side (I hope) but I don't believe that intruding on personal privacy and endangering everybody at the same time is justified considering the outcome. Criminality and terrorism is at an all time low. Don't let the terrorists win by destroying our vital civil liberties.
Chris, I must strongly disagree with you. I know little about the technicalities of encryption or backdoors but it seems obvious that any weakening of a secure system can be exploited by the bad guys, be they criminals or others.
I consider I have a right to privacy in my personal life. As Graham states there are enough examples of security breaches in unsecure systems to make us all desire the best methodology to protect information which we consider to be personal and private.
The notion of incorporating a backdoor, protected by warrants or other means assumes that all writers of such software live in countries which will support such measures. As for compiling stuff I would assume it only takes one guy to do this and make it available to the criminal masses via the Dark Web. There seem to be plenty of talented software engineers working for the wrong side and selling access to so-called secure systems would easily become a cash cow for these people.
If we look at the recent attack in London, I am not aware of any definitive evidence that any such breaking of encryption would have identified Masood. By definition, the “lone wolf” attacker will never be identified by intercepting coded messages.
As a small point, in my own experience it is not necessary for a warrant to be issued for the police to carry out surveillance of suspects.
I have no answer to the dilemma but I do know that weakening secure systems is a bad thing. I can appreciate and sympathise with the frustration of those in the security services who feel that they are being hampered by encryption, but driving the bad guys to use some other form of communication is surely not productive and harms the vast majority of law-abiding citizens.
Some great points! This entire discussion may become moot if quantum computing comes into the mainstay. Then, everything could be decrypted in instantaneous time, and with all financial transactions hacked out into the open, money could no longer be the backbone of human society.These are interesting times indeed. Recommended reading for such events – http://www.antipope.org/charlie/blog-static/fiction/toast/toast.html#antibodies
Some excellent points. Quantum cryptography will be a challenge but 256 bit encryption is supposed to provide a little protection against this. Much depends on what advances are made.
I know this is old but … quantum physics? I recall reading years ago an article how it was uncrackable blah blah blah. I wrote a comment saying if you can make it you can break it – or find some way round it. A week later they wrote another article stating it had been done.
Not the article but here's a funny one – https://www.wired.com/2013/06/quantum-cryptography-hack/.
Besides the best security in the world is broken by the weakest chain – the human.
Graham
Thank you for bringing this difficult subject into focus again, it's only by public debate and discourse that we will get enough interest and general knowledge to move towards the right balance of liberty and state oversight.
In response to Chris's comment, firstly a genuine thank you for your service I assume in law enforcement – it is not appreciated enough in our society.
I believe it's true that most criminals are not able to compile apps, but the point is they don't need to; the cyber crime world is particularly good at SaaS and it only takes one or two talented bad guys to put it out there as open source, or a service, as has happened with other vulns exploits and tools that have been released, so with respect, Graham's point is very much a valid one. It is already trivial to get hold of well encrypted communication apps.
As to backdoors, its been debated fairly extensively already, but I would just make a couple of points.
1) these issues wholly traverse national jurisdictions – you can't expect a foreign app company to comply exactly, in complete confidence, with our requirements, however carefully checked and balanced according to our societal norms, nor would we necessarily wish to risk even telling them what we wanted in case it leaked, and
2) government oversight control is hard enough where there is a recognisable commercial entity to deal with – its realistically impossible where its a software dev group or non-profit that by its nature may not be prepared to be personally identified and/or may be politically non-aligned, may not exist tomorrow, may morph into other entities or even divide into multiple groups.
A senior politician who has experience of signing intercept warrants gave a talk on this recently and when asked at the end of his speech as to this issue, he commented that in a civilised society there probably is no final right to absolute privacy, i.e. if your neighbour is making a WMD in his house most would vote for the state to have power to intervene, but he commented that only parliament can determine where the line between liberty and state protection finally lies, as being the place where legislature and the will of the people meet.
Obviously this isn't the place to discuss techniques that could be used but suffice it to say that the same principles that have always worked, in measure, given enough resource, with every technique from carrier pigeons to microdots will still work in this highly challenging new environment.
"there probably is no final right to absolute privacy"
There isn't. Even the ECHR recognises this. WhatsApp does not guaranteed "absolute privacy"; it's encryption prevents snoopers from gaining access. Government bodies have the ways and means to bypass it.
"in measure, given enough resource, with every technique from carrier pigeons to microdots will still work in this highly challenging new environment."
Law enforcement are back in the position pre-mobile phones. They've got to target suspects instead of indiscriminately surveilling everybody. It's not highly challenging, other enforcement bodies including the CIA, FBI, NSA, GCHQ have broken into encrypted messages – it can be done.
Similar attacks have happened for thousands of years. It's not a new problem. Intelligence is key and thinking that everybody uses WhatsApp misses the point somewhat.
The government have intelligence on suspects but fail to act upon it. That's a problem which has nothing to do with the technology.
"There isn't. Even the ECHR recognises this."
I know, and its obvious it must be so, but i thought worth putting out there for comment.
"the CIA, FBI, NSA, GCHQ have broken into encrypted messages – it can be done"
Maybe. But properly encrypted material I believe is a challenge even for such well resourced agencies, because a) it takes time, b) the useful lifetime of the message may be short, i.e. whatever it is may have happened before it's decrypted.
But I agree its not a new problem in principle. It's just that the borderless nature of it, and the scale of it, and the speed of it are new and real issues.
I have debated this subject over the years with my colleagues and have formed the view that encrypted messaging should NOT be allowed. There is no need to encrypt a message. It's overkill. If you need to hide something that desperately that you require end to end encryption, then that says you're probably up to no good.
The fact that it is now baked in to standard messaging apps such as whatsapp means the security services are pretty much in the dark, and the bad guys know it.
Remember WW2? The enigma machine?
Now those messages cannot be decrypted, and the bad guys win.
Sure if messaging encryption was banned or backdoored, then the most extreme would move to other secure methods, but that's a different conversation. At the moment, it's far too easy, and we are protecting them because of a privacy argument that will not really effect you. Unless you are dodgy.
And in WW2 the British broke the Enigma machine and kept it most secret. The Germans never found out until the war had ended.
If you really believe that the intelligence services can't break WhatsApp then you're burying your head in the sand. All the leaks prove that they can and do break into WhatsApp and other encrypted messaging services.
Encryption makes it difficult for criminals to get access to your messages; it doesn't stop a nation state breaking in.
Encryption is overkill? Look at TalkTalk being hacked three times in one year for one example. You've had hundreds of companies hacked because they've not had proper security and/or encryption.
If you think encrypted messaging services should be banned please emigrate to North Korea. Those who are prepared to sacrifice liberty for security deserve neither.
The Digital Minister believes in strong encryption as does the MODs former cyber security chief Major General Jonathan Shaw. I doubt you have anywhere near his experience what with you advocating banning encrypted messaging and him a staunch supporter.
https://www.theregister.co.uk/2017/03/27/digi_minister_matt_hancock_praises_crucial_role_of_encryption/
https://www.theregister.co.uk/2017/03/27/whatsapp_crypto_row/
Don't have anything to hide unless you're dodgy? Well then please fork over your:
– passwords
– banking credentials and history
– tax records
– bills and receipts
– IDs
– all keys
– all passcodes
– and everything else
Don't want to? Then you must be 'dodgy'!
And of course the Enigma machine was broken. If you think that it wasn't you have a serious ignorance of the Second World War. Unless you're suggesting if it hadn't been broken. But it was. Due to cribs, recovering Enigma machines, settings sheets and other things.
"To (some) cybersecurity experts, Friday’s incident (#wannacry attack) showed exactly why technology companies such as Microsoft, Google and Apple are so defensive about the idea of backdoors into their services and devices.
Law enforcement agencies may want a way into highly secure gadgets and apps to further their investigations — such as when the FBI pressed Apple last year to hack into the iPhone used by a gunman in the San Bernardino terror attack . But the companies have repeatedly pointed out that there’s no safe way to build an entry point just for trusted government organizations."
http://www.latimes.com/business/technology/la-fi-tn-ransomware-exploit-20170512-story,amp.html