Over a number of years, TrueCrypt gained a reputation and a sizeable following as a reliable and stable, tried and tested free full disk encryption solution.
The architecture is believed to be sound and as far as is known, no critical bugs have been found in version 7.1a, released around two years ago.
Indeed, Edward Snowden has expressed his faith in it, and he should know what can be cracked by the best minds in the business using the best tools available!
A crowd-funded audit has released its first results, which raised no major concerns.
Then suddenly and without warning at the end of May, the developers declared it unsafe to use and effectively killed it, recommending users move to Bitlocker (for Windows) or other tools for Mac and Linux.
This was very shortly after I had taken delivery of a shiny new SSD and while I was planning the transfer of my Windows 7 installation to it, encrypting it on the way. Since Bitlocker is only available on Ultimate Edition, which offers nothing else I might need, Truecrypt seemed the obvious choice.
Should I still use it?
The whole saga has brought into focus an issue which has been central to security thinking in government circles for many years, though much less so more widely: that of assurance.
I take out insurance to make good if my house burns down. But assurance is the measures I might take to reduce the risk of it burning down in the first place.
Product and system assurance schemes such as Common Criteria, CCTM and CAPS are at the heart of government security policies.
In simplistic terms, they check whether the Target of Evaluation (or ToE) can be relied upon to do what it says on the tin.
A product may be built on the latest hyper-secure bullet-proof technology, but with only the vendor’s word for it a security architect working in the government arena would favour an independently assured product, even if theoretically not as strong. However, formal evaluation is not the only form of assurance.
So comparing Truecrypt against Bitlocker, what sources of assurance do we have?
- Steve Gibson examined TrueCrypt a while back and declared his faith in it.
- As we’ve already noted, Edward Snowden has endorsed it more recently.
- An independent audit has delivered its verdict on the boot code with no critical issues found. And it’s inconceivable that the NSA wouldn’t have dwarfed that audit by its own efforts, despite which, TrueCrypt is understood to have thwarted the best efforts of criminal investigation teams.
As for Windows, versions since NT and including Windows 7 and Server 2008 have been subjected to formal Common Criteria evaluations.
Much has been written on the negative side regarding TrueCrypt.
For example, the audit raised serious questions about code quality and the antediluvian build environment. But this is a bit like criticising a donkey for not being a horse. The groundwork was laid when Microsoft was only just waking up to the need for a secure development methodology.
This leads to the question of the lack of ongoing support and updates, which would hopefully have encouraged an evolution of the development processes.
But updates are only needed to fix bugs or to introduce new features. There appear to be no known critical bugs, and the features are sufficient, at least up to Windows 7.
Feature creep leads to complexity, and complexity is the enemy of security. We have something which works, doesn’t crash and doesn’t trash your data.
The Heartbleed bug and subsequent revelations about OpenSSL code quality have shaken confidence in open source.
The source of TrueCrypt is available for anyone to look at, but in practice, everyone assumes someone else will do so and in the end, no one does.
Whilst this is true, quite apart from the fact that the TrueCrypt audit is continuing, there is a key difference. OpenSSL is huge and has a vast repertoire of functions. Consequently it has an enormous and very complex attack surface.
TrueCrypt has an attack surface during installation; if your installation environment has been compromised then there’s no hope for you, whatever product you use. The next point of potential weakness is in booting and password entry, which have already been covered by the audit.
After that, if an attacker is able to freeze your RAM with liquid nitrogen in the time it takes for you to put your coat on to go home then he may have you, and likewise if he can mount an “evil maid” attack.
This applies to any encryption product to a greater or lesser extent.
But if you can shut your computer down and give the RAM a minute to die, then the attacker’s only recourse is a direct assault on the cryptography, which rarely if ever succeeds, given a decent password.
So how much assurance does the Windows Common Criteria evaluation offer?
Evaluating a product of the size of Windows is a vast undertaking, and it’s not clear, on the surface at least, how much attention was given to BitLocker. A common mistake in using evaluated products is to assume that the specific security enforcing function you are relying on is fully included in the Target of Evaluation, and tested in sufficient depth.
BitLocker does appear to have been included in the evaluation, but unlike TrueCrypt, which is a single-user product (lose your password, loose your data) BitLocker is aimed at the enterprise.
As such, it includes enterprise-grade key management and password recovery features. In contrast to a direct cryptographic attack which practically never succeeds, key management is a minefield and extremely difficult to get right.
And as well as technical attacks, social engineering can often be leveraged.
As for the Windows 7 Evaluation Report, the NSA’s name is on the front page. Who knows what vulnerabilities they might have failed to disclose, or what back-doors they might have persuaded Microsoft to include?
Formal assurance methodologies and their methods of application have evolved considerably over the years.
There was a time around 10 years ago when an evaluated version of the PIX firewall firmware was preferred in government applications long after any self-respecting network engineer would touch it with a bargepole.
Because it was evaluated. Those days are long gone, but we still need to look at assurance from the widest perspective.
So, am I still going to use TrueCrypt? In an enterprise or serious business you can’t afford to use a product without a good support model. It has to be BitLocker – unless you’re a journalist guarding Edward Snowden’s files.
But TrueCrypt does have an advantage in being cross-platform: in extremis I could in principle do an offline virus scan, or fix other serious problems using Linux-based tools.
So yes, for my own use I might still use Truecrypt. But first, I have to work out why Microsoft System Backup is giving me an error 0x8007045D. There must be an answer – after all, Windows is formally evaluated!
"Who knows what vulnerabilities they might have failed to disclose"
Exactly what I've been saying for a long time. During Windows 7 and 8 development, Microsoft allowed the NSA to review the source code for "vulnerabilities". Given what we know now about the NSA, what happened is obvious: the NSA found X vulnerabilities – and told Microsoft about X-Y vulnerabilities. There can't be any doubt this happened – unless the NSA couldn't FIND any vulnerabilities, of course, which is about as likely as my flying to the moon by flapping my arms.
My approach to assurance? NO product can be fully trusted until it's been retired after long service, and has been attacked unsuccessfully by multiple, independent, resourceful, and motivated attackers (not just some red team hired to do so, but attackers who WANT it broken for their own gain.) Once a product has done that, we can say it was reasonably secure. Until then, it's just a guess.
Actually, even your approach is not secure. The article I referred to (only thought of this later the evening and didn't get to it until now) in my response actually explains why. Yes, it is possible to do what Ken Thompson did and that is make the compiler (and he could have done this to the assembler, the linker, …) not only recognise it was compiling login (the program that is used for logging into a UNIX machine… well, especially then) and to insert the appropriate bug, but also completely remove any sign that he had changed the source of the compiler or anything at all. Indeed, after his task it was not even in the source and all he would need to do is plant the bugged programs on a system and watch it evolve.
He explains it better but it actually is an interesting (at least to programmers) concept and one I've played with before (not for what he did but again, as a programmer, I have a variety of uses in it including the learning aspect). So while yes, you're right in a sense it is only that if it is not in use then there is no risk. But it is not right in that you can be sure it was 100% safe (maybe you aren't referring to 100% but the point remains the same: the idea is unless you personally can verify every part of the process then you are ultimately relying on trust in some form, and that is the problem).
Here's another example that I just thought of: piggyback riding an antivirus. You thought you were checking your system for viruses but you actually allowed a TSR virus to infect every single file that can be infected (based on what it infects). Here you are thinking you're doing the safe, proper thing but what you didn't know is that you had an infected system prior to the scan and now you have a worse problem (and imagine a dropper doing this instead of the actual virus? E.g., you weren't infected but were set up to be and now you are). And no, I did not invent that term – that really did (maybe does? who knows?) happen (just in case you didn't know about it.. it isn't a term I've heard/read in a long time).
Steve Gibson is a charlatan. I know I already pointed this out on another post but I'll elaborate more. Wait, I take that back. Taken from the website I referred to (Attrition), there is this: http://radsoft.net/news/roundups/grc/
(Plus other things there, other sites as well as things I've written here in responses)
Sums it up quite nicely.
As for open source: OpenSSL's recent issues doesn't make open source somehow less trusted (I get to trust too but in general it is too easily given). And I will state that yes people DO evaluate all the time. I can give examples but it doesn't really matter (I will however point out some irony in a moment) to those who are sold on proprietary (which often times they know of bugs but ignore it, hence a real use of 0-days: make the companies fix their problems so that someone who is malicious does not find it and then use it to cause havoc) are only going to see what they want to (in general that is how everyone works but those who have seen both sides know best). Ironically, your claim about it not being checked.. did you forget how the flaws were found in the first place? Did you also note how quickly the bug was fixed once discovered and despite that it is open source and free (there's that thing about open source programmers who are simply very pationate about their "baby".. really, that is how it is)? Exactly. That's one of the reason different Linux distributions use it: there is no legal implications. That's a general thing with open source.
As for your entire point, it is easily summarised as such: trust is given far too easily and trust is abused far too often. And unfortunately too many people just don't get that fact. Good example (and it is old, mind you, which again further proves my point): Ken Thompons' article 'Reflections on Trusting Trust' (if I recall the name correctly). If there is ANYTHING to be used to learn about trust and its flaws, THAT is it.
We had to dig in TrueCrypt code when developing Passcovery Suite.
Everything looks quite good there and we haven't depicted any suspicious spots. The protection is really strong. Password searching speed indexes shown by our software (it is, probably, the fastest) are mean. So we continued to use TrueCrypt for our purposes.
"…In an enterprise or serious business you can’t afford to use a product without a good support model. It has to be BitLocker – unless you’re a journalist guarding Edward Snowden’s files…"
This has got to be one of the most outrageous statements in the entire article.
I have a customer that makes asphalt. They run an asphalt mixer that is controlled by a Windows XP system connected to the kiln by a serial port.
Microsoft has decreed Windows XP has NO support left. So my customer is supposed to throw out their working asphalt kiln that cost $250,000.00 and replace it with a new one that probably will cost $500,000 and is run by Windows 7 – because "they can't afford to use a product without a good support model"?
Are you out of your mind? Seriously!!
Code is code. The ONLY REASON that this entire ridiculous "support" nonsense was invented by commercial software is because some companies decided it would be a good idea to distribute software binary only. Kind of like delivering an automobile without a service manual. And a lot of small operator residential types who were running their business like they run their homes (instead of like a business) decided "okey dokey"
TrueCrypt is Open Source. Yes, it is not Open Source in the way that OSI may have defined Open Source because OSI is a political organization with certain political goals. But, before OSI and GNU came around, a lot of Open Source had as as many (or more) license restrictions as TrueCrypt. A business that wants to use TrueCrypt – or ANY OTHER open source software, only has to ask 4 questions:
1) Does it do what I want and is the cost/benefit good?
2) Is it written in a modern language like C or C++ or Java or something that's worth my time to modify?
3) Can I get all the development tools used to compile it?
4)Can I ignore any copyrights/patents/licenses/encumbrances on it without ill effects?
If the answer to these 4 questions is YES then in the words of the Borg, "support is irrelevant. We will add your source code and technological distinctiveness to our own"
To put it simply, if you and I are directly competing, and my use of unsupported software of any type vs your use of approved and supported software allows me to take business from you then you can make the most elegant and convincing arguments you want about how serious businesses must use supported software, but the only people listening to them will be the people standing next to you in line at the unemployment office.
Oh how wrong you are. It is so wrong it is hysterical. Your response is littered with flaws and incorrect (ie false) statements.
Re: "The ONLY REASON that this entire ridiculous "support" nonsense was invented by commercial software is because some companies decided it would be a good idea to distribute software binary only."
Really. So is Linux kernel 2.2 supported? Is 2.4 kernel supported? is Fedora Core <= 18 supported? What acout CentOS <= 4? Your claim is complete and utter nonsense. Actually it is so far off that it would be hilarious except you seem to be serious! Nothing to do with binary and nothing to do with source. It has to do with design and changes in standards (and for Windows, it was layer upon layer of … so much so that it had to be cleaned up in a better way; what you really think their ROI is good to continue ancient source over newer? And you really think their productivity is going to continue there? And security is just as easy as their more recent releases?). Clearly you aren't a programmer of any real sized project (or if you are you do little of it). In fact, given your question 4, you sure have a great amount of languages under your belt… Funny thing is how old C is. Yes it is common because it so so efficient. But you know that doesn't mean it is a modern language (the more recent standards are as close to that as you'll get). And modifying software to fit your needs when it cannot be trusted is a complete and utter disaster waiting to happen. You would rather risk your company (or your own) assets? One can hope you aren't a manager of a huge project. And 3… I would like to see you try to compile newer code on older compilers. I would also like to see you compile certain old code on newer compilers without getting warnings if not flat out errors (and ignoring the former is a big mistake).
Re: "This has got to be one of the most outrageous statements in the entire article."
Re: "Are you out of your mind? Seriously!!"
I think better thought is you are very naive at best or you are very angry at the fact XP is no longer supported. But see below – it has nothing to do with binary form only (by the way, ever hear of a disassembler or decompiler? Or is that too advanced for you?)
Re: "allows me to take business from you then you can make the most elegant and convincing arguments you want about how serious businesses must use supported software,"
On OSI: political. Really now? No, perhaps you should re-evaluate the license. It is inconsistent and it has serious problems (of TrueCrypt). And open source in those days were very different and so is like comparing apples and oranges.
You only hear/see what you want to hear/see and you only know what you heard/saw. That is a summary of your suggestion.
As an aside: why is it you're remarking about XP when the subject is TrueCrypt? Is it your (incorrect) suggestion that open source versus binary is the reason for lack of support? Take that for what it is worth, if you even see it and get this far.
That is: What about CentOS <= 4? Yes, I made a mistake. Primarily my mistake was not noticing it when knowing full well my keyboard has been acting odd (and I know why but haven't fixed it).
Ah, I see that the ceasing of TrueCrypt might be related to the end of life for XP. In that case I am sorry about the aside. But still, regardless of that, the binary versus source claim you make related to end of life (EOL is in fact more than end of line… indeed EOL also refers to software life time) is completely false and I only hope it is out of frustration/anger rather than deliberately misleading.