
In the latest episode of our podcast, Get With IT, we spoke with Mark Thacker, senior manager for product management for Red Hat Enterprise Linux, about the threat of quantum computing on encryption.
Here is an edited and abridged version of that conversation.
Since we’re talking about the actual threat of quantum computers breaking very sophisticated encryptions, how would you describe the actual scale of that threat? Is that something we need to worry about today as quantum computing is not as sophisticated as everybody says it will become down the road? What’s the status of things today?
I’ve been in the industry now for 35 years, and I can remember various inflection points in the industry — Y2K was probably a similar concept of, is this really going to be a problem? At least with Y2K you had a date, right? You knew it was an actual date that was really going to occur. There was nothing that you could do to stop that.
With quantum computing, I would actually make a provocative statement: The problem is now. It isn’t that you have to wait for quantum computers to get powerful enough. Why would I say the problem is now? Do I believe that there’s actually going to be a state in which quantum computers will be sophisticated and scalable enough to break, for example, the classic cryptography that we use today? Absolutely. Do I know when that is? No. There are various dates thrown around. I’ve heard Gartner says it’s 2029, some researchers are saying some portions of public key cryptography could be broken as soon as 2027.
It doesn’t really actually matter, and I will tell you why I’m saying that. There is a practice called harvest now, decrypt later. So everything that we are doing today, public internet exchanges, TLS communications to your bank over the wire, bank to bank transfers, secret communications that you’re using for highly sensitive installations, all of that could be recorded today for later decryption.
Won’t that data kind of be out of date by that point?
You just said the key word there: value. One of the things that we talk about with customers is: what is the value of data? Not all data is equally valuable, right? So let’s go down an example. Back in June of 2022 Red Hat published an article, and one of the examples is exactly what you’re talking about, the value of data, a credit card number that was being transacted today and then recorded three years from now. That’s probably not a big deal. Your credit card probably has expired and where it has been replaced. I just had a credit card get replaced with a new number. I don’t care if anyone has the old number.
The issue is when the data that you’re trying to protect or that you’re encrypting today happens to contain private information, such as a confidential informant talking about an authoritarian government regime and what they are doing. That information is incredibly valuable, regardless of its date.
Harvest now, decrypt later is making people think, what is the value of data? What kind of data are you protecting? Where is that data? Is it in flight on networks? Is it stored on a tape device, in a vault somewhere? Is it sitting on a hard drive?
The problem is now because there are extremely valuable pieces of data being protected today using classic cryptography mechanisms that could still have value well beyond what you think of as a financial transaction piece of data.
It’s been since 2016 and maybe even earlier, that NIST has been working on these post quantum encryption standards, right? If somebody harvested now for decryption later, when it becomes that sophisticated, wouldn’t the onus be put on organizations that are using encryption to keep updating it to the newest standard, so that even though it was harvested years ago, there’s new encryption on it now that couldn’t be broken as quickly?
There’s actually a basic algorithmic expression, which is the value of your data plus the time it takes to re-encrypt that data. If that exceeds the time it takes for there to be a quantum computer developed that can break the data, then you’ve got a problem.
It’s unrealistic to assume that all data can be re-encrypted and constantly kept up to date with the latest and greatest. It’s also true that, yes, work has been ongoing, really, even since before NIST started on it. In preparation for this particular discussion, I went back and found there are references back to 1994 when Peter Shor actually identified an algorithm that can be used by quantum computers to break RSA and Diffie-Hellman and even elliptic curve cryptography. This has been ongoing for a while.
I do believe that it is incumbent upon everybody to understand the relative value of their data and how it’s being protected today. Yes, some data most likely needs to be re-encrypted using new post quantum crypto algorithms, but you’re not going to re-encrypt everything. That’s not realistic, right?
Is there anything that organizations can do now to start protecting their data from the future that we’ve described?
I think that there are definitely things to be done. Number one, do an inventory of your data. Physically, where is it? What value does it have? What kind of data are you trying to protect? This is classic IT security, good hygiene practices, but it’s especially important when you’re talking about a post quantum cryptography world. If you know what is important and you know how you’re protecting it, then you can choose to potentially take action on that high value data that has a longer shelf life. So number one, start with an inventory.
Number two, beware of the regulations that are affecting you. In the US, CNSA affects all federal government customers and vendors that are supplying work to those federal government customers. But the same problem exists worldwide, right? Various government organizations are requiring the use of post quantum crypto. So be aware of the regulations that are in your space.
Number three, be aware of what is happening in the community. Earlier on, David, you mentioned that NIST has been working on these algorithms since 2016. That’s absolutely true. I don’t have the exact numbers in front of me, but there was something like 70 plus different cryptographic algorithms that were selected in one of the rounds of NIST standardization. Almost all of them were broken after they were picked to go through that first round of evaluation. So the world changes constantly. The community changes constantly. Red Hat and other Linux vendors were actively working in the community, in the open source community, to push forth implementations of the NIST cryptographic standards that are done in the open so you as a customer, or you as a cryptographic researcher, can dig into the code, look to see what was actually implemented, which is super important in building trust. So, be aware of what’s going on in the community.