Restoring trust in the network: The silicon perspective: Part 2

In my blog last month, I discussed the claims that the National Security Agency (NSA) influenced software, hardware, and semiconductor companies to undermine user privacy and security by deliberately building vulnerabilities and backdoors into their products. This month I want to discuss the QorIQ platform security implementation in light of specific claims related to the NSA, NIST, and random number generators.

Virtually every PowerQUICC and QorIQ product developed since 2000 has incorporated a security engine (we call it the SEC) which includes a random number generator. Random numbers are important inputs to cryptographic key exchange protocols such as IKE (Internet Key Exchange) and the SSL client-server handshake. If the ‘random’ input to the key exchange protocol isn’t so random, it is easier for an attacker to guess the value of the key exchanged via an otherwise strong protocol, thereby allowing eavesdropping of an encrypted communications session.

Random number generator is an imprecise term; it doesn’t tell you anything about the source of the randomness. More precise terms are TRNG (‘true’ random number generator) and PRNG (‘pseudo’ random number generator).

TRNGs are entropy sources, amplifying the uncertainty in the analog world to create a digital value. Attackers attempt to control physical variables (voltage, temperature, external timing, transistor manufacturing) to cause TRNGs to produce lower entropy values.

PRNGs, aka DRBGs (deterministic random bit generators), are cryptography’s great oxymoron. Their outputs are completely deterministic, if you know the algorithm used and the starting value, called a seed. Even if the attacker knows the DRBG algorithm, so long as the seed remains a secret, the subsequent outputs appear totally random. Wonderful, but why use a ‘deterministic’ RNG when true RNGs exist? For starters, when NIST certification is required!

The US National Institute of Standards & Technology is responsible for validating cryptographic implementations. Prior to 2004, NIST didn’t have a standard for certification of random number generators. They offered what were effectively ‘best practices’ for statistical evaluation of a TRNG’s output, but statistical evidence wasn’t sufficient for NIST to certify that a TRNG always generated unpredictable outputs. Last decade, NIST began issuing certificates for PRNGs and more recently DRBGs. PRNGs & DRBGs are easy to test and certify; NIST gives you a seed value, and you give them a few MBs of random numbers generated from the seed, demonstrating that you’ve applied the deterministic algorithm properly.

In addition to certifiability, PRNGs/DRBGs are used because they can have significant performance advantages over TRNGs. When a security protocol like IPsec needs 16B of random Initialisation Vector per packet, and you need to support millions of packets per second, it takes a PRNG/DRBG to keep up. For this reason, RNGs on PowerQUICC and QorIQ processors combine TRNG with PRNG or DRBG. External seeds are accepted for testing purposes, but operationally, the TRNG initially (and periodically afterward) creates a high entropy seed for the high speed deterministic portion.

What then are the most recent claims regarding NSA, NIST, and US companies? As mentioned above, PRNGs and DRBGs use cryptographic algorithms, not analog inputs, to generate their outputs. NIST specifies several acceptable algorithms for DRBGs, including hashes, HMACs, block ciphers, and asymmetric encryption based on elliptic curve cryptography. This last method, specifically the Dual Elliptic Curve DRBG, has been discovered to leak information which attackers can use to determine the future outputs of the DRBG. Suspicions were raised about Dual EC even prior to its standardization, leading some to conclude NIST either cooperated knowingly with the NSA to push a flawed standard, or NIST lacks the independent knowledge of cryptography to evaluate algorithms. It seems either way, NIST’s international standing as a non-political technical resource for crypto standards development has been tarnished.

The dust from the fielding of this flawed standard has yet to settle. Software implementations need to be located and patched. Fortunately there doesn’t seem to be any instance of a major silicon vendor implementing Dual EC as their integrated RNG engine. It won’t be easy, but Dual EC can be rooted out of software products; replacing fielded silicon would be cost prohibitive.

As previously mentioned, we have been integrating RNGs in our processors before NIST offered a standard to certify against. Our entropy sources are submitted to 3rd party labs for statistical analysis and design review by 3rd party labs. Previous RNGs used SHA-1 for the deterministic portion, our most recent RNG (RNG4) uses SHA-256 (and its internal TRNG claims full entropy).

Why SHA-256, not Dual EC for the RNG4? Although we’ve integrated elliptic curve acceleration in our SEC engines from the beginning, it is an algorithm we hesitate to offer for more than public key. ECC has a history of intellectual property rights ambiguity, many permutations, and overall seems less studied than the alternatives. As a DRBG algorithm, it is far slower than hash, HMAC, or block cipher alternatives. Conversely, at the time we selected it, SHA-256 had just survived considerable scrutiny following reports of success in attacking reduced rounds of SHA-1, and we knew it would support our key exchange and random IV performance requirements.

Our conservative choice in a SHA-256 DRBG implementation doesn’t mean we believe cryptography should sit still. New algorithms do need to be invented, but they should only be standardized after they are competitively analyzed in a highly transparent process. NIST committing to such a course of action is one way it can restore its reputation as an honest broker.

Do our efforts to take our crypto accelerator implementations through NIST certification still provide value to our customers? We believe so. When the value of the algorithm itself isn’t in question, NIST certification proves the correctness of our implementation, which assures interoperability as well. When our implementations are certified, our customers can leverage our certificates and avoid repeating those tests at a system level.

A complete list of NIST certificates achieved by QorIQ devices can be found online.

 

Geoffrey Waters
Geoffrey Waters
Geoff Waters serves as a Distinguished Member of Technical Staff, covering high-end multi-core products and trusted computing for the Digital Networking group. He leads the Trust Architecture user’s group, and is a regular contributor to the MultiCore for Avionics (MCFA) working group. When Geoff is not working on security acceleration and hardware roots of trust, you'll find him on the river competing in canoe ultra-marathons and kayak races.

Comments are closed.

Buy now