App Delivery On-Demand

Blog archive

Out of Resources in the 2048-bit Key Length Zone?

Playing off the title of an earlier blog entry, "Out of Resources in the Twilight Zone," this entry explores the performance implications of moving from 1024- to 2048-bit key lengths--a recommendation by the National Institute of Standards and Technology (NIST) to increase the security of data encryption. The issue involves Secure Sockets Layer (SSL), a security protocol that uses RSA's public and private key encryption system to establish an encrypted link between a web server and browser. The protocol uses SSL certificates that currently employ 1024-bit RSA key lengths, but NIST has set January 1, 2011 as a target date for organizations to begin using 2048-bit key lengths.

The "recommendation" is more like an imperative, because the move is well underway already. Many certification authorities (CAs)--Entrust, GeoTrust, and VeriSign, to name just a few--are now only issuing 2048-bit certificates, and some major web browser and software vendors have thrown in their support, as well. It seems inevitably that everyone will need to make the shift, so what does that mean for your organization?

If your application and web servers are still doing all your SSL processing using 1024-bit key lengths, that's consuming about 30 percent of your CPU and memory resources just to establish the connection (handshake) and then decrypt and encrypt the transferred data. Problem is, when those same servers start trying to process 2048-bit keys, your users are going to see a reduction in performance (based on transactions per section) of about 5 times, regardless of the vendor platform. Stated another way, it's going to take 5 times more CPU cycles to process those 2048-bit SSL transactions.

Why is the decrease in performance (increase in CPU cycles) exponential rather than linear? The longer the key length, the harder the key is to decode, and that requires more processing power. In fact, if you were to leap directly from 1024 to 4096-bit key lengths, you would probably see a whopping 30x reduction in server performance. With reduced performance of 5 times (much less, 30 times), it's safe to assume you're not only going to have a lot of unhappy users, you're going to need additional server capacity to make up for that added load.

If you're not already offloading SSL functions from your virtualized application or web servers, now is a good time to give it some serious consideration. By offloading SSL processing to an advanced application delivery controller (ADC) device, you can reclaim those CPU cycles. And, if the ADC includes specialized hardware designed for RSA acceleration, it will handle that SSL processing far more efficiently than your application and web servers ever could.

If you're wondering whether a virtual network appliance would fit the bill, consider this: early testing of 2048-bit SSL processing on a virtual network appliance running on 64-bit commodity hardware revealed that it could only handle hundreds of transaction per second. Compare that to tens of thousands of transactions per second on a physical ADC, and it doesn't take long to figure out that a virtual appliance isn't a viable alternative. (Virtual ADCs are an excellent alternative to physical devices for some use cases, but this isn't one of them.)

For some organizations, the challenge is that they're still running legacy applications, many of which don't (and possibly never will) support 2048-bit keys directly. If you're in that boat and yet still must comply with certain regulatory requirements (for example, FIPS 140-2), you have no choice but to find a workaround to support 2048-bit key lengths. Again, here's where an advanced ADC device placed at the edge of your network can help. Install your 2048-bit certificate on your ADC device to process incoming client requests, and then use your existing 1024-bit keys to re-encrypt the data and forward it on to your application and web servers. With this solution, you can continue providing secure SSL connections, keep pace with new NIST guidelines, and avoid purchasing additional web and application servers to make up for the huge performance hit they would otherwise take.

Posted by Karl Triebes on 12/22/2010 at 12:47 PM


Featured

Subscribe on YouTube