The End-to-End Encryption Trend

Defense in depth might make a system more secure, but it also takes power out of buyers' hands.

Frustrated by the challenge of controlling access to enterprise assets, IT architects are embedding encryption at every layer of the stack. Theyre not just encrypting traffic between one perimeter and another but are also encrypting data repositories inside the perimeterand even encrypting the flows of data between processors and other subsystems.
While we applaud the principle of defense in depth, eWEEK Labs sees more than trace contaminants of vendor self-interest in this trend. The resulting systems may be more securethe devil, as always, is in the details of implementationbut they also tilt the balance of power from buyers toward closed-system suppliers and from content users toward more powerful providers.
Enterprises must understand the trade-offs thus created and may need to vote loudlywith their wallets and with their input to legislators and industry standards bodiesif they desire continued freedom to find an optimal mix of products in a competitive IT market.

In the second half of this year, as previously reported by eWeek, Transmeta Corp. will release Crusoe-family processors with on-chip accelerators for common encryption algorithms and with facilities for secure on-chip storage of keys and other sensitive data. This opens new doors to partnerships between hardware makers and application developers, along the lines suggested by Microsoft Corp.s Palladium and Intel Corp.s LaGrande initiatives, that could restrict machines to an approved slate of applications and enforce any number of use or access policies.

Its long been axiomatic that a computer is a byte pump: A program is a stream of bytes, decoded as instructions, while data is a stream of bytes that may mean anything at all. Battles royal have been fought over different encoding schemes for instructions, ranging from low-level machine code to the byte codes that are further translated by a "virtual machine" (for example, when programs are written in Smalltalk or Java).
Other battles have been fought over data representation standards, ranging from ASCII text to the elaborate (and undisclosed) formats used by Autodesk Inc.s AutoCAD or Microsoft Word. Even so, although these schemes often go by the name of "code," that word is used in the sense of representation, not concealment.
In the same way that anyone with a soldering iron could once extend the range of operations performed by a computer, anyone with a debugger or other low-level software tool has been free to examine what one computeror even one piece of a computerwas saying to another.
But that turns out to be a postulate, not an axiom, as the geometers might say. Just as one can devise geometries with no such thing as parallel lines, one can build computers on which theres no such thing as "free" software (defined by Richard Stallmans "free as in free speech, not as in free beer").
Software that could always be patched at the binary level, or modified in a more maintainable way with access to source code, could be on its way into the history booksalong with medicines not approved by the Food and Drug Administration or airplanes not certified by the Federal Aviation Administration, to cite other technologies in which demand for safety took prioritywith consequences of higher costs, fewer suppliers and a far more deliberate pace of innovation.
In an essay published last July, software engineer Adam Barr (author of the book "Proudly Serving My Corporate Masters: What I Learned in Ten Years as a Microsoft Programmer") sees the beginning of this path in a worthy goal: the desire to boot a PC securely from a network connection. Working on this problem in 1997, said Barr, he encountered the problem of securing the "boot loader": the first piece of software that a system runs, which must assure the security (by means such as cryptographic "code signing") of all subsequent modules.
"A clever hacker could corrupt the loader as it went by," Barr explained, "then have his modified loader bring up a version of the kernel that did not do the code signing check, and at that point all heck could break loose on the users machine (or, for the quick and dirty version, the corrupted loader could just format the hard drive and then stop)."
It soon became clear that the first secure link in the chain had to be embedded in BIOS hardware, a notion since formalized by Intel as Boot Integrity Services.
On general-purpose computers, that secure chain ends at the operating system, whose job then becomes the loading and execution of whatever applications a user may choose. Theres nothing but custom and expectation, though, to prevent that chain from extending into the domain of applications, as it does on game machines such as Microsofts Xbox or Sony Corp.s PlayStation.
Ironically, that extension of control may merely shift the battleground from the domain of technology to the domain of business process. If an accounting employee is transferred from receivables to payables, and if provisioning systems and procedures fail to revoke former privileges in the process of granting new ones, a classic opportunity for fraud arisesregardless of any restrictions that may exist on the applications that are allowed to run on a machine, or the configurations in which that machine can be placed. But the opportunity for software companies to enter the marketplace with innovations that promote good practices may be materially reduced by the need for a platform imprimatur.
If one is still looking for axioms, eWEEK Labs offers this: The concealment that results from end-to-end encryption, imposed by closed systems that are not subject to peer review, will inevitably be penetrated by attackerswhose attacks will then be camouflaged by that same concealment.
Buyers will pay the price at both ends, in higher costs both for what they buy and for figuring out how to protect what they own.
Technology Editor Peter Coffee can be reached at peter_coffee@ziffdavis.com.

Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.