Another Whack Against GAK

by J. Orlin Grabbe

GAK is short for Government Access to Keys.
Currently you don't have to leave copies of your house
keys, your office keys, or your car keys in the hands of
some government "escrow" agency. Like the local office
of the FBI. Not yet, anyway.

But the government insists that it has a right to
have your cryptological keys. Cryptological keys are
those used to encrypt messages in such a way that only
the owner of the proper key or keys can decode and read
the contents. The government wants your keys, because it
believes it should have the right to access and read
anything you might encrypt. It's their business, they say.
After all, you might be a fellow traveler of one of the Four
Horsemen of the Government Apocalypse: a Terrorist,
Spy, Drug Dealer, or Child Pornographer. So it needs
your keys to catch you pursuing your nefarious trade.

The government, naturally, conveniently omits
mention of the terrorists, spies, drug dealers, and child
pornographers in government itself. Or politicians who
are on the take from terrorists, spies, drug dealers, or child
pornographers. No, no bad people work for the
government. Rather, they are Out There somewhere--
somewhere beyond the Beltway. Out There is some
person writing secret things. That person may be having
bad thoughts. How can Big Brother fix their bad
thoughts, if it doesn't know what they are? As for the Bill
of Rights--what's that? Only militia members talk about
the Bill of Rights--are you a militia member? Here, let
me mark that in your government file. What's in your
government file, you ask? None of your business. Our
file on you is secret. And we have a monopoly on secrets.

Cryptology software was also declared a munition,
so that its export would be controlled by the International
Traffic in Arms (ITAR) regulations. This was supposed
to keep good cryptology out of the hands of foreigners, so
that the U.S. National Security Agency (NSA) could spy
on them. But this didn't seem to work. Those wily
foreigners simply bought good cryptology from non-U.S.
companies. Well, if you can't spy on foreigners, there's
always your own citizens.

So the Clinton administration came up with its
Clipper Chip proposal. The easy way to spy, the thinking
went, was to make people buy equipment build for that
express purpose. Now, the government didn't exactly tell
home builders they had to install a government keyhole in
every office and every bedroom. The proposal was more
subtle than that. The Clinton administration instead
attempted to force companies to build "wiretap-ready"
computers, "spy-accessible" cable TV set-top boxes, and
"the-FBI-is-also-on-the-line" telephones.

On April 16, 1993 the Clinton Administration
announced two new Federal Information Processing
Standards. One was the Escrowed Encryption Standard
(EES)--a.k.a. "Clipper". The other was the Digital
Signature Standard (DSS). All private companies doing
business with the government were potentially affected.

What was the purpose of these new proposals?
One government agency, the Office of Technology
Assessment (OTA), answers the question this way: "In
OTA's view, both the EES and the DSS are federal
standards that are part of a long-term control strategy
intended to retard the general availability of 'unbreakable'
or 'hard to break' cryptography within the United States,
for reasons of national security and law enforcement. It
appears that the EES is intended to complement the DSS
in this overall encryption-control strategy, by
discouraging future development and use of encryption
without built-in law enforcement access, in favor of key-
escrow encryption and related technologies" (Office of
Technology Assessment, Information Security and
Privacy in Network Environments, September 9, 1994).

No encryption "without built-in law enforcement
access". No communication without a GAK backdoor.
The EES was promulgated by the Clinton Administration
as a "voluntary" alternative to an existing Data Encryption
Standard (DES). It was intended to remain "voluntary",
just as long as everyone adopted it anyway.

The EES involved a bulk data encryption
algorithm called Skipjack, which would be contained on a
tamper-resistant chip, called the Clipper Chip (or MYK-
78). The chip would be manufactured by VLSI Logic, and
programmed with the algorithms and keys by Mykotronx
at a facility in Torrance, California. Each chip would
contain a trapdoor that would allow the government, using
a two-part key (U = U1+U2), each half deposited with a
different escrow agency, to decode any communications
sent through the chip.

Who were the escrow agencies who would each
own half the key? These were announced later: the
Department of Commerce's National Institute of
Standards and Technology (NIST) and the Treasury
Department's Automated Systems Division. Think about
it.

The Department of Commerce, whose head, Ron
Brown, went on oh-so-many trade missions, and
reportedly demanded a $700,000 payment for influencing
policy with respect to Vietnam. The Department of
Commerce, from whence the Indonesian Lippo Group's
John Huang raised money for the Democrat National
Committee while influencing trade policy in Southeast
Asia. You know, a nice, safe, non-political place like the
Department of Commerce.

And also the Department of Treasury's computer
department. Those people responsible for creating the
systems that keep track of your IRS files. The private and
secure IRS files, like the ones that were delivered to the
White House, as part of the Filegate affair, or the ones
that were sold for $500 a copy out of the Covington,
Kentucky regional office. So, how much for a copy of
someone's secret keys?

Here is how the two-key encryption process is
(was?) intended to work. In addition to the Skipjack
encryption algorithm, each Clipper Chip will contain a
80-bit family key F that is common to all chips; a 30-bit
serial number N; and an 80-bit secret "unique" key U
which can be used to unlock all messages sent through the
chip. Suppose I have my secure device get in touch with
your secure device. The first thing that happens is our
two chips agree on a randomly generated 80-bit
symmetric session key K, which will be used only for this
one conversation. The Clipper Chip takes our message
stream M and encrypts it with K, using the Skipjack
algorithm, producing the encrypted message K(M).
Simple enough.

But the chip also does other things, on the side, for
the government. As an entirely separate process, it also
takes the session key K and encrypts it with the secret key
U, producing U(K). Then it tacks the serial number N on
to the end of the encrypted session key, giving the
sandwich U(K)+N. Then it takes the family key F and
encrypts the sandwich, giving F[U(K)+N]. The encrypted
sandwich, F[U(K)+N], is called the LEAF, or "Law
Enforcement Access Field." Both the encrypted message
K(M) and the LEAF, F[U(K)+N], are sent out over the
telephone or computer data line. Your Clipper Chip
receives both these, but mostly ignores the LEAF. Your
chip simply takes the previously agreed session key K and
uses it to decrypt the encrypted message, yielding
K[K(M)] = M.

But suppose Fred is a FBI agent who wants to
listen in on this. He gets a warrant (maybe, but probably
not), and has the phone company plug him into the
conversation. With his listening device, he siphons off
both my encrypted message K(M) and the LEAF,
F[U(K)+N]. As a member of the FBI he is allowed to
know the family key F, which he uses to decrypt the
LEAF, yielding the sandwich: F{F[U(K)+N]} = U(K)+N.
So now he knows the serial number N. He then takes N
along with his warrant over to the first escrow agency
(Commerce), which gives him half of the secret key, U1.
He takes N with his warrant over to the second escrow
agency (Treasury), which gives him the other half, U2.
He now knows the secret key U = U1+U2. He uses U to
decrypt the encrypted session key: U[U(K)] = K. Now he
knows the session key K, which he uses to decrypt my
encrypted message: K[K(M)] = M.

Of course a good hacker or spy would simply
attack the key storage computers at Commerce and
Treasury, and thus gain access to all U.S.
communications. In other words, GAK is a foreign spy's
pornographic dream. You don't think it's possible? In
1994 a team of in-house hackers was assembled by the
Defense Information Systems Agency to test the security
of Department of Defense computer systems. The
hackers were able to gain control of 88 percent of the
8,900 Pentagon systems they attempted to break into. And
only 4 percent of these attacks were noted by Defense
Department system operators (Intelligence Newsletter,
no. 269, July 27, 1995).

Despite this artificially created vulnerability,
industry was urged to build the Clipper Chip into every
type of communication device: computer modem,
telephone, fax, and set-top TV converter. Of course to do
so (no surprise) would make a product subject to State
Department ITAR export controls. But AT&T, at least,
promptly popped the Clipper Chip into the AT&T
Security Telephone Device 3600, which has a retail price
of about $1,100, because they had been offered a large
government contract, and were thus "suitably
incentivised".

A memorandum prepared for the Acting Assistant
Secretary of Defense had noted a number of U.S.
computer industry objections to a trapdoor chip, such as
the Clipper Chip: "The industry argues persuasively that
overseas markets (much less drug lords or spies) will not
look with favor on U.S. products which have known
trapdoors when offshore products which do not have
them are available. In support of their argument, they
note that powerful public-key cryptography developed
and patented by RSA using U.S. tax dollars is free to
developers in Europe, subject to royalties in the United
States, and cannot be exported without expensive and
time-late export licenses. These charges are true. . .
Despite these concerns, the President [Bill Clinton] has
directed that the Attorney General [Janet Reno] request
that manufacturers of communications hardware use the
trapdoor chip, and at least AT&T has been reported
willing to do so (having been suitably incentivised by
promises of government purchases)" (Ray Pollari,
Memorandum for the Acting Assistant Secretary of
Defense (C3I), April 30, 1993).

Another implementation of the ESS was to be the
Capstone Chip (Mykotronx MYK-80), which would
include Clipper's Skipjack algorithm, and add to it digital
signature, hash, and key-change functions. While Clipper
was mostly intended for telephone communication,
Capstone was designed for data communication. Finally
there was to be Tessera, a PCMCIA card that would
contain a Capstone Chip. Despite generating universally
negative comments, the EES was approved by Ron
Brown's Department of Commerce as a federal standard in
February 1994.

But private industry didn't care for ESS. The
original Clipper proposal didn't go over very well in
Silicon Valley, where all those computer companies had
done so much to help out the election of Bill Clinton.
These companies turned their attention to Congress and,
in conjunction with civil liberties organizations like the
Electronic Frontier Foundation, defeated legislative bills
that would have allowed the government to insert its nose
into every private conversation. But the Clipper proposal
wouldn't die. When government power is involved, bad
ideas have a habit of coming back. Clipper was resurrected.
It was the same proposal, but the language changed. No,
government doesn't want to spy on you. It
didn't want to access your communications. Instead, the
language turned into mushy talk of hot milk, warm
blankets, and security. It was now a "key recovery"
proposal. You know, in case you lost yours, the
government would be able to supply you with another
copy, isn't that nice? The children will be safe. And
along with the gooey talk of security, the proposals
became increasingly less "voluntary", demanding that
everyone do what's "good" for them, and do it now.

"Clipper III" was announced on May 21, 1996.
The Clinton Administration proposed to use a
government-sanctioned key certification system as an
incentive to virtually impose key escrow on domestic
users. Naturally the proposal was clothed in political
language connoting sweetness and light. It was entitled
"Achieving Privacy, Commerce, Security and Public
Safety in the Global Information Infrastructure". One
could envision annual awards for those who had
successfully achieved "privacy and commerce"--whatever
that might mean.

Clipper III would establish a "public key
infrastructure" for encryption. What's a public key
infrastructure? Well, it's one that will 1) enable users to
clearly identify who they're talking to, and 2) help them
manage their cryptological keys. So--what's so bad about
that? After all, we all know the government is a good
manager, and has much to teach us. But here's the catch:
All users of the public key infrastructure would have to
ensure government access to their encryption keys by
using an approved key escrow authority.

Clipper III was an attempt by the government to
disguise key escrow as part of a key certification
program. But escrow and certification are not the same
thing. Key certification is a way for a third party to be in
a position to say that a particular public key "belongs to" a
particular person, organization, or agency. "Public key
8G293F666 really belongs to Bill Clinton." In popular
programs like PGP, for example, I can certify I am the
source of a message by signing it with my private key.
Then anyone can verify the signature using my public
key. Only I could have signed it, because only I know my
private key. And I don't give my private key to the
government. Meanwhile, you know the public key
belongs to me, because the certification authority said so.
Much like the Department of Motor Vehicles certifies that
your photograph is properly matched with the other
personal information on your driver's license.

Now, suppose the government sets up a central
certification authority. You ask them to certify your
public key. Well, they'll refuse to do so unless you first
give the government a copy of your private key. You say
you don't want to participate? Then you lose your ticket
to the Global Information Infrastructure. The Clinton
administration has been working hard on the European
Community and other OECD countries to impose similar
key certification/key escrow schemes. That's to keep you
from getting privately certified in a foreign country
without giving up your keys there also. Moreover, it
would allow governments to exchange keys among
themselves. You know, in case the White House owes the
Lippo Group a favor for all those donations. Or maybe
they would want to trade arms dealer Wang Jun (the
chairman of Poly Technologies--owned and run by the
People's Liberation Army--and a recent White House
guest) a few good keys for a few good missiles.

It takes a Global Information Infrastructure to spy
on the global village. Even when you don't know whom you
are inviting to dinner.

But, alas, the life of Big Brother isn't all a piece of
cake. On Monday, December 16, 1996, a federal district
court judge in San Francisco struck down part of the
government's self-proclaimed monopoly on secrets. It
was only a small victory. But even small victories are
encouraging when you are dealing with the Leviathan
state.

Daniel J. Bernstein is a Research Assistant
Professor at the University of Illinois at Chicago. He
developed a new encryption algorithm he wanted to
publish in professional journals and on the Internet. The
government said he must first register as an arms dealer
and seek government permission before publishing
anything. So Bernstein sued, saying the government
requirements were a violation of his First Amendment
right of free speech.

The government argued that Bernstein's ideas
were not protected by the First Amendment because they
were partly expressed in computer language (source
code). (Sort of: "You are not allowed free speech if I
don't understand what you are saying.") Judge Marilyn
Hall Patel rejected that argument. Then she also ruled
that the Arms Export Control Act is an unconstitutional
"prior restraint on speech". Why? Because it requires
Bernstein to first submit his ideas about cryptography to
the government for review, to also register as an arms
dealer, and then to apply for and obtain from the
government a license--all this before publishing anything.
Judge Patel used the Pentagon Papers case as precedent.

"I'm very pleased," said Bernstein. "Now I won't
have to tell my students to burn their textbooks." Philip
Zimmermann, Chairman of PGP, Inc., said "It's nice to
see that the executive branch does not get to decide
whether we have the right of free speech." And John
Gilmore, co-founder of the Electronic Frontier
Foundation, which backed the suit, commented: "There's
no sense in 'burning the constitution in order to save it'."