If you're a Windows 2000 user, be warned: Your security
software may not work the way you think it does.

Microsoft intentionally designed Windows 2000 so that
export versions can use a notoriously weak encryption method to scramble
information sent over the Internet and intranets, leaving sensitive data
exposed to hackers and eavesdroppers.

This design choice has alarmed security experts, not least
because so many Microsoft products recently have had so many problems. The
company spent the last week acknowledging embarrassing security holes in
its Hotmail service, Internet Explorer browser, and Outlook mail client.

A Microsoft manager on Monday defended why Windows 2000
computers in some circumstances switch from the highly secure triple-DES
algorithm to the notoriously weak single-DES variant. Triple-DES is up to
70,000 trillion times stronger.

Ron Cully, lead program manager for Windows networking,
said that companies might have thousands of machines and some might not have
triple-DES installed. Because of U.S. export and other import restrictions,
Microsoft ships triple-DES in a separate "high encryption pack."

"It's somewhat expected behavior that someone will
misconfigure an end system and not install the high-security pack," Cully
said. Having at least some encryption is better than nothing, he said.

That's not the point, charge Cully's peers at other companies
that are working on the same security standard, called IPsec. In a
no-holds-barred critique that began last week on the IPsec mailing list --
run by the Internet Engineering Task Force -- they argued it was another
example of slipshod Microsoft security.

Their beef: If two Windows 2000 computers without triple-DES
are talking and the system administrator has configured triple-DES-only links,
only single-DES gets used. The only error shown is an invisible one -- in
an audit log file -- so users may have a false sense of security.

"From an administrator perspective, it is hard to imagine
how a security hole could be worse: Windows lets you think all is OK but
in reality something else happens on the wire," wrote Sami Vaarala of NetSeal
Technologies, an information security firm in Espoo, Finland.

"This is *seriously* brain-damaged. I've given up expecting
good software design from Microsoft (actually, from most vendors), since
they (and everyone else) are far too arrogant about their abilities to design
and write error-free code," Steve Bellovin, a cryptologic researcher at
AT&T, wrote on the IPsec list last week.

> There have been allegations that NSA influenced Microsoft's
encryption
> support (one reason that NSA could afford to relax export
controls
> could be that they've already subverted the highest volume US
> products).

The FBI has been pretty blatant about their efforts, too. Both Microsoft
and FBI officials told me on background last year that the FBI wanted to
be sure MS included no encryption that wasn't easily broken.

Granted, neither side would go on the record. But the fact that both sources
told me they were willing to be cited as "company executives" and "government
officials" speaks volumes about the PR war the feds have been waging.

>There have been allegations that NSA influenced Microsoft's
encryption
>support (one reason that NSA could afford to relax export
controls
>could be that they've already subverted the highest volume
US
>products). It's pretty well acknowledged that NSA did this to
Crypto
>AG's hardware products decades ago, and has been reading the
traffic
>of those who depended on those products. An eavesdropper doesn't
need
>to break the encryption if they can break the user interface and
make
>it lie about whether it is really encrypting.

While John may be speculating about NSA subversion of strong crypto, specific
examples of this would be very helpful. Here are a few firms for consideration
as candidates for today's Crypto AGs besides Microsoft (meaning latest products,
not those that have been suspected in the past):

Cylink
IBM
Lotus
TIS
RSA
PGP

Perhaps it would be fair to list all firms that are now exporting strong
crypto if John's speculation is accurate.

How to get any compromise out in the open is the question. Presumably, secrecy
agreements or NDAs are in effect for any complicit firm and its employees.We've
gotten a couple of anonymous letters recently about Cylink but nothing on
the others.

Duncan Campbell's
exchanges with Microsoft
have been squelched by MS, but one final exchange is in the works which
summarizes what MS has publicly stated and what suspicions remain unanswered.
Similar queries in depth could be made to the other crypto exporters, if
for no other reason than to assure their foreign customers that they can
take and answer hard criticism. Otherwise, suspicions of complicity may undermine
credibility of all US crypto products.

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
On Wed, 17 May 2000, John Young wrote:
> While John may be speculating about NSA subversion of strong crypto,
> specific examples of this would be very helpful. Here are a few firms
> for consideration as candidates for today's Crypto AGs besides Microsoft
> (meaning latest products, not those that have been suspected in the past):
>
> Cylink
> IBM
> Lotus
> TIS
> RSA
> PGP
PGP's source code has always been available for public review. This has
not changed. There are no "back doors" for the NSA in PGP, and PGP has
never supported weak (under 128 bit) encryption, and never will.
> Perhaps it would be fair to list all firms that are now exporting strong
> crypto if John's speculation is accurate.
His speculation, however, is also based on the fact that Microsoft uses
DES with its security products.
> How to get any compromise out in the open is the question. Presumably,
> secrecy agreements or NDAs are in effect for any complicit firm and its
> employees.We've gotten a couple of anonymous letters recently about
> Cylink but nothing on the others.
Well, I can tell you that my NDAs do not cover secrecy agreements for
compromises made with the NSA. If PGP were in any way compromised by the
NSA (or any other party, for that matter) I would not be working here.
Look at the code.
> Duncan Campbell's exchanges with Microsoft have been squelched
> by MS, but one final exchange is in the works which summarizes
> what MS has publicly stated and what suspicions remain unanswered.
> Similar queries in depth could be made to the other crypto exporters,
> if for no other reason than to assure their foreign customers that they
> can take and answer hard criticism. Otherwise, suspicions of
> complicity may undermine credibility of all US crypto products.
I think that PGP has done just about everything that we can to assure the
users that our software is not in any way compromised. If there is
anything else we can do I would like to know about it.
__
L. Sassaman
System Administrator | "Everything must end;
Technology Consultant | meanwhile we must
icq.. 10735603 | amuse ourselves."
pgp.. finger://ns.quickie.net/rabbi | --Voltaire
-----BEGIN PGP SIGNATURE-----
Comment: OpenPGP Encrypted Email Preferred.
iD8DBQE5I07HPYrxsgmsCmoRAh+dAKC+5wwI09TXrbXvAv/OpMgD1lJyewCdGLi5
iynCfAueqfhIHfyoq4VOv7Y=
=rgGw
-----END PGP SIGNATURE-----

> PGP's source code has always been available for public review. This
has
> not changed. There are no "back doors" for the NSA in PGP,

<paranoia>Unless they are particularly subtle ones, based on a mathematical
understanding that is not yet publicly known. Remember that the NSA
knew about differential cryptanalysis well before anyone else. Times
have changed, but maybe less than we think.</paranoia>

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
PGP, Inc. and many other security companies were purchased by Network
Associates in 1997/98. PGP, Inc. was (and still is) one of the
standard bearers for anti-key-recovery solutions including pioneering
the methods for publication of scannable source code in book form.
What happened is that NAI also bought TIS. TIS was like the anti-PGP
at the time in terms of key recovery philosophy. Both companies had
good quality products, but TIS definitely had a radically different
philosophy from PGP, Inc.
Anyway, to make a long story short: NAI never enrolled in the KRA.
TIS was a member. When NAI bought TIS, the KRA changed the name on
their web page to NAI -- presumably because it looked better for them
to put a big company's name there. Shortly thereafter, NAI requested
(because Phil Zimmermann informed the NAI execs about the evils of
key recovery) that its name be removed and it was. The membership
was not renewed since the time when TIS originally was a part of it.
It's that simple.
So in any case, the issue was rapidly corrected, and within months of
NAI purchasing TIS, TIS had killed all of its key recovery features,
and the KRA membership had been cancelled.
So effectively, NAI was responsible for purchasing one of the most
vociferous supporters of key recovery and rapidly eliminating that
stance from it and all the features that stood behind it. I'm
actually quite pleased with the rapid turnaround and the outcome.
The KRA now appears to be dead, their website does not respond, the
TIS group is just as anti-key-recovery as PGP ever was, PGP's
government lobbying efforts against export controls were majorly
enhanced by the backing of NAI, and PGP's regular source code
publishing played a pivotal role in the change of government policy.
You may have forgotten now that there was a time when mandatory key
recovery and stronger export controls seemed just around the corner
in the US. It was a difficult time, and I would ask that you
recognize that the PGP group was at every moment unflinching in its
opposition to all of this, and that we were influential in making
sure those things never came to pass.
Dennis Glatting wrote:
>
> "L. Sassaman" wrote:
> >
> > PGP's source code has always been available for public review.
> > This has not changed. There are no "back doors" for the NSA in
> > PGP, and PGP has never supported weak (under 128 bit) encryption,
> > and never will.
> >
>
> Who's PGP? Last I looked PGP Inc. was owned by Network Associates,
> a key recovery alliance member.
- - --
Will Price, Director of Engineering
PGP Security, Inc.
a division of Network Associates, Inc.
-----BEGIN PGP SIGNATURE-----
Version: PGP 7.0 (Build 170 Alpha)
iQA/AwUBOSOOo6y7FkvPc+xMEQI+xwCfQBj0KqZUFLAu57GlgLILDNycX4oAmwSi
RKsTk/l9kpvCubos3cjvXBcg
=irgt
-----END PGP SIGNATURE-----

>So in any case, the issue was rapidly corrected, and within months of
NAI
>purchasing TIS, TIS had killed all of its key recovery features, and
the KRA
>membership had been cancelled.

There's a paper on adding GAK to IPSEC by someone from NAI in the GAK issue
of Computers and Security (January 2000, p.91). This is pure
GAK and isn't disguised as "business data recovery" or some other nicety:
"Goverments must be able to intercept the [Common Key Recovery Block] at
the time of key establishment or periodically while the SA remains active".

Will Price has made an exemplary statement on behalf of PGP. It should be
a model to match or beat by the other producers. Any firm which does not
come up to that level with public statements should be noted widely as
contributing to distrust of US crypto products and policy.

Even so, the statements may needed refreshing to fit the changing tides of
global markets and closed door international treaty agreements.

The report issued yesterday by the House Permanent Select Committee
on Intelligence, for the Intelligence Authorization Act for Fiscal
Year 2000, vividly describes, among other failings, the sorry state
of NSA's cryptological prowess and recommends that the agency form close
ties with industry to get it back in the lead of snooping on global
communications:

Much funding is to be poured into this gov-com alliance, and there will surely
be stringent tests of principle to not be sucked into secret agreements on
behalf of US national security -- and perhaps domestic security as the home
threat is boosted:

Clinton authorized another $300 million yesterday for anti-terrorism, including
the chimerical cyber threat:

Will popular demand for crypto deliver the profits corporations need to resist
government contracts, as set out in the New York Times today, or will the
Crypto AG gameplan win the secret bidding contest again as it has for so
long?

What is the latest on Austin Hill's ongoing meetings with the feds to assure
them that ZKS products are compatible with law and order? He candidly reported
the first few, promised to publicly report the upcoming, but none have appeared
that I have seen.

Who else is giving such assurances under the baleful watch of VCs and plain
old Cs.

KRA may be defunct, or it may have gone behind closed doors, for the key
recovery program at NIST is still going like a bat out of hell, making sure
the USG gets GAK in place and assuring the rest will follow the money.

At 08:58 AM 5/18/00 -0400, Russell Nelson wrote:
>L. Sassaman writes:
> > PGP's source code has always been available for public review.
> > This has not changed. There are no "back doors" for the NSA in
> > PGP,
>
><paranoia>Unless they are particularly subtle ones, based on a
>mathematical understanding that is not yet publicly known. Remember
>that the NSA knew about differential cryptanalysis well before
>anyone else. Times have changed, but maybe less than we
>think.</paranoia>
If there are weaknesses that the NSA didn't put there, they're holes,
not back doors. If the NSA knows how to break some of the algorithms
(IDEA, CAST-128, 3DES, RSA, SHA1, El Gamal, etc.), that's still not a
back door, it's a successful cryptanalysis. It seems very unlikely
to me, but maybe an F-16 would have seemed pretty damned unlikely to
Orville Wright, too.
On the up side, if NSA knows how to break (say) CAST-128 with few
enough resources to be useful (e.g., 2^{80} work, 2^{40} memory, a
few thousand known plaintexts), that fact will be kept secret. Which
means that they will have to be *very* careful making any use of
information recovered from that break, to avoid leaking the fact that
they can break it.
>-russ nelson <sig@russnelson.com> http://russnelson.com
- --John Kelsey, kelsey@counterpane.com
-----BEGIN PGP SIGNATURE-----
Version: PGPfreeware 6.5.1 Int. for non-commercial use
<http://www.pgpinternational.com>
Comment: foo
iQCVAwUBOSTXcSZv+/Ry/LrBAQENeAP/VL1RU+d6ClOD+hvoeY20r1XmyJ5eLvms
isjHq0NuK05Rs3kJ0Hnpx1qv0kB9h2DiMOGLO/Z+lWjCt93F4z6t7aRDQGVKhNPK
LM+Pv9bTyywLpPPAYDYUIvJQjSUcF63OiSpCDpWmVMO6BY2Vdp/9Mh5qvWZ+8Td5
3BpMyMpKBgY=
=WBJe
-----END PGP SIGNATURE-----

> ..... But a cooperative relationship between Microsoft and NSA
>(or any vendor and their local signals security agency) can be more
>subtle. What if Microsoft agreed not to fix that bug? What if
>Microsoft gives NSA early access to source to look for bugs? The NSA
>may not need much more than an agreement that certain portions of,
>say, the RNG object code will never change (or only change
>infrequently, with lots of notice). That might be enough to insure
>that NSAs viruses and Trojan horses can always find the right spot to
>insert a patch that weakens random number generation.

This is one of the more believable scenarios I've heard for back-doors supported
by organizations outside of Microsoft. I remain skeptical that NSA, Microsoft,
or anyone else can build a truly foolproof back-door (i.e. one that doesn't
spring open by mistake when Matthew Broderick happens to call). I doubt NSA
would want to entrust national security on the problematic behavior of a
software flaw as opposed to a throughly designed and analyzed back door
mechanism. But I like the idea of diddling with the RNG. People are unlikely
to look for such an attack, but it gets them what they want, especially when
they use best crypto practices and change keys often.

>It may be time to question whether we should ever expect that mass
>market operating systems from commercial vendors will protect users
>against a targeted attack from a high resource operation such as the
>major signals intelligence agencies. .....

I think there's a much more profound risk of such a back-door being installed
by a hostile overseas organization or by organized crime. If the NSA approaches
Microsoft to acquire their support of NSA's surveillance mission, then the
information will have to be shared with a bunch of people inside Microsoft,
and they're not all going to keep it secret.

On the other hand, any well-heeled organization could approach a single
disgruntled employee with a hefty bribe (say, a recent employee with newer,
less valuable stock options). If the employee is involved in bugfixing, the
organization could probably purchase a back-door. The employee will have
a strong motivation to not tell anyone about it since it would get him fired,
it might be in violation of various laws, and the folks who paid him might
come after him if word gets out. Those same motivations for silence aren't
as much at work if someone is told to do something on NSA's behalf as part
of their regular work duties.

I suppose if the mafia or the KGB-du-jour can do this, then so can the NSA,
if there's a bureaucrat there who's enough of a risk taker and has enough
motivation. I remain skeptical that there's anyone there with enough budget
and guts to take that approach to data collection, regardless of the perceived
benefits. Too much risk to the career if word gets out.

> If the NSA approaches Microsoft to acquire their support of
NSA's
> surveillance mission, then the information will have to be
shared
> with a bunch of people inside Microsoft, and they're not all
going
> to keep it secret.

Two people in Microsoft would need to know. Bill Gates, and the lead
programmer on the part of the product with the security or privacy bug.
The lead programmer would do it and keep quiet if "Bill" personally asked
him or her to. Nobody else would need to know, and it's unlikely that
anybody else would stumble on the bug (particularly if the lead programmer
does the maintenance on that part of the code).

The US Government was doing such things as early as 1919, when they approached
the head of Western Union. A messenger picked up all the telegrams
of the last 24 hours, daily, brought them to Herbert Yardley's "Black Chamber",
and returned them by the end of the day. The entire operation was completely
illegal. The same was done with the Postal Telegraph company in
1920. (Puzzle Palace, pg. 11-12.) I doubt very many employees
were in on the secret.

I have a well-founded rumor that a major Silicon Valley company was approached
by NSA in the '90s with a proposal to insert a deliberate security bug into
their products. They declined when they realized that an allegation
of the bug NSA wanted (using a "large prime" that was really composite) would
be detectable and verifiable by customers and competitors. (There have
been allegations of NSA-induced bugs in Crypto AG equipment, but the company
just denies them and nobody has proven they exist yet. This one would've
been easier to find once the allegation was made.)

Turning down the offer on verifiability grounds left them wondering whether
they really would have done it if it'd been possible to keep the whole thing
secret. The quid pro quo offered by NSA would be that their products
would have no trouble getting through the (at the time) draconian export
controls. Of course, there was no way to enforce the deal either; "blowing
the whistle" if NSA refused export permission would have revealed the company's
security products as untrustworthy, probably kicking it out of the security
market.

Anybody tested the primes in major products lately?

Did you ever wonder how certain companies' products got export licenses when
other similar companies just couldn't export?

How hard is it to factor a product of two primes when one of them isn't really
prime? (I.e. to factor a product of three primes?)

PGP 5 will, under certain well-defined circumstances, generate public/private
key pairs with no or only a small amount of randomness. Such keys are insecure.

Chances are very high that you have no problem. So, don't panic.

WHO IS AFFECTED?

----------------

The flaw has been found in the PGP 5.0i code base. It is specific to
Unix systems such as Linux or various BSD dialects with a /dev/random device.

Versions 2.* and 6.5 of PGP do NOT share this problem.

PGP versions ported to other platforms do NOT share this problem.

The problem does NOT manifest itself under the following circumstances:

- You typed in a lot of data while generating your key, including
long user ID and pass phrase strings.

- A random seed file PGP 5 could use existed on your system before
you generated the key.

However, the problem affects you in the worst possible manner if you started
from scratch with pgp 5 on a Unix system with a /dev/random device, and created
your key pair non-interactively with a command line like this one:

pgpk -g <DSS or RSA> <key-length> <user-id> <timeout> <pass-phrase>

WHAT TO DO?

-----------

If you have generated your key non-interactively, you may wish to revoke
it, and create a new key using a version of PGP which works correctly.

DETAILS

-------

In order to generate secure cryptographic keys, PGP needs to gather random
numbers from reliable sources, so keys can't be predicted by attackers.

Randomness sources PGP generally uses include:

- a seed file with random data from previous sessions

- user input and input timing

Additionally, certain Unix systems such as OpenBSD, Linux, and others, offer
a stream of random data over a central service typically called /dev/random
or the like. If present, this service is used by PGP as a source of
random data.

PGP 5.0i's reading of these random numbers does not work. Instead of random
numbers, a stream of bytes with the value "1" is read.

In practice, this implies two things:

1. PGP5 will generally overestimate the amount of randomness available.
We have not researched the effects of this in detail.

However, we believe that the amount of randomness gathered from input data,
timing information, and old random data will be sufficient for most
applications. (See below for a detailed estimate.)

2. In situations in which no other randomness sources are available, PGP
relies on the /dev/random service, and thus uses predictable instead of random
numbers. This is not a flaw of the random service, but of the PGP5
implementation.

One particular example of such a situation is non-interactive key generation
with a virgin PGP 5 installation, like described above.

In fact, RSA keys generated this way are entirely predictable, which can
easily be verified by comparing key IDs and fingerprints.

When using DSA/ElGamal keys, the DSA signature key is predictable, while
the ElGamal encryption subkey will vary. Note that fingerprints and key IDs
of the predictable DSA keys depend on a time stamp, and are themselves not
predictable.

The count parameter is always set to the value 1 by the calling code.
The byte read from the file descriptor fd into the RandBuf buffer is subsequently
overwritten with the read() function's return value, which will be 1.
The actual random data are not used.

This can be fixed by replacing line 1324 by the following line of code:

read (fd, &RandBuf, 1);

2. "Random" data

A dump of random data gathered during an interactive key generation session
is available at
<http://olymp.org/~caronni/randlog-keygen>. This
was dumped as passed to the pgpRandomAddByte() function, one byte at a time.

Note the streams of bytes with the value 1 which should actually contain
data gathered from /dev/random. Also note that the pass phrase ("asdf")
and the user ID ("roessler@guug.de") are clearly visible, but mixed with
timing data from the individual key presses.

No random data occuring after the second stream of ones were generated from
external events prior to the generation of the DSA key in question.

3. Some estimates

We give a back-of-the-envelope upper estimate of the amount of random bits
PGP may gather during interactive key generation. We assume that
/dev/random reading is flawed, and that no seed file exists prior to running
PGP. Timing is assumed to have a resolution of 1 us (gettimeofday()).

During a PGP session of one minute, we can get at most 2^28 different time
stamps (2^28 ~ 60*10^6).

Note that one time stamp close to the point of time of key generation is
known to attackers from the time stamp PGP leaves on the key.

So the intervals between individual key presses remain as a source of randomness.

Assuming that the user types at a rate of about 120 characters per minute,
we have an interval of approximately 0.5 seconds between two key presses.
Dropping the upmost non-random bit of the interval length, we get about 18
bits of random timing information per key press.

This estimate gets worse for experienced and fast-typing users.

With a user ID of 20 characters, and no pass phrase, PGP will have gathered
roughly 300-400 random bits interactively. While this is not bad, it
is not sufficient by PGP's own standards.

The "Intelligence Training" reference you posted earlier has received 1200+
downloads referred from jya.com.

(FYI - IntelBriefing was created in response to the IntelForum members having
no place to post long papers. As I had IntelBriefing.com spare, pointing
to comlinks.com, I created the new site, which is going Gangbusters!)

After five months of spirited debate, extensive meetings, and hard work,
the Federal Trade Commission's (FTC) Advisory Committee on Online Access
and Security has released its final report. The report is certain to
help shape the policy debate as Congress and various government agencies
attempt to develop policies to guarantee the protection of personal information
on the Internet. Links to background documents, Committee members'
individual statements, and public comments can be found at
www.ftc.gov/acoas. The Final Report
of the FTC Advisory Committee on Online Access and Security.

- PECSENC Meets to Discuss Encryption Export Regulations

At a recent meeting of the President's Export Council Subcommittee on Encryption
(PECSENC), the Department of Commerce offered a few suggestions as to the
direction in which they may be headed in their summer review of the January
14, 2000 encryption export regulations. The Bureau of Export Administration
(BXA) has received a handful of comments, which it will consider in drafting
final rules. BXA hopes to release a draft of the final rule for circulation
in late June and to issue the rule by July.

- D.C. Court of Appeals Hears Oral Argument in USTA
v. FCC

On May 17th, a three-judge panel from the U.S Court of Appeals for the

District of Columbia heard arguments in USTA, et al. v. FCC, et al. -- the
first case concerning implementation of the Communications Assistance for
Law Enforcement Act (CALEA) to reach oral argument. The case centers
around an August 31, 1999 Federal Communications Commission (FCC) CALEA
Order. Shortly after the FCC's order was released, several privacy
groups and industry associations appealed it, arguing that the FCC had failed
to consider the proper statutory criteria (such as preserving privacy and
minimizing costs) when establishing the Order. The panel is expected
to hand down a decision in the late summer or fall.

- Internet Companies to Demo P3P

The Internet privacy standard known as the Platform for Privacy Preferences
(P3P) is inching closer to implementation. Microsoft, IBM, Privada,
and others will unveil websites and browsers incorporating P3P at a special
"interoperability" session at 10 am on Wednesday, June 21 at the AT&T
auditorium in New York City. If its sponsors can drum up significant
interest in the session, it could turn out to be a giant leap for P3P.
More information on the June 21 interop session is available on the website
of the W3C:

Steptoe & Johnson LLP grants permission for the contents of this publication
to be reproduced and distributed in full free of charge, provided that: (i)
such reproduction and distribution is limited to educational and professional
non-profit use only (and not for advertising or other use); (ii) the
reproductions or distributions make no edits or changes in this publication;
and (iii) all reproductions and distributions include the name of the author(s)
and the copyright notice(s) included in the original publication.