Digital signatures represent one of the most explosive clusters of
privacy-threatening technologies, motivations and processes that has yet been
invented. Enormous care must be invested in the development of digital
signature infrastructure, and the parallel development of privacy protections.

1.
Introduction

This paper identifies the privacy implications of digital signatures, and
some weaknesses in current approaches in law and policy in Australia.

It is intentionally restricted in scope in several ways. In particular:

it assumes general familiarity by the audience with privacy issues,
cryptography techniques and digital signatures;

it does not address more general privacy implications of cryptographic
methods, e.g. in relation to the encryption of messages; and

it considers the interests of individuals in their own digital signatures,
but does not address the question of corporations' interests in the digital
signatures issued to their employees to use on the company's behalf, nor the
interests of the employees in relation to company digital signatures.

Asymmetric (or public key) cryptography involves two related keys, one of
which only the owner knows (the 'private key') and the other which anyone can
know (the 'public key'). The advantages this technology has provided are that
only one party needs to know the private key; and that knowledge of the public
key by a third party does not compromise security.

A digital signature is a 'message digest' (created by processing the message
contents using a special algorithm) encrypted using the sender's private key.
The recipient can, by re-creating the message digest from the message that they
receive, using the sender's public key to decrypt the digital signature, and
comparing the two results, satisfy themselves not only that the contents of the
message received must be the same as that which was sent (data integrity), but
also that the message can only have been sent by the purported sender (sender
authentication), and that the sender cannot credibly deny that they sent it
(non-repudiation).

Digital signatures are subject to a form of 'spoofing' by creation of a bogus
public key that purports to be that of a particular person, but is not. In
order to address that risk, 'certification authorities' (CAs) are envisaged,
that will certify that a public key is that of a particular person.

In the United States, the National Institute of Standards and Technology
(NIST) established a federal digital signature standard (DSS) during the period
1991-94. Many U.S. States are in the process of establishing legal frameworks
for digital signatures, most of them based on Utah's legislation (1995). A
commentary on matters of concern about the Utah model, including privacy
aspects, is provided by Biddle (1996).

In Australia, a report on digital signatures by an ad hoc committee entitled
Strategies for the implementation of a Public Key Authentication Framework
for Australia, was published as a Miscellaneous Publication of Standards
Australia in 1996 (PKAF Report, 1996). The various architectural options for
digital signature infrastructures raised in that Report (and particularly its
`preferred option') will be used throughout this paper to illustrate some of
the privacy issues.

The Public Key Authentication Framework Task Group which produced the PKAF
Report (see Appendix G for membership) contained no-one representing an
institutional commitment to privacy protection (the only contender, the
Commonwealth Attorney-General's Department, has complex conflicts of
interests). It is therefore not surprising that its Report, while not hostile
to privacy protection and recommending some valuable protections, provides
inadequate consideration of the issues involved. The following discussion will
identify those strenghts and weaknesses.

On the closely related issue of encryption policy, no overall Australian policy
has yet been released, but the OECD's encryption policy guidelines will soon be
released and are likely to be influential, particularly as an Australian
representative chairs the Committee developing them. For the latest known
draft, see the annexure `OECD Cryptography Guidelines near finalisation'
(Unofficial extracts from the OECD December 1996 Draft Cryptography Policy
Guidelines) (1996) 3 Privacy Law & Policy Reporter 126.

Another relevant document is a policy paper prepared by Gerard Walsh for the
Attorney-General's Department, Review of Policy Relating to Encryption
Technologies (Attorney-General's Department, Security Division, October
1996) (Walsh Report 1996), but has been suppressed to date (see 3 PLPR 181 for
details).

For an overview of the topic, see (Clarke 1996), at
http://www.rogerclarke.com/II/CryptoSecy.html,
and for more details see Volume 3 No 2 of Privacy Law and Policy
Reporter, a special issue on encryption policy. For a more substantial
authority on cryptographic techniques, see Schneier (1996).

2.
Private Keys - Direct Privacy Implications

The practice is emerging of using separate key-pairs for encryption of
message-content and for digital signatures, as the PKAF Report notes (p10). The
draft OECD encryption guidelines insist that this distinction should be taken
into account in development of national policies on access to keys (Greenleaf
1996b at 70). This separation is crucial for the protection of privacy in
digital signatures, because the public interest in obtaining access to private
keys used for message encryption is likely to be stronger than the public
interest in obtaining access to private keys used for digital signatures.

This paper deals only with the privacy implications arising in relation to
digital signatures. Significant additional and different privacy issues arise
in relation to message encryption keys.

A first concern relates to the manner in which private keys are generated.
From a security viewpoint, it is essential that key-generation is undertaken
entirely under the control of the individual concerned, and that the private
key never leave the possession of that person without strong security
precautions being taken. If any other approach is taken (such as generation by
a service organisation, or by a government authority), serious security and
privacy issues arise, because the scope exists for the individual to be
convincingly impersonated.

The PKAF Report only recommends that it should require key pairs to be
generated by a `trusted process' (p25), and is silent on whether individuals
should be allowed to generate their own. It says that any user-generated pairs
would have to comply with guidelines set up by the relevant CA, `otherwise, the
key pair might not be secure'. Of course, it might also be too secure
for the liking of governments.

For privacy protection, individuals need a right to generate their own keys,
and to not be restricted in the strength of the keys they generate.

A further concern relates to the manner in which private keys are stored,
and are backed-up, and in which backup copies are stored. Where other
individuals or organisations are involved, the private key must be the subject
of strong cryptography-based security precautions; otherwise impersonation
risks arise from this source.

Cryptographic measures exist, or may be invented, which may make it feasible
for a person to store and backup their private key with multiple individuals or
organisations, in such a way that the collusion of multiple parties is
necessary in order to gain access to the key. This may represent a
sufficiently secure means of secondary storage.

'Escrow' is an arrangement whereby something is placed on deposit with a
trusted party, but may be accessed by third parties under certain conditions.
It was originally used for title deeds for real property, and is used for
source-code for software packages. Escrow can also be used for private keys,
in which case it is referred to as 'private key escrow', which is commonly
shortened to 'key escrow'.

The U.S. Government has sought to impose a requirement that individuals deposit
a copy of their private key into escrow with a government agency, or a
government-approved service provider. This is completely inimical to the
security and privacy interests of individuals.

It may be, however, that the techniques discussed in the previous section
relating to partial-key storage with multiple parties, and multi-party
collusion in order to gain access to the key, may enable such a measure to be
undertaken without privacy being unreasonably sacrificed at the altar of
national security and law enforcement.

National security and law enforcement agencies may claim an interest in
gaining access to private keys, in the sense of knowing what the key is. Since
access to private keys used for digital signatures does not provide access to
content of messages, or provide the only source of identification details of
the holder, the reasons which would support such access seem limited. The
danger of such access is, of course, that it enables surveillance agencies to
convincingly impersonate the holder of the key. It is clear from recent
Australian history that there is a risk of our police and security agencies
abusing such access.

Access without a warrant is inimical to privacy protection. The
Telecommunications Bills 1996 at present before Parliament are allowing
surveillance agencies to issue their own public interest certificates
concerning call data, and this is a bad precedent for encryption policies.

Warrants should be issued by a high level of judicial authority. A case should
have to be made why the digital signature should not be revoked simultaneously
with it being compromised through access. Otherwise, there would need to be a
mechanism whereby the access became known at some later point in time, to
enable the individual concerned to revoke the now-compromised key, and generate
and publish a new one.

When grounds exist for believing that a private key may have been
compromised, the key pair must be withdrawn, or 'revoked'. This involves
identification of the party who is requesting the revocation. This
identification is necessarily intrusive, because the risk exists of an
impersonator requesting revocation, and certification of a replacement key.
This would only need to be achieved during a few key minutes in order for a
fraud to be perpetrated, e.g. in relation to the purchase or sale of shares, or
the transfer of funds from a bank account.

The consequences for individuals of wrongful key revocation are sufficiently
important that there should be legal right to compensation if a key is
wrongfully revoked. Further issues surrounding revocation and Certificate
Revocation Lists (CRLs) are discussed below.

3.
Public Keys - Privacy Issues?

It may seem strange to some that something which is intended to be a
`public' key, and the utility of which depends on it being known, should raise
privacy issues. However, it is a commonplace of privacy policy that some of the
most privacy-intrusive practices arise from the existence and/or misuse of
`public registers' of various types, such as the Electoral Roll, telephone
books, motor vehicle registers, and council registers of building approvals.

No one is yet proposing that possession of a digital signature be compulsory,
and some might think its `optional' or `voluntary' nature as a tool for
business and the technologically literate would remove privacy concerns.
However, if individuals increasingly find it necessary to provide digital
signatures for mainstream transactions, and to participate effectively in
cyberspace, it is likely that they will be forced to establish their identities
with one or more certification authorities in order to do so.

We must remember that the possession of an `Australia Card' was to have been
`voluntary', and the use of a Tax File Number is still (in theory) voluntary.
But the reality in both cases was `don't leave home without one'!

Public keys are designed to be widely available, and so privacy issues will be
the exception, rather than the rule. However, the privacy issues that may arise
are complex and important, and deserve more consideration than they have
received to date.

There are also some potential problems that may arise in relation to the
identification requirements for certification of public keys. This is
intrusive, because it requires people to expose data about themselves that they
may wish to keep private. For some people (e.g. those who are stateless, or
whose birth details are uncertain), it may be acutely embarrassing. The level
of identification required is a significant privacy issue.

The PKAF Report proposes `a points based scheme to establish an entity's true
(unique) identity (similar to the procedure for obtaining a passport or a bank
account) based on at least two pieces of independent evidence' (p74). It
properly proposes that these details should be protected by legislative privacy
principles (p80). The PKAF Report does not seem to envisage that these details
will be passed `up the chain' to the PARRA, but it is considerable importnace
that this be prevented.

If any central public register(s) of all public keys is maintained, then in
order to sufficiently describe the person who holds each digital signature
(because names are inadequate), personal information such as addresses,
date-of-birth etc may be included, leading to problems of secondary uses for
other purposes. We can describe such a register as a `positive' register, as
it contains identification information about every holder of a digital
signature.

The PKAF Report does not propose any such central `positive' register, but is
based on a certificate system. This reduces one of the main privacy dangers, as
the owner of the digital signature chooses who to send it to (accompanied by
its certification). However, there is still a need for the certificate which is
issued to include sufficient identification details of the holder for the
receiver to be sure which person is identified by the digital signature they
have received. A three way identification is needed: person - digital
signature - message. The certification must therefore contain some identifying
particulars of the person, and since names are ambiguous, there are privacy
issues of how much other personal information (address, d-o-b etc)
certification will have to contain.

An ambiguous aspect of the PKAF Report is its mention of `procedures for
"unlisted" certificates (similar to unlisted telephone numbers in concept' and
`procedures for distribution of certificates to ... directories' (p77). The
Report says that CAs `may distribute certificates to a publicly available
facility such as a Directory' (p27). The extent to which `listing' certificates
will genuinely be optional is crucial, if there is to be no central `positive'
register.

The PKAF Report proposes that the PARRA (Policy and Root Registration
Authority) `[g]enerates and publishes the national and international
Certificate Revocation Lists (CRLs) for and from all subordinate and peer
authorities' (p55).

This appears to anticipate a national public register of revoked digital
signatures. It will be to some extent unsafe to rely on any digital signature
because of the possibility it has been revoked. The only way to remove this
uncertainty is to check that it has not been revoked, and the PKAF Report
proposes the most `efficient' method of conducting such a check: one central
public register of revocations. It is possible to have no such central
register, but then the recipient would have to identify the relevant ICA or OCA
and check with it concerning revocation.

The PARRA revocation register is the digital signature equivalent of the Credit
Reference Association of Australia's national `negative reporting' database, or
perhaps closer to the old `Australia Card register' - the place you have to
check before you can rely on the information you have been given.

The great dangers of a central register arise from its potentials for political
abuse and for surveillance. A `negative' register such as is proposed for PARRA
poses somewhat less dangers than a `positive' register, simply because it need
contain no identification details, `merely' a list of revoked digital
signatures.

However, it does pose two significant privacy dangers:

If a digital signature (or all digital signatures held by a person) could
be revoked on the central `negative' PARRA register (perhaps irrespective of
revocation by the issuing ICA/OCA) for reasons unrelated to compromise of the
signature then a person could become incapable of participation in cyberspace.
The PKAF Report compares issue of a digital signature to issue of a passport
(p85), and indeed the capacity to hold non-revoked digital signature(s) will
become the cyberspace equivalent of a passport, and a domestic one at that.

If it becomes routine for signature recipients to check PARRA for
non-revocation of digital signatures, then PARRA logs will be a centralised
surveillance facility, capable of indicating which cyberspace entities a person
is transacting with over a period of time. To some extent the surveillance
could be real-time, but more often would provide logs over time. Either way,
police and other investigative agencies are likely to show a keen interest, as
they already do with telephone call data held by carriers.

There are strong pressures towards increasing expectations that members of
the public should identify themselves when they conduct transactions. These
pressures include:

the technological imperative, i.e. `it can be done, so it should
be done';

the marketing imperative, i.e. `the more that marketers know
about consumers, the more efficient marketing communications will be, and the
better-informed the consumer is'; and

the social control imperative, i.e. `the public is not to be
trusted, and data about their behaviour is essential in order to deter
non-compliance and detect and prosecute offenders'.

Digital signature technology adds a new dimension to the technological
armoury, because it provides apparently high-reliability identification of the
individual who conducts a transaction.

An early application is likely to be electronic commerce, where whichever party
delivers first is interested in assurances that the other party will keep their
part of the bargain. Knowing the identity of the other party is one way of
gaining that assurance. It is, however, only one way; and there are ways of
designing transactions such that neither party is at risk of default by the
other.

A particular form of electronic commerce, electronic publishing, may be another
area in which identification may become mainstream. This is because
period-subscriptions and multiple-issue subscriptions to digital versions of
documents need to be the subject of controls. User-names and passwords are
adequate in most circumstances, but high-cost and limited-distribution
subscriptions may be seen to justify the requirement for digital signatures.

There are many other potential cyberspace applications. For example, bogus
postings to newsgroups and e-lists, and bogus private mail, could be overcome
if everyone signed their mail - although only to the extent that recipients
actually used the signature to check the authenticity of the message.

Keys used for digital signatures are very long series of bits, which can be
represented as long series of alphanumeric characters. Unlike Personal
Identification Numbers (PINs), it is simply not feasible for individuals to
remember them. They must therefore be stored in a manner which is convenient,
portable, but secure.

The most likely current technology to support such storage is a chip. The chip
could be embedded in a variety of carriage-mechanisms, such as a ring, watch or
brooch. At present, the main form used is a plastic card. Any such device
gives rise to security and privacy issues.

A fundamental concern is the means whereby the private key within the card is
authorised for use. A PIN is feasible, but easily compromised. The
possibility exists to store an individual's biometric in the card, and unlock
the use of the private key only if a new measurement of this matches to that
stored the card. People are very wary of biometrics, and many of them are
highly intrusive in one sense or another.

Even if the individual's biometric measure remains solely on the card
carried by the individual, a considerable level of security and privacy concern
exists.

If the measure were to be stored by a third party, even if only for the
purposes of backup, then a much higher level of security and privacy concern
exists. A central repository for such biometric identifiers would present
opportunities for social control that are the stuff of anti-utopian novels.

The PKAF Report recommends (Appendix C: Legal Issues) implementation of the
PKAF scheme via legislative standards rather than a `bureaucratic' `purpose
created statutory body'. It suggests that the PARRA (to be approved by
Standards Australia) would have `some form of corporate structure ... with a
widely-based membership' representing major interest groups in use of digital
signatures.

The only privacy protection proposed is that digital signature participants
(CAs etc) would not be accredited unless they complied with the Standards
Australia standard. The legislation would require the standard to provide that
use of digital signatures would be voluntary, and that `information about the
holders of key pairs should be protected in accordance with privacy
principles'.

This meagre consideration of privacy is inadequate on a number of counts:

The standards/accreditation approach, while it obviously has its uses in
improving corporate behaviour, is completely inadequate as a means of creating
or protection civil rights (or any other individual rights). Standards give
individuals no right to obtain remedies for breaches, and threats of loss of
accreditation are no substitute. There needs to be legislation providing for
rights, investigation, and remedies.

The privacy issues, and the individual rights that are needed, go beyond
the protection of personal information in accordance with existing privacy
principles, as will be illustrated below.

The Privacy Act 1988 (Cth) only applies to the Commonwealth public
sector, with some limited extensions not very relevant here. However, the
Commonwealth Attorney-General has proposed to extend its operation to the
private sector (see the Discussion Paper, Privacy Protection in the Private
Sector (Attorney-General's Department, September 1996), Attorney-General
Williams' launch speech in 3 PLPR 81).

Extension of the Privacy Act and its Information Privacy Principles (IPPs) to
the private sector will go a little way toward controlling the collection, use
and disclosure of personal information used in relation to digital signatures,
particularly unauthorised uses, but will have little impact on the main
privacy issues as they are outside the scope of the IPPs. One main reason is
that the Privacy Act's IPPs (except the collection Principles) do not apply to
a `generally available publication' (see the Act's definition of `record'),
which will exclude any registers of signatures or of revoked signatures from
the scope of the Act. The second main reason is that it is not clear that the
collection principles in the IPPs provide any guarantees against systems being
built which require digital signatures or specific forms of digital signatures.
Another reason is that the IPPs, for various definitional reasons, may fail to
deal with some cyberspace transactions (see Greenleaf, 1996a). For these
reasons, as discussed below, new privacy rights going beyond the IPPs are
needed.

Various submissions to the Attorney-General on the Discussion Paper (see
various papers in Volume 3 Nos 9 and 10 of PLPR, special issues on submissions
on the Discussion Paper) have recommended wholesale revision of the IPPs, to
enable them to address these issues more directly. Some of the suggestions
below already appear in various submissions, particularly those influenced by
the Australian Privacy Charter.

The Australian Privacy Charter, Principle 17 (Public registers) provides:
`Where personal information is collected under legislation and public access is
allowed, these Principles still apply except to the extent required for the
purpose for which public access is allowed' (see 2 PLPR 44 for the text of the
Charter).

Some such principle is needed to attempt to control the secondary uses which
may be made of digital signature certificate and revocation registers. The
`under legislation' restriction may already be too narrow, given that PKAF
registers are proposed to operate under standards, not under legislation.

The Australian Privacy Charter, Principle 10 (Anonymous transactions)
provides: `People should have the option of not identifying themselves when
entering transactions', and is subject to any justified over-riding public or
private interests. This new privacy principle needs to be adopted and applied
in legislation and guidelines concerning digital signatures. Some aspects of
its application are given below.

In order to prevent digital signatures, and the infrastructure that
surrounds them, becoming a pervasive surveillance mechanism, it will be
necessary to give individuals the right to participate in cyberspace
communications without using digital signatures (ie `unauthenticated
transactions') wherever there is no strong social interest supporting the need
for authentication. Some commercial transactions will probably always require
authentication. Operators of some discussion lists may well justify
authentication of all messages sent to a list, bacause of the dangers inherent
in widespread publication. On the other side of the coin, ISPs and others
should never be able to require that private communications between individuals
should require digital signatures. The PKAF Report recognises that use of
digital signatures should be voluntary, but there needs to be legal guarantees
of this, not just `voluntariness' in the sense of `no signature, we won't deal
with you'.

Even where communications must be authenticated, that does not mean they
must necessarily be identified to the recipient. The PKAF Report says that PKAF
infrastructure `may' support anonymous certificates, and this is a helpful
recognition, but individuals need a right not to be excluded from transactions
because of unnecessary demands for identification. One of the central
privacy struggles of cyberspace will be between the market (and surveillance
agencies) that want identified transactions at all costs, and individuals who
wish to resist this.

The PKAF Report is really supporting a digital pseudonym, because it means
certificates bound to account numbers or other indirect identifiers, but which
are `capable of indirectly being traced to the actual user'. This recognition
of pseudonymity is useful, but the important privacy issue is who will be
capable of making such an indirect identification, and under what circumstances?

The PKAF Report says that PKAF infrastructure `must support multiple
certificates or multiple keys for a single user', referring to another aspect
of pseudonymity. It is very important to individual privacy that recipients of
digital signatures do not normally have the ability to aggregate profiles of
the transactions that a person enters into using multiple digital identities.
The obligations on CAs and others to maintain the privacy of the multiple
identities a person uses, with appropriate exceptions for fraudulent or other
illegal use, must be made clear.

6.
Conclusions

Digital signatures and the PKAF infrastructure within which they will
operate are difficult to understand. Digital signatures will be championed by
many players that the public distrusts, including national security agencies,
law enforcement agencies, and consumer marketing companies. Digital signatures
will be associated with increasingly intrusive expectations that people
identify themselves. Digital signatures will inevitably be associated with
cards. Digital signatures will inevitably be associated with biometric
identifiers.

As a result, the public will be very suspicious about digital signature
technologies. They will seek counter-measures and subversion opportunities.
They will demand explicit privacy protections, far more substantial than the
weak and patchy regime that is presently in place. The protections proposed by
the PKAF Report are also quite inadequate, though promising in some respects.
Successful implementation of digital signatures and PKAF infrastructure will
require far more attention to privacy issues by policy-makers and business
interests.

The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 50 million in early 2015.