Privacy Law and Policy Reporter
This document was prepared in support of a presentation to a seminar on Esecurity and Ecrime, run by the UNSW Continuing Legal Education Programme, Sydney, 19-20 July 2001. This document is at <www.anu.edu.au/people/Roger.Clarke/DV/IdCertainty.html>.
Human identification is the lynchpin of the burgeoning technologies that enable data surveillance. Ignorance prevails about the nature of identification and identity authentication, and about anonymity and pseudonymity. Yet more disturbing is the ignorance within the national security and law enforcement communities of the dramatic impact of these technologies on civil freedoms and democracy. The notion of ‘certainty of identity’ is a highly dangerous nonsense.
The title of this article, ‘Certainty of identity’, was presumably intended to be provocative. But unfortunately it reflects very nicely the simplistic perceptions that are evident within the agencies of social control and among the technology providers that sell to them. This brief paper argues that certainty of identity is an extraordinarily dangerous notion, which represents a far greater threat to society than the evils that security technologies are supposed to combat.
The paper surveys the technologies of surveillance, and shows how identity is central to them. It presents key concepts relating to identity and identification, and juxtaposes the alternatives of anonymity and pseudonymity. It identifies inappropriate presumptions that are commonly made by staff in national security and law enforcement agencies. It concludes that these agencies, and the attitudes rife in them, are among the most serious threats to society.
The paper is brief, but provides access to a substantial literature.
Visual and electronic surveillance have been complemented, and are increasingly being supplanted, by surveillance of individuals and populations through the copious data trails that are generated about their activities.
Mass dataveillance provides an efficient means of monitoring large numbers of people in order to generate suspicion about specific individuals and select them for closer attention. Larger numbers of people than ever before can be subjected to more intensive personal dataveillance,because the techniques are largely automated.
Key technologies of surveillance include the following:
Surveillance technologies depend upon mechanisms for the identification of human beings. This is a remarkably poorly understood topic. One frequently overlooked facet is that individual entities of all kinds, including people, have multiple identities, rather than just one.
Conventional identifiers such as names and codes are associated with identities rather than with entities. Law and practice in civilised countries recognises this, and permits the use of multiple identities. Sanctions are applied where individuals perform significantly antisocial actions, including those that depend upon multiple identities; but the use of multiple identities per se is in few cases itself an offence. Naturally, criminals use the scope provided by this freedom to adopt multiple identities as a means of avoiding retribution. This is just another of the many tensions that exist between the needs for freedom and for control over criminal behaviour.
Some identifiers are capable of reaching behind the identity and recognising the entity itself. These are termed biometrics, because they measure some feature of the individual, or of the individual’s behaviour.
Identification is the process whereby an identifier is acquired, and an association achieved between an identity and information stored in a database. Identity authentication is the further process whereby a sufficient degree of confidence is established that the identification process has delivered a correct result. Identity authentication can be performed by collecting multiple identifiers, acquiring knowledge that only the right individual is expected to have, or inspecting tokens that only the individual is expected to possess.
The concept of ‘certainty of identity’ is a forlorn hope. All identification and authentication techniques are subject to error. In addition to accidental errors, all are capable of being circumvented with varying degrees of ease. False inclusions arise, including successful masquerades; and the tighter that the tolerances are set, the greater is the frequency of false exclusions. The problems of false exclusion fall on the affected individuals; and the less easily compromised techniques impose mightily on the people who are subjected to them.
Rather than the naive concept of ‘proof of identity’ (POI), the focus needs to be on ‘evidence of identity’; and rather than the selfserving military concepts of ‘absolute security’ and ‘absolute trust’, the real world is about the management of risk and the balancing of competing interests.
A lot of discussion about security makes the blithe presumption that it is normal for transactions to be identified. The presumption is false. A great deal of human activity has always been conducted anonymously. Common examples include:
The contemporary trend towards authoritarianism, aided by technological developments, has been rapidly undermining anonymity, through demands for identification in all manner of circumstances, and the creation of new data trails that can be tracked.
Many kinds of people resent demands for identification and seek ways of obscuring their identities and selves. Of course, some of these people have criminal intent. Others are intent on undermining the current political system, or are ‘scurrilous rumour mongers’. But there are many other motivations, including:
The kleptomania of government agencies and marketing organisations for identified personal data has stimulated a great deal of constructive behaviour by software developers. Tools to deny information, deny identity and assure anonymity are readily available, especially in the electronic context, and are increasingly popular.
Anonymity compromises accountability, in that it undermines society’s ability to impose sanctions on miscreants, and therefore reduces the extent to which fear of retribution curbs disapproved behaviour.
A further form of nymity exists, which has the scope to achieve a balance between personal freedoms and social accountability. Instead of an identifier, what is associated with data is a pseudo-identifier or pseudonym.
In principle, the relationship between the pseudonym and a person is able to be discovered (otherwise it would be anonymous). In practice, however, it may or may not be able to be discovered, because the link is protected by technical, legal and organisational arrangements. For those protections to be circumvented, particular conditions need to be fulfilled, such as the issuing of a search warrant or other form of court order.
There are several mechanisms that can be used to give effect to pseudonymity, including ‘identity escrow’, escrow of partial identifiers, and ‘secret sharing’. This is not a mere theory, nor a new idea. Longstanding examples exist in such contexts as auctions and financial exchanges, epidemiological research and the arts.
If the discussion can be moved beyond the trivial level of assuming that ‘certainty of identity’ is a meaningful concept, then a fuller model of identity, identification and nymity could be used as a basis for designing schemes that achieve suitable balances between security and freedoms.
Against this background, it might be hoped that some serious-minded discussions are happening between the law enforcement community and representatives of the broader community. Tensions exist between law enforcement and other social objectives or values, and enormous care is needed in implementing invasive technologies such as Caller-ID, reverse telephone directories, MOLI, payment mechanisms, road tolling schemes, ATM and railway station surveillance, road traffic surveillance, biometrics and DNA databases.
Regrettably, however, the law enforcement community appears to see no need to compromise its use of such technologies, no need to consult with the community about them, and no risk to their waning public credibility if they proceed in accordance with the technological imperative and the blandishments of their favoured technology providers.
A serious rift is developing between the hardheaded law and order devotees, and the lovers of freedom and democracy.
Here are some presumptions that are conventional among some kinds of people.
1. National security is all important. The enemy is at the door. Terrorists are about to unleash a campaign of terror. The sky is falling, the sky is falling.
2. Law enforcement is all-important. Mankind is born evil, not good. No-one can be trusted, except for law enforcement agency employees. Freedoms must be constrained because they are used by criminals. Property is more important than limb and even life.
3. Social control is vital. Taxpayers’ money must be protected. Welfare recipients are cheats. Taxpayers are cheats. No-one can be trusted. Freedoms must be constrained, except freedom to make money.
Contrast those with the following perceptions that are shared by many people around the world.
1. Vast quantities of personal data are interchanged among government agencies and coalitions of agencies impose cross-system enforcement in order to achieve their aims;
2. Crimes are not restricted to activities that the society as a whole deems to be beyond the bounds, but are also defined to suit the government of the day, government agencies, and/or powerful corporations.
3. Elements within law enforcement agencies, often at high levels, are closely linked with organised crime.
4. Police kill more people than terrorists do.
5. Law enforcement agencies serve the government of the day despite the need to break the law in the process.
6. National security agencies operate independently of the elected government, and in concert with agencies of foreign powers.
We can feel comfortable about statements like those when they are used in relation to, say, Sierra Leone, Indonesia or Russia. What is disturbing is that all are capable of being used in relation to Australia, with degrees of credibility ranging from dubious (5) through highly feasible (2, 3 and 6), to clearly true (1 and 4).
Given the explosion in privacy invasive technologies, and their blind application, it is difficult not to feel deeply pessimistic about the directions our society is taking. The world is recognising the threats that technologies pose for the survival of the species; but, in the meantime, the survival of society as we know it is under dire threat in a much shorter time-scale.
Dataveillance technologies threaten to dramatically increase the power of the organisations that control their deployment. Power corrupts, and the scale of power that can be delivered by dataveillance technologies will increase the degree of corruption of the organisations that control them. When lists of ‘public enemies’ are drawn up, national security, law enforcement and social control agencies will need to be not just included, but placed high up on the scale.
Meanwhile, the balance of power in an increasingly globalised world is changing. Transnational and even large national corporations are increasingly above the law, and will impose and enforce law as they wish it to be, and co-opt law enforcement agencies to their own needs. Alliances between government agencies and private sector corporations are still in their infancy. As they become more common and more pervasive, personal data will leak across organisational boundaries, and organisations will crossleverage their power over individuals.
Pitifully weak data protection laws will not even be able to retard the bushfire of the surveillance society, let alone quench it. Individuals who stand out against the use of power will be increasingly subjected to dataveillance, psychological pressure, and countermeasures.
The technologies of surveillance need to be resisted, not just by criminals but also by people who actually like the ideas of freedom and democracy. Whilever people are capable of contemplating a concept as vacuous as ‘certainty of identity’, law and order devotees will pursue simpleminded objectives of subjugating society. Nymity services are going be very big business.
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra, a Visiting Fellow in the Department of Computer Science, Australian National University, and a member of the Editorial Board of PLPR.
The following are the source materials, researched over the last quarter-century, that underlie the arguments in this paper.
‘Introductory papers on dataveillance and privacy’; at <www.anu.edu.au/people/Roger.Clarke/DV/Popular.html>.
Definitions (1997-) at: <www.anu.edu.au/people/Roger.Clarke/DV/Intro.html>.
‘The underlying theory’ (1988) at: <www.anu.edu.au/people/Roger.Clarke/DV/CACM88.html>.
‘Technologies of mass observation’ (2000) at: <www.anu.edu.au/people/Roger.Clarke/DV/MassObsT.html>.
‘While you were sleeping ... surveillance technologies arrived’ (2001) at: <www.anu.edu.au/people/Roger.Clarke/DV/AQ2001.html>.
‘IT as a weapon of authoritarianism or a tool of democracy’ (1994) at: <www.anu.edu.au/people/Roger.Clarke/DV/PaperAuthism.html>.
Review (1993) at: <www.anu.edu.au/people/Roger.Clarke/DV/NotesAntiUtopia.html>.
‘Human identification’ (1994) at: <www.anu.edu.au/people/Roger.Clarke/DV/HumanID.html>.
‘Anonymity and pseudonymity’ (1999) at: <www.anu.edu.au/people/Roger.Clarke/DV/UIPP99.html>.
‘Privacy enhancing technologies (PETs)’ ‘Introducing PITs and PETs: technologies affecting privacy’ (2000) at: <www.anu.edu.au/people/Roger.Clarke/DV/PITsPETs.html>.
‘The technologies’ (1999), at <www.anu.edu.au/people/Roger.Clarke/DV/Florham.html#Techno>.
Resources (1999) at: <www.anu.edu.au/people/Roger.Clarke/DV/PEPST.html>.
‘The technology’ (1994) at: <www.anu.edu.au/people/Roger.Clarke/DV/MatchIntro.html>.
‘The failure of cost/benefit analysis to control IT’ (1994) at: <www.anu.edu.au/people/Roger.Clarke/DV/MatchCBA.html>.
‘The technology’ (1993) at: <www.anu.edu.au/people/Roger.Clarke/DV/PaperProfiling.html>.
‘Direct marketing’ (1998) at: <www.anu.edu.au/people/Roger.Clarke/DV/DirectMkting.html>.
‘The PBL/Acxiom conspiracy’ (1999) at: <www.anu.edu.au/people/Roger.Clarke/DV/InfoBase99.html>.
‘The Australia Card proposal’ (1987) at: <www.anu.edu.au/people/Roger.Clarke/DV/OzCard.html>.
‘The tax file number conspiracy’ (1991) at: <www.anu.edu.au/people/Roger.Clarke/DV/PaperTFN.html>.
‘The resistible rise of the national personal data system’ (1992) at: <www.anu.edu.au/people/Roger.Clarke/DV/SLJ.html>.
‘The parallel data matching scheme manoeuvre’ (1994) at: <www.anu.edu.au/people/Roger.Clarke/DV/PaperMatchPDMP.html>.
‘Smart card technical issues starter kit’ (1998) at: <www.anu.edu.au/people/Roger.Clarke/DV/SCTISK.html>.
‘Application of the technology’ (1997), at <www.anu.edu.au/people/Roger.Clarke/DV/IDCards97.html>.
‘Design requirements’ (1997) at: <www.anu.edu.au/people/Roger.Clarke/DV/IDCards97.html#DesOpt>.
‘The technologies’ (1999) at: <www.anu.edu.au/people/Roger.Clarke/DV/PLT.html>.
‘Safe-T-Cam’ at: <www.rta.nsw.gov.au/frames/safety/c_f.htm?/frames/safety/c1a&/safety/c1a_c.htm&Safe-T-Cam&0>.
‘Melbourne CityLink’s e-tag’ at <www.transurban.com.au/>.
‘MOLI (your mobile phone as the spy in your own pocket)’ at : <www.acif.org.au/MOLI>.
‘The impacts’ (2000) at: <www.anu.edu.au/people/Roger.Clarke/EC/eTP.html>.
‘The digital persona’ (1994) at: <www.anu.edu.au/people/Roger.Clarke/DV/DigPersona.html>.
‘The information infrastructure is a super eye-way’ (1988) at: <www.anu.edu.au/people/Roger.Clarke/DV/Monitor.html>.
‘Basics of internet privacy’ (1996) at: <www.anu.edu.au/people/Roger.Clarke/DV/IPrivacy.html>.
‘Developments in internet privacy’ (1998) at: <www.anu.edu.au/people/Roger.Clarke/DV/ICurr9908.html>.
‘Privacy risks in digital signature technology’ (1997) with GW Greenleaf at: <www.anu.edu.au/people/Roger.Clarke/DV/DigSig.html>.
‘Public key infrastructure position statement’ (1998) at: <www.anu.edu.au/people/Roger.Clarke/DV/PKIPosn.html>.
‘Current status’ (2000) at: <www.anu.edu.au/people/Roger.Clarke/DV/PKI2000.html>.
‘The fundamental inadequacies of conventional public key infrastructure’ (2001) at: <www.anu.edu.au/people/Roger.Clarke/II/ECIS2001.html>.
‘The technologies’ (1994) at: <www.anu.edu.au/people/Roger.Clarke/DV/HumanID.html>. #Bio
‘Biometrics and privacy’ (2001) at: <www.anu.edu.au/people/Roger.Clarke/DV/Biometrics.html>.
‘The OECD Data Protection Guidelines’ (1989) at: <www.anu.edu.au/people/Roger.Clarke/DV/PaperOECD.html>.
‘Beyond the OECD Guidelines: privacy protection for the 21st century’ (2000) at: <www.anu.edu.au/people/Roger.Clarke/DV/PP21C.html>.
‘Dataveillance and information privacy resource pages’ at: <www.anu.edu.au/people/Roger.Clarke/DV/>.
‘Major electronic resources on dataveillance and privacy’ at <www.anu.edu.au/people/Roger.Clarke/DV/index.html>.
Annotated Bibliography of the author’s papers on dataveillance and privacy at: <www.anu.edu.au/people/Roger.Clarke/DV/AnnBibl.html>.