Privacy Law and Policy Reporter
This paper was presented by the Federal Privacy Commissioner at the Biometrics Institute Conference ‘Biometrics – Security and Authentication’, Sydney, ,20 March 2002, and represents the Commissioner’s first observations on the subject of privacy and biometrics. The Commissioner wishes to acknowledge the significant input into the preparation of this paper by Robin McKenzie, Senior Policy Advisor with the OFPC.
The first part of the paper, published ,in this issue, starts by discussing the biometric market and some of the current or proposed applications in Australia. It then canvasses some of ,the drivers for increasing use of the technology and notes the range of claims — the solution to everything, or the end of the world as we know it — that are made about the interaction of privacy and biometrics. The final part of the paper, to be published in a following issue, continues to explore ,the impacts of biometrics and privacy through an analysis of how the Privacy Act may apply to biometric applications both in the public and private sectors —General Editor.
Biometrics is a generic term that refers to a wide range of measures of biological data. The use of biometrics is not new. The signature has been used for authentication for a very long time. The use of fingerprints in law enforcement can be dated back to 1879, when a French policeman named Alphonse Bertillon suggested that people could be precisely identified ,by carefully measuring different parts ,of the body. Although his original approach was to record a wide array ,of body measurements, including the tilt of the forehead and length of the right ear, the system was fine tuned to a photograph, with a quick physical description alongside a set of fingerprints.
This enthusiasm for biometrics as ,the best way of uniquely identifying someone took off from there and what was once a trickle has now become a flood. A whole range of well known and new biometrics is being used and experimented with. These include fingerprints, hand geometry, face, voice, iris or keystroke recognition, and DNA.
The Electronic News reported in December 2001 that the total non-automated fingerprint identification systems biometrics markets would ,climb from US$66 million in 2000, to US$900 million by 2006. ZDNet Australia reported in the same month that hardware sales are projected to increase tenfold to more than US$590 million in 2003 and that biometric consulting and integration revenues could reach nearly $1.8 billion. Business Review Weekly on 28 February 2002 reported that ,the Washington based International Biometric Industry Association expects the biometrics market including recognition technology based on ,fingers, hands, eyes, faces, voice and handwritten signatures to turnover US$660 million in 2003, up from US$100 million in 2000.
Biometrics are showing their presence in Australia. A small sample includes the following.
What are the reasons for this explosion of interest in and use of biometrics?
The quest for ever more efficient and fraud proof means of authentication has been one of the main driving forces. Another has been the drive for better means of identifying criminals and suspects for law enforcement reasons. The attraction of biometric information is that it is potentially hard to forge and it uniquely identifies a person, the Andrew Niccol film Gattaca notwithstanding.
Also, biometric information can be stored on computers. Developments ,in technology have made the use of biometrics as an identity tool more feasible. Its limitations before the advent of computers are amply demonstrated by the problems the French authorities encountered when the Mona Lisa went missing in 1911. Although the fingerprint identification from a thumbprint left on the glass of the painting implicated one Vicenzo Perrigia, the identification was no use ,in locating him because of the chaotic filing system the authorities had been using since they began collecting prints almost 20 years before. The ability to store and organise massive amounts of data in databases on fast computers has the potential to solve this problem. Increasing processor speed and access times for disk and memory and improved compression of algorithms has improved the performance of identification and authentication systems using biometrics enormously. New templates are taking up fewer bytes and can be matched at ever increasing speeds and with greater accuracy.
As governments and business organisations rely increasingly on electronic remote communication ,for interaction and commercial transactions, remote electronic methods of authentication are in demand. Examples of these kinds of transactions include use of ATMs and commerce on the internet. The personal identification number (PIN) has become one of the main ways of remote authentication. However people are being swamped ,by PINs, which they lose or forget, ,or handle in an insecure way.
Biometrics are also being explored as a means of assisting people who are unable to use conventional means for accessing systems or services. For example, the National Australia Bank has launched its first voice activated ATM, which aims to improve access to ATMs for the visually impaired and aged.
In addition, the technology needed ,to implement a biometric system is becoming available at lower cost. A ,key reason for this is that biometrics systems can increasingly be integrated with existing systems. For example, face recognition and iris systems can operate with cameras on multimedia computers. An article in Security Electronics in February 2002 announced an agreement to combine Visionic’s Face-IT technology with Nice Systems’ digital recording system to enhance the use of face recognition technology on video surveillance cameras, in real time. Increasing interoperability via the newly released biometric application programming interface standard of BioAPI, is likely to remove a long standing obstacle to growth.
And of course, since September 11, governments have been driven to explore biometrics, and in particular face recognition, for surveillance purposes and as a way of identifying people who are a threat to security.
The horse has clearly bolted. It is too late to turn back the tide of biometric technology.
There are in fact mixed views among the various interest groups about whether the use of biometrics is privacy enhancing or privacy invasive. The burgeoning use of biometrics has been accompanied by increasing expressions of alarm from some privacy advocates and civil rights groups about the threats to privacy this poses. Some in the biometrics industry argue that biometrics are without doubt the answer to threats to privacy resulting from identity theft. For example, Richard E Norton of the International Biometric Industry Association (IBIA) says:
Simply put, it’s getting harder and harder to preserve personal privacy without using biometrics. It’s a misperception that biometrics somehow compromise privacy; in actuality, they are the best way to lock up a record and ensure that an identity cannot be stolen. Biometrics are designed to give the user total control over who has access to his or her information, and provide a clear audit trail if someone tries to obtain data from a record. Which would consumers rather have — a system like we have now, with your name, social security number, birth date, address and phone number available to anyone who has PIN, password, or ‘hacked’ access to customer records, or a system that prevents a record from being penetrated unless it’s unlocked through biometric verification? Privacy advocates are on thin ice here, especially when they claim that a record can be compromised or stolen. A biometric cannot be reverse-engineered to find out who you are, and it cannot be used to link records together — in fact, the technology by definition prevents it. Finally, you can’t be an impostor by using someone’s biometric; the template is dynamic, and the data is encrypted. Biometrics raise the bar against fraud ,and abuse at no cost to privacy.
Others with an interest in privacy also argue that if properly constructed, biometric systems have the potential to act as privacy enhancing technologies (PETs). A discussion paper released by the Information Privacy Commissioner, Ontario, Canada states:
Biometrics need not subvert informational privacy. A pro-privacy position should not be construed as anti-biometric. The technology can actually be privacy enhancing if systems are designed with that objective in mind.
In a similar vein, the Ontario Information and Privacy Commissioner has challenged industry to develop security technologies enhancing privacy, or STEPs.
On the other hand, some privacy advocates and others have described the use of biometrics as a major threat to privacy and some even describe it as the end of the free world as we know it. ,For example, Roger Clarke writes:
Biometric technologies, building as ,they do on a substantial set of other surveillance mechanisms, create an environment in which organisations have enormous power over individuals. Faced with the prospect of being alienated by employers, by providers of consumer goods and services, and by government agencies, individuals are less ready to voice dissent, or even to complain.
This is completely contrary to the patterns that have been associated with the rise of personal freedoms and free, open societies. It represents the kind of closed-minded society that the Soviet ,bloc created, and which the free world decried. The once-free world is submitting to a ‘technical imperative’, and permitting surveillance technologies to change society for the worse. Biometrics are among the most threatening of all surveillance technologies, and herald the severe curtailment of freedoms, and the repression of ‘different-thinkers’, public interest advocates and ‘troublemakers’.
All of these perspectives have a relevant bearing on how to think about biometrics. Another perspective that needs to be kept in mind as well is that while the use of biometrics may pose a threat to privacy, there are many possible benefits to individuals, including the possibility of better protection from identity theft and the convenience of not having to remember multiple PINs or passwords.
The task I have as the Privacy Commissioner, along with other Commissioners, is to engage actively with the issue. We need to consider what can be done to protect privacy while still achieving the benefits that the use of biometrics is capable of bringing to society and to individuals. Indeed, wherever possible, the real objective should be to seek ways of ensuring that biometric technologies achieve these benefits while actually enhancing privacy.
To answer these challenges, some careful analysis is needed. Relying on untested or interest driven assumptions about biometrics will not result in good privacy solutions.
So what I would like to explore in this article is the question of just what threat to privacy the use of biometrics poses. Are the issues unique? Or are the threats similar to the ones posed by a number of other techniques of identification and authentication? How should the threats to privacy be tackled? Are current laws and approaches adequate or are there reasons why new approaches are needed?
Much will depend on the use that is made of biometric systems and the kind of biometric used. Biometrics can be ,put to a range of uses. One kind of biometric use, DNA testing, has potential use as a predictor of disease or disability. The use of DNA poses its own unique issues, which are being explored in a joint inquiry run by the Australian Law Reform Commissioner, and Australian Health Ethics Committee, and I do not propose to consider them in this article.
I will begin by considering what privacy is. In 1890, in what is now regarded as the key early modern writing on privacy, Samuel Warren ,and Louis Brandeis popularised Judge Cooley’s suggestion that privacy is the ‘right to be let alone’ and argued for the need for a legal protection of this right in the face ,of ‘recent inventions and business methods’.
While the face of the world and business methods have changed, the Warren and Brandeis formulation remains one the simplest and most meaningful answers to the question ,of ‘what is privacy?’.
Some fundamental part of human dignity requires privacy. Privacy is part of the claim to personal autonomy. It supports the various freedoms that democratic countries value. As Professor Zelman Cowen said in the 1969 Boyer lectures:
A man without privacy is a man without dignity; the fear that Big Brother is watching and listening threatens the freedom of the individual no less than the prison bars.
The International Covenant on Civil and Political Rights is one of a number of international instruments that recognise privacy among the basic rights. Article 17 states:
No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation.
David Banisar of EPIC suggests privacy can be divided into four separate but related concepts:
The development of new businesses and technology has led to court cases where more complexities about ‘the right to be let alone’ were debated — including the notion that this right needed to be weighed against the public’s right to know about things ,of legitimate public concern.
It is often the case that privacy is something that arouses more thought and interest in its absence or when it is threatened than in its presence. Prince Edward, a member of the British royal family, illustrated this when he was quoted as saying on the eve of his marriage that you do not value your privacy until you have lost it.
Another point worth making is that people often do not value other people’s privacy until their own is threatened. ,A good example of this is to be found among the Tasmanian police force that remained apparently uninterested while the legislation was going through the Tasmanian Parliament to establish forensic procedures for contributing ,to the CrimTrac DNA database (for criminals and suspects who might have done nothing more than be stopped for a random breath test). The police have developed a sudden interest in privacy issues now that the Government is proposing that they contribute their own DNA samples to a database, ,as have the Victorian Police.
So what is it about the use of biometrics that threatens our ability to be ‘let alone’, our dignity, our chance for anonymity and solitude? Does use ,of biometrics raise the spectre of Big Brother watching our every move so that there is never a chance to be on ,our own?
At one level, the extent to which biometrics threaten (or enhance) privacy depends on the use to which they are put. Some uses appear to have the potential for greater privacy threats or enhancements to privacy than others. However, it is not possible to be too dogmatic about this. The actual level ,of the threat or enhancement will vary according to the particular context.
Use of biometrics for authentication may have a low level of privacy risk provided that the authentication system involves the individual knowingly exercising a choice to enrol in a system and the system does not require the authenticating body to hold large amounts of information about an individual except that necessary to establish that the person is who they say they are.
Use of biometrics for identification has the potential to be more privacy invasive in some cases; for example where it involves the identifying organisation holding large amounts of information about individuals that it may or may not need, or that the individual may or may not know about. In the case of identification in a criminal context, it often involves bodily intrusive methods of collection from suspects, for example DNA sample collection, iris recognition, or in some cases fingerprints, and may require the giving of a sample which is not voluntary.
Use of biometrics for surveillance is likely to be a major privacy concern, particularly when carried out covertly. A key principle of privacy is that, generally speaking, people should have control over their personal information. People have no control if identifiable information about them is collected without their knowledge. Some biometrics are particularly capable of being collected covertly. These include facial or appearance characteristics, voice characteristics and keystroke behaviour.
Other privacy risks arise regardless of the proposed use. Some of the privacy risks result from the nature of biometric information. Biometric information provides information about a person that is unique (or very close to unique). Also, the initial biometric information is inseparable from the person so is hard to forge.
These great strengths however, are also the source of key privacy risks and weaknesses, especially if systems are not properly designed and/or regulated. As is the case with all unique identifiers, it is easy and very tempting to use the one identifier in a whole range of contexts and then to link the information for purposes other than the original purpose for collection (otherwise known as function creep). We have already ,seen the debates about this around the proposal for an Australia Card. Public interest advocates in the US are keen ,to ensure that ‘the same thing didn’t happen with biometric information ,that happened with Social Security numbers’.
This particular problem also provides a demonstration of where suitable design may be able to resolve it. The number of different biometrics that ,can be collected about each individual ,is probably limited only by our imaginations. The number of technologies for processing and protecting each of these is also large. Collection and use of a different biometric using a different technology for each of the different purposes is one way of technologically limiting or even preventing such linkage. An iris recognition technology might be used solely to facilitate a payments system; a palm scan might be the technology used for accessing one workplace, while voice recognition might be the way to unlock the car door. This is similar to the different ‘personas’ that some people use for conducting different parts of their lives online.
Uniqueness and difficulty to forge also make a biometric a potentially powerful authentication or identification tool. But the down side ,is that there is a risk that it will be impossible for a person to repudiate a transaction or repair the situation if something has gone wrong. As one commentator says about biometrics:
[I]t doesn’t handle failure very well. Imagine that Alice is using her thumbprint as a biometric, and someone steals it. Now what? This isn’t a digital certificate, where some trusted third party can issue her with another one. This is her thumb. She only has two. Once someone steals your biometric, it remains stolen for life; there’s no going back to a secure situation.
These features also make it difficult for a person to escape from situations ,of misuse in the hands of individuals ,or governments with malign intent. ,A powerful example of misuse was in Argentina, which was one of the first countries to adopt the Bertillonage system. One police officer evangelised the practice of keeping records of fingerprints. The first person was convicted of murder based on fingerprint evidence gathered at the scene of the crime in 1892. Nearly 80 years later, the Argentine police were using a system called Digicom to track down ‘dissidents’ in the streets of Buenos Aires. Combining digital processing with radio technology, the system scanned in fingerprints and relayed the information from police ,cars back to a central database. Each individual had a national identity card with a photo on the front and a complete set of fingerprints on the ,back. The Digicom system enabled the Videla Government to keep tabs on Argentina’s population, thirty thousand of which ‘disappeared’ between 1976 and 1981.
Another privacy risk that comes from collecting information from a person’s body is that the information may reveal more information than just identity regardless of the intended use. Some of this is very sensitive. For example, voice can reveal emotions; the face may reveal information about a person’s emotions and health. Iris recognition and retinal scans may also reveal information about a person’s health.
Aside from unintended collection of this information, it seems that there are already products on the market that aim to collect this kind of information, for example to detect deception through voice. The authors of ‘At face value’ predict that biometrics used to expose emotions, though voice, face and keystroke dynamics will have great influence in the future because these characteristics can be measured without consent. Examples of where they are likely to be used include multimedia contexts, particularly to collect more information from a person than they intend, and for e-commerce or telemarketing to influence the purchase patterns of customers.
A further privacy risk that seems bizarre but which cannot be dismissed ,is the possibility that people may mutilate other people’s body parts in order to use someone else’s biometric identity for criminal purposes, for example, access to money or buildings.
Other privacy risks arise from the nature of the technology used for biometrics.
The effectiveness and efficiency of current biometric uses depends on computer technology and electronic devices. This means that most of the privacy risks associated with computer technology also apply to biometric systems. Systems that involve storage ,of data on, and processing and transmission using, computer technology are subject to hacking and unauthorised access, use and disclosure. Although it may be difficult for a person to fake a fingerprint, a voice or hand, there is a view among a number of commentators that it could be relatively simple for a person to hack into a system and copy the digital image of a biometric and replay it whenever ,he or she wishes to pass as the person whose image it is.
Although human characteristics may be unique, all technologies so far developed for measuring them have built-in tolerances. This is because of the inaccuracy of the techniques and ,the different circumstances under which a biometric may be presented. This tolerance results in false acceptances ,(or FAR) or false rejections (or FRR). Privacy risks resulting from this include the following.
The concern is that there may be a false illusion of Fort Knox around biometric systems, which may leave individuals in vulnerable or impossible positions when things go wrong. ,As the authors of ‘At face value’ ,point out:
It is important to stress that, when biometrical systems are used, there is always a fraction of false acceptances. Corruption of personal data due to false acceptances will occur. The use of biometrics however might create the illusion that the personalization is ,always correct.
An example of this has been in relation to fingerprint matching where recently the reliability of fingerprint matches has been questioned in court.
In some cases the nature of the technology will limit the ability of a person to remain anonymous. For example, if a telephone network relies on voice recognition to get access to it, people may no longer have the option of using a payphone to remain anonymous.
Finally, the intense focus on human characteristics that biometrics gives rise to may lead to increased knowledge about the relationship between human characteristics and other habits, behaviours or emotions. For example, ,it could be discovered that people ,with red hair are more likely to buy financial products, or people with ,low voices are more likely to join a particular political party. It is not hard to imagine the possible misuse of this kind of information.
All these considerations show just how important design and policy ,stance are in considering the use of biometrics. As Privacy Commissioner, ,I strongly support applications of ,such technologies in ways that ,produce benefits that include privacy enhancement. Where this is not possible, strong and explicit justification, strong external monitoring and clear accountability are the minimum requirements that should ,be considered. Legislation is one of a number of ways of addressing these questions.
Biometrics and Privacy Part II: Coverage of the Privacy Act will ,appear in a forthcoming issue ,of PLPR.
Malcolm Crompton, ,Federal Privacy Commissioner.
 Douglas J-V ‘Bertillonage in-disguise?’ ZDNet Australia 13 February 2002 <www.zdnet.com.au/newstech/security/story/0,2000024985,20263453-4,00.htm>.
 Descriptions of the range of biometrics can be found at Hes R, Hooghiemstra T F M and Borking J J ‘At face value: on biometrical identification and privacy’ (1999) Registratiekamer September, also available online at <www.cbpweb.nl/documenten/av_15_At_face_value.htm>; or DeVoney C and Hakala D ‘2001: The Year We Make Contact’ ZDNet Australia 27 December 2000 <www.zdnet.com.au/newstech/ecurity/story/0,2000024985,20107874-1,00.htm>.
 Electronic News General News Section December 2001 p 12.
 DeVoney and Hakala in above ,note 2.
 Kirby J ‘Thumbprint security’ Business Review Weekly 28 February 2002 p79, available online at <www.brw.com.au/stories/20020228/1358.asp>.
 ‘University opts for biometric security’ Computerworld Australia 4 February 2002, p 3.
 Reported on ABC TV program Catalyst 28 February 2002; transcript available online at <www.abc.net.au/catalyst/stories/s486753.htm>.
 Above note 7.
 Above note 1.
 Above note 2 at p 15.
 Lebihan R ‘Australia Launches First Voice-Activated ATM’ ZDNet Australia 1 March 2002 <www.zdnet.com.au/newstech/communications/story/0,2000024993,20263777,00.htm>.
 Above note 2.
 ‘Face off’ Security Electronics ,1 February 2002 p 4.
 ‘Biometrics set for explosive growth’ Electronic News 1 December 2001 p 12.
 Interview with Richard E Norton, International Biometric Industry Association (IBIA), by Ted Dunstone, Biometrics Institute, 30 October 2000; transcript at <www.biomet.org/001029_ibia_interview.htm>.
 ‘Consumer Biometric Applications: A Discussion Paper’ Information and Privacy Commissioner/Ontario September 1999 p 33; available at <www.ipc.on.ca/english/pubpres/papers/cons-bio.htm>.
 Cavoukian A ‘Commissioner issues challenge to technologists: Take the next STEP’ Information and Privacy Commissioner/Ontario 10 January 2002 <www.ipc.on.ca/english/pubpres/ext-pub/steps.htm>. She has cited, as an example, a recently developed airport body scanner that shows where a weapon appears to be concealed instead of showing pictures of the naked body as the scanner seeks to reveal concealed weapons. The former is seeking to identify potential suspects compared with the latter which is assuming we are all guilty until proven innocent. The latter also won a ‘Big Brother Award’ in 2000 as one of the world’s most privacy invasive developments of that year. See <www.privacyinternational.org/bigbrother/us2000>.
 Clarke R ‘Biometrics and Privacy’ ANU 15 April 2001 <www.anu.edu.au/people/Roger.Clarke/DV/Biometrics.html>. See also Steinhardt B ‘Loss of privacy is cost’ USA Today 28 January 2002 <www.usatoday.com/news/comment/2002/01/28/ncoppf.htm> and Lowe S ‘Face it, you may be screened before flying’ SMH 11 January 2002 <old.smh.com.au/news/0201/11/text/national14.html>.
 The inquiry released an issues paper ‘Issues paper 26: Protection of Human Genetic Information’ in October 2001 available at <www.austlii.edu.au/au/other/alrc/publications/issues/26/>.
 Warren S and Brandeis L ‘The right to privacy’ (1890) 4 Harvard Law Review 193 available at <www.louisville.edu/library/law/brandeis/privacy.html>. They credit Judge Cooley in his A Treatise on the Law of Torts (2nd ed) Callaghan Chicago 1888 p 29 with the phrase ‘the right to be let alone’.
 Cowen Z The Private Man The Boyer Lectures Australian Broadcasting Commission 1969 p 9.
 Available at <www.unhchr.ch/html/menu3/b/a_ccpr.htm>.
 Banisar D ‘Privacy and human rights 2000: an international survey of privacy laws and developments’ Privacy International 2000 <www.privacyinternational.org/survey/>.
 Warner G ‘Greens join fight against police DNA testing plan’ The Mercury 13 February 2002 <www.themercury.news.com.au/printpage/0,5942,3766912,00.html>.
 Wilkinson G ‘Police to fight DNA bid’ Herald Sun 1 March 2002.
 See for example Clarke R ‘Just another piece of plastic for your wallet: the “Australia Card” scheme’ ANU 1987 <www.anu.edu.au/people/Roger.Clarke/DV/OzCard.html>.
 Oliver G M ‘A study of the use ,of biometrics as it relates to personal privacy concerns’ University of Maryland: European Division 31 July 1999 p 12 <faculty.ed.umuc.edu/~meinkej/inss690/oliver/Oliver-690.htm>.
 Schneier B ‘Biometrics: Truths ,and Fictions’ Crypto-Gram Newsletter 15 August 1998 <www.counterpane.com/crypto-gram-9808.html>.
 Above note 2 at p 16.
 See for example Tomko G ‘Biometrics as a privacy-enhancing technology: friend or foe of privacy?’ Department of Social Services Connecticut 15 September 1998 ,<www.dss.state.ct.us/digital/tomko.htm>; and above note 28.
 Above note 2 at p 24.
 ‘In a ruling on 17 January, Louis Pollak, a federal judge in Pennsylvania [in the case of United States v Plaza] decided that fingerprint evidence was unreliable. He [will require] evidence to persuade a jury that [fingerprints] are the same or, as the case may be, are ,not.’ at ‘Printing errors’ Economist 17 January 2002 <www.economist.com/science/PrinterFriendly.cfm?Story_ID=939896>.