Privacy Law and Policy Reporter
Part 1 of this article, ‘The end of the world as we know it or the white knight of privacy’, appeared in (2002) 9(3) PLPR 53. The Privacy Commissioner wishes to acknowledge the significant input into the preparation of this paper by Robin McKenzie, Senior Policy Adviser with the OFPC — General Editor.
There is an array of laws in Australia that protect privacy at the Federal and State and Territory level. In this section of the article I will be focusing on just one of these laws, the Privacy Act 1988 (Cth) (the Act). The analysis that follows is a further exploration of the impact that the increasing use of biometric information may have on privacy and in part as well as an exploration of the protection that the Act offers for the use of biometrics.
In this analysis I will point to a few areas in the interface between biometrics, privacy and the Act that raise issues or need to be carefully monitored. I note in this regard that the Government has announced that the Privacy Commissioner will review the operations of the new private sector provisions in the Act two years after they commence, that is December 2003.
The Act covers Commonwealth public sector agencies and a fair part of the private sector. It is worth noting that this coverage leaves some gaps. For example, not all State or Territory public sectors are covered by State or Territory public sector legislation. Also, in the private sector, the Act does not cover most of the activities that employers carry out in relation to employee records. This could be of concern because biometric systems have a number of potential uses in the employment context, unless federal ,and State workplace relations law provides sufficient protection.
The Government has announced that it will review, in conjunction with the States, existing Commonwealth, State and Territory laws to consider the extent of privacy protection for employee records and whether there is a need for further measures. The findings of this review are expected to feed into the general review of the Act mentioned earlier.
The Act applies to ‘personal information’. A threshold question is whether biometric information is personal information. The Act in s 6 defines personal information to be:
Information or an opinion (including information or an opinion forming part of a database), whether true or not, and whether recorded in a material form of not, about an individual whose identity ,is apparent, or can reasonably be ascertained, from the information or opinion.
Biometric information is clearly information about ‘an individual’. On the question of whether biometrics is identifying information, the authors of the paper ‘At face value’, who include Dr Borking from the Registratiekamer ,in the Netherlands, say that:
In the context of biometrical identification it can also be argued ,that this person is generally identifiable, since the biometrical data is used for identification or authentication, at least in the sense that the person concerned is distinguished from any other person.
The authors go on to say that with this approach, the identifiability of the person does not depend on the availability of other data, which — jointly or separately — allow the person concerned to be identified.
Of course the use of biometrics generally involves a number of transformative processes that involve manipulation of the data and may include mathematical transformation of the information into a code. The authors of ‘At face value’ conclude that:
There is no reason to think that what applies to the human characteristic itself, would not apply to the digital representation of that characteristic, the templates which are composed on the basis of these representations, and to ,any subsequent transformation. As the process continues, the amount of detail will change, but the unique link with the person concerned is kept. It is reasonable therefore to conclude that the data involved will remain personal data in most, if not all stages of their processing.
The Act regulates information privacy in the Commonwealth public sector and the private sector nationally. It does not directly address the issue of bodily privacy, which is often addressed elsewhere in general law or statute law. However, both the Information Privacy Principles (IPPs) for the public sector and the National Privacy Principles (NPPs) for the private sector require that information be collected in a way that is not unreasonably intrusive. This may be adequate protection in many cases, but is unlikely to be adequate in cases where a person has no choice about whether ,or not to give a biometric.
The issue of choice is likely to arise where governments are considering use of biometrics. The first step in building in privacy is to take account of it at the decision-making phase. There will be some contexts, for example in law enforcement, where there will appear to be prima facie arguments for mandatory collection of biometric information. Even in these cases I strongly encourage a systematic consideration of issues such as any alternatives available, who the measure will affect, whether it is proportional to the problem and what safeguards might be needed. This approach is discussed in more detail ,in a paper I presented in 2001 to an Australian Institute of Criminology Conference. One approach in cases where a government may require a person to provide a biometric in intrusive circumstances is to have ,a separate law governing the circumstances. For example, there are separate laws in some States where individuals are required under new ,laws to give DNA samples for law enforcement reasons. In other cases, governments as well as private sector organisations are to be encouraged to build in choice and to think about necessary safeguards.
In the private sector, uses of biometrics that involve bodily intrusive collection methods are likely to be strongly resisted by consumers. However, consumer resistance is only possible where the market gives them real choice. The extent to which the market provides real choice to consumers in the privacy area is a matter on which I am currently keeping a close eye, in relation to a number of areas of operation of the new private sector provisions.
The first step in allowing or encouraging people to exercise choice is to make sure they know they have a choice in the first place. The Act can play an important role here.
Both the IPPs and the NPPs require that information be collected by lawful and fair means (IPP 1, NPP 1.2). Generally speaking, in the Guidelines ,to the National Privacy Principles and ,in the Plain English Guidelines to the Information Privacy Principles, we ,have interpreted this to mean that information must not be collected covertly, although exceptions have been made for investigations of criminal offences. Both the IPPs and the NPPs require an agency or organisation collecting personal information to take reasonable steps to tell, or ensure that a person is aware of, certain information when collecting personal information directly from the person (IPP 2, NPP 1.3). NPP 1.5 includes the same requirement when collecting from third parties. This would generally speaking require an organisation or agency to make a person aware that it has collected biometric information.
The policy approach in the private sector provisions of the Act is to require an organisation to get the individual’s consent before it collects information ,of a sensitive nature (NPP 10). IPP 3, which applies to Commonwealth agencies, may give some protection. It requires that collection must not intrude to an unreasonable extent on the personal affairs of the individual. However, it may be argued that in ,the context of a government agency collecting biometrics, a stronger requirement is needed. For example, the law might need to require that a person, generally speaking, should have the choice about whether to give biometric information, or the law might possibly allow collection of a biometric when specifically authorised or required by law. This is an important issue because Commonwealth public sector agencies are increasingly interested in the use of biometrics systems for a range of security and other reasons.
What protection the NPPs would give depends on the question of whether biometric information is ‘sensitive information’ as defined in the Act. If biometric information is ‘sensitive information’ then, generally speaking, private sector organisations could only collect biometric information with the individual’s consent. Sensitive information is defined in s 6 as personal information that is information or an opinion about an individual’s:
There appears to be nothing inherent in a biometric that would make it sensitive information under the Act. It would only become sensitive because it reveals information that falls within the definition.
Use of biometrics usually involves the capture or measurement of a human characteristic and the creation of a template. In some cases, at the stage ,of collecting the ‘raw’ or unprocessed template, the biometric information could be regarded as containing information that relates to a person’s race or possibly their state of health. Examples of this might be that raw facial recognition data could reveal skin colour (although it is questionable whether this is necessarily determinative of race, but it may be used to form an opinion nonetheless) or certain signs of illness. If general surveillance videos are used to collect facial information, it could also incidentally pick up a whole range of intimate personal information about a person’s behaviour including information about a person’s sexual practices or preferences. In the view of an iridologist, iris recognition and retinal scans would definitely involve collecting health information. It also seems that the quality of a fingerprint could be affected by race, gender, occupation and age. Although this is usually discussed in the context of false rejections, it seems to me that this feature of fingerprints could also be used as an illustration of another way that biometrics can reveal more information about a person than simply the characteristic.
Once the information is transformed beyond the raw or unprocessed template, it is unlikely that the information indicating skin colour or state of health could be derived from the data.
Information collected through voice recognition, which is another kind of biometric that could easily be collected covertly, would be unlikely to reveal health information. Information about emotions, which could come from facial recognition, keystrokes or voice recognition, would not fall within the definition of sensitive information unless, for example, it reveals psychiatric information.
Given the potential sensitivity around what biometric information may reveal about a person and the potential for the easy covert collection of a number of these characteristics it could be argued that the starting point should be collection with explicit consent and real choice about whether to provide it, including other viable options besides providing it. Once again, this is an issue that I am keeping a close eye on in a number of areas of operation of the private sector provisions. Where choice is not an option then a much more stringent consideration of privacy risks ,is required. In the absence of choice, privacy will be best protected if approaches such as the following are built in:
Biometrics, by their nature, are generally inconsistent with anonymity. Yet the starting point for privacy is the ability of citizens to go about their business freely and unobserved. This issue has not been directly addressed in the IPPs governing Commonwealth agencies. However, the NPPs covering the private sector include a principle (NPP 8) that requires organisations to give individuals the option of not identifying themselves when entering transactions with an organisation where it is lawful and practicable to do so. ,This is a protection not included in the OECD guidelines or EU Directive and ,so is fairly untested.
The key will be determining practicality. Adopting a biometric system is likely to require a very large investment. If agencies and organisations do not consider the possibility of building in anonymity or at least ‘pseudonymity’ from the start, and if biometrics technology developers do ,not build in from the start the possibility of interacting anonymously or pseudonymously as far as possible, non-practicality is likely to be a foregone conclusion. NPP 8 will be a lame dog because in many cases it will not be practicable to retrospectively build in anonymity. For this aspect of privacy risk to be adequately addressed, systems developers, agencies and organisations will need to consider anonymity well before the question of individual complaint under the Act becomes ,an issue.
Similar issues have arisen in relation to the use by Commonwealth agencies of Public Key Infrastructure (PKI). To address these and other privacy issues arising out of the development of the Gatekeeper PKI trust framework, the National Office for the Information Economy (NOIE) has developed Gatekeeper privacy requirements and also asked the Privacy Commissioner ,to develop best practice guidelines for Commonwealth agencies to help them to design and implement PKI application and processes when using Gatekeeper digital certificates with individual clients.
Gatekeeper privacy requirement 12 reads:
The CA [certification authority] shall have the ability to provide anonymous ,or pseudonymous certificates where appropriate. Gatekeeper policy requires ,a PKI design that enables individuals to: choose to use any Distinguished Name in a certificate, except where it would be impracticable to do so; and conduct pseudonymous transactions except ,where the agency demonstrates that ,it is impracticable to do so.
Guideline 9 of the Primacy Commissioner’s Guidelines Privacy and Public Key Infrastructure: Guidelines for Agencies using PKI to communicate or transact with individuals (issued in December 2001) reads:
Agencies should provide their clients with anonymous and pseudonymous options for transacting with them, to the extent that this is not inconsistent with the objectives and operation of the relevant online application.
The PKI Guidelines also provide an additional protection here by requiring at Guideline 1 that:
Agencies should allow their clients ,to choose whether to use PKI for a particular transaction and to offer them alternative means of service delivery.
These Guidelines at least alert agencies to the need to consider and implement anonymity when considering the use of PKI. However, given that in many cases the object of PKI and biometrics is for authentication, the possibility of anonymity might seem impossible. I suspect the key to this conundrum will be in lateral thinking about how the technology might be developed. Dr George Tomko, Chairman of Photonics Research, Ontario, Canada has some ideas about how biometric encryption can be used to de-identify information in a database. The issue of using privacy enhancing technologies to achieve privacy protection through anonymity has been of major interest to the Information Privacy Commissioner ,in Ontario, Canada and the Registratiekamer in the Netherlands. They have examined the issue in detail in a two volume report, Privacy-Enhancing Technologies: The Path to Anonymity. These volumes examine the question in detail. Of particular interest is the work outlined in volume II, in which the Registratiekamer in conjunction with the TNO Physics and Electronics Laboratory looked in close technical detail at the components of information systems and reached conclusions about which components did or did not require identity to successfully function. It goes on to consider how information systems ,can be set up to specifically take ,into account these conclusions. ,I would strongly recommend that all developers of information systems involving personal information, and particularly those developing biometrics systems, take a close look at this work.
Because human characteristics collected are unique (even if the measures are not necessarily accurate), there is considerable potential for the data collected from biometrics to be used as a unique identifier. The privacy issues associated with unique identifiers are not new and are widely recognised. They were canvassed widely and loudly in the Australian community in relation to the proposal for an Australia Card in the 1980s. In response to these concerns, the Government instead strengthened the use of tax file numbers (TFNs) and enacted special legislation to give their use legislative protections, including strong penalties for misuse, making production of a TFN voluntary albeit with some incentive to produce it. The Privacy Commissioner has the power ,to monitor the records of the Commissioner of Taxation to ensure that he or she is not using TFN information for purposes beyond his or her powers and to ensure that he or she is taking adequate measures to prevent the unlawful use or disclosure of the TFNs he or she holds. The Privacy Commissioner also has the power to audit the records of TFN recipients for security, accuracy and compliance with guidelines the Office has issued. Despite the original intention to constrain the use of TFNs, successive parliaments have expanded the authorised uses for TFNs to assistance agencies and for superannuation purposes. For example, many people are now generally ineligible to receive social security payments unless they have provided their TFN.
It is possible that the lack of an all encompassing unique identifier in Australia has contributed to a lower incidence of identity theft than that found in the US, where the social security number has enabled a whole range of information about a person ,to be linked and then fraudulently appropriated.
So far it has been governments that have had the ability to generate the kind of unique identifiers that pose the greatest privacy threat. IPPs 9, 10 and 11 place some restrictions on use or disclosure for purposes other than the particular purpose of collection. Consent, awareness, authorisation by law, law enforcement and some other public interest reasons are exceptions ,to this rule.
The new private sector provisions recognise the potential for government identifiers to be taken up by the private sector and used as their own unique identifier. NPP 7 stops private sector organisations from adopting, using or disclosing Commonwealth government identifiers. However, the NPPs do not address directly the possibility of private sector organisations developing and using biometric information as unique identifiers and for it to be used by a number of private sector organisations to link data collected for one purpose to data collected for other purposes and then used for tracking a person or for yet another purpose.
The potential for function creep gives rise to the question of whether there may need to be additional legislative ,or other measures around the use of identifiers to address the threats biometrics may pose as a unique identifier. The State of Texas has adopted this approach for both public and private sectors, with quite stiff penalties for breach. The technical solutions proposed by Tomko are another. Taking away the temptation for function creep by the information or taking other steps to make linking and function creep impossible may be the answer.
It is hard to predict if a particular biometric or biometric technique will emerge as a unique identifier. For example, the handwritten signature is in widespread use in our society, but it has not emerged as such an identifier in advanced technologies. It will be a matter of keeping a close eye on whether a particular biometric technique does look like emerging as a unique identifier. I intend to contribute to this surveillance.
IPP 4 and NPP 4 both require organisations to take reasonable steps to protect the information they hold from misuse, loss or unauthorised access, as well as modification or disclosure.
Making these principles practical and effective, however, may take more work. For example, it may be important to develop biometric security standards to ensure that the requirements of these principles are met effectively and measurably. This is one of a number ,of areas raised in this article where ,the Biometrics Institute could make ,a very valuable contribution.
However, as with all things, the greater the prize, the greater the incentive to steal or otherwise abuse it and the greater the potential harm to individuals. This is the ‘Fort Knox’ syndrome, illustrated so well by the James Bond film Goldfinger. Biometric techniques, as with other powerful new technologies, may emerge with these characteristics in personal data protection, especially if one of the techniques begins to dominate. The situation is also far from static. Yesterday’s Fort Knox rapidly becomes tomorrow’s open door. For example, 128 bit encryption took over from 58 bit as the internet ‘standard’ only a few years ago. Yet 256 bits is already being discussed as needed in the near future.
The question of whether the security principles in the NPPs and the IPPs are up to these challenges again needs to be monitored closely. The two year review at the end of 2003 will be an early opportunity to do so in the case of ,the NPPs.
IPP 8 requires Commonwealth agencies to take steps that are reasonable in the circumstances to ensure that the information they use ,is accurate, up to date and complete. Information collected by agencies must be up to date and complete (IPP 3). ,NPP 3 requires private sector organis-ations to take reasonable steps to make sure that the personal information they collect, use or disclose is accurate, complete and up to date. This would seem to require agencies and organisations to consider for example, the false rejection rates and false acceptance rates of their biometric systems. This could mean, depending on the use for the biometric system, close attention to the false acceptance rate in particular, because false acceptances ,are the most likely to compromise the accuracy of personal information held on a biometric system. On the other hand, organisations and agencies would need to minimise the false rejection rate where acceptance is the key to eligibility for a benefit of a key service or benefit. Minimising false rejection rates will also be critical where a person has no choice about giving the biometric.
Given that there will always be some room for error, the right of a person to get access to the personal information an agency or organisations holds about them and to challenge or alter a result if it is wrong will be critical. NPP 6 and IPP 6 require organisations and agencies to give a person access to their personal information and to correct it if it is wrong. However, in addition to this there may be a need for additional mechanisms to ensure that a person is able, in a relevant forum, to challenge a decision or evidence based on a faulty or inaccurate biometric result. This will be particularly important where a wrong result may have a major impact on a person’s life. The forum could include in court, or via a Privacy Commissioner, or other administrative mechanism. In addition, as part of the principle of openness required by NPP 5, biometric developers and users should be open about the accuracy of their technology.
This article is my first major engagement with the privacy issues relating to biometrics. It is clear from this engagement that biometrics have the potential to benefit individuals and society and indeed could have privacy enhancing capabilities. However, it is also very clear that the potential of biometrics to be a privacy enhancing tool will only be realised, and the potential risks to privacy prevented, if very close attention is paid to privacy from the time that a biometrics system is just a ‘twinkle in the eye’. This means that agencies and organisations must consider, from the outset, the privacy risks that a proposed use of biometrics will pose and what privacy enhancing options there might be. It also means that those developing biometric software, hardware and other technology must build privacy protection and privacy enhancement into the system’s structure. One important key will be identifying when knowledge of a person’s identity will be necessary and when it will not be, and then building the system so components that require identity are kept separate from those that do not. Another will be building in structures that make linking of data for purposes other than the original purpose impossible.
If these measures are in place, the burden on regulation and law will ,be that much lighter.
The Act is a practical starting point for regulation. But it is light touch legislation and, at least so far as the private sector goes, is in the very early stages of operation. The Act aims to give individual’s control over their personal information by a number of means including information, consent in some circumstances and relying on market forces for people to be able to exercise choice about who they enter into transactions with and who they do not. Whether this approach is adequate or whether more is needed will be a question I will be tracking closely for the purposes of the review of the legislation in two years’ time.
Where biometrics are to be used in circumstances where people do not have choice about participation in systems using biometrics, including where there may be a covert use of biometrics, adequate privacy protection will require strong legal protection, including appropriate mechanisms for accountability. Legal protections could include specific measures such as:
It will be an interesting journey and will require constant vigilance and imagination on all sides if this fast moving area of biometrics is to realise its potential as the white knight of privacy.
Malcolm Crompton, ,Federal Privacy Commissioner.
 The Government has established ,a review to answer this question; Fact Sheet: Employee Records, 22 December 2000 at <www.law.gov.au/privacy/newfacts/EmployeeRecords.htm>.
 Hes R, Hooghiemstra T F M, Borking JJ ‘At face value: on biometrical identification and privacy’ Registratiekamer, September 1999 ,p 17, also available at <www.regisitratiekamer.nl/cgi-bin/modules/ print.cgi>.
 Above note 2.
 Future Directions, Crime Prevention, Legal Responses & Policy ,at <www.privacy.gov.au/news/speeches/sp34_files/frame.html>.
 For a description of some of these see Issues paper 26: Protection of Human Genetic Information October 200 1 p 370 available at <www.austlii.edu.au/au/other/publications/issues/26/>.
 See Office of the Federal Privacy Commissioner, Guidelines to the National Privacy Principles, September 2001 at <www.privacy.gov.au/publications/page2.html#59.75>, and ,Plain English Guidelines to Information Privacy Principles at <www.privacy.gov.au/publications/page1.html#1>.
 Above note 2 at p 9 who quotes Newham E The Biometrics Report ,(2nd ed) SJB Services UK 1999.
 Above note 2 at p 20.
 See <www.privacy.gov.au/government/guidelines/index.html#a>.
 Tomko G, ‘Biometrics as a privacy-enhancing technology: friend or foe of privacy?’ Department of Social Services Connecticut 15 September 1998 at <www.dss.state.ct.us/digital/tomko.htm>.
 Information and Privacy Commissioner/Ontario Canada and Registratiekamer, The Netherlands, Privacy-Enhancing Technologies: the Path to Anonymity Volume 1 August 1995 at <www.ipc.on.ca/english/pubpres/papers/anon-e.htm> and Registratiekamer, The Netherlands, Privacy-Enhancing Technologies: the Path to AnonymityVolume II August 1995 at <www.ipc.on.ca/english/pubpres/papers/anoni-v2.pdf>.
 Section 28 of the Act.
 Scott, R L ‘Protecting biometrics identifiers’ Health Law Prespectives Health Law and Policy Institute ,24 August 2001 available at <www.law.uh.edu/healthlawperspectives/Privacy/010824Biometrics.html>.
 For example, a set of standards developed by IBG include as a best practice standard that biometric information must never be used as ,a universal unique identifier, and sufficient protections must be in place to ensure to the degree possible that biometric information cannot be used ,as a unique identifier: Standard 2 ,at <www.bioprivacy.org/best_practices.htm.>.