Privacy Law and Policy Reporter
John Fahey, Minister for Finance and Administration, when announcing the formation of the Government Public Key Authority (GPKA) in May, made a point of stressing that the Federal Government was ‘aware of the privacy concerns of some individuals due to the growing utilisation of electronic communications by all sectors of the community’. Having recently been embarrassed by his government paying insufficient attention to privacy issues in its initial proposals for outsourcing government functions, Mr Fahey was apparently keen to avoid being bitten twice by the same issue.
The Office of Government Information Technology (OGIT) did consult privacy groups, but only at the eleventh hour in the development of its blueprint for the GPKA. The Gatekeeper Report (Gatekeeper — A Strategy for Public Key Technology Use in the Government, May 1998) recognises and ignores privacy issues in about equal measures. The brief chapter on privacy reads as something of an ‘add on’ (though a valuable one), and its recommendations are watered-down in later parts of the Report. Gatekeeper doesn’t shut the door on privacy, but leaves it ajar, pending further developments.
The GPKA is to oversee the use of encryption technologies in communications by and to Commonwealth government agencies (Government Public Key Infrastructure — GPKI). It will affect uses for both authentication (digital signatures) and key exchange (for message encryption for confidentiality purposes). The GPKA will publish and administer accreditation standards for Certification Authorities (CAs) that wish to provide public key facilities to government, and will recommend standards within the GPKI. It will work through the Chief Government Information Officer (CGIO), head of OGIT. The GPKA is supposed to be subordinated to the root certification authority (PARRA) of a National Public Key Infrastructure (NPKI) once a NPKI structure exists, a matter on which the National Office of the Information Economy (NOIE) is expected to soon report.
The GPKI is explicitly intended to cover communications between agencies and the general public, such as benefit recipients or applicants, not just with business-to-government or inter-agency communications. Extensive consumer use of digital signature in communications with government may still be some years away, but the GPKI structure being built now will determine how those interactions will take place, and must be built with them in mind. However, consumer interests are rarely mentioned in the Report. Gatekeeper proposed a GPKA with four (voting) representatives from government agencies, and two (non-voting, ‘advisory’) representatives from IT industry associations (AIIA and AEEMA), but no representatives of consumer and privacy interests, reflecting this narrow focus. Some of the problems identified below may be a by-product of Gatekeeper’s focus on how the GPKI is intended to operate in relation to agencies, and to businesses contracting with government, rather than intended policy toward consumers. These problems need fixing before the GPKI is constructed.
The Gatekeeper Report is not very explicit on when end-users will be required to use digital signatures in order to interact with government, except for general references to the voluntary nature of their use. Digital signatures have the potential to become an internal passport to cyberspace ‘existence’, with the dual dangers of reducing the small scope for anonymity that remains, and the increased potential for data matching and profiling through the consistent presence of a digital signature. Genuine freedom to communicate and transact with government without the use of digital signatures needs to be preserved except to the extent that demonstrated public interest considerations require otherwise. Assurances of ‘voluntariness’ and choice are too vague unless we know whether they only mean that people can choose to communicate with government using non-electronic means, but if they want to communicate with government electronically, then use of a digital signature is mandatory (at least for some as yet unspecified purposes).
Since the Gatekeeper Report, OGIT has clarified that:
There will be no requirement for users to formally identify themselves electronically if there is no current requirement to identify themselves in the ‘conventional’ environment.
How will the Privacy Act 1988’s guarantee of collection of the minimum necessary personal information by government apply to government requirements for use of digital signatures? Does the Proportionality Principle of the OECD security policy Guidelines limit the scope of where digital signatures may legitimately be required? These legal and policy issues are not explicitly addressed by Gatekeeper. OGIT’s statement is a good start, but it is important that the legal issues are clarified, and the policy of voluntary digital signature use is more precisely defined and explicitly guaranteed.
Mr Fahey was quite explicit that individuals ‘may, if they wish, possess multiple key pairs’, ‘with a variety of different labels or pseudonyms’. Gatekeeper’s privacy chapter gives general endorsement to the use of multiple key pairs and pseudonyms, and recognition of the anonymity principle, but the application later in the Report qualifies this by stating that such transactions ‘may be undertaken’ after a risk analysis. There needs to be a clear government policy that the GPKI should allow anonymous and pseudonymous transactions wherever possible.
Gatekeeper states a ‘commit[ment] to a clear distinction in the use and application of PKI into two categories, authentication and key exchange’ and only separate key pairs will be endorsed. The dilemma for governments is that they want to maximise key recovery facilities for keys used for key exchange (to assist investigations), but must avoid any key recovery for authentication keys because this will compromise the non-repudiation of those keys when it comes to prosecution (or when others attempt to enforce contracts). Unfortunately for governments, business and consumers don’t have such a strong interest in keeping key pairs separate. Requiring separate key pairs makes it easier to institute surveillance measures in relation to key exchange (message encryption) while leaving the authenticity of digital signatures intact.
Private keys used for key exchange within the GPKI ‘must comply with relevant government policy’, says Gatekeeper , but it is ambiguous whether this may mean that key recovery must be available. Mandatory key recovery is a hot issue in many countries, but one that Gatekeeper seems to fudge. The requirement that private keys to be used for communication with government (but which may be used for many other communications) must comply with (unknown) government policies may be a device to extend the reach of government-approved key recovery schemes far beyond transactions with government. Will it be possible for business and consumers to use for transactions with government key pairs that are used for both purposes? Could (or should) this be stopped?
The OECD Guidelines for Cryptography Policy, while noting that processes for lawful access to cryptographic keys must distinguish keys that are used only for confidentiality, has as one of its most general principles:
Government controls on cryptographic methods should be no more than are essential to the discharge of government responsibilities and should respect user choice to the greatest extent possible.
How the Gatekeeper recommendations meet this requirement needs more explanation and justification. OGIT insists that the GPKI ‘is not setting standards for encryption, or key recovery’, but it clearly is doing so within the GPKI, and may do so de facto on a wider canvas.
Gatekeeper says nothing about controls on access to information relating to Certificate Revocation Lists (CRLs), other than a general but useful statement that CRLs should be dispersed, not centralised. The purpose of CRLs is to provide public notice of whether and when a digital signature has been revoked, and investigative agencies will have access to this information on the same basis as anyone else. However, there is far more valuable information for investigative purposes which is not likely to be part of the public CRL, namely the logs of all inquiries that have been made as to whether a certificate has been revoked. These logs, which CAs might keep in case of disputes about reliance upon their CRLs, will provide some history of who has been relying upon a person’s digital signature, and when, thereby assisting in building up a profile of that person’s activities. They are the equivalent of ‘call data’ in the telecommunications context, and access to them by investigative agencies should be controlled, by a requirement for a warrant or by some similar independent control.
CRL logs are only one aspect of the sensitive personal information that will be held by private sector Certification Authorities as part of the GPKI. Detailed proof of identity (POI) information will also be held by them, but Gatekeeper’s technical standards cannot give consumers an effective remedy against such private sector bodies collecting excessive information.
The Privacy Act 1988 needs to be extended to cover CAs in their activities as part of the GPKI. The ‘outsourcing’ legislation currently before Parliament to extend the Privacy Act to private sector contractors should be amended where necessary to achieve this.
Gatekeeper apparently accepts a choice of key generation methods, in line with the OECD guidelines, but this may be made illusory by the device of shifting liability for any choice of un-endorsed methods of key generation to those who use such methods. Gatekeeper states that ‘the user will have to accept liability arising from such a choice’ and requires ‘the burden of liability clearly moved to the individual or entity generating the keys’. To the extent that such shifting of liability is unjustified by the risk to government, the supposed choice is illusory. Depending on what liabilities may be shifted to a consumer or business, this may be a major consumer and business issue which could undermine confidence in the fairness of the GPKI. The government needs to provide considerably more clarification of what this burden-shifting means.
Gatekeeper’s ‘Legal Issues’ chapter ignores contractual and non-contractual relationships between CAs and end users (consumers or businesses), as if these were irrelevant to government concerns. When consumers or businesses place reliance on CAs, with whom they may or may not have contractual relationships, in order to be able to deal with government, the fairness of this legal relationship is the concern of government.
By a combination of being ‘first cab off the rank’ in establishing a vital part of Australia’s PKI, and by virtue of the inconvenience of having one standard for dealings with government and one for other dealings, one effect of the GPKI is likely to be that the standards it sets may become the norm for all types of authenticated and encrypted transactions. In many ways this may be no bad thing, but it can be a subtle device to obtain methods of surveillance that it would be difficult to achieve directly through legislation. The privacy issues in relation to the GPKI are more important than their immediate context indicates. Public policy considerations such as privacy should drive the development of PKI, not be treated as add-ons, risk factors or PR considerations. Gatekeeper does not provide sufficient answers to these concerns, and more detailed statements of government policies, guarantees, and some legislation, is needed.
Since Gatekeeper was released, I have been appointed by the government as a (non-voting) member of the GPKA, representing consumer and privacy interests. It is up to governments to find answers to these difficult privacy and consumer issues, but I will try to make it more difficult for them to be ignored.
Graham Greenleaf, General Editor. (email@example.com)
(Details of the GPKA, and the Gatekeeper Report, are at http://www.gpka.gov.au/)