Privacy Law and Policy Reporter
Graham GreenleafThe Commonwealth Attorney-General's announcement of the extension of privacy laws to the private sector (see 3 PLPR 81) did not propose any changes to the 11 Privacy Principles (IPPs) set out in s 14 of the Australian Privacy Act 1988. The Government's Discussion Paper Privacy Protection in the Private Sector (September 1996) noted that they would `form the basis' of the new law and that they `were drafted to apply equally to both the public and private sectors'.
It is therefore worth considering some of the consequences if the existing Privacy Principles became enforceable under Australian law in relation to private sector Internet service providers (ISPs), and Internet users, both in Australia and elsewhere. What is likely to happen if they are expected to comply with, and rely on, privacy laws and principles which have their origins `BC' (`before cyberspace') and have been drafted with little or no thought given to any special problems the Internet may raise?
The s 14 IPPs are `first generation' IPPs of the early 1980s, deriving in part from the OECD's privacy guidelines and the Australian Law Reform Commission's Report on Privacy (No 22, 1984), but with some significant strengthening in the aftermath of the `Australia Card'. They are less comprehensive than more recent formulations such as the Australian Privacy Charter (discussed later), but are similar to the privacy principles found in many other national privacy laws.
This article aims to raise some, but by no means all, of the issues which need to be resolved in drafting `second generation' sets of IPPs, privacy principles suitable for cyberspace, by using the s 14 IPPs as a starting point for criticism.
Throughout the article `ISP' is used to refer to any provider of services or content, unless a more specific usage is apparent from the context.
There are at least four possibilities even in the easiest case, that of e-mail addresses:
It is therefore a question of fact whether an individual's identity can be ascertained (though only with a degree of probability) from transactional details where only an e-mail address was collected, and it is a further question of fact whether it can `reasonably' be so ascertained. The fact that e-mail addresses can easily be `spoofed' also needs to be considered, though this situation will change with increased security measures on the Internet.
Every computer connected to the Internet has unique addresses (its numerical `IP address' such as 18.104.22.168, and its name under the Domain Name System such as bondi.austlii.ed.au). This applies equally to computers which have many users (either simultaneously or singly) and machines which normally have only a single user (such as on a person's desk at work or home). These addresses are the basis of some Internet protocols, most importantly the hypertext transmission protocol (http) which is the basis of the world-wide-web. Machine addresses (for example, law34.law.unsw.edu.au or arkady.austlii.edu.au) rarely directly identify a person, though this is not impossible. However, IP address and the domain and sub-domain information (for example, austlii.edu.au) allow the geographical and system location of the computer to be identified, along with the identify of the person with delegated responsibility to allocate the machine's addresses.
It is therefore a question of fact whether an individual's identity can be ascertained (though only with a degree of probability) from transactional details where only a machine address was collected, and it is a further question of fact whether it can `reasonably' be so ascertained, in the absence of generally accessible Internet facilities to match machine addresses with individuals. However, it would not be difficult for any police, private investigator or other person to make inquiries to determine who (if anyone) was the predominant user of a particular computer.
The scope of the Privacy Act's definition of `personal information' is therefore problematic in its application to cyberspace, and will require clarification by legislation or Commissioner's rulings, if the IPPs are to have any consistent or sensible application to cyberspace.
The approach of this definition misses the point to some extent. Information about, for example, the interests, understanding or consumption habits of a particular person can be aggregated by an Internet service provider (or providers), by use of e-mail or machine addresses, for purposes such as e-mailing customised direct marketing materials to that address, or to customise the appearance of a web page so as to appeal most to requests which come from a particular machine address. It makes no difference whether the ISP can `reasonably ascertain' the identity of the person who is associated with either the e-mail address or the http request, because the information about their consumption habits has been aggregated and used to market back to them, without them necessarily being aware of this or having consented to it. More serious consequences may also follow from such aggregation, such as decisions to limit access, or to deny some goods or services. If the definition of `personal information' excludes such activity, IPPs will be very weak in cyberspace.
Collection for a proper purpose
The principle that organisations must not collect personal information unless it is collected for a lawful purpose directly related to a function or activity of the organisation and is directly related to that purpose (IPP 1) has limited meaning in relation to commercial organisations which can define their functions and activities in broad and unconstrained ways. A broad definition at the outset can be used to avoid the `finality' principles imposing a significant limitation on later uses of personal information (discussed later).
Fair means of collection
The principle that the means of collection must be lawful and fair (IPP 1) could be interpreted to prohibit any surreptitious collection of personal information by Internet service providers, and also by employers and others with capacity for surveillance of Internet communications.
A major issue here will be whether the collection and aggregation of usage information identified by machine address, for the purpose of marketing uses (or research or other uses) will constitute `fair use', and if not what notice can be given to users so as to render it fair. A practical difficulty here is that web users may (usually) browse to any page on a web site, and do not necessarily start at some `front page', so questions of how to provide adequate notice -- and avoid giving it repeatedly to the same person -- raise difficulties not found in other contexts.
Informing users why information is collected
This principle will come as a shock to most ISPs. Before collecting information from the person the information is about (or as soon as practical thereafter), organisations must ensure that the person is generally aware of:
These requirements will apply most clearly whenever an ISP requires or requests a user of their facilities (typically, users browsing a web site) to enter name, address, e-mail address and other details into an online form.
The sting in this principle is that, having informed the user of the intended (internal) use of the collected information, and the intended (external) disclosures, the ISP is then bound by the finality principles to adhere to those stated purposes. Marketing and other uses must be disclosed `up front' or they will be a breach of the law.
This principle is currently limited to where `the information is solicited by the collector from the individual concerned'. It is therefore unlikely to apply where an individual browses a web site (thereby disclosing a machine address), unless the very provision of a web page (in a `public_html' directory after all!) is regarded as solicitation to disclose machine addresses to all those who care to access it (the `honey pot' interpretation of `solicit'). This issue will therefore have to be dealt with under the `fair collection' principle above (if at all).
Where an individual responds to an invitation on a web page to browse to some other site (which then captures the machine address), there may be solicitation -- but not necessarily by the party capturing the address! This weakness needs to be addressed.
The use principle
The current Act's `finality' principles provide that organisations may not use personal information for a purpose other than for which it was collected (IPP 10), except
(a) with the consent of the person;
(b) to prevent a serious and imminent threat to any person's life or health;
(c) as required or authorised by law;
(d) where reasonably necessary for the enforcement of criminal laws or revenue protection; or
(e) for a directly related purpose.
In the case of exception (d), but not otherwise, the organisation must keep a log of all such uses.
It is therefore crucial that ISPs define at the outset the purpose for which information is collected, and (where necessary) disclose this to users. Otherwise, any subsequent uses of the information will breach the law. These exceptions have been drawn with the interests of government agencies in mind. There is no exception, for example, for the `reasonably necessary protection of the interests of the organisation that controls the information'. The meaning of a `directly related purpose' is problematic, particularly in relation to marketing uses by the collecting organisation or its associated entities. Even the use of personal information captured in relation to suspected security breaches of a site might be problematic if it was not for the law enforcement exemption.
The disclosure principle
Organisations may not disclose personal information to anyone else (IPP 11), subject to the same exceptions (a)-(d) as apply to Principle 10. There is another exception, where the subject of the information is reasonably likely to be aware of the practice of disclosure. This is of crucial importance here, because it means that organisations can justify disclosures to third parties (for example, selling marketing lists) if they give `reasonable' disclosure of this to users.
The disclosure principle requires that the recipient of information under any of these exceptions may only use it for the purpose for which it was disclosed. This has not been of much importance as yet in relation to government agencies, but is likely to be important in relation to the private sector, if applied similarly. Recipients of personal information would be liable to actions taken against them by the subjects of the personal information, not only by the provider of the information, if they use it for purposes other than for which it was disclosed.
It is odd that the disclosure principle does not, on its face, prohibit a recipient from making any use of personal information which should not have been disclosed at all. However, the `fair collection' principle (discussed above) says that information may not be collected by `unlawful means', and it is arguable that this applies to information obtained in breach of another IPP.
`Unauthorised access' and `interception' offences in relation to the Internet
IPP 11 does not seem to address the fundamental issue of the obtaining of personal information without any consent of the disclosing organisation (as distinct from the subject of the information). In the Internet context, both `hacking' into databases (that is, unauthorised access) and unauthorised extraction of data by otherwise authorised users pose significant dangers to privacy. Although the requirement of collection by fair and lawful means could be relevant (IPP 1), it only applies if the party obtaining the information is bound to comply with the IPPs -- which would not apply at present, in the case of a hacker.
These problems are, in fact, comprehensively covered by the `computer crimes' provisions in the Commonwealth Crimes Act 1914.1 Part VIA (Offences Relating To Computers), supplemented by provisions in the Telecommunications (Interception) Act 1979 (Cth) (discussed in 3 PLPR 93). It is a more serious category of offence if access is to data which the person knows or ought reasonably to know relates to the `personal affairs' of a person (see s 76D(2)(b) of the Crimes Act 1914).
Access and correction principles
These principles provide that a person has a right of access to personal information held by an organisation (IPP 6) and that organisations must make corrections, deletions and additions to personal information to ensure that it is accurate and relevant, up-to-date, complete and not misleading (given the purpose of collection and related purposes) (IPP 7).
ISPs that collect personal information about users may need to develop secure online means of providing access and handling complaints of inaccuracy etc, or they may find that the costs of providing access and correction facilities by other means detract from the efficiencies of their online operations.
Failure by an ISP to protect personal information against misuse by adopting reasonable security safeguards would be a breach of the law (IPP 4). So it is not only the `hacker' or unauthorised user who can breach the law (see above), but also those responsible for lax security.
This principle includes an obligation on the ISP to do everything reasonably within its power to stop authorised recipients misusing the information.
Under the existing principles, any person has a right to know (on request) whether an organisation holds personal information (whether about the person or not), and if so:
(a) its nature;
(b) the main purposes for which it is used;
(c) the classes of persons about whom it is kept;
(d) the period for which each type of record is kept;
(e) the persons who are entitled to have access to it, and under what conditions; and
(e) how to obtain access to it (IPP 5).
Each organisation must maintain an inspectable register of this information, and must inform the Privacy Commissioner annually of its contents. The Commissioner then publishes an annual compilation (the Personal Information Digest) that receives little use.
It is questionable whether any benefit would be served by general private sector `registration' of such information with any central authority. However, such `openness' requirements would impose relatively little burden on ISPs if they were allowed to publish the required details on the net. The Commissioner could simply create an index of links, or better still a searchable copy of all entries, and it would be more useful than the current Digest.
Data quality principles
Organisations must take reasonable steps to ensure that personal information is accurate, up-to-date and complete (given the purpose of collection and related purposes) before using it (IPP 8), and may only use personal information for purposes to which it is relevant (IPP 9).
A purpose limitation principle
The Australian Privacy Charter addresses the question of defining the acceptable purposes of surveillance (the `Achilles heel' of finality) at the outset, but obliquely, by stating that potentially privacy-invasive systems should not be introduced `unless the public interest in so doing outweighs any consequent dangers to privacy'.
Principle 1 -- justification and exceptionsIt also requires a `precise' purpose for collection to be specified (Principle 11), though how `precise' remains undefined.
Technologies, administrative systems, commercial services or individual activities with potential to interfere with privacy should not be used or introduced unless the public interest in so doing outweighs any consequent dangers to privacy.
Exceptions to the principles should be clearly stated, made in accordance with law, proportional to the necessities giving rise to the exception, and compatible with the requirements of a democratic society.
The Charter also rejects `consent' as the sole touchstone of legitimacy:
Principle 2 -- consentThe self-defined purposes of the organisation are clearly not the determining factor in these principles. The Charter does not attempt to address how the principles are to be implemented, so it doesn't prescribe a process for determining the public interest considerations.
Individual consent justifies exceptions to some privacy principles. However, `consent' is meaningless if people are not given full information or have no option but to consent in order to obtain a benefit or service. People have the right to withdraw their consent.
In exceptional situations the use or establishment of a technology or personal data system may be against the public interest even if it is with the consent of the individuals concerned.
The Privacy Act's IPPs are deficient in not dealing with the way in which personal information which has previously been made available to the public in one form (usually written) is transformed by its provision on the Internet, including the use of search engines. This makes the more general issue of the secondary use of information available from public registers much more pressing.
The Australian Privacy Charter states one blunt solution:
Principle 17 -- public registers
Where personal information is collected under legislation and public access is allowed, these principles still apply except to the extent required for the purpose for which public access is allowed.
Anonymity and pseudonymity
While this principle could arguably be implied from the collection principles, the Australian Privacy Charter recognises the value, particularly in the cyberspace context, of an explicit right to anonymity.
Principle 10 -- anonymous transactionsPhysical and virtual transaction structures should be required to give the option of anonymous or pseudonymous transactions, wherever reasonable.
People should have the option of not identifying themselves when entering transactions.
The Privacy Charter also includes two other significant rights relevant to cyberspace:
Data export prohibitions in national laws
All European Union (EU) countries will soon prohibit the export of personal data to countries which do not have `adequate' privacy laws or meet other conditions (including individual consent).3 The privacy laws of Hong Kong, Taiwan and Québec also contain export restrictions.
Australian privacy laws should include export prohibitions, and the Australian Government's proposals suggest that such restrictions will be included (see 3 PLPR 83). They should prevent personal information held by Australian ISPs (or others) being transferred to countries such as the US where no effective privacy protections apply, unless there is explicit individual consent or other exceptions apply. However, it is very important that one of the main lessons of the Internet censorship debate should be learnt: no liabilities should be placed on Internet access providers who merely provide the communications and storage facilities -- only those who are responsible for the content should be liable.
Such restrictions will also increase the likelihood of our laws being regarded as `adequate' for the purposes of the EU Directive, and by other countries with similar restrictions, as they `close the loophole'.4 It is as yet unproven, but arguable, that strong privacy laws in a jurisdiction will serve more to attract businesses to locate there (for example, those with strong European links), than to deter them.
Need to provide protection for `foreigners'
Any extension of Australian privacy laws to cover the private sector should ensure that persons who live outside Australia have the same rights against Australia-based ISPs as do Australian citizens and residents.
Australia should try to ensure that the privacy laws of other countries extend the same protection to Australians. If any genuinely international privacy agreements emerge, one approach would be to include in such an agreement a requirement to extend such reciprocal national treatment to the citizens and residents of all state parties. However, it is questionable whether Australia should make the extension of such protection contingent on any international agreement, particularly as it may be necessary or desirable if Australia is to be considered to have `adequate' privacy laws in the terms of the EU Privacy Directive.
Limits of national laws -- international `good citizens'
Irrespective of what Australian laws say, a large proportion of Internet transactions from Australia will be with ISPs who are out of the effective reach of Australian law, situated in countries such as the US that have negligible relevant privacy laws. A high percentage of Internet privacy breaches against Australians are likely to be untouchable by our laws.
This is not an argument against the enactment of Australian privacy laws. The best that Australia can achieve is to be an international `good citizen' in cyberspace, looking after that part of the global problem which is in its backyard, and encouraging others to do likewise. Effective privacy protection in cyberspace must come from a patchwork quilt of national laws, eventually made more uniform by international agreements, and more enforceable through cooperation between national privacy authorities.
The `international black holes' argument is one variant of the argument that privacy laws dealing with cyberspace are futile because they can never be enforced -- at least not in the absence of massive surveillance of communications which is hardly consistent with privacy protection. Such argument can be rejected on two grounds. First, it is usually the case that there is only partial enforcement of any regulatory laws, but (if appropriately publicised and penalised) this does not stop such exemplary punishment having a strong effect. Second, the desire on the part of businesses to be law-abiding should not be underrated.
The Data Protection Commissioners' International Working Group on Data Protection in Telecommunications issued eight draft `guidance' principles in May 1996 (reproduced in this issue). From the `official' perspective, British Columbia Information & Privacy Commissioner David Flaherty puts it like this:5
I strongly believe that the Internet community needs to promote even more of a culture in which the tracking of digital footprints, by whatever method, is illegal, immoral and unethical without individual consent, despite all of the technological and commercial imperatives to the contrary.The `Internet community' is also starting to respond, such as by the July 1996 formation of a wide coalition of interested groups, the Global Internet Liberty Campaign (GLIC),6 who have included among their demands:
A note of caution on which to finish: John Perry Barlow once described the `European' predilection for relying on governments to protect privacy by laws and codes as `like having a peeping Tom install your window blinds'.7 While it is true that in many cases the best privacy protection is strong encryption, once our personal information is legitimately in the hands of government agency or Internet merchant, encryption is beside the point. International, accepted and enforceable privacy principles that make sense in cyberspace are then our best defence.
Graham Greenleaf, General Editor.
1. The complementary state and territory Acts will usually be unnecessary here, since Internet access requires use of telecommunications facilities: see ss 76D and 76E.
2. See 3 PLPR 88 for a discussion of the Robot Exclusion Standard; `Rogue' is used tongue-in-cheek because there are many robots that do not adhere to this voluntary standard at present, for technical reasons: see http://info.webcrawler.com/mak/projects/robots/active.html
3. See 2 PLPR 105, 127; see also G Greenleaf `A privacy code for Asia-Pacific cyberlaw' Journal of Computer-Mediated Communication , Vol 2, No 1, 1996, http://www.usc.edu/dept/annenberg/vol2/issue1/
4. See Greenleaf G, op cit, for a detailed discussion.
5. David Flaherty `Some reflections on privacy in electronic communications, with special reference to the Internet and the situation in British Columbia' Special Colloquium on `The Internet: Beyond the Year 2000', University of Toronto, May 1996.
7. John Perry Barlow `A pretty bad problem: forward to PGP user's guide by Phil Zimmerman'. http://www.eff.org/pub/Publications/John_Perry_Barlow/HTML/a_pretty_bad_ problem_article.html.