Privacy Law and Policy Reporter
Most of the world's privacy laws are based around sets of (variously named) privacy principles which formally derive largely from two sources: the OECD privacy Guidelines (1980)1 and the Council of Europe privacy Convention2. There has been a lot of enthusiasm lately for the latest progeny of this legislative line, the European Union Privacy Directive of 1995, and the legislation it has influenced (before and after birth) in Québec, NZ, Taiwan, Hong Kong and elsewhere3. Australia's Privacy Act 1988 was one of the most important early embodiments of this approach outside Europe, and the current proposals to extend the Act to the private sector (see 3 PLPR 81) do not involve significant changes to the principles.
In most respects, these developments are consolidations of the privacy instruments of the 1980s (and thinking of the 1970s)4. Such legislation is highly desirable, even if the principles have always contained deficiencies, but may no longer provide adequate protection against new surveillance technologies and administrative practices.
This paper takes a critical look at the content of the privacy principles of the 1980s, and at how three `1990s' instruments both reflect and go beyond the limitations of the 1980s model: The Canadian Standards Association's Model Code for the Protection of Personal Information (1995); the Australian Privacy Charter (1994); and the EU privacy Directive (1995).
One of the best and earliest formulations of this approach was by James Rule and colleagues in The Politics of Privacy (1980)6, who described the development of `the efficiency criterion' in the development of US and other privacy laws:
By this [`efficiency'] criterion, surveillance is considered acceptable provided that four conditions are met: first, that personal data are kept accurate, complete, and up to date; second, that openly promulgated rules of `due process' govern the workings of data systems, including the decision-making based on the data; third, that organisations collect and use personal data only as necessary to obtain `legitimate' organisational goals; fourth, that the persons described in the data files have the right to attest adherence to these principles.As Rule et al conclude, this is a most opportune definition of `privacy protection' if you are an organisation interested in surveillance:
By these criteria, organisations can claim to protect the privacy of those with whom they deal, even as they demand more and more data from them, and accumulate ever more power over their lives.This emphasis on making surveillance systems operate more justly and openly -- `efficiently' -- obscures the broader issues of the extent of surveillance that a democratic society should accept. What degree of surveillance is too intrusive, unforgiving or dangerous, irrespective of how fairly and openly it is done?
My argument, and Rule's, is that, while not discounting the value of the `efficiency' elements of Privacy Principles, the more important criteria against which any Privacy Principles or laws should be measured are that of the capacity they have to place limits on the extent of surveillance carried out, and, where appropriate, to stop proposed surveillance altogether.
This `limiting' or `critical' capacity of Privacy Principles (there isn't even an accepted term for it) is crucial because, if absent, any proposed surveillance scheme, no matter how broad in scope or intense in its intrusiveness, can be carried out fairly and openly. Under these circumstances the objective function of Privacy Principles, laws, and Commissioners will be, very often, to give legitimacy to surveillance schemes which they would otherwise lack, by pronouncing that such schemes do in fact comply with all the Privacy Principles. `The Privacy Commissioner will be kept fully informed' is the refrain often heard when politicians want to blunt public disquiet about some proposed surveillance measure -- if they know there is nothing the Commissioner can do to stop it.
So the question we need to ask about Privacy Principles, both those now in existence, and those proposed for the 21st century, is `to what extent do they act as smokescreens for surveillance?'.
According to the CSA's 10 principles, the organisation concerned is the only party that has a role in identifying the purposes for which personal information is collected (principle 2). Once an organisation has determined its purposes, everything flows from this: collection is limited to those purposes (principle 4); use, disclosure and retention are limited by those purposes (principle 5); and `data quality' (accuracy, completeness and currency) are to be appropriate to those purposes.
The Code has nothing to say about any limitations on how broad or multi-purpose these purposes may be, or how inherently intrusive a purpose may be. One strength of the Code is that the only exceptions to use and disclosure that it allows are `with the consent of the individual or as required by law'. However, it is noticeable that there is no similar requirement of compliance with law explicitly stated in relation to the determination of the purposes of collection of information8 - the assumption seems to be that this will normally be unconstrained.
The CSA Code adds the classic `efficiency' criteria of accountability (principle 1), security safeguards (principle 7), openness (principle 8), individual access (principle 9) and rights to challenge compliance (principle 10). Other than `finality' (principle 5), the Code does not impose any limitations on surveillance which go beyond the `efficiency criterion', with two possible exceptions.
There is a requirement of individual `knowledge and consent' (principle 3) to collection, use or disclosure of information, but both knowledge and consent are conditioned by the words `except where appropriate'. The Code indicates that the organisation concerned is to determine `appropriateness', but the Notes to principle 3 give examples of where consent might not be necessary. The openness of `appropriateness' is at least a point here at which it could be argued that individuals or third parties such as governments should have some say -- it is a potentially `limiting' or `critical' element in the Code, but one whose significance is not clear.
The consent principle also includes the valuable commentary that an organisation should not, as a condition of the supply of a product or service, `require an individual to consent to the collection, use or disclosure of information beyond that required to fulfil the explicitly specified, and legitimate purposes'.
The Code also includes the requirement of organisational `openness' about its personal data practices (principle 8), familiar since the OECD Guidelines. Openness is always two-faced: it is no formal limit on an organisation's surveillance practices to disclose what it does (if anyone asks), and it mutes opposition to say that there is nothing secret going on; but on the other hand information about practices is usually a pre-condition to effectively opposing them by political means, so `openness' has a critical edge.
In summary, there is relatively little in the CSA Model Code that goes beyond Rule's `efficiency criterion', other than the `finality' principle, and perhaps greater disclosure requirements. By itself, the Code may serve a valuable role in sanitising surveillance, but is less likely to serve to limit its growth and extent.
A rejoinder to this argument is that the CSA Code is not intended to operate in a vacuum, and that there may well be other external constraints, such as legislation, which limit the purposes for which information may be collected under principle 4. The commentary in the Code does mention that any applicable legislation must be considered in implementing it, and it does say that it is only a set of minimum requirements9. Discussions of the relationship between the Code and legislation that I have seen don't usually seem to mention the need for such legislation to extend the principles in the Code, or stress the partial nature of what it covers10. Something that claims to be a `Model Code' should at least within the wording of its ten Principles recognise and indicate the source of other constraints that are crucial to its subject matter. It is at best a Model Code for part of `the Protection of Personal Information' -- arguably the less important part -- and it should make this clear.
Is it a good idea to call for the immediate legislative implementation of the CSA Code11? This is `the pragvocates12 dilemma': is it better to have a valuable set of protections to ensure that surveillance will work more fairly when it occurs, even if by doing so you increase the likelihood that it will occur by providing a device which could legitimate it but which has little capacity to limit it? I suspect that it is a good idea, because fairness, openness and finality are so valuable in themselves, but this needs to be coupled with an energetic effort to obtain the other, more political, controls that are needed.
The Charter does not have the same pedigree as the CSA Model Code's genesis in hard negotiations between a wide variety of industry representatives and privacy advocates14. There are superficial similarities: the Charter came out of a process which involved participation by some industry representatives, and credit reporting and telecommunications industry representatives were involved throughout15. However, with these exceptions, the Charter drafting group consisted mainly of Australian privacy advocates and experts16, including senior staff of privacy agencies. It is best described as a `privacy advocates Charter'. As a `log of claims' by privacy advocates, it is valuable for that reason alone, as I will mention later.
The Charter contains all the `efficiency' and `finality' principles found in the OECD Guidelines and the Australian Privacy Act 1988, but its 18 Privacy Principles go further.
Acceptable `purposes', public interest and consent
The Charter addresses the question of defining the acceptable purposes of surveillance at the outset, but obliquely, by stating that potentially privacy-invasive systems should not be introduced `unless the public interest in so doing outweighs any consequent dangers to privacy'.
Principle 1 -- Justification & exceptionsIt also requires a `precise' purpose for collection to be specified (principle 11), though how `precise' remains undefined.
Technologies, administrative systems, commercial services or individual activities with potential to interfere with privacy should not be used or introduced unless the public interest in so doing outweighs any consequent dangers to privacy.
Exceptions to the Principles should be clearly stated, made in accordance with law, proportional to the necessities giving rise to the exception, and compatible with the requirements of a democratic society.
The Charter also rejects `consent' as the sole touchstone of legitimacy:
Principle 2 -- ConsentThe self-defined purposes of the organisation are clearly not the determining factor in these Principles. The Charter does not attempt to address how the Principles are to be implemented, so it doesn't prescribe a process for determining the public interest considerations.
Individual consent justifies exceptions to some Privacy Principles. However, `consent' is meaningless if people are not given full information or have no option but to consent in order to obtain a benefit or service. People have the right to withdraw their consent.
In exceptional situations the use or establishment of a technology or personal data system may be against the public interest even if it is with the consent of the individuals concerned.
It is also significant that the Charter requires the legitimacy of administrative systems and commercial services to be justified at the `system' level. In contrast, `purposes of collection' in the CSA Code and similar approaches focuses only on the collection of information and obscures the fact that it is the purposes of establishing whole surveillance systems that needs justification, not just collection.
Other new or `critical' elements in the Charter
Three other Principles in the Charter go beyond the usual formulations from the 1980s, and were considered to deserve separate statement, though it could be argued that some or all of them are implied by other Principles.
Principle 10 -- Anonymous transactionsIn the two years since it was launched, the Charter has had some influence in Australia. It is referred to in reports by regulatory bodies, and in speeches by the Australian Privacy Commissioner. It was recognised as a major source of inspiration and content for a privacy Bill introduced into the Australian Parliament by the Australian Democrats shortly before the 1996 Australian election17. It has been adopted as an `in house' code by a few private sector organisations.
People should have the option of not identifying themselves when entering transactions.
Principle 17 -- Public registers
Where personal information is collected under legislation and public access is allowed, these Principles still apply except to the extent required for the purpose for which public access is allowed.
Principle 18 -- No disadvantage
People should not have to pay in order to exercise their rights of privacy described in this Charter (subject to any justifiable exceptions), nor be denied goods or services or offered them on a less preferential basis. The provision of reasonable facilities for the exercise of privacy rights should be a normal operating cost.
Despite these encouraging signs, the Charter will almost certainly have less impact and widespread implementation than the CSA Code, for many reasons. However, one of the reason is that it does call into question the surveillance interests and practices of any organisation, in a way that the CSA Code does not.
However, the success of the Directive's extra-jurisdictional scare value also poses a problem for those of us outside Europe, as local legislators may accept the Directive as the definitive world standard of what is needed to protect privacy. For example, the NSW Attorney-General announced his intention to introduce privacy legislation on the basis that it would make NSW `the first state to meet the all-important [EU Directive], making NSW a magnet for business ventures'19. The Directive's wording has also had a direct influence on which personal data exports from Australia are to be allowed, not just on the acceptance of the need for restrictions on the export of personal data20.
Therefore, while accepting the legal and political value of the Directive within and outside Europe, we must still ask `to what extent does it embody a set of privacy principles which are adequate for the 21st century?'. This paper's contribution to this debate is simply to ask whether the Directive requires laws which go beyond `efficiency' considerations.
`Purposes' and the directive
The key provision in the Directive in relation to the purposes for which surveillance systems may be established is Art 7 which defines legitimate `processing of personal data', which includes legitimate collection of personal data, as well as its storage, use and disclosure (Art 2(b)). Processing is only legitimate if it comes within one of the conditions in Art 7, so there is no assumption that organisations can simply make internal decisions about the purposes of collection. Legitimacy can be founded on the consent of the data subject, to protect the vital interests of the data subject, on the legal obligations of the system controller, or where necessary in the public interest.
However, where it is only the interests of the operator of the surveillance system that provides the justification, legitimacy is limited to where it is `necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject ...'. Article 7 does not elaborate on how this balancing is to be achieved, but the preamble says that member states remain free to determine the appropriate balance in relation to use of information for `legitimate ordinary business activities' and conditions of disclosure for marketing purposes. Many of the most contentious privacy decisions are therefore still left to the member states to make, but the important thing is that the Directive explicitly includes `the interests or fundamental rights and freedoms of the data subject', and other public interests, as factors determining the legitimate `purposes' of surveillance systems.
Is this EU requirement `export quality'? Could a jurisdiction whose privacy protection was based on principles that left what the legitimate purposes of collection (and, therefore, `processing') solely to the self-assessment of the surveillance organisation be considered to have `adequate' privacy protection in EU terms21? Let's hope not.
Other critical provisions in the EU Directive
There are some other aspects of the Directive's substantive content which go beyond `efficiency' or define principles beyond the models of the 1980s. Those worth noting include:
(i) Where information is obtained from third parties, rights to be informed of matters such as the purposes of collection, obligatory nature thereof, intended recipients, and subject rights (A11);
(ii) Rights to have such corrections, erasures or blocking (suppression) of data communicated to third parties to whom the data has been disclosed (A12);
(iii) Rights to object to processing on `compelling legitimate grounds' (A14(1)), and an opportunity to object to data being used for direct marketing (by various forms of `opting out'22) (A14(2));
(iv) The subject's rights not to be subject to decisions significantly affecting him or her which are based solely on automated processing intended to evaluate personal aspects relating to an individual23, except where pursuant to a contract or legislative authority, and where there are suitable measures to safeguard the data subject's legitimate interests (A15). The subject's right of access must also include a right to know `the logic involved' in any such automated decisions (A12(1)).
(v) National laws are to specify `processing operations likely to present specific risks', so that `prior checking' of such systems by the supervisory authority can occur (A20)24.
(vi) A general prohibition on the processing of personal data `revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership' and health or sex life (A8(1))25, and even stricter provisions concerning data on offences or `security measures'26.
Many of these elements do, of course, derive from national privacy laws of particular EU member states, but in that form little attention was paid to them outside Europe except by privacy aficionados. Their `percolation up' into the EU Directive has the valuable effect that, properly handled, they can gain the attention of legislators in places like Australia and Canada -- and perhaps, one day, even in the US!
A toast to the EU Directive
Moderate consumption of Australian bubbly (we can't call it `Champagne' because of other EU initiatives and litigation) is an appropriate response to the principles of the EU Directive. They may be riddled with exceptions and weakened by unenthusiastic national implementations, and they don't address many of the emerging `21st century issues', but they do go a good distance beyond a mere concern with surveillance `efficiency'.
Graham Greenleaf, General Editor.
1. Organisation for Economic Cooperation and Development's Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (OECD, Paris, 1981).
2. Convention for the Protection of Individuals with Regard to the Automatic Processing of Personal Data (Convention No 108), in force since 1985.
3. See details in Greenleaf G `A privacy code for Pacific cyberlaw' Journal of Computer Mediated Communications, (Special Issue on Emerging Law on the Electronic Frontier') Vol 2 No 1 (1996), located at http://shum.huji.ac.il/jcmc/vol2/issue1/index.html
4. See the article by John Gaudin (p 43) for comments on the history of the OECD developments.
5. I use `surveillance' in a neutral or non-pejorative sense, `to indicate monitoring for all sorts of purposes, both helpful and coercive' , following Rule et al, op cit, (below) p 47.
6. James Rule, Douglas McAdam, Linda Stearns and David Uglow The Politics of Privacy, Mentor Books, 1980.
7. National Standard of Canada CAN/CSA-Q830-96; see also Colin Bennett `PLUS 8830 -- Implementing Privacy Codes of Practice', Canadian Standards Association, 1995; see also Colin Bennett `Privacy codes, privacy standards and privacy laws: The instruments for data protection and what they can achieve' Visions for Privacy in the 21st Century: A Search for Solutions, Laurel Point Inn, Victoria, British Columbia, 9-11 May 1996
8. See below concerning the general statement which appears in the Code's commentary.
9. See `1. Scope' -- Canadian Standards Association Model Code for the Protection of Personal Privacy, 1995
10. See for example, Bennett 1996 op cit Pt III `The Value-added of Data Protection Law'
11. Such calls have been made by the Canadian Direct Marketing Association (CDMA): see Bennett 1996 op cit p 11.
12. `Pragmatic advocate', coined by Simon Davies.
13. See <2 PLPR 44> for the full text of the Charter.
14. See Bennett 1996 op cit.
15. See Tim Dixon `Privacy Charter sets new benchmark in privacy protection' <2 PLPR 41>.
16. Including the writer.
17. See <2 PLPR 162>.
18. Directive of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Directive 95/46/EC of 24 October 1995.
19. See G Greenleaf ` `Revolutionary' NSW Bill to set the agenda' (1996) 3 PLPR 17.
20. See Commonweatlh Attorney-General's Department Discussion Paper - Privacy Protection in the Private Sector, 1996, p 20.
21. For a discussion of the meaning of the `adequacy' on non-EU privacy laws, see G Greenleaf `A privacy code for Asia-Pacific cyber-law' (1996) Journal of Computer-Mediated Communication, V2 No1, `Emerging Law on the Electronic Frontier', http://www.usc.edu/dept/annenberg/vol2/issue1/, and more briefly in G Greenleaf `European privacy Directive and data exports' <2 PLPR 105>.
22. National laws can provide either for objection after the data subject has been informed that the data is to be used for direct marketing, or merely at the data subject's request.
23. The 1990 draft was limited to decisions `involving an assessment of conduct', and referred to `personality or profile'. The Parliament recommended that this only apply to assessments of `character', that there should be an exception where there is consent, but that there would be a right to be informed of and to challenge any such automated processing. The 1992 draft referred to processing defining a personality profile.
24. The authority must be notified of such proposed operations by the controller or the data protection official (A20(2)).
25. Subject to numerous exceptions: A8(2)-(4).
26. These can only be kept under official authority: A8(5).