AustLII Home | Databases | WorldLII | Search | Feedback

Privacy Law and Policy Reporter

Privacy Law and Policy Reporter (PLPR)
You are here:  AustLII >> Databases >> Privacy Law and Policy Reporter >> 1996 >> [1996] PrivLawPRpr 68

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Gaudin, John --- "The OECD Privacy Principles -- can they survive technological change?" [1996] PrivLawPRpr 68; (1996) 3(8) Privacy Law & Policy Reporter 143


The OECD Privacy Principles -- can they survive technological change?

Part I

John Gaudin

In this article John Gaudin argues that technological changes require us to contemplate discarding the OECD principles and the data protection framework which has been built up on them. Proposals to amend Australia's federal privacy law, and to enact new State laws make this a timely discussion. (General Editor).

The Organisation for Economic Cooperation and Development (OECD) Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (OECD, Paris, 1981) were adopted in 1980 after an intense process of discussion and development. They have gone on to influence the shape of most data protection legislation of the past 15 years.

The Guidelines did not articulate a wholly novel set of principles. They are essentially an exercise in compromise which draw on principles identified by the US Privacy Study Commission as expressed in the 1974 US Privacy Act, which in turn have been traced to a series of five principles articulated in the US Code of Fair Information Practices developed by the Advisory Committee to the Department of Health Education and Welfare.1 These principles were further developed during the late 1970s in successive drafts for a Council of Europe Convention on the Protection of the Individual, vis-à-vis Automated Records2. The OECD Data Bank Panel released a set of eight core principles in 1977 which were modified to assume the current form in a draft released in December 1978. Final agreement was reached in June 1979 and the Recommendations were accepted by the OECD in September 1980.

The impetus behind the Guidelines came from concerns that disparate national data protection legislation could create unreasonable obstacles to international flows of information. The Guidelines were intended to serve a dual purpose, to standardise national and local responses and to provide a basis for the regulation of international flows of personal data. They are premised on the desirability of a free flow of information between countries which is best served by clear and acceptable regulatory standards. They have gone on to influence the form of national data protection regimes in those states which had not adopted such regimes before 1981. In the process they have had to confront a range of practical issues which would not have been immediately evident in the international context which gave rise to them.

Because of the dual role of Justice Michael Kirby, as Chairman of the OECD Expert Group and Chairman of the Australian Law Reform Commission, it is scarcely surprising that the Guidelines played an influential role in the Law Reform Commission's 1983 Privacy Report which in turn influenced the shape of the 1988 Commonwealth Privacy Act.3 As the Australian states consider adopting complementary information privacy legislation and the Federal Government begins to contemplate the extension of the Privacy Act into the private sector, it seems a good time ti ask whether the principles still reflect the best approach to the protection of personal information.

A variety of views have been expressed on this topic. Some commentators assert that the principles still provide an effective basic structure, and are sufficiently flexible to adapt to changing technologies and social processes. Others have suggested that the principles may no longer be appropriate, and suggest that the European Union's Data Protection Directive provides a more updated model to follow. The extent to which the Data Protection Directive represents a real advance on the Guidelines is a separate issue beyond the scope of this paper and one which may only emerge within a longer term perspective. It may be questioned whether the Data Protection Directive marks a real break with the original principles, to the extent that it was negotiated by data protection officials who are accustomed to think within the framework established by the OECD Guidelines and by the contemporary Council of Europe Convention. Article 6 of the Directive substantially incorporates the core OECD principles.

Technological change

It is trite to say that there have been many significant developments in information technology since 1981. The OECD experts contemplated a world in which flat-file databases were stored on mainframe computers by large government and corporate agencies and accessed from dumb terminals by human users using customised sequential data management programs. The personal computer only emerged as a viable business tool in 1982, and has undergone a process of intense development since then.

The process of networking has effectively been turned inside out, with intelligent terminals dominating the process of accessing data from a variety of servers, culminating in the complex phenomenon which is the Internet. Sequential programming is steadily being replaced with object oriented approaches to programming which select programming modules from off the shelf libraries to process a variety of different data. Relational and object oriented database structures are replacing the old flat-file databases and databases themselves are being distributed over networks. A major effort is going into further automation of information retrieval, through artificial intelligence based expert systems, data warehousing and in the techniques known collectively as data mining or knowledge discovery in databases.4 Multimedia and Geographical Information Systems are promoting a much more intensive overlay of data from a variety of sources.

Parallelling and interacting with these technical changes there have been a variety of significant social developments, the decentralisation and restructuring of management in both the public and private sectors, the contracting out and privatisation of traditional public sector functions, the collapse of command economies, the globalisation of the media, the spread of workplace automation and monitoring.

A detailed critique of the principles in the light of these changes is still lacking and this paper attempts to begin that process. I take it as axiomatic that any critique of the OECD principles needs to start from an assessment of the developments I have just outlined. In this article I wish to concentrate on the effect of technical change, rather than on the theoretical coherence of the principles themselves.

I anticipate the objection that guidelines in this area should be technologically neutral and that what I am suggesting are guidelines which are more closely identified with specific forms of technology and which will become even more rapidly obsolete. There are two possible answers to this objection. First I would concede that guidelines should be technologically neutral, but argue that the existing guidelines are deficient in this respect in that they implicitly assume particular forms of technology. Secondly, and more candidly, I would argue that technological neutrality is ideal which may be striven for, but is ultimately impossible to obtain.

My preferred position is an amalgam of these propositions. I do believe that the OECD Guidelines are a product of their time and reflect assumptions about the future development of information technology which we can now see to be limited.5 I also believe that the ideal of technological neutrality can be taken only so far, so as to encompass a variety of technical applications while orienting itself more specifically to the overall framework in which these can be delivered. It is still valid to promote guidelines which are not specific to particular technologies but which adapt to the social and technical environment produced by an accumulation of technical changes.

Basic principles

Part Two of the Guidelines identifies eight `Basic Principles of National Application'. They include:

These principles are given context by definitions of the expressions `data controller', `personal data' and `transborder flows of personal data' in principle 1, by an attempt to specify limits on the scope of the Guidelines in principle 2, and by a series of principles which attempt to define how the basic principles should be given national and international effect.

To a casual observer, the basic principles may not be immediately identifiable in data protection legislation like the Australian Privacy Act 1988. This is because they have been grafted onto a 'life cycle' or `information-processing cycle' model which is used to characterise the processing of data from its original collection from the individual concerned through to potential disclosure and eventual disposal. The principles are then inserted at each stage of the model where they are seen to be relevant.6

The life cycle model was popular among archival and record management practitioners at the time it was borrowed by the Law Reform Commission, but has since been largely replaced by a continuum model which emphasises the need for a consistent and coherent regime of management throughout the life of records.7 While the focus of this paper is not directed to this particular model, my conclusions about the OECD Guidelines will have obvious implications for such adaptations.

A broad critique will not be equally applicable to all of the principles. Indeed the continuing resilience of the Guidelines at a policy level stems from the fact that some of the principles are regarded as so eminently reasonable that they deflect criticism from those which are not. I therefore propose to focus successively on the eight general principles and single out the most significant developments in relation to each of them. My purpose is an essentially limited one, to ask how the principles stand up to the changes in the context of information processing. Clearly this does not exhaust what can be said about the Guidelines.

The purpose specification principle

This principle together with the use limitation principle are arguably the most crucial elements of the Guidelines, the ones which receive the greatest attention from data protection proponents and at the same time the most difficult ones to continue to take seriously. These principles therefore provide the most appropriate starting point for this review. The purpose specification principle states that the purposes for which personal data is collected should be stated at or before collection and that subsequent use should be limited to the fulfilment of these or compatible purposes or other purposes specified each time the purpose is changed.

The principle can be seen to have a threefold aim, to limit the collection of personal information to data which is necessary or relevant, to provide a basis by which data subjects can be informed how information collected from them will be used, and to inhibit subsequent reuse of personal data for `unrelated purposes'. The principle rests on the assumption that procedures for processing data can be specified in advance of collection and that this can effectively limit their subsequent use. Data protection advocates are fond of citing instances where application of the principle has benefited all parties concerned, by restricting the thoughtless collection of unnecessary data and thereby reducing the costs of storage and access.

This may have made sense in the 1970s where computer files on individuals were typically stored in customised flat-file databases which were designed to operate in specifically predictable ways on discrete combinations of data, when sorting and retrieval techniques were rudimentary and storage costs were high. Databases of this nature, the so-called legacy databases, still exist and their social significance cannot be dismissed. Increasingly, however, we are attempting to apply the principles to data which is stored and processed in radically different ways by a much broader body of users. The spread of microcomputers has been associated with off the shelf software designed to serve a variety of purposes. Database software development is undergoing a process of refinement making it more readily available for a variety of uses.

The relational database model which dominates the current generation of off-the-shelf database management programs, separates the physical storage of data from the way it is processed. It disintegrates the unitary aspects of data into a series of separate files or tables and emphasises the ability to recombine the data in a variety of relationships.

This tendency is carried further with object oriented programming and object oriented database management. Programs are put together using pre-constructed modules. A database consists of objects which represent entities in the real world which are defined by attributes and grouped together in classes. Information hiding is used to enable users to manipulate objects without needing to know how the object is defined.

It becomes difficult to establish a pre-defined purpose when a specific application is assembled from components and data which is already to hand or the user manipulates data objects without requiring any knowledge of the personal attributes it represents. Similar challenges to purpose specification and use limitation arise from the techniques collectively described as data warehousing, data mining and knowledge discovery in databases which involve the deliberate attempt to make existing data yield answers to new or unanticipated questions through the application of rule generation and artificial intelligence.

Multimedia applications and geographical information systems represent further challenges to the purpose specification principle. Multimedia promotes the conversion of a range of media into a digital form in which it can be combined and re-used in striking and unintended ways8. GIS systems overlay data from different sources for visual presentation in ways which are far more subtle than the rudimentary data matching or profiling which compares two or more text files.9

The problems posed by these new approaches to data processing are not altogether new. They have always existed in relation to research uses of existing data compiled for a purely administrative or care oriented purpose. Many regulatory regimes have recognised the special needs for research by providing mechanisms which avoid a consent requirement before using or disclosing data for research purposes. The changes I have referred to suggest that the research approach is becoming closer to the norm in data processing. In the process, the relationship between regulation of research and regulation aimed at personal data generally is being turned inside out.

The use limitation principle

The principle states that personal data should not be used or disclosed for purposes other than the purpose of collection except with the consent of the data subject or by authority of law. The principle is related to the purpose specification principle although here the emphasis is on subsequent use and on the circumstances in which uses for purposes other than the purpose of collection can be permitted. The main exceptions are consent and authority of law.

Disclosure is included in this principle to the extent that the Guidelines do not propose a separate principle covering disclosure. This could be seen as problematic, in that many of the concerns which people have overdisclosure of information involve disclosures to third parties for purposes which are unrelated to data processing, for example disclosures of criminal records or similar data from police computers to private citizens.

Consent to use is an ostensibly attractive way of giving individuals control over their information. It presents a number of problems when it is applied to specific data processing situations. First it assumes that it is relatively easy to establish a nexus between a data subject and a particular instance of data use. This is rarely the case when dealing with computerised data which is stored and processed in aggregate forms. In what sense can it be said that an individual's data is used, each time a computer runs a series of sorts which involve the processing of the data containing her records through a microprocessor?

The problem this presents for consent has been obvious for some time in the comparison of large scale individualised files known in a data protection context as data matching. In the Australian legislative context, the fact that these activities do not fit clearly within the Privacy Act's Information Privacy Principles has led to separate treatment of the data matching activities of selected agencies, and to non-mandatory Data-matching Guidelines issued by the Privacy Commissioner.

In a relational database the problem is posed in a somewhat different way. Which data do we identify as personal, when the particulars relating to an individual have been decomposed into a number of tables, where the identification of the person as an individual is either a fleeting grid of cross-references or is secondary to the main purpose of the data contained in relevant tables.

One solution is to adopt a more narrow definition of use which would exclude the automated processing which results in identifiable personal data. Even here, as the recent House of Lords judgment in R v Brown10 illustrates, the concept of use in a data processing context can be an elusive one. In this case a majority held that simply retrieving stored data onto a monitor did not amount to use within the meaning of the Data Protection Act. There had to be some form of human action which amounted to using the data.

Confining use to instances where there is some form of human intervention in the decision making process could also be seen to exclude forms of automated decision making, such as credit scoring, which impact on an individual even if it is impossible to say that another person has infringed on their privacy. Data protection would be effectively crippled if it only applied to instances where the affairs of one human were lodged in the consciousness of another human.

The other challenge to the viability of the use limitation principle stems from the kind of exemptions it invariably requires when incorporated into legislation. The most obvious example is the exemption for uses which are `authorised or required by law'. In some instances this represents an insuperable obstacle to forms of information processing which are otherwise reasonable. In other circumstances it serves as a trojan horse, whereby an organisation can prevent scrutiny of disclosures which would be seen as unreasonable against any other criteria.

Broad exemptions such as use for law enforcement or revenue protection purposes further disable the principle from having any application in areas where some control on potential misuse of personal information is arguably crucial. Traditional safeguards against misuse of power, like warrants, freedom from incrimination and exclusion of unfairly obtained evidence are vulnerable to these kind of exemptions.

Data protection clearly needs to restrict some uses of personal information. The question is whether permissible uses should be defined in relation to a concept of purpose which I have already suggested is problematic. We would arguably be better off with a principle which does not make such broad claims which are then negated in a way which undermines more traditional safeguards against abuse.

The collection limitation principle

There are three parts to this principle, an assertion of the need for limits to the collection of personal data, collection by lawful and fair means and where appropriate with the knowledge or consent of the data subject. There are two approaches to interpreting the collection limitation principle, a looser recognition that collection can be excessive and that some limits should be specified and a narrower insistence that a minimal amount of information should be collected by reference to the associate principles of purpose specification and use limitation. A suggestive explanation in the Guidelines' Explanatory Memorandum as to how the principle was arrived at suggests that the looser definition maybe the preferred one.

The Expert Group discussed a number of sensitivity criteria, such as the risk of discrimination, but has not found it possible to define any set of data which are universally regarded as sensitive. Consequently, para 7 merely contains a general statement that there should be limits to the collection of personal data. For one thing, this represents an affirmative recommendation to lawmakers to decide on limits which would put an end to the indiscriminate collection of personal data.11

Whichever interpretation is chosen the principle assumes that it is relatively easy to define what information is personal at the time of collection, otherwise we would have to interpret the principle as asserting collection of any kind of information should be limited. Under principle 1 personal data is defined as any information relating to an identified or identifiable individual (the data subject). Unfortunately it cannot be assumed that information has this personal quality at the time of collection, even if it is subsequently processed in a way which allows an individual to be identified.

One example would be the collection of transaction records of a genuinely anonymous payment card. Each transaction record may not constitute personal information, however assuming I have a record of all the transactions using one card, I only need an independent verification that one of these transactions was by an identified individual (for example a shopkeeper identifies the only individual who made a transaction with a card between 1 and 2pm). I can then reconstruct the individual's movements and activities. Similar issues arise with many transactions captured on a typical commercial relational database. While suppliers and customers are initially recorded as secondary elements in the transactions of primary interest to the business, sales and purchases, they are also capable of being redefined and targeted for marketing purposes. Geographical information systems take the process of combining data so that an identity ultimately emerges one step further. One may start with geographical coordinates or remotely sensed images, but by overlaying this with a variety of census, commercial and cadastral data a great deal of personal information may be ultimately revealed.

It may be argued that these are all instances where the individual is identifiable, and while this serves as a theoretical answer it doesn't really address the practical problem of knowing that an individual is capable of being identified at the point of collection. It is not practical to scrutinise every instance where data is captured to establish its capacity for identification. Increasingly data is captured automatically; do we make the computer responsible for determining whether it is potentially personal? It seems to me far more relevant to determine the capacity for data to be personal at the time it is processed rather than to rely on a characterisation at the point at which it is collected.

To the extent that the collection limitation principle implies that collection should be limited by reference to its purposes, the principle is dependent on the purpose limitation principle. I have already suggested that there are problems with this principle and these carry over when attempts are made to establish purpose based limits.

In some contexts purpose specification becomes a pretext for expanding rather than confining the collection of personal information. For example identification of users is an important means of regulating access to electronic databases, either for charging or for limiting access to authorised users. The collection limitation principle is too narrowly defined to promote alternatives to providing a personal information, such as support for anonymous or pseudonymous access to electronic databases. This is likely to become a crucial issue with the spread of one stop shopping for government information and services through multimedia kiosks.

The data quality principle

`Personal data should be relevant to the purposes for which they are used, and to the extent necessary for these purposes, should be accurate, complete and kept up to date'. I have few problems with this way of stating the principle which seems to me to encapsulate the essential concerns of data protection. If there is any potential for people to be harmed by data processing it would seem to arise from the use of inaccurate, incomplete or irrelevant data. However, it is necessary to consider the practical ways in which the principle might be applied, given the range of more complex forms of information processing I have already referred to.

Once we move beyond the flat-file database designed to perform a single clearly defined function we can no longer assume the relevance of data fora specific purpose in advance of its use. The response of some privacy advocates when I have raised this point is often one of `so much the worse for the relational model or whatever'. If the relevance of data to a pre-defined purpose cannot be established then it should not be collected!

I do not consider that this is a realistic position to take. It really encapsulates what I see as a commitment to adhere to a set of principles irrespective of whether they actually apply to the way we process information. Of course this objection need not arise if we are prepared to accept that relevance should be left to be determined at the time when records are processed.

Pierre Peladeau has recently drawn attention to one aspect of this principle which is frequently overlooked by data protection practitioners, the analysis of the actual processing of personal data to make decisions affecting an individual.12 He points out that:

an information item's quality, relevance and necessity can only be assessed in relation to the decision it is used in;
and that decisions of this nature incorporate, not only personal information about the subject, but also non-nominative personal information about other people and non-personal information. It is not only the personal data which needs to be relevant, accurate and up-to-date but also the non-nominative data which is combined with it. An example from the area of genetic testing would be the accuracy and `up-to-datedness' of probability statistics which may be used by insurers to estimate a risk from an identifiable genetic mutation.

It is not therefore only personal data which needs to relevant, accurate and up to date but any data used in combination with personal data in ways which could affect identifiable persons.

John Gaudin, Research Officer, NSW Privacy Committee. Part II of this article will appear in the next issue of PLPR.


1. US Personal Privacy in an Information Society: The Report of the Privacy Protection Study Commission, Washington 1977, pp 500-502.

2. Australia, Law Reform Commission, `Seminar on TransBorder Data Barriers and the Protection of Privacy', document 4, May 1978 at pp 19-20.

3. In fact it might be more accurate to say that the data protection principles incorporated into the Privacy Act more closely followed an earlier draft of the principles arranged to fit into a life cycle model of data from collection to disclosure.

4. For a discussion of the application of the Guidelines to KDD see D O'Leary, `Some Privacy Issues in Knowledge Discovery: The OECD Personal Privacy Guidelines', IEEE Expert, April 1995, p 48.

5. One of the more interesting attempts to anticipate future developments is the paper by K Lenk, `Information Protection Problems in Broadband Communications' presented to the 1974 OECD Seminar on `Policy Issues in Data Protection and Privacy', Paris 1976 p 278. Lenk's contribution was dismissed by the Australian delegate to the seminar as `esoteric'.

6. Australia, Law Reform Commission Report No. 22, Privacy vol 2, p 78; the inspiration for this approach can be traced to a series of research papers prepared by Kevin O'Connor and Mark Richardson for the Law Reform Commission in 1978. The subject matter of these papers, the records of the Commonwealth Census, employment records and health insurance records dictated a standardised arrangement of the material in a chronological form, which clearly prefigure the arrangement of the Privacy Act's IPPs. A seminar presentation by the Commission in June 1978 marked the progression of this editorial outline into a set of self-contained principles, which were subsequently presented in the Commission's 1983 Privacy Report. One might even suggest that the disproportionate focus on collection in the IPPs directly reflects the prominence given to collection in the Census research paper.

7. D Roberts, Documenting the Future: Policy and Strategies for Electronic Recordkeeping in the New South Wales Public Sector, Sydney, Archives Authority of NSW, 1995, pp 16-17.

8. Joel R Reidenberg, `Multimedia as a new challenge and opportunity in privacy: The examples of sound and image processing'.

9. H Onsrud, J Johnson & X Lopez, `Protecting privacy in using geographical information systems', paper from Conference on Law and Information Policy for Spatial Databases, Tempe, Arizona October 28-29, 1994.

10. (1996) 1 All ER 545.

11. OECD Guidelines, Explanatory Memorandum, para 51.

12. P Peladeau, `Data protection as an art: a matter of process', Privacy Files, 2, 1, 10 (October 1996).


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/PrivLawPRpr/1996/68.html