AustLII Home | Databases | WorldLII | Search | Feedback

Privacy Law and Policy Reporter

Privacy Law and Policy Reporter (PLPR)
You are here:  AustLII >> Databases >> Privacy Law and Policy Reporter >> 1997 >> [1997] PrivLawPRpr 17

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Gaudin, John --- "The OECD Privacy Prinicples -- can they survive technological change?" [1997] PrivLawPRpr 17; (1997) 3(10) Privacy Law & Policy Reporter 196


The OECD Privacy Principles -- can they survive technological change? - Part II

John Gaudin

In the first part of this article (< 3 PLPR 143>) John Gaudin argued that technological changes require us to contemplate discarding the Principles in the OECD's Guidelines on the Protection of Privacy and Transborder Flows of Personal Data and the data protection framework which has been built up on them. Here, he continues his analysis of each Principle (General Editor).

The security safeguards principle This principle calls for reasonable security safeguards against loss or unauthorised access, modification, use, disclosure or destruction. While few would quarrel with the need for security it is questionable whether the principle as stated is sufficiently specific to cover the wide range of contexts in which personal data is stored and processed. In the past, databases assumed similar and predictable features, leading to a degree of unanimity over what were reasonable security safeguards. Where personal data is held in a variety of ways it would seem appropriate for the principle to provide greater guidance on the principles applicable to security. Principles such as proportionality, accountability, timeliness and the requirements for training and continuous risk evaluation are familiar in standards dealing with computer security and should be explicitly incorporated in Guidelines on data privacy.

The security safeguards principle reminds us that data protection is not solely concerned with restricting the use to which personal information can be put. Data subjects also have an interest in the retention and preservation of data whose existence can be for their benefit. This consideration needs to be kept in mind when attempting to graft a retention limitation principle onto the original principles.

The Guidelines do not specify that personal data should be stored for a limited period, and there have been various attempts since to include a principle to this effect. The OECD's own internal Computerised Information and Privacy Principles adopted in 1988 included a separate Storage Duration Limitation Principle.13 Its application to personnel records, where there is a clear link between period of employment and appropriate retention, is relatively unproblematic. To extend the principle to personal records generally raises the problem of circular definition. Retention is justified because it is seen to be necessary. Quite apart from this, an overly prescriptive approach to retention has the potential to adversely affect the interests of data subjects.

Clearly there is a need to balance the interest in not having irrelevant or untimely information used and the interest in preserving data, either because the subject has an interest in future access to it or for historical or research purposes. In the European Communities original draft Data Protection Directive, personal data is to be `kept in a form which permits identification of the data subjects for no longer than necessary for the purpose for which the data are stored'.14

This was subsequently expanded to give explicit recognition to research interests.

The openness principle

Openness is urged as a general policy to be associated with developments practices and policies with respect to personal data. This should include identifying the whereabouts of the data controller and providing means for readily establishing the existence, nature and main purposes of its use. These examples underline the datedness of the Guidelines.

Legislation attempts to give effect to this principle by requiring organisations to disclose the ways they process personal information or by establishing systems for registering databases. Few of these systems can be seen as effective. Publications like the Australian Privacy Commissioner's Personal Information Digest are rarely consulted, not particularly informative, and involve disproportionate effort and cost. The British Data Protection Register experiences a high level of non-compliance and represents a substantial diversion of resources away from other areas of data protection.15

The principle clearly reflects a time when data was held in single purpose flat-file databases on a small number of government and corporate mainframes. To continue to insist on a strategy designed for these circumstances introduces an element of artificiality into the discourse of data protection. Simply to require agencies to notify details of their practices is no longer a realistic way of enlightening data subjects. It is one thing to require a handful of agencies involved in extensive data collection to be open about their practices. We have reached a point where personal information is processed by many organisations in a wide variety of ways which no individual subject can readily appreciate unless he or she makes a full-time study of the process. In other words, openness merely becomes a way of enlightening data protection professionals, whose view of the process often becomes increasingly ambiguous as they come to appreciate the sheer scale of personal data processing.

In saying this, I would certainly not wish to detract from the importance of openness as a vehicle for safeguarding peoples interest in their personal information. To recapture the essential thrust of the principle we need to harness the emerging interactive capacities of data processing equipment to give people a more immediate role in tracing the way their personal data is processed. In the process, we may need to relearn some of the lessons imparted by the original principles. For example if consumers are to be given the freedom to determine their own level of participation in an interactive environment in many circumstances this may favour opt-out rules in preference to the opt-in strategies which are customised for a more passive data surrendering environment.

The individual participation principle

Under this heading, the Guidelines propose a number of practical strategies to ensure meaningful individual participation in personal data processing. The individual should be able, on request, to identify data held about him or her, to have that data communicated to him or her, to challenge and have data corrected or removed, and to be given reasons where there is a valid reason for denying access or correction.

My reservation about this principle is that it may not go far enough in reflecting the ways current and anticipated technology to guarantee individuals meaningful participation. The remedies it proposes are still anchored in a realm where personal data accumulates in a few large databases where its presence can be queried, and where its relevance and accuracy can be corrected in a relatively straight-forward way.

Where data is replicated in multiple networks, the issue is not so much whether it is intrinsically accurate or relevant. Rather, it is whether its application is appropriate in a particular processing context. It is not realistic to expect each individual to check every site where his or her personal information may be processed. We need to use the interactive features of developing information technology to create opportunities for the individual to authorise specific uses or to confirm the relevance of information when uses occur.

Proceeding in this way is likely to overturn a number of privacy shibboleths. For example, it is likely to require one-stop-shopping whereby corrections to personal information are propagated across the network. On-the-spot confirmation and authorisation is likely to require some form of secure and universal identifier, for the convenience of the data subject rather than the data processor. The way in which individuals exercise their rights of access and correction need to be structured in a way which preserves accountability without sacrificing privacy. Requiring full identification before according individuals a right of access is not the best way to ensure this. We may well need to explore the technology associated with digital signatures to achieve an appropriate balance based on pseudonymous transactions.

The participation principle substantially predates experience of freedom of information legislation which has been used, more or less effectively to give individuals rights to access and correct personal data held by public sector agencies. Where access is introduced as part of data protection legislation it has the potential to overlap or conflict with rights of access under freedom of information.

The emphasis on individual participation can also be seen to define privacy as the limited preoccupation of the affluent in post-industrial societies. There is a growing appreciation of the ways in which advanced information technology is being used to exploit technologically disadvantaged societies or to enforce repugnant regimes.16 There is surely scope for expanding the guidelines in a way which recognises the rights of groups who are less able to participate meaningfully as individuals in the control of privacy invasive technology.

The accountability principle

A data controller should be accountable for complying with measures which give effect to the principles ... This principle seeks to fix responsibility for the storage and processing of personal data onto identifiable data controllers.

The emergence of networked and distributed data systems and more complex and spontaneous ways of processing data once it has collected complicates the application of this principle as well as a number of the other principles. The managerial revolution which displaces hierarchical methods of work organisation renders the position of the data controller a problematic one. It is often to identify who the data controller is or should be and to define the scope of his or her responsibilities within an organisation.

As with other principles, I suspect that the key to retaining the idea of accountability will depend on freeing ourselves from an obsession with data as things which can be `owned' by a single controller. If, instead, we adopt a process model then we can recognise that responsibility attaches at the point where data is processed. This implies a more diffused and participatory concept of accountability, which emphasises that different people or organisations can assume responsibility for the same data, depending on the ways it is being processed at a particular time.

The accountability principle could be seen to implicitly endorse the idea of a data protection authority capable of monitoring compliance with the guidelines and calling data controllers to account. However, Principle 19 expands on appropriate means of ensuring internal enforcement of the Guidelines and makes it clear that member states can choose between administrative, legal or other forms of enforcement which might not include a regulatory authority. It should be noted that the US has avoided creating a national regulatory authority, and its information privacy legislation has been consistently criticised for this failure by privacy advocates. In revising the Guidelines, it may be a good time to consider the strengths and weaknesses of regulatory authorities in data protection.

The global scope of computer networks creates difficulties both for national or regional data protection authorities and for organisations whose data processing activities take place across a range of jurisdictions. The original Guidelines involved an explicit recognition of this fact and an initial attempt to come to terms with it. The increasing challenge is being recognised through more extensive co-operation between data protection agencies, and through the promotion of forms of self-regulation such as standards which can operate across jurisdictions.

An argument which has so far been broadly implicit in this article is that data protection authorities themselves are apt to be overwhelmed by the scale of technical change. They often remain committed to traditional regulatory structures and principles which are demonstrably out-of-date. There is a temptation for regulators to pursue a series of increasingly quixotic and fatalistic rear-guard actions. I believe that the way to avoid this self-defeating process is to accept a much more participatory model of regulation which is inherent in the technology and which should be inherent in the principles. Data protection authorities should be one player among many. They should recognise that the respect for individuals which data protection encourages needs to come from designers and processors themselves. They should promote dialogue and negotiation rather than declaim principles from the high moral ground.

Automated decision making

Although the Guidelines do not refer to automated decision making, a passing reference to the subject is justified by its relevance to the kind of argument I am pursuing. Even before the Guidelines were finalised, attention was drawn to the absence of a principle specifically covering automated decision making affecting the interests of data subjects.17 The omission can be seen as a consequence of a political decision to avoid explicit references to automated data in the final version.18 The European Data Protection Directive tackles the issue of automated decision making more directly requiring openness about the logic used in processing involving automated decisions, and in art 15 asserting the right not to have decisions made on the basis of personal data solely by automated means.

Automated decision making has progressed beyond the paradigm example of credit scoring which would have been in the mind of regulators in the late 1970s. Credit scoring involves a fairly straight-forward series of computations, which are substantially under the direction of the management of the credit provider. First a cut-off score is established, based on aggregated characteristics of previous applicants, the pool of available funds and manageable levels of risk. Individuals are scored in relation to the cut-off point by reference to individual risk factors such as current commitments and past credit record and their degree of conformity to the ideal debtor (married, no children, home-owner etc). The management of the credit provider can decide when to fine tune the scoring system or when to substitute human judgement from the full rigour of automated scoring. The logic behind this process can be represented fairly easily, even though the actual methodology is likely to be guarded as a trade secret.

Automated decision making which incorporates expert systems and similar forms of artificial intelligence designed to elicit and update rules from existing data is being increasing applied in a wide variety of fields, including medical treatment as well as in the more `traditional' areas of credit and insurance. These forms of decision-making are less susceptible to the same degree of centralised control as the simple model I have described above. The methods employed and the knowledge which is utilised may be beyond the understandings of management. Explaining the logic of how decisions are made in individual cases presents a much greater challenge.

An obvious solution would be to require that no final decision adverse to the person concerned is made without human intervention, in accordance with the somewhat ambiguous formulation of art 15 of the Data Protection Directive. The practical effect of this requirement is likely to be severely limited, in that it assumes that a single user of the data would be ready to override a recommendation presented as the accumulated wisdom of an enterprise or profession.

Clearly data protection regulation needs to come to terms with the challenge of automated decision making but this is an issue which requires extensive and informed discussion which is beyond the scope of this paper.

Conclusion

I am not suggesting that we should abandon the search for an effective set of principles which can animate or mould more detailed regulation. The argument in favour of principles is that they offer a means of combining uniformity and flexibility with more detailed application to specific cases. Data protection is not an area where detailed black letter law can be accommodated comfortably with the variety of circumstances in which information is processed.

In the Explanatory Memorandum attached to the Guidelines it is stated that:19

There is a need for a continuing review of the Guidelines, both by member countries and the OECD. As and when experience is gained, it may prove desirable to develop and adjust the Guidelines accordingly.
The existing Guidelines reflect a political consensus between users, administrators and privacy advocates which is essentially grounded in the society, technology and culture of the late 1970s. We can hail the intention of their creators and the effectiveness of their response at the time. However, this does not require us to accept that the principles are sufficiently flexible to accommodate the extensive changes which have taken place since they were promulgated. I have identified a number of areas where the basic intention of the principles is still clearly relevant, but where the expression and emphasis needs to be altered to more clearly correspond with new processes. I do not accept that this is tantamount to abandoning the necessary technological neutrality of the Guidelines.

To the extent that the Guidelines attempt to frame collection, use and disclosure in accordance with a primary or overriding purpose and confine permissible uses by reference to this purpose, they reflect a limited or superseded model of data processing which we should be moving away from. Unless we are prepared to contemplate discarding the OECD principles and the data protection framework which has been built up on them, it is unlikely that we will ever commit sufficient resources to developing a new model capable of meeting the contemporary challenges for the protection of information privacy.

John Gaudin, Research Officer, NSW Privacy Committee (this article was written in his private capacity).


13. Secretary-General OECD, `Principles governing the protection of privacy in the creation and use of computerised personal data files', Paris 1988, Principle 10.

14. Council of the European Communities, `Draft proposal for a Council Directive approximating certain laws, regulations and administrative provisions concerning the protection of individuals in relation to the processing of personal data', version 8, 1990, art 16, Principle 1(e); the final version is expanded to read:

(e) kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which that data were collected or for which they are further processed. Member states shall lay down appropriate safeguards for personal data stored for longer periods for historical, statistical or scientific use.

15. CJ Bennet, Regulating Privacy, Ithaca and London 1992, p 103.

16. See, for example, Wayne Masden's 1994 Conference on Law and Information Policy for Spatial Databases paper, `Protecting Indigenous Peoples' Privacy from `Eyes in the Sky', and Privacy International's report, Big Brother Incorporated (http://www. privacy.org/pi/reports/big_bro/).

17. R Turn (ed), Transborder Data Flows: Concerns in Privacy Protection and Free Flow of Information, AFIPS, Arlington Va, 1979, p 62.

18. Ibid, p 76; OECD Expert Group on Transborder Data Barriers and the Protection of Privacy, `Second draft Guidelines governing the protection of privacy in relation to transborder follows of personal data', Paris, 15 February 1979, pp 4-5; OECD Expert Group, third meeting, summary statement by the chairman (uncorrected), pp 9-11.

19. OECD Guidelines, Explanatory Memorandum, p 27.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/PrivLawPRpr/1997/17.html