Privacy Law and Policy Reporter
Lee A Bygrave
This is the second of a series of articles entitled ‘An international data protection stocktake @ 2000’, presenting a transnational perspective on the central features of data protection laws at the start of the new millennium. Part 1, ‘Regulatory trends’, appeared in (2000) 6 (8) PLPR 129 — General Editor.
The core principles applied by the data protection laws of many jurisdictions to the processing of personal data can be categorised under eight headings: fair and lawful processing; minimality; purpose specification; information quality; data subject participation and control; disclosure limitation; information security; and sensitivity. As we shall see, these categories are not always hard and fast; considerable overlap exists between them. Further, each of them is, in reality, a constellation of multiple principles.
This article analyses the constituent elements of these principles and discusses the main similarities and differences in their formal manifestation in the various data protection instruments. Detailed analysis of the scope and content of the principles and of the range of legal exemptions to their implementation is beyond the scope of the article.
The principles are primarily abstractions which denote the basic thrust of a set of legal rules. At the same time, they have a normative force of their own. This force is achieved in several ways. First, the principles (or a selection of them) have been expressly incorporated into certain data protection laws as fully fledged legal rules in their own right (though not always using exactly the same formulations as given in this article). Second, the principles function as standards which guide the balancing of interests by, for example, data protection authorities in the exercise of their discretionary powers. Finally (and closely related to the latter function), the principles help to shape the drafting of new data protection laws. This is most obviously exemplified by the considerable influence the 1980 OECD Data Protection Guidelines (OECD Guidelines) have had on the drafting of the legislation of certain OECD member states, particularly Australia and New Zealand.
The primary principle of data protection laws is that personal data should be ‘processed fairly and lawfully’. This principle is ‘primary’ because, as shown below, it both embraces and generates the other core principles of data protection laws. Concomitantly, the twin criteria of fairness and lawfulness are implicit in all of these principles, even if they are expressly linked in some instruments only to the means for collection of personal data or not specifically mentioned at all.
Of the two notions ‘fairly’ and ‘lawfully’, the latter is relatively self-explanatory. The notion of fairness is less obvious in meaning but potentially broader. An exhaustive explication of the notion of fairness probably can not be achieved in the abstract. Moreover, general agreement on what is fair will inevitably change over time. Nevertheless, at a very general level, the notion of fairness undoubtedly means that, in striving to achieve their data processing goals, data controllers must take account of the interests and reasonable expectations of data subjects; controllers cannot ride roughshod over these. This means that the collection and further processing of personal data must be carried out in a manner that does not, in the circumstances, intrude unreasonably upon the data subjects’ privacy nor interfere unreasonably with their autonomy and integrity. In other words, fairness requires balance and proportion. These requirements are applicable not just at the level of individual data processing operations; they are equally applicable to the way in which the information systems supporting such operations are designed and structured.
In light of these requirements, fairness also implies that a person is not unduly pressured into supplying data on him or herself to a data controller or accepting that the data be used by the latter for particular purposes. Arguably, fairness therefore implies a certain protection from abuse by data controllers of their monopoly position. While very few data protection instruments expressly address the latter issue, some protection from abuse of monopoly can be read into the relatively common provisions on data subject consent, particularly the requirement that such consent be ‘freely given’.
The notion of fairness further implies that the processing of personal data be evident to the data subject. Fairness not only militates against surreptitious collection and further processing of personal data; it also militates against deception of the data subject as to the nature and purposes of the data processing. Arguably, another requirement flowing from the link between fairness and transparency is that, as a point of departure, personal data shall be collected directly from the data subject, not from third parties. This requirement is expressly laid down in some but not the majority of data protection instruments.
As mentioned above, fairness implies that data controllers must take some account of the reasonable expectations of data subjects. This has direct consequences for the purposes for which data may be processed. It helps to provide the ground rules for the purpose specification principle (dealt with in more detail below) and sets limits on the secondary purposes to which personal data may be put. More specifically, it means that when personal data obtained for one purpose are subsequently used for another purpose which the data subject would not reasonably anticipate, then the data controller may have to obtain the data subject’s positive consent to the new use.
A second core principle of data protection laws is that there should be restrictions on the amount of personal data collected; the amount of data collection should be limited to what is necessary to achieve the purpose(s) for which the data are gathered and processed. This principle is summed up here as ‘minimality’, though it could just as well be summed up in terms of ‘necessity’ or ‘non-excessiveness’. In some data protection instruments, the principle is described as ‘proportionality’.
As with the principle of fair and lawful processing, the principle of minimality is manifest in a variety of provisions. It is most obviously manifest in provisions along the lines of art 6(1)(c) of the EC Directive on Data Protection which stipulates that personal data must be ‘relevant and not excessive in relation to the purposes for which they are collected and/or further processed’. It is also manifest in provisions such as art 6(1)(e) of the Directive, which requires personal data to be erased or anonymised once they are no longer required for the purposes for which they have been kept. The minimality principle is further manifest in the Directive’s basic regulatory premise — embodied in arts 7-8 — which is that the processing of personal data is prohibited unless it is necessary for achieving certain specified goals.
The minimality principle does not shine so clearly or broadly in all data protection instruments as it does in the Directive. For instance, the 1990 UN Data Protection Guidelines (UN Guidelines) and the OECD Guidelines omit an express requirement of minimality at the stage of data collection (though such a requirement can arguably be read into the more general criterion of fairness, as described above). The OECD Guidelines also omit a specific provision on the destruction or anonymisation of personal data after a certain period. Again, though, erasure or anonymisation may be required pursuant to other provisions, such as those setting out the principle of ‘purpose specification’ (see below). Most (but not all)  national laws make specific provision for the erasure of personal data once the data are no longer required.
Rules encouraging transactional anonymity are also direct manifestations of the minimality principle. Currently, very few data protection laws contain rules expressly mandating or encouraging transactional anonymity. However, it is arguable that such requirements may be read into the more commonly found provisions (described above) in which the minimality principle is manifest, particularly when these provisions are considered as a totality.
Another core principle of data protection laws is that personal data should be collected for specified, lawful or legitimate purposes and not subsequently be processed in ways that are incompatible with those purposes. This norm is often termed the principle of ‘purpose specification’. Sometimes the terms ‘purpose finality’ or ‘purpose limitation’ are employed instead.
The principle has three separate components, each of which may be regarded as a principle in itself:
(1) the purposes for which data are collected should be specified/defined;
(2) these purposes should be lawful/legitimate; and
(3) the purposes for which the data are further processed should not be incompatible with the purposes for which the data are first collected.
The term ‘purpose specification’ denotes the first listed principle more aptly than the latter two. Nevertheless, the notion of purpose specification is used here to cover all three principles.
The requirement for purpose specification is prominent in all of the main international data protection instruments. It is also prominent in most (but not all) of the national laws and/or in administrative practice pursuant to them. Some laws stipulate that the purposes for which data are processed shall be ‘lawful’. Other laws, such as the EC Directive and Council of Europe Convention on Data Protection (the CoE Convention), stipulate that such purposes shall be ‘legitimate’.
Fairly solid grounds exist for arguing that the notion of ‘legitimacy’ carries the criterion of social acceptability — personal data should only be processed for purposes that do not run counter to predominant social mores. In other words, the purpose specification principle, insofar as it uses the legitimacy criterion, can arguably be said to harbour a ‘social justification principle’ similar to that proposed by, inter alia, the NSW Privacy Committee and Michael Kirby. Nevertheless, the question remains of how such mores are to be defined. Are they to be defined in terms of procedural norms which hinge on a criterion of lawfulness (for example, that the purposes for which personal data are processed should be compatible with or fall naturally within the ordinary and lawful ambit of the particular data controller’s activities)? Or do they also embrace more than a lawfulness criterion (for example, that the data controller’s activities are socially desirable in the sense that they promote or do not detract from some generally valued state of affairs constituted by, say, a particular balance between privacy related interests and economic interests)?
The bulk of data protection instruments seem prima facie to comprehend legitimacy in terms of procedural norms hinging on a criterion of lawfulness; very few expressly operate or have operated with a broader criterion of social justification. Nevertheless, the discretionary powers given by some laws to national data protection authorities have enabled them to apply a relatively wide ranging test of social justification, particularly in connection with the licensing of certain data processing operations. While this ability is being reduced in line with reductions in the scope of licensing schemes, it will not disappear completely.
The principle of information quality stipulates that personal data should be valid and accurate with respect to what they are intended to describe, and relevant and complete with respect to the purposes for which they are intended to be processed. All data protection laws contain rules directly embodying the principle, but they vary considerably in their wording, scope and stringency.
Regarding the first element of the principle (the validity of data), data protection laws use a variety of terms to describe the stipulated data quality. Article 5(d) of the CoE Convention and art 6(1)(d) of the EC Directive state that personal data shall be ‘accurate and, where necessary, kept up to date’. The equivalent provisions of some other data protection instruments refer only to a criterion of accuracy/correctness, while still others supplement the latter with other criteria, such as completeness.
With regard to the principle’s second element the EC Directive formulates this as a requirement that personal data are ‘adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed’ (art 6(1)(c)). Some data protection instruments refer to the criteria of relevance, accuracy and completeness but do not refer to non-excessiveness.
Finally, variation exists in terms of the stringency with which data protection instruments require checks on the validity of personal data. The standard set by the EC Directive, for example, is in terms of ‘every reasonable step must be taken’ (art 6(1)(d)). By contrast, the UN Guidelines emphasise a duty to carry out ‘regular checks’ (principle 2). Many other data protection instruments, including the OECD Guidelines and CoE Convention, do not explicitly address the issue of quality checks at all, although their requirements that personal data ‘should’ or ‘must’ be of a certain quality imply the need for some sort of checking system.
A core principle of data protection laws is that individuals should be able to participate in, and have a measure of influence over, the processing of data on them by other individuals or organisations. This principle embraces what the OECD Guidelines term the ‘Individual Participation Principle’ (see para 13), though the rules giving effect to it cover more than what is articulated in that particular paragraph.
Data protection instruments rarely contain one special rule expressing this principle in the manner formulated above. Rather, the principle manifests itself more obliquely in a combination of several categories of rules. First, there are rules which aim at making people aware of data processing activities generally. The most important of these rules are those requiring data controllers to provide basic details of their processing of personal data to data protection authorities, coupled with a requirement that the authorities store this information in a publicly accessible register.
Second, and arguably of greater importance, are a category of rules which are aimed at making people aware of basic details of the processing of data on themselves. This category of rules can be divided into three main subcategories:
(1) rules requiring data controllers to collect data directly from data subjects in certain circumstances;
(2) rules prohibiting the processing of personal data without the consent of the data subjects; and
(3) rules requiring data controllers to orient data subjects directly about certain information on their data processing operations.
As noted above, rules falling under the first subcategory are found only in a minority of data protection instruments, though such rules could and should be read into the more common and general requirement that personal data be processed ‘fairly’. Examples of the second subcategory of rules are provided below.
As for rules belonging to the third subcategory, influential examples of these are arts 10-11 of the EC Directive which, in summary, require data controllers to directly supply data subjects with basic information about the parameters of their data processing operations, independently of the data subjects’ use of their own access rights. None of the other main international data protection instruments lay down such requirements directly. National data protection laws often make this a requirement only in cases when data are collected directly from the data subject. Some national laws have a notification requirement for particular kinds of data processing, such as disclosure of customer data or health research,  though notification in such cases has been independent of whether or not the data controller has collected the data directly from the data subject. It is expected that the current notification requirements pursuant to national laws of at least EU and EEA member states will be harmonised and expanded in accordance with the EC Directive. At the same time, some of the newly enacted national laws within Europe stipulate duties of information which go beyond the prima facie requirements of arts 10-11 of the Directive. These duties of information arise in connection with certain uses of personal profiles and video surveillance.
There are also rules which grant individuals the right to gain access to data kept on them by other individuals and organisations. Most, if not all, data protection instruments provide such a right. An influential formulation of this right is given in art 12 of the EC Directive. This provides individuals with a right of access not just to data relating directly to them but also to information about the way in which the data are used, including the purposes of the processing, the recipients and sources of the data, and the ‘logic’ involved in certain automated data processing operations. The right in art 12 is similar to, but more extensive than, the equivalent rights found in the other main international data protection instruments. None of the latter, with the exception of the UN Guidelines, specifically mention the subject’s right to be informed of the recipients of data. None mention the right to be informed of the logic behind automated data processing. Most national laws also omit specification of these rights, though the Directive should soon bring about a change in this situation — at least in Europe.
The third major category of rules are those which allow persons to object to others’ processing of data on themselves and to demand that invalid, irrelevant or illegally held data be corrected or erased. The ability to object is linked primarily to rules prohibiting various types of data processing without the consent of the data subjects. Such rules are especially prominent in the EC Directive, relative to older data protection instruments. Some older instruments make no express mention of a consent require-ment, while others often stipulate consent in fairly narrow contexts — for example, as a precondition for disclosure of data to third parties. It is important to note that consent is rarely laid down as the sole precondition for the particular type of processing in question; consent tends to be one of several alternative prerequisites. This is also the case with the EC Directive. The alternative prerequisites are often broadly formulated, significantly reducing the extent to which data controllers are hostage to the consent requirement in practice.
A specific right to object is also laid down in some data protection laws. The EC Directive contains important instances of such a right, namely in art 14(a) (which provides a right to object to data processing generally), art 14(b) (which sets out a right to object to direct marketing) and, most innovatively, art 15(1) (stipulating a right to object to decisions based on fully automated assessments of one’s personal character). These rights to object are not found in the other main international data protection instruments. Neither are they currently found in the bulk of national laws, though this situation will change in the near future — at least in Europe — largely under the influence of the Directive.
With respect to rectification rights, most data protection instruments have provisions which give persons the right to demand that incorrect, misleading or obsolescent data relating to them be rectified or deleted by those in control of the data, and/or require that data controllers rectify or delete such data.
Another core principle of data protection is that data controllers’ disclosure of personal data to third parties should be restricted so that disclosure may occur only upon certain conditions. In practice, disclosure limitation means as a bare minimum that personal data ‘should not be disclosed ... except: (a) with the consent of the data subject; or (b) by the authority of law’.
The principle of disclosure limitation, like that of individual participation and control, is not always expressed in data protection instruments in the manner formulated above. Moreover, neither the CoE Convention nor the EC Directive specifically address the issue of disclosure limitation but treat it as part of the broader issue of the conditions for processing data. Thus, neither of these instruments apparently recognise disclosure limitation as a separate principle but incorporate it within other principles, particularly those of fair and lawful processing, and purpose specification. The OECD Guidelines incorporate the principle of disclosure limitation within a broader principle termed the ‘Use Limitation Principle’ (para 10), while the UN Guidelines specifically address the issue of disclosure under the principle of purpose specification.
Nevertheless, disclosure limitation is singled out here as a principle in its own right because it tends to play a distinct and significant role in shaping data protection laws. Concomitantly, numerous national statutes expressly delineate it as a separate principle or set of rules.
The principle of information security stipulates that data controllers should take steps to ensure that personal data are not destroyed accidentally or subject to unauthorised access, alteration, destruction or disclosure. Representative provisions to this effect are art 7 of the CoE Convention and art 17 of the EC Directive.
The principle of information security has occasionally manifested itself in relatively peculiar provisions. Especially noteworthy is s 41(4) of Denmark’s Personal Data Act of 2000. This states that for personal data which are processed for the public administration and which are of special interest to foreign powers, measures shall be taken to ensure that they can be disposed of or destroyed in the event of war or similar conditions.
The principle of sensitivity stipulates that the processing of data which are especially sensitive for data subjects should be subject to more stringent controls than other data. The principle is primarily manifest in rules that place special limits on the processing of predefined categories of data. The most influential list of these data categories is provided in art 8(1) of the EC Directive, which includes ‘racial or ethnic origin’, ‘political opinions’, ‘religious or philosophical beliefs’, ‘trade union membership’, ‘health’ and ‘sexual life’. Further, art 8(5) makes special provision for data on criminal records and the like. Similar lists are found in other data protection instruments at both international and national level, but these vary somewhat in scope. For instance, the list in art 6 of the CoE Convention omits data on trade union membership, while the list in the UN Guidelines includes data on membership of associations in general (not just trade unions). The lists in some national laws also include, or have previously included, data revealing a person to be in receipt of social welfare benefits. References to this sort of data will, however, have to be dropped from the lists of the data protection laws of EU and EEA member states, given that the list of data categories in art 8(1) of the Directive is intended to be exhaustive.
Singling out relatively fixed subsets of personal data for special protection breaks with the otherwise common assumption in data protection discourse that the sensitivity of data depends on the context in which the data are used. Accordingly, attempts to single out particular categories of data for special protection independent of their context has not been without controversy. Further, not all data protection instruments contain extra safeguards for designated categories of data. This is the case with the OECD Guidelines and many data protection laws of the Pacific Rim countries. Similarly, the older data protection regimes of some European countries — notably Austria, Germany and the UK — have provided relatively little protection for such data.
The absence of such safeguards in the OECD Guidelines appears to be due partly to failure by the Expert Group responsible for drafting the Guidelines to achieve consensus on which categories of data deserve special protection, and partly to a belief that the sensitivity of personal data is not an a priori given but dependant on the context in which the data are used. The absence of extra protections for designated categories of especially sensitive data in national data protection laws would appear to be due to many of the same considerations, plus an uncertainty over what the possible extra protection should involve.
Lee A Bygrave, Research Fellow, Norwegian Research Centre for Computers and Law.
 At an international level, see for example art 5(a) of the 1981 Council of Europe Convention on Data Protection (CoE Convention) and art 6(1)(a) of the 1995 EC Directive on Data Protection (EC Directive). At a national level, see for example art 9 of Italy’s 1996 Law on Protection of Individuals and Other Subjects with Regard to Processing of Personal Data and Data Protection Principle 1 in Sch 1 to the UK Data Protection Act 1998.
 The case, for instance, with the OECD Guidelines (see para 7).
 The case, for instance, with the Norwegian Personal Data Registers Act 1978 (now repealed).
 The most notable exception is s 3(3) of the German Teleservices Data Protection Act 1997. Compare also principle 18 of the Australian Privacy Charter of 1994.
 See, for example, art 2(h) of the EC Directive.
 The link between fairness and transparency is made explicit in, inter alia, recital 38 of the EC Directive.
 The connection between fairness and non-deception is emphasised in, inter alia, s 1(1) of Pt II of Sch 1 to the UK Data Protection Act 1998.
 Examples of express provision are s 5(1) of Canada’s federal Privacy Act 1982, Information Privacy Principle 2 of the NZ Privacy Act 1993 and National Privacy Principle 1.4 in Sch 3 to Australia’s federal Privacy Act 1988 (as amended).
 This line has been taken by the UK Data Protection Tribunal. See especially the Tribunal’s decision of 24.3.1998 in British Gas Trading Limited v Data Protection Registrar (case reference unspecified). Compare also National Privacy Principle 2.1(a)-(b) in Sch 3 to Australia’s federal Privacy Act.
 This term is employed by the Council of Europe in several of its data protection instruments: see, for example, para 4.7 of Recommendation No R (97) 18 on the Protection of Personal Data Collected and Processed for Statistical Purposes (adopted 30 September 1997).
 A point noted in para 54 of the Guidelines’ Explanatory Memorandum.
 The US federal Privacy Act 1974 being an example. However, a requirement of erasure/anonymisation can arguably be read into other provisions of the Act: see 5 USC, s 552a(e)(1) and (5).
 The most far reaching requirements for transactional anonymity are laid down in ss 3(4), 4(1), 4(4) and 6(3) of Germany’s Teleservices Data Protection Act 1997. See also National Privacy Principle 8 in Sch 3 to Australia’s federal Privacy Act.
 See, for example, para 9 of the OECD Guidelines and Principle 3 of the UN Guidelines.
 See art 5(b) of the CoE Convention, art 6(1)(b) of the EC Directive, Principle 3 of the UN Guidelines and para 9 of the OECD Guidelines.
 Norway’s Personal Data Registers Act 1978 (now repealed) is an example here. Nevertheless, the principle was enshrined in chapters 2-3 (see especially s 3-1) of the main regulations to the Act. Compare also the principle’s relatively oblique manifestation in the federal privacy legislation of Australia and the US.
 As has been the case, for instance, with respect to the Norwegian legislation: see generally LA Bygrave, Personvern i praksis: Justisdepartementets behandling av klager på Datatilsynets enkeltvedtak 1980–1996 Cappelen Oslo 1997.
 See, for example, the OECD Guidelines and Data Protection Principle 2 in Schedule 1 to the UK Data Protection Act.
 See NSW Privacy Committee Guidelines for the Operation of Personal Data Systems Background Paper 31 Sydney 1977 p 3; M D Kirby ‘Transborder data flows and the “basic rules” of data privacy’ (1981) 16 Stanford Journal of International Law 27 at 46.
 A lonely example is s 4(2) of the Netherlands’ Registration of Persons Act 1988 (now repealed) which stated: ‘The purpose of a personal data file may not be in conflict with the law, the maintenance of public order or morality.’
 This has been the case, for example, pursuant to s 3(1) of Sweden’s Data Act 1973 (soon to be repealed) and s 10 of Norway’s Personal Data Registers Act (now repealed).
 See, for instance, s 33 of Norway’s Personal Data Act 2000 which maintains the possibility for the national data protection authority to undertake a relatively open ended assessment of licensing applications, albeit with respect to a narrower range of data processing operations than was the case under the 1978 legislation.
 Identical or near-identical requirements are set down in the provisions of several national laws, including art 9(1)(c) of Italy’s Law on Protection of Individuals and Other Subjects with Regard to Processing of Personal Data and Data Protection Principle 4 in Sch 1 to the 1998 UK legislation.
 See for instance art 5 of Switzerland’s federal Data Protection Act 1992.
 This is the case, for example, with para 8 of the OECD Guidelines.
 Similarly formulated requirements are found in several national laws: see, for instance, art 5(2) of Hungary’s 1992 Act on the Protection of Personal Data and on the Publicity of Data of Public Interest. The equivalent provision in the CoE Convention is almost identical except that it refers only to the purposes for which data are ‘stored’ (art 5(c)).
 See for example para 8 of the OECD Guidelines and ss 4-8 of Canada’s federal Privacy Act.
 See for example arts 18-19 of the EC Directive, arts 28-30 of the Hungarian Act and arts 11(2)-(3) of the Swiss Act.
 See for instance art 21 of the EC Directive, Information Privacy Principle 5 and s 27(1)(g) of the Australian federal Privacy Act and s 22 of France’s 1978 Law Regarding Data Processing, Files and Individual Liberties.
 The UN Guidelines’ ‘principle of purpose specification’ (principle 3) stipulates that the purpose of a computerised personal data file should ‘receive a certain amount of publicity or be brought to the attention of the person concerned’. Compare the more generally formulated ‘Openness Principle’ in para 12 of the OECD Guidelines:
There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.
Articles 10-11 of the Directive are supplemented by art 21 which requires member states to ‘take measures to ensure that processing operations are publicised’ (art 21(1)) and to ensure that there is a register of processing operations open to public inspection (art 21(2)).
 See for example the US federal Privacy Act (5 USC s 552a(e)(3)), Information Privacy Principle 3 of the NZ Privacy Act and art 18(1) of the Swiss federal Data Protection Act (in relation to ‘systematic’ collection by federal government bodies).
 See for instance s 4b(2) of Denmark’s Private Registers Act 1978 (now repealed).
 See for instance s 40-5 of France’s 1978 legislation.
 Section 21 of Norway’s Personal Data Act 2000 states that when, on the basis of a personal profile, either the data subject is approached or contacted or a decision, directed at the data subject, is made then he or she must be automatically informed of the data controller’s identity, the data constituting the profile and the source of these data. A similar requirement is found in s 23 of Iceland’s Act on Protection of Individuals with regard to the Processing of Personal Data 2000.
 See s 40 of the new Norwegian Act and s 24 of the new Icelandic Act. These provisions extend to surveillance operations in which personal data are not actually registered or stored (for example, on film).
 See art 8 of the CoE Convention, paras 12-13 of the OECD Guidelines and principle 4 of the UN Guidelines.
 See especially art 7(a) of the Directive which stipulates consent as one (albeit alternative) precondition for processing generally.
 This is the case with the CoE Convention.
 See for example para 10 of the OECD Guidelines, s 4(2) of the Danish Private Registers Act (repealed) and art 19(1) of the Swiss Act.
 Compare principles 5.5, 5.6, 6.10 and 6.11 of the ILO’s 1997 Code of Practice on Protection of Workers’ Personal Data which seek to limit the use of automated decision-making procedures for assessing worker conduct.
 See for example art 12(b) of the EC Directive, Principle 4 of the UN Guidelines, s 14 of the UK Act, art 13(1)(c) of the Italian Act and Information Privacy Principle 7 of the NZ Act.
 Paragraph 10 of the OECD Guidelines.
 See especially arts 5(a), 5(b) and 6 of the Convention, and arts 6(1)(a), 6(1)(b), 7 and 8 of the Directive.
 See for example the US federal Privacy Act (5 USC s 552a(b)-(c)), s 8 of Canada’s federal Privacy Act, and Information Privacy Principle 11 in both the NZ Privacy Act and Australia’s federal Privacy Act.
 A similar rule was found in s 12(3) of Denmark’s Public Authorities’ Registers Act 1978 (now repealed) and s 29 of the Icelandic Protection of Personal Records Act 1989 (also repealed).
 See s 6(6) of Finland’s Personal Data Registers Act 1987 (now repealed), s 4(2) of Sweden’s Data Act 1973 (soon to be repealed) and art 3(c)(3) of the Swiss federal Data Protection Act.
 See further the discussion of this point in Bygrave L A Data Protection Law: Approaching Its Rationale, Logic and Limits Faculty of Law Oslo 1999, ch 18, section 18.4.3.
 For a forceful, highly persuasive critique of such attempts, see Simitis S ‘“Sensitive daten” – zur geschichte und wirkung einer fiktion’ in Brem E, Druey J N, Kramer E A and Schwander I (eds) Festschrift zum 65. Geburtstag von Mario M. Pedrazzini Stämpfli & Cie Bern 1990, pp 469-493.
 See the Guidelines’ Explanatory Memorandum paras 43 and 51.
 See for example Law Reform Commission of Hong Kong, Report on Reform of the Law Relating to the Protection of Personal Data Government Printer Hong Kong 1994 pp 99 and following; Australian Law Reform Commission (ALRC) Privacy Report No 22 AGPS Canberra 1983 vol 2 paras 1218 and following.