AustLII Home | Databases | WorldLII | Search | Feedback

Privacy Law and Policy Reporter

Privacy Law and Policy Reporter (PLPR)
You are here:  AustLII >> Databases >> Privacy Law and Policy Reporter >> 2002 >> [2002] PrivLawPRpr 26

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Nye, Sharon --- "Internet privacy - regulating cookies and web bugs" [2002] PrivLawPRpr 26; (2002) 9(2) Privacy Law and Policy Reporter 21


Internet privacy — regulating cookies and web bugs

Sharon Nye

Until the introduction of the Privacy Amendment (Private Sector) Act 2000 (Cth), the Australian information economy had burgeoned on almost unfettered use of technology to collect information. The central argument of this article is that the Privacy Act 1988 (Cth) as amended (the Act), with its stated policy of encouraging Australian online commerce, has, in reality, the effect of permitting the continuation of such unfettered collection, only providing illusory privacy protection for the individual.

The discussion will centre on the use of cookies and web bugs in a typical internet marketing network to demonstrate how, by combination of characteristics in the National Privacy Principles (NPPs) and pro-business policy bias, the rules can be read restrictively by courts to permit cookie and web bug surveillance. Given the limited scope, the discussion will only focus on the gateway pillars of the NPPs under the Act, particularly NPP 1, which are most significant in first line defence and also the most tested by the new technologies. NPP 1 provides:

1. Collection

1.1 An organisation must not collect personal information unless the information is necessary for one or more of its functions or activities.

1.2 An organisation must collect personal information only by lawful and fair means and not in an unreasonably intrusive way.

1.3 At or before the time (or, if that is not practicable, as soon as practicable after) an organisation collects personal information about an individual from the individual, the organisation must take reasonable steps to ensure that the individual is aware of:
(a) the identity of the organisation and how to contact it; and

(b) the fact that he or she is able to gain access to the information; and

(c) the purposes for which the information is collected; and

(d) the organisations (or the types of organisations) to which the organisation usually discloses information of that kind; and

(e) any law that requires the particular information to be collected; and

(f) the main consequences (if any) for the individual if all or part of the information is not provided.


1.4 If it is reasonable and practicable to do so, an organisation must collect personal information about an individual only from that individual.

1.5 If an organisation collects personal information about an individual from someone else, it must take reasonable steps to ensure that the individual is or has been made aware of the matters listed in subclause 1.3 except to the extent that making the individual aware of the matters would pose a serious threat to the life or health of any individual.

First, the meanings of ‘collection’, ‘personal information’ and ‘fairness’ will be examined to find that, notwithstanding prima facie broad protection construed by the Privacy Commissioner, the principles often apply awkwardly to the technology and are inched along further towards restrictive application by explicit judicial intention to claw back privacy. Second, discussion of the meaning and application of ‘necessary’ will reveal not only that NPP 1 has limited ability to curb the privacy invasiveness of cookies and web bugs as their functionality now stands, but that the requirement of being ‘necessary’ as ,a floating standard has the ability to permit future developments in the privacy invasiveness of the technologies.

Cookies and web bugs in a typical internet organisational relationship and the potential that courts will apply NPPs to give weak privacy protection

The typical website privacy policy only discloses the use of cookies in a simple request-reply interaction between the user’s browser and the website’s server to facilitate the display of the webpage and any transactions that occur therein.[1] In such a two party interaction, it is still within the intuitive awareness of the user that information is being passed along to the server to facilitate the operation of the website, even though the server is invisible to ,the user.

Controversy lies in the use of cookies and web bugs by third parties: the user assumes that the information is being passed on to the party facilitating the interaction, when in fact the information is also being passed on to a party uninvolved in the transaction and whose existence is unknown to the user. This occurs in the common internet marketing relationship between four players: the user, the company who wishes to advertise its products on third party host websites, the advertisement hosts whose website displays the advertisements, and the intermediary network advertiser who serves the ads on behalf of the company.[2] The user may believe that he or she could only receive cookies from the website that ,he or she is immediately viewing, but in fact the user is also receiving cookies and hence being ‘tracked’ from the network advertiser serving the ads ,on the host website.

Cookies have been defined as ‘a piece of information that the internet website sends to your browser’[3] and web bugs as ‘1x1 clear gifs’.[4] The function of the technologies defies simple definition and will be revealed progressively in the discussion of NPP application. One preliminary distinction must be made however; the type of web bug pertinent to the advertisement context is the ,Type 1 web bug that relays machine information to the server in the downloading process, rather than more malicious types that execute files within the user’s computer itself to directly access personal information.[5]

The central contention of this article is that the NPPs do not decisively curb the privacy invasive potential of cookies and web bugs. The NPPs are a set of high level principles that do not specify what is required for compliance and therefore have elasticity to accommodate interpretation for stringent or weak privacy protection.[6] While the Privacy Commissioner’s Guidelines infer stronger protection to catch technological breaches, because ,no distinction is made between requirements for strict compliance and mere best practice,[7] the courts have some scope to claw back protection afforded by the NPPs. Internet business organisations using cookies and web bugs could rely on s 55A(5) of the Act to force a court hearing and avoid a ‘pro-privacy’ Commissioner’s determination.[8]

The NPPs give courts leeway to roll back protection by a combination of technological neutrality that creates ambiguity in application, and a pro-business bias that gives the opportunity to construe the ambiguity against the consumer.[9] The NPPs are worded generally[10] and do not specifically address the privacy implications of ,21st century technology. As a result, technological innovation in e-commerce that redefines interaction with information will find loopholes in the static broad brushstrokes of the NPPs.[11] Furthermore, Attorney-General Daryl Williams has made it clear that the legislative intent of the Privacy Amendment (Private Sector) Act is not to protect privacy from the moral perspective of absolute human right[12] but to provide protection only insofar ,as is necessary to achieve the optimal advantage for Australian ,e-commerce.[13] Therefore, the stated object of the Act limits the scope of the NPPs by weighing privacy against business efficiency,[14] and downplaying its moral force in rights discourse, for example, by balancing an individual’s ‘interest’ in privacy against the ‘right’ ,of business efficiency.[15] Privacy will therefore have to fight for survival against profit maximisation in the information economy. American courts are already forcing big entertainment companies to monitor customer use of intellectual property online.[16] By analogy, the ‘big money’ interest[17] in network advertising would force the hand of the courts to take the restrictive approach to privacy when given half the chance. Indeed, the following discussion reveals many such chances in NPP 1.

NPP 1 examined

Meaning of ‘collection’

The NPPs only apply where there has been collection of information, and requirements for lawful collection are stated in NPP 1 (reproduced above). ‘Collection’ is defined broadly by the Federal Privacy Commissioner to mean to ‘gather’, ‘acquire’ or ‘obtain’ personal information from any source.[18] Despite the apparent breadth of these words, restriction arguably lies in the lowest common denominator of the synonyms that strongly suggests ‘gain’[19] by positive action,[20] connoting first, acquisition of information not previously held and second, a distinction between direct extraction and passive inference. To correctly assert that a cookie or web bug has ‘collected’, these requirements must be fulfilled by the technology within its operational transaction.

Cookies do not functionally fulfil these requirements within the immediate transaction wherein they operate. Upon receiving the request for a page from ,the user’s browser, the web page server sends back not just the requested page but also an HTTP header[21] with an additional ‘set-cookie: name=value’ line. The browser complies with the header and a cookie of a certain name and assigned value[22] is stored in the user’s browser. The value can be a random number that ‘identifies’ the user’s computer to the server[23] when the browser returns this cookie in the next visit.[24] Strictly speaking, what is actually extracted by the server is the existence of the identifier that the server itself set, and therefore there is no incremental ‘gain’ in information from the immediate cookie transaction. Any information about the user’s location is passively and subsequently inferred by matching the existence of the identifier with the URL of the web page viewed,[25] which is information collected by a web bug outside of the cookie transaction.[26] Granted, a convincing refutation would be to consider the location of the identification number to be sufficiently proximate to the user’s computer that subsequent inferences could be deemed ‘collected’ within the immediate cookie transaction.[27] However, even a slight lapse in the nexus[28] gives courts the opportunity for restriction.

Whether or not the use of a web bug would be deemed sufficiently proximate depends on its specific functionality. In our marketing scenario, the general rule is that the user’s browser will only return cookie information to the domain where the cookie originated, implying that only the host website’s server which set the cookie can recognise the user’s computer.[29] The first functionality of ,the Type 1 web bug[30] is to circumvent this rule by permitting the third party network to set cookies.[31] This potentially stretches the proximity loophole to allow surveillance by unknown third parties outside the web page domain.[32] The second point to make about ‘collection’ is the apparent requirement for some ‘active’ request; in this case, the apparent requirement for some ‘act’ the bug actively ‘gaining’ information by establishing a conduit between the user’s browser and the network advertiser’s server.[33] When the browser is triggered automatically to send an HTTP request to the network advertiser to place an ad in the pre-set banner space of the web page, other information is also sent along this conduit to enable the transaction, such as the user’s IP address, browser and operating system type and version, and the address of the host web page.[34]

Even if web bugs can be said to ‘gain by positive action’ the URLs required to match with cookie identifiers, this may still be permitted through the ambiguity of ‘solicitation’. As analogous to the situation in Harder,[35] web bugs may not involve a solicitation or request, but can be equated to an open telephone line as the user’s browser automatically divulges information during the process of downloading web pages.

Gunning suggests that the requirement of solicitation can be implied into collection because NPP 1.1 ‘necessity’ and NPP 2.1 ‘purpose’ envisage collection with a preconceived objective, which is arguably a mental element ,that cannot exist prior to or during accidental receipt of unsolicited information.[36] Greenleaf disagrees with this construction; according to him the operation of s 16B of the Act postpones collection until such time as information is included in a record, therefore postponing also the critical time for the mental element of purpose to exist until point of inclusion.[37] However I would argue, contrary to Greenleaf, that unsolicited information included in a record is not ‘collected’;[38] Gunning is correct to emphasise the importance of purpose at the point of reception as a requirement distinct from subsequent s 16B retention. The NPP 1.3 requirement that collection of information be ‘from the individual’ does not necessarily require solicitation, because information can be collected from an individual even though it was not requested.[39] But in conjunction with NPP 1.3(c) arguably there is a requirement of a request for information because it envisages the formulation of purpose before collection for retention so that the individual can be informed of the purpose — preferably at or before collection for retention.

Personal information

While it is recognised that cookies ,can store other information in the name=value pair,[40] such as an email address or search string that has been divulged to the website by the user during registration or search,[41] this discussion will only focus on personal information collected by the covert operation of the technologies.[42] The information that can be received by covert use of web bugs are a user’s IP address, browser and operating system type and version, the address of the host web page, and any identification numbers stored on the advertiser’s cookies. The only information that can be ‘collected’ by covert use of cookies is the cookie identification number.[43]

Personal information has been given ,a broad conceptualisation by Bygrave, who perceives the cruxial criterion to ,be identifiability or the ability to distinguish a person by unique characteristics, not necessarily by name.[44] The concept of personal information can be restricted however, first by sub-issues of identifiability, second by the additional requirement in s 6 of the Act that the information be ‘about an individual’, and third by the judicial gloss of intention to identify.

First, the Privacy Commissioner considers that s 6’s use of the phrase ‘reasonably be ascertained’ construes ,a primary piece of information as personal if identity can be ‘fairly easily’ ascertained by the use of auxiliary information.[45] This prima facie permissive use of auxiliary information could mean that an IP address could be linked to a name in an internet log file or a cookie identifier linked to an email address to make such information personally identifiable.[46] The IP address-name link could be defeated ,by ‘ease’ of identity if the IP address ,is dynamic[47] or if there is no extant accessible log file.[48] The Act’s use of ,the term ‘reasonably’ arguably poses a higher standard than ‘fairly easily’, requiring lawful linkage to defeat the auxiliary use of email addresses collected by planting web bugs in ,email (which is possibly outlawed under the Cybercrime Act 2001 (Cth)).[49] Concomitantly, lack of individuation could extinguish the ‘personal’ quality of the link where the IP address and cookie identification attach to a machine with multiple users.[50]

While these issues are not necessarily fatal to successful qualification as personal information, they do pose obstacles. On the one hand, case law is not always a reliable guide, as it may ,be coloured by a court’s consideration of underlying policies pertinent to a particular case. For example, the New Zealand Privacy Act Casenote 12582[51] ruling that telephone numbers lack ‘personal’ quality because they fail ,to uniquely identify individuals was influenced by the Court’s desire to avoid the possibility of jeopardising the privacy of other innocent users of the phone in question if the number were ,to be disclosed. On the other hand, whether or not information crosses the obstacles to become ‘personal’ is a matter of fact; so, for example, information can be more easily linked ,if IP addresses become static with the technological evolution towards DSL;[52] information can potentially be linked lawfully if the advertiser purchases a profiling agency;[53] and linkage of a telephone number or IP address to an individual is generally not difficult, for example if the person is the subject of ,a police inquiry.[54]

The above ambiguities create the possibility of construing cookie identifiers and IP addresses as personal information, but if ‘reasonably be ascertained’ is given the meaning of probability of identification rather than the difficulty of identifying, then even ,if there is no intention to use the information as personal identification, the mere capacity to do so will render the information personal.[55] Restriction of the identification element can come with the second element of idiosyncratic connection. Section 6(1) of the Act prima facie conflates the requirement that information be about an individual with identifiability, because the meaning of ‘about’ is not specified other than by qualification of the latter.[56] However, Harder and Provincial Section Order 23 interpret the New Zealand and Ontario counterpart definitions[57] respectively to operate with ‘about an individual’ as a separate narrower element requiring some idiosyncratic connection. So in Harder, Ms C’s denial of possessing unspecified chattels was not personal information because even though she was identified in the information, no ‘statement’ was made in relation to ,her as an individual.[58] In Order 23, estimated market value combined with a municipal address was held not to ,be information about the inhabitants, but merely about the property.[59] By analogy, arguably, an IP address or cookie identifier — even linked to ,a name — lacks an idiosyncratic relationship with the user because the information is about an inanimate browser or computer, not the individual. Conversely, a cookie identifier linked with the web page URL can constitute personal information about a person’s browsing tastes and therefore has the potential to be sensitive information if inferences can be made to intimate idiosyncrasies.[60] ,C v ASB is a decision which prohibited the use of auxiliary information to establish the idiosyncratic link that the information is about the individual.[61] (A small compromise was made that if the auxiliary information appears in the same document and the personal information is not intelligible without ,it, then the link will be permitted.)[62] Auxiliary use of web bug collected web page URLs would fall outside the purview of this concession for lack of proximity with the cookie transaction. There is a weaker argument here that the result was driven by a tangential policy of refusal to lift the corporate veil,[63] because the Court explicitly stated that its rejection of auxiliary information was grounded in the need to maintain the integrity of the definitional boundaries of personal information.[64]

This brings me to the third point —courts, attempting to restrict the reach of privacy protection,[65] can impose a judicial requirement which is not prima facie present in the legislation. Eastweek posed a radical constriction of requiring an intention to identify,[66] possibly to the extent of requiring an intention to identify by name.[67] This would exempt the combined use of an IP address and web page URL to personalise advertisements, because arguably the result of personal interaction with the user can be achieved without an intention to identify by name, relying instead upon identification by idiosyncratic browser behaviour. However, the decision in Equifax Europe Ltd v The Data Protection Registrar put the brakes on Eastweek’s radicalism by concentrating on the intention to find out about the individual rather than identification, which would catch a combination use of the IP and URL accessed.[68]

NPP 1.2: an organisation must only collect information by fair means

According to the Guidelines, a crucial element of fairness is lack of deception, interpreted broadly to include prima facie all forms of covert collection.[69] The American Federal Trade Commission (FTC) does not consider ‘deceptive practice’ to be mere covert collection of data without consent, but holds there must be the additional element of engaging in practices that betray the reliance placed by users on represented collection practices in the website privacy policy.[70] The Hong Kong Privacy Commissioner, New Zealand Privacy Commissioner and the recent UK Campbell case also adopt ,this reasoning of reliance — not on the parameters of the privacy policy, but on the illusion of being unwatched so the individual divulges information where otherwise he or she would have remained reticent.

In New Zealand Case Note 16479[71] and Hong Kong Case 199804574,[72] covert recording was held to be unfair because were it not for reliance on the fact the conversation was off the record the respondent would have answered differently. Similarly in Campbell, the clandestine nature of filming was unfair because the subject was disempowered from taking alternative action.[73]

The interpretation of the FTC has been criticised for not taking into account what can be achieved in practice.[74] For example, in the use of web bugs and cookies, the collection process begins upon downloading the web page, because this is the moment the cookie is set onto the user browser and the user’s browser divulges information to the server through the web bug. This makes the question of whether continued browsing is in reliance of the privacy policy irrelevant, since by the time the policy can be viewed the collection has been completed.

Even in the case of persistent cookies, it is difficult to assert that continued browsing resulted from reliance upon ,a privacy policy, because general consumer practice is to ignore such notices.[75] If the interpretation of the Commonwealth countries is adopted, it could be argued that consumers would divulge information even if they knew cookies and web bugs were operating. This is because the potential privacy invasion from the technology in surveillance and personalisation is intimately tied to the technical capability of the website. For example, without cookies, shopping carts will not work properly, and without web bug collection of URLs, some graphics cannot be downloaded.[76] Although it could be argued that persistent cookies are not strictly necessary for the immediate function of the website, ,some consumer opinion suggests that personalised advertising is coming to ,be expected as integral to the internet experience. This would mean disclosure is not based upon a lack of expectation.[77]

NPP 1 collection: what is necessary?

NPP 1.1 states that an organisation must not collect personal information unless the information is necessary ,for one or more of its functions or activities. Article 8 of the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR) states that there ,shall be no interference by a public authority with the exercise of the right to privacy except where this is ‘necessary’ in a democratic society.[78] ‘Necessary’ in art 8 of the ECHR has been interpreted by the Court of Human Rights to create a high standard, requiring pressing social need and proportionality to the legitimate aim pursued.[79] ‘Necessary’ has also been interpreted to mean pressing commercial need.[80] Interpretation of the ‘necessary’ criterion in art 8 of the ECHR by the Court of Human Rights is relevant to defining ‘necessary’ in the European Directive and therefore also in the ,data protection legislation of European party states.[81]

Therefore, in Europe, use of persistent cookies[82] to prolong the tracking of user habits long after the original session has ended may be considered unnecessary for the function of the website or personalisation of the browsing experience during any one immediate session; hence it would not qualify as a pressing commercial need.[83] Arguably, web bugs may be permitted as a pressing commercial need because the internet advertising industry is heavily dependent upon the revenue generated by personalised advertising[84] and the internet culture of free access that has popularised the online information economy would be jeopardised without subsidies flowing on from advertising revenue.[85]

However, the covert use of cookies and web bugs could be considered as use for illegitimate purposes if the arguments surrounding the ‘side effects’ of personalised advertising gain widespread acceptance. Such ‘side effects’ include:

However, it is contended that this argument has no interpretative application to the Australian NPPs; first, because without the refractory effect of the Directive, the focus of art 8 on public authorities means it is not directly applicable to the private sector; second, because although the ECHR may be influential on the UN Human Rights Committee in its interpretation of the International Covenant on Civil and Political Rights (ICCPR) to which Australia is a party, the ICCPR counterpart art 17 does not include ,the term ‘necessary’, thereby severing the link of interpretation;[90] and third, even if it could be correctly asserted that the European ‘necessary’ standard has force as customary international law,[91] or at least constitutes an extrinsic interpretative aid under s 15AB of the Acts Interpretation Act 1901 (Cth),[92]because the Australian Act explicitly creates a conflicting lower standard,,be applied.[93]

Arguably the Australian Act creates a lower standard because it will only raise the standard of privacy protection given by the courts if it is incidental to achieving the apparent overall objective of the legislation of securing Australian economic advantage. Despite political rhetoric invoking Australia’s human rights obligation regarding privacy under ICCPR art 17,[94] the Attorney General’s Department and the Australian Law Reform Commission consider the ICCPR to be only an indication of international best policy and expressly deny that it creates any legally binding obligation which must be implemented through the Act.[95] Indeed, it is questionable if any ‘international obligation’ attaches to ,the ICCPR at all, as the Explanatory Memorandum only discusses obligations under the EU Directive.[96] Despite Australia’s clear concern ,with the EU trade barrier, the Government has displayed a US style hesitation in applying sweeping EU privacy standards,[97] as evidenced ,by the inclusion of the small business exemption in the Act despite the risk that this compromises Australia’s chances of being held to have met ,the EC Directive’s requirement of ‘adequacy’.[98]

Left to Australian judicial consideration, the meaning of ‘necessary’ varies from an absolute imperative, to a relative term of reasonably required,[99] to merely the exclusion of collection as a ‘fishing expedition’,[100] depending entirely upon judicial intention — which, in the case of the Act, has favoured the lower threshold. ‘Reasonably required’ could mean that alternative modes of collection that are less privacy invasive need not be considered, and certainly the breadth of ‘reasonably’ is such ,that the stated purpose is virtually unrestricted. If the stated purpose is future marketing in anticipation of the user returning to the site, then the use of persistent cookies would be ‘reasonable’.

The Privacy Commissioner’s interpretation raises the standard so that necessary qualifies ‘legitimate’ functions or activities, rather than merely functions and activities.[101] I would argue that this is to no avail. Because ,the NPPs establish no absolute immutable right, but rather initiate a negotiation between interest in privacy and the right to business efficiency, then what constitutes ‘legitimacy’ is a floating standard, contingent upon changing social acceptance of the technology.[102] Already, some consumers socialised into an acceptance of surveillance on the internet scoff at the ‘hype’ surrounding cookie and web bug usage.[103] Arguably, not only is the floating standard no inhibition to present privacy invasive technologies that have become accepted, but it also gives the Act elasticity to encompass currently unacceptable technological applications that through socialisation may eventually become acceptable. At present the use of cookies and web bugs within the organisational relationship of a network advertiser and a web page host extends the privacy invasiveness of the technology by hiding the network advertiser as a collecting party.[104] However, developments are afoot that would take the problem to new heights by increasing the magnitude of surveillance exponentially and increasing the exposure of disclosed information to countless intermediate collectors. I am referring to KaZaa’s use of Gnutella’s peer sharing topology to permit the harnessing of personal home computers to the network advertiser hub[105] so that the individual surfer also becomes a collector. While presently controversial, this process has the potential to slip ,into internet culture because it promises even more efficient advertising[106] and potential remuneration for private party collectors.[107] Its ‘legitimacy’ is just one small step down the slippery slope to total privacy compromise.

Conclusion

Cookies and web bugs have a good chance of surviving the privacy legislation and, indeed, there is potential for the Act to allow even more intrusive privacy invasive practices. The economic imperative has seen the cutting back of fair use exemptions in anti-circumvention legislation in favour of powerful media interests, and it would seem that the law/technology divide in the privacy legislation will also be dominated by the colour of money. l

Sharon Nye is a student researcher at the Baker & McKenzie Cyberspace Law Centre at the University of New South Wales. This article was first submitted as an essay for the elective course Data Surveillance and Information Privacy Law in the LLM program at UNSW.




[1] See the privacy policy on cookies provided by Microsoft and Dell which are similar because they are so generic. Both are phrased in terms of the cookie helping the user and helping Microsoft or Dell. No other party such as the companies advertising or the ad host is disclosed. The disclosed function of the cookie is that it tells the server that the user has returned to the page. The broader implications of such a function are not stated: see <www.microsoft.com/info/cookies.htm> and <www.dell.com/us/en/gen/misc/policy_001_policy.,htm>.

[2] Tsoi K ‘Web bugs and internet advertising’ 43 (2001) Computers and Law 17.

[3] Draft National Privacy Principle Guidelines 7 May 2001, p 22 (the Guidelines).

[4] Smith R ‘FAQ: web bugs’ <www.privacyfoundation.org/resources/webbug.asp> p 1.

[5] This taxonomy of web bugs has been developed by Intelytics, which categorises web bugs according to manner of ‘infection’, location of infection, co-operation of the user and co-operation of the website from which browsing information is being collected: Statement of GE Clayton, Dr SB Lucan and KG Coleman before the Congressional Privacy Caucus ‘Web bugs and the threat to online privacy and security for consumers and businesses’ 1 March 2001 at <www.house.gov/markey/iss_privacy_,clayton.pdf> p 7.

[6] Private Sector Privacy Handbook chapter on National Privacy Principles, CCH Australia Ltd Sydney 2001, ,p 2042 [5-100].

[7] Above note 6.

[8] Section 52(1)(B) of the Act states that the Commissioner could make a determination that includes a declaration that offending conduct not continue or be repeated; damages be awarded; and costs be awarded. However, if a determination is entered into to the detriment of the business organisation, the organisation can refuse to comply, thereby forcing either the complainant or the Commissioner ,to commence de novo Federal Court proceedings against the organisation ,to enforce the determination: see ,ss 55(1)(a) and (b) and s 55A(5). However, a copy of the Commissioner’s determination can be received as evidence into the trial: s 55(6)(a).

[9] Gunning expresses a privacy advocate’s viewpoint that cl 3 should only have a restricted application to consideration of exemptions to the NPPs, but construction of the provision weighs in favour of the interpretation proposed here that the policy behind ,cl 3 also broadly affects the restrictive reading of the NPPs: Gunning P ‘Central features of Australia’s private sector privacy law’ [2001] CyberLRes ,2 at 5 <www2.austlii.edu.au/-graham/CyberLRes/2001/2/>.

[10] For criticism as to lack of definition of terms key to the Act, see Australian Consumers’ Association Submission to the House of Representatives Standing Committee on Legal and Constitutional Affairs Inquiry into Privacy Amendment (Private Sector) Bill 2000 p 3, available from the Australian Parliament House website at <www.aph.gov.au/>. The NPPs are worded generally because the NPPs were derived from OECD guidelines and geared towards flexibility and industry wide coverage: see The Parliament of the Commonwealth of Australia Cookie monsters? Privacy in the information society Report by the Senate Select Committee on Information Technologies November 2000, p 58.

[11] Senator K Lundy ‘Privacy Amendment (Private Sector) Bill 2000’ In Committee Speech 30 November 2000 1-2.

[12] Williams D MP ‘Privacy Amendment (Private Sector) Bill 2000 Second Reading Speech’ 12 April 2000, House Hansard p 4.

[13] Above note 12 p 1.

[14] Privacy Amendment (Private Sector) Act 2000 (Cth) s 3(b)(iii).

[15] Greenleaf G Privacy Amendment (Private Sector) Bill 2000, Submission to the House of Representatives Standing Committee on Legal and Constitutional Affairs 15 May 2000 ,p 2. The submission is available from the Australian Parliament House website at <www.aph.gov.au/>. ,This language is also used in the Explanatory Memorandum of the Privacy Amendment (Private Sector) ,Bill which couches individual’s ‘interest’ in privacy in terms of a consumer interest; see The Parliament of the Commonwealth of Australia Senate Privacy Amendment (Private Sector) Bill 2000 Revised Explanatory Memorandum pp 1, 6-10.

[16] Shim R ‘Sonicblue forced to spy on subscribers?’ <story.news.yahoo.com/news?tmpl=story&u=/zd/20020506/tc_zd/5107342>.

[17] In 2000, 48 per cent of all online advertising revenue was attributable to banner advertising; it was 36 per cent ,in 2001. This constitutes the largest portion of the revenue, with sponsorships and classifieds bringing in 26 per cent and 16 per cent respectively in 2001 and email advertising only bringing in 3 per cent in 2001: see Internet Advertising Bureau ‘Internet advertising revenue totalled $1.7 billion for Q4 2001’ <www.iab.net/>.

[18] Office of the Federal Privacy Commissioner Guidelines to the National Privacy Principles September 2001.

[19] ‘Gather’ is defined variously to mean ‘gain’ in Sykes J (ed) The Concise Oxford Dictionary (7th ed) Clarendon Press Oxford 1982 p 408. ‘Acquire’ can also have the meaning of ‘gain’ and is treated as synonymous with ‘obtain’ under the Income Tax Assessment ,Act 1936 (Cth): see s 136AA(2).

[20] ‘Gain’ in the context of the verbs ‘obtain’ ,‘acquire’ and ‘gather’ suggests gain by positive action rather than passive reception. So ‘obtain’ means to procure or gain as the result of purpose and effort: see Re Woods, Woods v Woods [1941] St R Qd 129 at 137 per Philip J. In the context of s 50 of the Trade Practices Act 1974 (Cth), an acquisition has been described as an act of gaining: see Lipton J ‘The meaning of acquire in section 50 of the TPA and its impact on secured financing transactions’ (1993) 21 Australian Business Law Review 353 at 354. ‘Gather’ is defined to mean cause to assemble: see Sykes J above note 19 p 408.

[21] To understand the operation of cookies, a few brief words must be ,said about HTTP. Hypertext transfer protocol (HTTP) is a group of standards that governs the way web pages, graphics and other data should be transferred across the internet. An HTTP header communicates the request from the user’s browser as well as the server’s response, telling the receiving end exactly what it is receiving. The information transferred is therefore extremely limited — the server does not ‘know’ anything about the browser except that certain data was requested and dispatched. Cookies help the website function by expanding the ability of HTTP through including more information inside the HTTP header: see Slayton M ‘An introduction to cookies’ <hotwired.lycos.com/ webmonkey/templates/print_template.,htmlt?meta=/webmo...> p 1.

[22] Whalen D ‘The unofficial cookie FAQ version 2.54’ at <www.cookiecentral.com/faq/> p 5.

[23] Junkbusters ‘How web server’s cookies threaten your privacy’ <www.junkbusters.com/ht/en/cookies.html> p 1; Nerino T and Smith D Virtual quickstart guide: JavaScript ,and for the World Wide Web (3rd ed) Peachpit Press California 1999 p 157.

[24] Slayton M above note 21 at p 2. When a cookie is sent from the browser to the server, the information in the HTTP header is changed slightly to ,read ‘cookie: name=value’ so that the server is made aware of a cookie with ,a unique value. Whalen D above note 22 at p 5.

[25] The cookie only stores information about parameters. It stores information about the path parameter setting the URL path the cookie is valid within and it stores information about the domain parameter which makes the cookie accessible to different servers within, the domain. This information governs the location at which the cookie can be read but does not indicate the location of the cookie as attached to ,the user’s browser at any one time: Zimmerman R ‘The way the cookies crumble’ (2000) 4 New York University Journal of Legislation and Public Policy 493 at note 17; Whalen D above note 22 at p 5.

[26] Above note 2.

[27] See Fraser H ‘Location, location, location’ Dec/Jan 2001 elawpractice. com.au Issue 9 at p 29-31.

[28] Clickstream data has therefore been described as information about information — human inferences are imputed into mechanical crumbs, creating an indirect link between the user and what is actually collected: see Reidenberg J and Gamet-Pol F ‘The fundamental role of privacy and confidence in the network’ (1995) 30 Wake Forest Law Review 105 at 113.

[29] Slayton M above note 21 at p 2.

[30] Statement of GE Clayton, Dr SB Lucan and KG Coleman above note 5 ,at p 7.

[31] The bug can include additional information in the query string that the computer uses to download the webpage to instruct a cookie to be returned along with the webpage. The cookie is sent back to the network advertiser’s server with subsequent HTTP requests to the same internet domain: statement of GE Clayton, Dr SB Lucan and KG Coleman above note 5 at p 7.

[32] O’Harrow R ‘Fearing a plague of web bugs’ 13 November 1999 at <www.washingtonpost.com/wp-srv/Wplate/1999-11/13/0481-111399-idx.html> p 1; Olsen S ‘Nearly undetectable tracking device raises concern’ <news.com.com/2100-1017-243077.html?legacy=cnet> p 1.

[33] As mentioned above, host websites do not keep advertisements locally but subscribe to an intermediary network advertiser to place the ads for them. When the user’s browser requests a web page by an HTML call, the website server answers the call by transmitting information necessary to display the webpage. Downloaded with this information is a Type 1 web bug that is an HTML code. This HTML code is an invisible conduit between ,the user’s browser and the network advertiser: see BBC News ‘Pixel-high privacy spy’ at <news.bbc.co.uk/hi/english/sci/tech/newsid_842000/842624.stm> p 2; Whalen D, above note 22 at p 4; Langa F ‘The web bug boondoggle’ at <www.informationweek.com/story/IWK20010621S0030> p 2. The HTML code is downloaded with the rest of the web page as part of the graphics because it attaches to an invisible graphic 1x1 pixel in size: see Online Profiling: A Report to Congress June 2000 <www.ftc.gov/os/2000/06/onlineprofilingreportjune2000.pdf> p 7; Edwards M ‘Your web browser is bugged’ <www.ntsecurity.net/Articles/Print.cfm?ArticleID=9543>.

[34] Online Profiling: A Report to Congress above note 33 at p 7; Edwards M above note 33.

[35] Harder v The Proceedings Commissioner [2000] 3 NZLR 80, available at <www.austlii.edu.au/nz/cases/NZCA/2000/129.html>.

[36] Above note 9 at 6.

[37] Greenleaf G ‘Private sector privacy: problems of interpretation’ (2001) CyberLRes 3 <www2.austlii.edu.au/~graham/CyberLRes/2001/3/> p 4.

[38] The wording of the New Zealand Privacy Act is unfortunate, in that it states in s 2 that ‘collect does not include receipt of unsolicited information’, a construction which ,leads to uncertainty as to whether the restrictions of the Australian Federal Privacy Act obligations concerning information collected (s 16B) require solicitation before they apply.

[39] In a lecture by Bygrave and Waters on collection use and disclosure principles, Bygrave commented that NPP’s 1.3 ‘collection from the individual’ suggests a requirement of solicitation. However, in Harder (above note 35), the information was arguably from the woman speaking on the other end of the telephone line, but nevertheless there was no collection because the information was from the individual but without request.

[40] Underhill S ‘Web bugs within ,web pages’ <www.infinisource.com/featuresweb-bugs2-pf.html> p 1.

[41] Note that the cookie value cannot contain anything that isn’t provided by the browser or by the user himself or herself, unless the information is the unique identifier assigned by the server setting the cookie: Australian Consumers’ Association, ‘What’s inside your cookie’ CHOICE at <www.choice.com.au/articles/printGenerator.asp?ID=102717> p 2. See also Slayton M above note 21 at p 1.

[42] For a discussion on the distinction between overt and covert methods of collection and how the two methods can be used in combination, see Shimanek A ‘Do you want milk with those cookies? Complying with the safe harbor privacy principles’ [2001] 26 University of Iowa Journal of Corporation Law 456 at 460.

[43] Slayton M ‘The risks of cookies’ <hotwired.lycos.com/webmonkey/templates/print_templates/print_template.htmlt?meta=/webmo...> p 2.

[44] Bygrave L ‘Chapter 10: Existing safeguards for data on collective entities pursuant to data protection laws’ <www2.austlii.edu.au/privacy/secure/Bygrave/index-CHAPTER-10.html> p 10.

[45] Above note 44.

[46] Above note 2.

[47] The IP address is not usually personally identifiable information because most IP addresses are dynamic, changing every time the user connects to the internet, as opposed to a static address that is uniquely connected to the user’s computer: Federal Trade Commission Online Profiling: A Report to Congress Part 2 Recommendations July 2000 <www.ftc.gov.os/2000/07/onlineprofiling.htm> p 10 footnote 14.

[48] Above note 37 at p 13.

[49] The Commonwealth jurisdiction under the Cybercrime Act 2001 is limited to use of a telecommunications service in commission of an offence; with respect to the Commonwealth Act and the Crimes Amendment (Computer Offences) Act 2001 (NSW), there must be copying of data held in a computer and the access must be unauthorised. The web bug could possibly be said to copy data in emails stored in the user’s computer where such access is unauthorised: see Steel A ‘Boldly going where no-one has gone: the expansive new computer access offences’ (2002) 26(2) Criminal Law Journal 72 at 73, 77 and 80.

[50] Above note 44 at p 12.

[51] Case notes released by the Privacy Commissioner of New Zealand Casenote 12582 (1999) at <www.knowledge-basket.co.nz/privacy/people/cn12582.html>.

[52] Federal Trade Commission Online Profiling: A Report to Congress Part 2 Recommendations above note 47 at ,p 10 footnote 14.

[53] See Dixon T ‘2000 — a chronology of internet privacy debacles’ (2001) 3(9) Internet Law Bulletin ,117 at 118 for a discussion of the DoubleClick purchase of Abacus. ,The complaint filed by EPIC against DoubleClick to the FTC was for misleading and deceptive conduct, as DoubleClick claimed in its privacy policy that information gleaned from surfers would not be personally identified. Therefore, it is fair to say that if a similar scenario occurred under the Australian Act, the parent and subsidiary exchange of information could be permissible. Where information disclosed is not sensitive, then s 13(1)(b) of the Act exempts intercorporate transferral of information. If information is sensitive then the requirements of NPP 2 must be proved: see US District Court for the Southern District of New York Official Court Notice of Pendency and Proposed Settlement of Class Action In re DoubleClick Inc Privacy Litigation Master File No 00-CIV-0641 at <settlement.doubleclick.net/settlement/> p 1 for statement of charges against DoubleClick.

[54] Above note 37 at p 13.

[55] European Directive recital 26 ,uses the phrase ‘likely reasonably’, which could conflate the meaning of probability into the word ‘reasonably’. Under the Ontario legislation, where the definition of personal information does not contain the word reasonably, nevertheless judicial gloss has been applied to give the legislation wider breadth. Provincial Order 230 provides that if there is reasonable expectation that the individual can be identified from the information, then the information is personal: see <www.ipc.on.ca/english/orders/orders-p/p-230.htm>

[56] ‘Personal information is information or an opinion ... about an individual whose identity is apparent ,or can reasonably be ascertained’: see ,Pt II, s 6 of the Act.

[57] The New Zealand Privacy Act 1993 defines ‘personal information’ to be ‘information about an identifiable person’. While this is a simpler definition that that of the Australian Act, the two are comparable to the extent that the New Zealand definition also has the two elements of ‘about’ and ‘identifiability’. The Ontario legislation defines generally personal information to mean ‘recorded information about an identifiable individual’, making it similar to the New Zealand legislation except that the Ontario legislation explicitly provides that personal information can include an identification number: see Privacy Act 1993 (NZ) Pt I, s 2 (Interpretation) <www.knowledge-basket.co.nz/privacy/legislation/1993028/doc00004.html>; Freedom of Information and Protection of Privacy Act s 2(1) and (2); and Municipal Freedom of Information and Protection of Privacy Act s 2(1) and (2), <192.75.156.68/DBLaws/Statutes/English/90f31_e.htm#s002>

[58] Above note 35 at [24].

[59] While the Ontario legislative definition includes ‘identifier assigned ,to an individual’, arguably in the case of IP addresses and cookie identifiers the number is not assigned to the individual but to a machine with potentially several users; therefore Order 23 still applies to distinguish the identifier as personal information in such cases.

[60] For an example of cookie information linked to web page URLs being sensitive information, see ,Kaplan C ‘Fighting to make a city’s cookie files public’ Cyber Law Journal 18 December 1997 <www.nytimes.com/library/cyber/law121897law.html>. An investigative journalist in Tennessee is fighting to access the cookie files of government employees under the charge that if government employees are surfing Ku Klux Klan sites, satanic sites or pornographic sites, then the public has the right to know and take action.

[61] Waters N ‘Cases and Complaints: C v ASB Bank Ltd’ [1997] PLPR 63.

[62] C v ASB Bank Ltd 4 (1997) HRNZ 303 at [44].

[63] Above note 44 at p 13.

[64] Above note 61.

[65] In Harder v Proceedings Commission [2000] 3 NZLR 80, Tipping J made obiter remarks ,about the proper approach to the understanding of the concept of ‘personal information’. Tipping J said:

An unqualified approach to what constitutes ‘information about an identifiable individual’ will lead readily to breaches of one or more of the information privacy principles ... ,[The provisions of s 14(a)] require the Commissioner, and implicitly others involved in the administration of the ,Act, to have due regard for the protection of important human rights and social interests that compete with privacy, including the general desireability of ,the free flow of information and the recognition of the right of government and business to achieve their objectives in an efficient way.

[66] Eastweek Publisher Ltd v Privacy Commissioner For Personal Data [2000] HKCA 137 at <www.hklii.org/cgi-hklii/disp.pl/hk/cases/HKCA/2000/137.html> p 6.

[67] The discussion of identity in Robeiro J’s judgment is focused on anonymity rather than the woman’s unique characteristics as revealed by ,the photograph.

[68] (1991) Case DA/90 25/49/7 at [49], cited at <www2.austlii.edu.au/privacy/secure/Bygrave/index-CHAPTER-2.html#fn201>.

[69] Guidelines to the National Privacy Principles September 2001.

[70] Hetcher S ‘The emergence of website privacy norms’ [2001] 7 Michigan Telecommunications and Technology Law Review 97, Lexis transcript p 19.

[71] See <www.privacy.org.nz/news3.html>.

[72] See <www.pco.org.hk/english/casenotes/case_enquiry2.php?id=13>.

[73] Campbell v Mirror Group Newspapers cited R v Broadcasting Standards Commission ex parte BBC [2000] EWCA 59; [2001] QB 885 where Lord Woolf MR said:

The fact that it is clandestine can add an additional ingredient. The fact that it is secret prevents those who are being filmed from taking any action to prevent what they are doing from being filmed.

In this case, the plaintiff had no opportunity of evading being photographed or refusing consent to be photographed: see Campbell v Mirror Group Newspapers [2002] EWHC ,499 (QB) at <www2.bailii.org/~jury/cases/EW/EWHC_QB_2002_499.html> at [100]-[104].

[74] Walker K ‘The Costs of Privacy’ [2001] 25 Harvard Journal of Law ,and Public Policy 87 at 105.

[75] See above note at 107-113 for a discussion on the irrelevance of website privacy policies. For an American law student’s comment on the ineffectiveness of privacy policies in response to the US Department of Commerce Workshop ,on Online Profiling, see <www.ftc.gov/bcp/profiling/comments/ridder.htm>.

[76] Harper J and Singleton S ‘With a grain of salt: what consumer privacy surveys don’t tell us’ June 2001 <www.cei.org/PDFs/with_a_grain_of_salt.pdf> p 7.

[77] Seban L, ‘Who cares about internet privacy?’ 12 June 2001 <www.newsfactor.com/perl/printer/11161>.

[78] See Bygrave L, ‘Data Protection Pursuant to the Right to Privacy in Human Rights Treaties’ <www2.austlii.edu.au/privacy/secure/Bygrave/index-Appendix-2.html> p 3.

[79] Leander v Sweden 1987 <hudoc.echr.coe.int/hudoc> at [58]. The second requirement of proportionality to a legitimate purpose conflates the two criteria of legitimacy and proportionality. The legitimate purposes are listed in art 8 cl 2 and the Court considers the fulfilment of this legitimacy criteria in its own right before determining proportionality: see Leander v Sweden at [49] ‘Legitimate Aim’. See also Koelman K and ,Bygrave L Privacy, Data Protection ,and Copyright: Their Interaction in ,the Context of Electronic Copyright Management Systems IMPRIMATUR, Institute for Information Law, Amsterdam, June 1998, p 68; ,Bygrave L above note 78 p 14.

[80] See Koelman K and Bygrave L above note 79 at p 48.

[81] See Directive 95/46/EC cl 10 of the preamble: Koelman K and Bygrave L above note 79 where Bygrave says that art 8 is ‘very important from the normative perspective for the interpretation of the Directive’.

[82] Junkbusters above note 23 at p 7; Eichelberger L ‘The cookie controversy’ <www.cookiecentral.com/ccstory/ cc3.htm> p 1. Because the life of the cookie can exceed the browsing time, the browser will save the cookie onto the user’s hard drive. The author notes that some cookies may be so persistent that they will last long after you change ISPs or upgrade your browser.

[83] For discussion that persistent cookies are unnecessary, see Garfinkel S ‘The persistence of cookies’ at <hotwired.lycos.com/packet/garfinkel/96/50/index2a.html>.

[84] Above note 17.

[85] Hetcher S above note 70 p 15; Weise E ‘A new wrinkle in surfing the net’ <www.usatoday.com/life/cyber/tech/cth582.htm> p 2.

[86] Gandy O ‘Legitimate business interest: no end in sight?’ (1996) University of Chicago Legal Forum 77 at 124-5; Kaplan C above note 60.

[87] Stepanek M ‘Weblining: Companies are using your personal ,data to limit your choices — and force you to pay more for products’ <www.businessweek.com/2000/00_14/b3675027.htm?scriptFramed> p 5. Venture Direct, a New York based company, sells a list of fat black women who are offered as targets for self-improvement products — an example ,of discriminatory commercial practice cited by J Reidenberg, Testimony before the Subcommittee on Courts and Intellectual Property, Committee on the Judiciary and United States House of Representatives Oversight Hearing on Privacy and Electronic Commerce ,18 May 2000.

[88] Lewis P ‘Had your fill of spam yet?’ <seattletimes.nwsource.com/ news/nation-world/html98/spam_ 032898.html>; Middleton J ‘Hotmail Users Face Mass Spamming’ <www.pcw.co.uk/Newa/1124709>.

[89] Even though the ECHR art 8 is focused on public authorities, it has an interpretative value for the application of the NPPs to the private sector because the European Directive applies the same standard of obligation to both public and private bodies: see Preamble cl 26. The UK Data Protection Act 1998 follows this approach by defining ‘data controller’ as a person who determines the purposes for which personal data is to be processed, without differentiation between public and private body controllers. Under such circumstances Bygrave’s gloss ,on the Human Rights Court’s interpretation of necessary as encompassing ‘pressing social or commercial need’ may be permissible: see Koelman K and Bygrave L above note 79 p 48. However, where the ECHR focus on public authorities is directly applied to the Australian Act, which differentiates between public and private sectors, there must be a clear judicial statement from the Court of Human Rights that art 8 has application to private bodies. A cursory browse through the European Court ,of Human Rights case database has revealed no such development from years 1999 to 2000, the recent case Amann v Switzerland merely reiterating that ‘the storing by a public authority of information relating to an individual’s private life amounts to an interference within the meaning of Article 8’: see <hudoc.echr.coe.int/hudoc>.

[90] Article 17 of the ICCPR is phrased in cl 2 to give a general right ,of protection against interference, whereas art 8 ECHR is phrased to include an exception. That is why the term ‘necessary’ does not appear in art 17 of the ICCPR. Even if the ICCPR serves as a link between the ECHR and the Australian Act, the High Court in Minister for Immigration and Ethnic Affair v Teoh (1995) [1995] HCA 20; 183 CLR 273 ,said that the treaty should be used ,as an interpretative guide with due circumspection, having regard to the relationship between the treaty and the Act. See Balkin R ‘International Law and domestic law’ Chapter 5 at 5.3.4 ,in Blay et al (eds) Public International Law: An Australian Perspective; Evatt E ‘Meeting universal human rights standards: the Australian experience’ Senate Occasional Lecture Series 1998 available at the Australian Parliament House website, p 4.

[91] In the Chow Hung Ching case, the Court suggested that Australian courts ought to apply at least those principles of international law that had been universally accepted: Balkin R, ,above note 90 at 5.2.1. It can be argued that the necessity standard does not have standing as a rule of customary international law because it is only accepted in European states. For example, the American Convention ,on Human Rights omits the necessity criterion: Bygrave L above note 78.

[92] It is not certain to what extent ,s 15AB(1) and (2) of the Acts Interpretation Act 1901 (Cth) authorises treaties as an aid to interpretation where the treaty is not expressly referred to in the statute. However, obiter dicta of the Federal Court suggests that it may be permissible to consider such statutes where they have been referred to in ,the Second Reading Speech: Balkin R, above note 90 at 5.3.4.

[93] Balkin R, above note 90 at 5.2.2:

Even where a rule of customary law (or for that matter a treaty provision) can be satisfactorily established, the courts will refuse to apply it in the face of clearly contradictory statutory provisions.

[94] For use of rights rhetoric in political debate, see Murphy J MP, Privacy Amendment (Private Sector) ,Bill 2000 Second Reading Speech ,8 November 2000; Senator Stott Despoja N, Privacy Amendment (Private Sector) Bill 2000 Second Reading Speech 29 November 2000, Senate Hansard.

[95] Ms K Leigh Assistant Secretary ,of the Attorney General’s Department Information Law Branch, response ,to the Chairman, and Mr A Rose President of the Australian Law Reform Commission, response to Senator ,Abetz, Senate Legal and Constitutional References Committee, Official Committee Hansard Wednesday ,5 August 1998 Canberra pp 212 ,and 227. Available at the Australian Parliament House website. Compare this to the Senate Legal and Constitutional Committee recommendation that the privacy legislation be introduced on a wider base of international obligations, including ICCPR obligations: see Senate Legal and Constitutional Committee ‘Privacy in the Private Sector: Chapter 3 Evaluating a Privacy System’ <www. aph.gov.au/senate/committee/legcon_,ctte/privacy/Chap3.htm> p 14.

Clause 12B is a statement of Commonwealth legislative power ,rather than an undertaking of ICCPR obligation: The Parliament of the Commonwealth of Australia Senate, Privacy Amendment (Private Sector) ,Bill 2000 Revised Explanatory Memorandum, p 55 Item 48.

[96] The Explanatory Memorandum does not define what ‘international obligation’ encompasses in its discussion of the amendment itself: p 36. In the Memorandum’s discussion of the Government’s broader privacy policy, international obligation is only mentioned in regards to achieving compatibility with the EU Directive to remove barriers in international trade: ,p 10. Nowhere in the Memorandum’s general discussion of the rationale of introducing legislation is the ICCPR mentioned: see Regulation Impact Statement p 6-10, Conclusion and Recommended Option p 31.

[97] Both the US and Australian governments share a kindred concern that the cost of implementing stringent EU privacy requirements could prove too much for businesses to remain competitively viable. For example, Orson Swindle, Commissioner of the FTC, said in dissent to the introduction of Fair Information Practice Principles that they would ‘impose costs or other unintended consequences that could severely stifle the New Economy’. The same rationale was responsible for introduction of the small business organisation exemption in Australia: ,see Orson Swindle, Prepared Statement of the Federal Trade Commission on ‘Privacy online: Fair information practices in the electronic marketplace’ Before the Committee on Commerce, Science, and Transportation US Senate, Washington DC 25 May 2000 , at <www.ftc.gov/os/2000/05/,privacyswindle.htm> p 2; and the Advisory Report on the Privacy Amendment (Private Sector) Bill 2000 House of Representatives Standing Committee on Legal and Constitutional Affairs, Chapter 2: Small Business Exemption, p 9, available at the Australian Parliament House website.

[98] European Commission Submission to the House of Representatives Committee on ,Legal and Constitutional Affairs concerning its inquiry into the ,Privacy Amendment (Private Sector) ,Bill 2000 pp 2 and 6, available ,from the Australian Parliament ,House website.

[99] R v Marwey [1977] Qd R 247 ,at 250; Smith v McCarron [1921] SAStRp 26; [1921] ,SASR 244 at 247.

[100] Lang v Australian Coastal Shipping Commission [1974] 2 ,NSWLR 70 at 72-3.

[101] The Draft National Principle Guidelines May 2001 define collection of personal information to be necessary if ‘an organisation cannot in practice effectively pursue a function or activity without collecting personal information’. Compare this to the ,Draft National Principle Guidelines September 2001 which define collection of personal information to be necessary ‘if an organisation cannot in practice effectively pursue a legitimate function or activity without collecting personal information’.

[102] Kang J ‘Information Privacy in Cyberspace transactions’ (1998) 50 Stanford Law Review 1193 at 1285-1286.

[103] Cookies are only rejected ,0.68 per cent of the time per billion pages viewed: see Harper J and Singleton S above note 76 at p 9. Fred Langa calls the ‘web bug hysteria’ boloney: see Langa F ‘The web bug boondoggle’ above note 33 at p 2.

[104] Above note 2 at 17 at 18.

[105] The latest peer to peer system ,is a centralised system embedded in a decentralised system. This means that peers (private individuals acting as collectors) are connected to a supernode (network advertiser server) where the supernodes are not connected but the peers can communicate across nodes: see Minar N ‘Distributed systems topologies: Part 1’ <www.openp2p. com/lpt/a//p2p/2001/12/14/topologies_,one.html> p 3.

[106] Borland J ‘Stealth P2P network hides inside KaZaa’ <news.com.com/2100-1023-873181.html>.

[107] Borland J ‘KaZaa exec defends sleeper software’ <news.com.com/2100-1023-875016.html>.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/PrivLawPRpr/2002/26.html