Privacy Law and Policy Reporter
Until the introduction of the Privacy Amendment (Private Sector) Act 2000 (Cth), the Australian information economy had burgeoned on almost unfettered use of technology to collect information. The central argument of this article is that the Privacy Act 1988 (Cth) as amended (the Act), with its stated policy of encouraging Australian online commerce, has, in reality, the effect of permitting the continuation of such unfettered collection, only providing illusory privacy protection for the individual.
1.1 An organisation must not collect personal information unless the information is necessary for one or more of its functions or activities.
1.2 An organisation must collect personal information only by lawful and fair means and not in an unreasonably intrusive way.
1.3 At or before the time (or, if that is not practicable, as soon as practicable after) an organisation collects personal information about an individual from the individual, the organisation must take reasonable steps to ensure that the individual is aware of:(a) the identity of the organisation and how to contact it; and
(b) the fact that he or she is able to gain access to the information; and
(c) the purposes for which the information is collected; and
(d) the organisations (or the types of organisations) to which the organisation usually discloses information of that kind; and
(e) any law that requires the particular information to be collected; and
(f) the main consequences (if any) for the individual if all or part of the information is not provided.
1.4 If it is reasonable and practicable to do so, an organisation must collect personal information about an individual only from that individual.
1.5 If an organisation collects personal information about an individual from someone else, it must take reasonable steps to ensure that the individual is or has been made aware of the matters listed in subclause 1.3 except to the extent that making the individual aware of the matters would pose a serious threat to the life or health of any individual.
First, the meanings of ‘collection’, ‘personal information’ and ‘fairness’ will be examined to find that, notwithstanding prima facie broad protection construed by the Privacy Commissioner, the principles often apply awkwardly to the technology and are inched along further towards restrictive application by explicit judicial intention to claw back privacy. Second, discussion of the meaning and application of ‘necessary’ will reveal not only that NPP 1 has limited ability to curb the privacy invasiveness of cookies and web bugs as their functionality now stands, but that the requirement of being ‘necessary’ as ,a floating standard has the ability to permit future developments in the privacy invasiveness of the technologies.
Cookies have been defined as ‘a piece of information that the internet website sends to your browser’ and web bugs as ‘1x1 clear gifs’. The function of the technologies defies simple definition and will be revealed progressively in the discussion of NPP application. One preliminary distinction must be made however; the type of web bug pertinent to the advertisement context is the ,Type 1 web bug that relays machine information to the server in the downloading process, rather than more malicious types that execute files within the user’s computer itself to directly access personal information.
The central contention of this article is that the NPPs do not decisively curb the privacy invasive potential of cookies and web bugs. The NPPs are a set of high level principles that do not specify what is required for compliance and therefore have elasticity to accommodate interpretation for stringent or weak privacy protection. While the Privacy Commissioner’s Guidelines infer stronger protection to catch technological breaches, because ,no distinction is made between requirements for strict compliance and mere best practice, the courts have some scope to claw back protection afforded by the NPPs. Internet business organisations using cookies and web bugs could rely on s 55A(5) of the Act to force a court hearing and avoid a ‘pro-privacy’ Commissioner’s determination.
The NPPs give courts leeway to roll back protection by a combination of technological neutrality that creates ambiguity in application, and a pro-business bias that gives the opportunity to construe the ambiguity against the consumer. The NPPs are worded generally and do not specifically address the privacy implications of ,21st century technology. As a result, technological innovation in e-commerce that redefines interaction with information will find loopholes in the static broad brushstrokes of the NPPs. Furthermore, Attorney-General Daryl Williams has made it clear that the legislative intent of the Privacy Amendment (Private Sector) Act is not to protect privacy from the moral perspective of absolute human right but to provide protection only insofar ,as is necessary to achieve the optimal advantage for Australian ,e-commerce. Therefore, the stated object of the Act limits the scope of the NPPs by weighing privacy against business efficiency, and downplaying its moral force in rights discourse, for example, by balancing an individual’s ‘interest’ in privacy against the ‘right’ ,of business efficiency. Privacy will therefore have to fight for survival against profit maximisation in the information economy. American courts are already forcing big entertainment companies to monitor customer use of intellectual property online. By analogy, the ‘big money’ interest in network advertising would force the hand of the courts to take the restrictive approach to privacy when given half the chance. Indeed, the following discussion reveals many such chances in NPP 1.
The NPPs only apply where there has been collection of information, and requirements for lawful collection are stated in NPP 1 (reproduced above). ‘Collection’ is defined broadly by the Federal Privacy Commissioner to mean to ‘gather’, ‘acquire’ or ‘obtain’ personal information from any source. Despite the apparent breadth of these words, restriction arguably lies in the lowest common denominator of the synonyms that strongly suggests ‘gain’ by positive action, connoting first, acquisition of information not previously held and second, a distinction between direct extraction and passive inference. To correctly assert that a cookie or web bug has ‘collected’, these requirements must be fulfilled by the technology within its operational transaction.
Cookies do not functionally fulfil these requirements within the immediate transaction wherein they operate. Upon receiving the request for a page from ,the user’s browser, the web page server sends back not just the requested page but also an HTTP header with an additional ‘set-cookie: name=value’ line. The browser complies with the header and a cookie of a certain name and assigned value is stored in the user’s browser. The value can be a random number that ‘identifies’ the user’s computer to the server when the browser returns this cookie in the next visit. Strictly speaking, what is actually extracted by the server is the existence of the identifier that the server itself set, and therefore there is no incremental ‘gain’ in information from the immediate cookie transaction. Any information about the user’s location is passively and subsequently inferred by matching the existence of the identifier with the URL of the web page viewed, which is information collected by a web bug outside of the cookie transaction. Granted, a convincing refutation would be to consider the location of the identification number to be sufficiently proximate to the user’s computer that subsequent inferences could be deemed ‘collected’ within the immediate cookie transaction. However, even a slight lapse in the nexus gives courts the opportunity for restriction.
Whether or not the use of a web bug would be deemed sufficiently proximate depends on its specific functionality. In our marketing scenario, the general rule is that the user’s browser will only return cookie information to the domain where the cookie originated, implying that only the host website’s server which set the cookie can recognise the user’s computer. The first functionality of ,the Type 1 web bug is to circumvent this rule by permitting the third party network to set cookies. This potentially stretches the proximity loophole to allow surveillance by unknown third parties outside the web page domain. The second point to make about ‘collection’ is the apparent requirement for some ‘active’ request; in this case, the apparent requirement for some ‘act’ the bug actively ‘gaining’ information by establishing a conduit between the user’s browser and the network advertiser’s server. When the browser is triggered automatically to send an HTTP request to the network advertiser to place an ad in the pre-set banner space of the web page, other information is also sent along this conduit to enable the transaction, such as the user’s IP address, browser and operating system type and version, and the address of the host web page.
Even if web bugs can be said to ‘gain by positive action’ the URLs required to match with cookie identifiers, this may still be permitted through the ambiguity of ‘solicitation’. As analogous to the situation in Harder, web bugs may not involve a solicitation or request, but can be equated to an open telephone line as the user’s browser automatically divulges information during the process of downloading web pages.
Gunning suggests that the requirement of solicitation can be implied into collection because NPP 1.1 ‘necessity’ and NPP 2.1 ‘purpose’ envisage collection with a preconceived objective, which is arguably a mental element ,that cannot exist prior to or during accidental receipt of unsolicited information. Greenleaf disagrees with this construction; according to him the operation of s 16B of the Act postpones collection until such time as information is included in a record, therefore postponing also the critical time for the mental element of purpose to exist until point of inclusion. However I would argue, contrary to Greenleaf, that unsolicited information included in a record is not ‘collected’; Gunning is correct to emphasise the importance of purpose at the point of reception as a requirement distinct from subsequent s 16B retention. The NPP 1.3 requirement that collection of information be ‘from the individual’ does not necessarily require solicitation, because information can be collected from an individual even though it was not requested. But in conjunction with NPP 1.3(c) arguably there is a requirement of a request for information because it envisages the formulation of purpose before collection for retention so that the individual can be informed of the purpose — preferably at or before collection for retention.
Personal information has been given ,a broad conceptualisation by Bygrave, who perceives the cruxial criterion to ,be identifiability or the ability to distinguish a person by unique characteristics, not necessarily by name. The concept of personal information can be restricted however, first by sub-issues of identifiability, second by the additional requirement in s 6 of the Act that the information be ‘about an individual’, and third by the judicial gloss of intention to identify.
First, the Privacy Commissioner considers that s 6’s use of the phrase ‘reasonably be ascertained’ construes ,a primary piece of information as personal if identity can be ‘fairly easily’ ascertained by the use of auxiliary information. This prima facie permissive use of auxiliary information could mean that an IP address could be linked to a name in an internet log file or a cookie identifier linked to an email address to make such information personally identifiable. The IP address-name link could be defeated ,by ‘ease’ of identity if the IP address ,is dynamic or if there is no extant accessible log file. The Act’s use of ,the term ‘reasonably’ arguably poses a higher standard than ‘fairly easily’, requiring lawful linkage to defeat the auxiliary use of email addresses collected by planting web bugs in ,email (which is possibly outlawed under the Cybercrime Act 2001 (Cth)). Concomitantly, lack of individuation could extinguish the ‘personal’ quality of the link where the IP address and cookie identification attach to a machine with multiple users.
While these issues are not necessarily fatal to successful qualification as personal information, they do pose obstacles. On the one hand, case law is not always a reliable guide, as it may ,be coloured by a court’s consideration of underlying policies pertinent to a particular case. For example, the New Zealand Privacy Act Casenote 12582 ruling that telephone numbers lack ‘personal’ quality because they fail ,to uniquely identify individuals was influenced by the Court’s desire to avoid the possibility of jeopardising the privacy of other innocent users of the phone in question if the number were ,to be disclosed. On the other hand, whether or not information crosses the obstacles to become ‘personal’ is a matter of fact; so, for example, information can be more easily linked ,if IP addresses become static with the technological evolution towards DSL; information can potentially be linked lawfully if the advertiser purchases a profiling agency; and linkage of a telephone number or IP address to an individual is generally not difficult, for example if the person is the subject of ,a police inquiry.
The above ambiguities create the possibility of construing cookie identifiers and IP addresses as personal information, but if ‘reasonably be ascertained’ is given the meaning of probability of identification rather than the difficulty of identifying, then even ,if there is no intention to use the information as personal identification, the mere capacity to do so will render the information personal. Restriction of the identification element can come with the second element of idiosyncratic connection. Section 6(1) of the Act prima facie conflates the requirement that information be about an individual with identifiability, because the meaning of ‘about’ is not specified other than by qualification of the latter. However, Harder and Provincial Section Order 23 interpret the New Zealand and Ontario counterpart definitions respectively to operate with ‘about an individual’ as a separate narrower element requiring some idiosyncratic connection. So in Harder, Ms C’s denial of possessing unspecified chattels was not personal information because even though she was identified in the information, no ‘statement’ was made in relation to ,her as an individual. In Order 23, estimated market value combined with a municipal address was held not to ,be information about the inhabitants, but merely about the property. By analogy, arguably, an IP address or cookie identifier — even linked to ,a name — lacks an idiosyncratic relationship with the user because the information is about an inanimate browser or computer, not the individual. Conversely, a cookie identifier linked with the web page URL can constitute personal information about a person’s browsing tastes and therefore has the potential to be sensitive information if inferences can be made to intimate idiosyncrasies. ,C v ASB is a decision which prohibited the use of auxiliary information to establish the idiosyncratic link that the information is about the individual. (A small compromise was made that if the auxiliary information appears in the same document and the personal information is not intelligible without ,it, then the link will be permitted.) Auxiliary use of web bug collected web page URLs would fall outside the purview of this concession for lack of proximity with the cookie transaction. There is a weaker argument here that the result was driven by a tangential policy of refusal to lift the corporate veil, because the Court explicitly stated that its rejection of auxiliary information was grounded in the need to maintain the integrity of the definitional boundaries of personal information.
This brings me to the third point —courts, attempting to restrict the reach of privacy protection, can impose a judicial requirement which is not prima facie present in the legislation. Eastweek posed a radical constriction of requiring an intention to identify, possibly to the extent of requiring an intention to identify by name. This would exempt the combined use of an IP address and web page URL to personalise advertisements, because arguably the result of personal interaction with the user can be achieved without an intention to identify by name, relying instead upon identification by idiosyncratic browser behaviour. However, the decision in Equifax Europe Ltd v The Data Protection Registrar put the brakes on Eastweek’s radicalism by concentrating on the intention to find out about the individual rather than identification, which would catch a combination use of the IP and URL accessed.
In New Zealand Case Note 16479 and Hong Kong Case 199804574, covert recording was held to be unfair because were it not for reliance on the fact the conversation was off the record the respondent would have answered differently. Similarly in Campbell, the clandestine nature of filming was unfair because the subject was disempowered from taking alternative action.
NPP 1.1 states that an organisation must not collect personal information unless the information is necessary ,for one or more of its functions or activities. Article 8 of the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR) states that there ,shall be no interference by a public authority with the exercise of the right to privacy except where this is ‘necessary’ in a democratic society. ‘Necessary’ in art 8 of the ECHR has been interpreted by the Court of Human Rights to create a high standard, requiring pressing social need and proportionality to the legitimate aim pursued. ‘Necessary’ has also been interpreted to mean pressing commercial need. Interpretation of the ‘necessary’ criterion in art 8 of the ECHR by the Court of Human Rights is relevant to defining ‘necessary’ in the European Directive and therefore also in the ,data protection legislation of European party states.
Therefore, in Europe, use of persistent cookies to prolong the tracking of user habits long after the original session has ended may be considered unnecessary for the function of the website or personalisation of the browsing experience during any one immediate session; hence it would not qualify as a pressing commercial need. Arguably, web bugs may be permitted as a pressing commercial need because the internet advertising industry is heavily dependent upon the revenue generated by personalised advertising and the internet culture of free access that has popularised the online information economy would be jeopardised without subsidies flowing on from advertising revenue.
However, it is contended that this argument has no interpretative application to the Australian NPPs; first, because without the refractory effect of the Directive, the focus of art 8 on public authorities means it is not directly applicable to the private sector; second, because although the ECHR may be influential on the UN Human Rights Committee in its interpretation of the International Covenant on Civil and Political Rights (ICCPR) to which Australia is a party, the ICCPR counterpart art 17 does not include ,the term ‘necessary’, thereby severing the link of interpretation; and third, even if it could be correctly asserted that the European ‘necessary’ standard has force as customary international law, or at least constitutes an extrinsic interpretative aid under s 15AB of the Acts Interpretation Act 1901 (Cth),because the Australian Act explicitly creates a conflicting lower standard,,be applied.
Arguably the Australian Act creates a lower standard because it will only raise the standard of privacy protection given by the courts if it is incidental to achieving the apparent overall objective of the legislation of securing Australian economic advantage. Despite political rhetoric invoking Australia’s human rights obligation regarding privacy under ICCPR art 17, the Attorney General’s Department and the Australian Law Reform Commission consider the ICCPR to be only an indication of international best policy and expressly deny that it creates any legally binding obligation which must be implemented through the Act. Indeed, it is questionable if any ‘international obligation’ attaches to ,the ICCPR at all, as the Explanatory Memorandum only discusses obligations under the EU Directive. Despite Australia’s clear concern ,with the EU trade barrier, the Government has displayed a US style hesitation in applying sweeping EU privacy standards, as evidenced ,by the inclusion of the small business exemption in the Act despite the risk that this compromises Australia’s chances of being held to have met ,the EC Directive’s requirement of ‘adequacy’.
Left to Australian judicial consideration, the meaning of ‘necessary’ varies from an absolute imperative, to a relative term of reasonably required, to merely the exclusion of collection as a ‘fishing expedition’, depending entirely upon judicial intention — which, in the case of the Act, has favoured the lower threshold. ‘Reasonably required’ could mean that alternative modes of collection that are less privacy invasive need not be considered, and certainly the breadth of ‘reasonably’ is such ,that the stated purpose is virtually unrestricted. If the stated purpose is future marketing in anticipation of the user returning to the site, then the use of persistent cookies would be ‘reasonable’.
Cookies and web bugs have a good chance of surviving the privacy legislation and, indeed, there is potential for the Act to allow even more intrusive privacy invasive practices. The economic imperative has seen the cutting back of fair use exemptions in anti-circumvention legislation in favour of powerful media interests, and it would seem that the law/technology divide in the privacy legislation will also be dominated by the colour of money. l
Sharon Nye is a student researcher at the Baker & McKenzie Cyberspace Law Centre at the University of New South Wales. This article was first submitted as an essay for the elective course Data Surveillance and Information Privacy Law in the LLM program at UNSW.
 Tsoi K ‘Web bugs and internet advertising’ 43 (2001) Computers and Law 17.
 Draft National Privacy Principle Guidelines 7 May 2001, p 22 (the Guidelines).
 Smith R ‘FAQ: web bugs’ <www.privacyfoundation.org/resources/webbug.asp> p 1.
 This taxonomy of web bugs has been developed by Intelytics, which categorises web bugs according to manner of ‘infection’, location of infection, co-operation of the user and co-operation of the website from which browsing information is being collected: Statement of GE Clayton, Dr SB Lucan and KG Coleman before the Congressional Privacy Caucus ‘Web bugs and the threat to online privacy and security for consumers and businesses’ 1 March 2001 at <www.house.gov/markey/iss_privacy_,clayton.pdf> p 7.
 Private Sector Privacy Handbook chapter on National Privacy Principles, CCH Australia Ltd Sydney 2001, ,p 2042 [5-100].
 Above note 6.
 Section 52(1)(B) of the Act states that the Commissioner could make a determination that includes a declaration that offending conduct not continue or be repeated; damages be awarded; and costs be awarded. However, if a determination is entered into to the detriment of the business organisation, the organisation can refuse to comply, thereby forcing either the complainant or the Commissioner ,to commence de novo Federal Court proceedings against the organisation ,to enforce the determination: see ,ss 55(1)(a) and (b) and s 55A(5). However, a copy of the Commissioner’s determination can be received as evidence into the trial: s 55(6)(a).
 Gunning expresses a privacy advocate’s viewpoint that cl 3 should only have a restricted application to consideration of exemptions to the NPPs, but construction of the provision weighs in favour of the interpretation proposed here that the policy behind ,cl 3 also broadly affects the restrictive reading of the NPPs: Gunning P ‘Central features of Australia’s private sector privacy law’  CyberLRes ,2 at 5 <www2.austlii.edu.au/-graham/CyberLRes/2001/2/>.
 For criticism as to lack of definition of terms key to the Act, see Australian Consumers’ Association Submission to the House of Representatives Standing Committee on Legal and Constitutional Affairs Inquiry into Privacy Amendment (Private Sector) Bill 2000 p 3, available from the Australian Parliament House website at <www.aph.gov.au/>. The NPPs are worded generally because the NPPs were derived from OECD guidelines and geared towards flexibility and industry wide coverage: see The Parliament of the Commonwealth of Australia Cookie monsters? Privacy in the information society Report by the Senate Select Committee on Information Technologies November 2000, p 58.
 Senator K Lundy ‘Privacy Amendment (Private Sector) Bill 2000’ In Committee Speech 30 November 2000 1-2.
 Williams D MP ‘Privacy Amendment (Private Sector) Bill 2000 Second Reading Speech’ 12 April 2000, House Hansard p 4.
 Above note 12 p 1.
 Privacy Amendment (Private Sector) Act 2000 (Cth) s 3(b)(iii).
 Greenleaf G Privacy Amendment (Private Sector) Bill 2000, Submission to the House of Representatives Standing Committee on Legal and Constitutional Affairs 15 May 2000 ,p 2. The submission is available from the Australian Parliament House website at <www.aph.gov.au/>. ,This language is also used in the Explanatory Memorandum of the Privacy Amendment (Private Sector) ,Bill which couches individual’s ‘interest’ in privacy in terms of a consumer interest; see The Parliament of the Commonwealth of Australia Senate Privacy Amendment (Private Sector) Bill 2000 Revised Explanatory Memorandum pp 1, 6-10.
 Shim R ‘Sonicblue forced to spy on subscribers?’ <story.news.yahoo.com/news?tmpl=story&u=/zd/20020506/tc_zd/5107342>.
 In 2000, 48 per cent of all online advertising revenue was attributable to banner advertising; it was 36 per cent ,in 2001. This constitutes the largest portion of the revenue, with sponsorships and classifieds bringing in 26 per cent and 16 per cent respectively in 2001 and email advertising only bringing in 3 per cent in 2001: see Internet Advertising Bureau ‘Internet advertising revenue totalled $1.7 billion for Q4 2001’ <www.iab.net/>.
 Office of the Federal Privacy Commissioner Guidelines to the National Privacy Principles September 2001.
 ‘Gather’ is defined variously to mean ‘gain’ in Sykes J (ed) The Concise Oxford Dictionary (7th ed) Clarendon Press Oxford 1982 p 408. ‘Acquire’ can also have the meaning of ‘gain’ and is treated as synonymous with ‘obtain’ under the Income Tax Assessment ,Act 1936 (Cth): see s 136AA(2).
 ‘Gain’ in the context of the verbs ‘obtain’ ,‘acquire’ and ‘gather’ suggests gain by positive action rather than passive reception. So ‘obtain’ means to procure or gain as the result of purpose and effort: see Re Woods, Woods v Woods  St R Qd 129 at 137 per Philip J. In the context of s 50 of the Trade Practices Act 1974 (Cth), an acquisition has been described as an act of gaining: see Lipton J ‘The meaning of acquire in section 50 of the TPA and its impact on secured financing transactions’ (1993) 21 Australian Business Law Review 353 at 354. ‘Gather’ is defined to mean cause to assemble: see Sykes J above note 19 p 408.
 To understand the operation of cookies, a few brief words must be ,said about HTTP. Hypertext transfer protocol (HTTP) is a group of standards that governs the way web pages, graphics and other data should be transferred across the internet. An HTTP header communicates the request from the user’s browser as well as the server’s response, telling the receiving end exactly what it is receiving. The information transferred is therefore extremely limited — the server does not ‘know’ anything about the browser except that certain data was requested and dispatched. Cookies help the website function by expanding the ability of HTTP through including more information inside the HTTP header: see Slayton M ‘An introduction to cookies’ <hotwired.lycos.com/ webmonkey/templates/print_template.,htmlt?meta=/webmo...> p 1.
 Whalen D ‘The unofficial cookie FAQ version 2.54’ at <www.cookiecentral.com/faq/> p 5.
 Slayton M above note 21 at p 2. When a cookie is sent from the browser to the server, the information in the HTTP header is changed slightly to ,read ‘cookie: name=value’ so that the server is made aware of a cookie with ,a unique value. Whalen D above note 22 at p 5.
 The cookie only stores information about parameters. It stores information about the path parameter setting the URL path the cookie is valid within and it stores information about the domain parameter which makes the cookie accessible to different servers within, the domain. This information governs the location at which the cookie can be read but does not indicate the location of the cookie as attached to ,the user’s browser at any one time: Zimmerman R ‘The way the cookies crumble’ (2000) 4 New York University Journal of Legislation and Public Policy 493 at note 17; Whalen D above note 22 at p 5.
 Above note 2.
 See Fraser H ‘Location, location, location’ Dec/Jan 2001 elawpractice. com.au Issue 9 at p 29-31.
 Clickstream data has therefore been described as information about information — human inferences are imputed into mechanical crumbs, creating an indirect link between the user and what is actually collected: see Reidenberg J and Gamet-Pol F ‘The fundamental role of privacy and confidence in the network’ (1995) 30 Wake Forest Law Review 105 at 113.
 Slayton M above note 21 at p 2.
 Statement of GE Clayton, Dr SB Lucan and KG Coleman above note 5 ,at p 7.
 The bug can include additional information in the query string that the computer uses to download the webpage to instruct a cookie to be returned along with the webpage. The cookie is sent back to the network advertiser’s server with subsequent HTTP requests to the same internet domain: statement of GE Clayton, Dr SB Lucan and KG Coleman above note 5 at p 7.
 O’Harrow R ‘Fearing a plague of web bugs’ 13 November 1999 at <www.washingtonpost.com/wp-srv/Wplate/1999-11/13/0481-111399-idx.html> p 1; Olsen S ‘Nearly undetectable tracking device raises concern’ <news.com.com/2100-1017-243077.html?legacy=cnet> p 1.
 As mentioned above, host websites do not keep advertisements locally but subscribe to an intermediary network advertiser to place the ads for them. When the user’s browser requests a web page by an HTML call, the website server answers the call by transmitting information necessary to display the webpage. Downloaded with this information is a Type 1 web bug that is an HTML code. This HTML code is an invisible conduit between ,the user’s browser and the network advertiser: see BBC News ‘Pixel-high privacy spy’ at <news.bbc.co.uk/hi/english/sci/tech/newsid_842000/842624.stm> p 2; Whalen D, above note 22 at p 4; Langa F ‘The web bug boondoggle’ at <www.informationweek.com/story/IWK20010621S0030> p 2. The HTML code is downloaded with the rest of the web page as part of the graphics because it attaches to an invisible graphic 1x1 pixel in size: see Online Profiling: A Report to Congress June 2000 <www.ftc.gov/os/2000/06/onlineprofilingreportjune2000.pdf> p 7; Edwards M ‘Your web browser is bugged’ <www.ntsecurity.net/Articles/Print.cfm?ArticleID=9543>.
 Online Profiling: A Report to Congress above note 33 at p 7; Edwards M above note 33.
 Harder v The Proceedings Commissioner  3 NZLR 80, available at <www.austlii.edu.au/nz/cases/NZCA/2000/129.html>.
 Above note 9 at 6.
 Greenleaf G ‘Private sector privacy: problems of interpretation’ (2001) CyberLRes 3 <www2.austlii.edu.au/~graham/CyberLRes/2001/3/> p 4.
 The wording of the New Zealand Privacy Act is unfortunate, in that it states in s 2 that ‘collect does not include receipt of unsolicited information’, a construction which ,leads to uncertainty as to whether the restrictions of the Australian Federal Privacy Act obligations concerning information collected (s 16B) require solicitation before they apply.
 In a lecture by Bygrave and Waters on collection use and disclosure principles, Bygrave commented that NPP’s 1.3 ‘collection from the individual’ suggests a requirement of solicitation. However, in Harder (above note 35), the information was arguably from the woman speaking on the other end of the telephone line, but nevertheless there was no collection because the information was from the individual but without request.
 Underhill S ‘Web bugs within ,web pages’ <www.infinisource.com/featuresweb-bugs2-pf.html> p 1.
 Note that the cookie value cannot contain anything that isn’t provided by the browser or by the user himself or herself, unless the information is the unique identifier assigned by the server setting the cookie: Australian Consumers’ Association, ‘What’s inside your cookie’ CHOICE at <www.choice.com.au/articles/printGenerator.asp?ID=102717> p 2. See also Slayton M above note 21 at p 1.
 For a discussion on the distinction between overt and covert methods of collection and how the two methods can be used in combination, see Shimanek A ‘Do you want milk with those cookies? Complying with the safe harbor privacy principles’  26 University of Iowa Journal of Corporation Law 456 at 460.
 Slayton M ‘The risks of cookies’ <hotwired.lycos.com/webmonkey/templates/print_templates/print_template.htmlt?meta=/webmo...> p 2.
 Bygrave L ‘Chapter 10: Existing safeguards for data on collective entities pursuant to data protection laws’ <www2.austlii.edu.au/privacy/secure/Bygrave/index-CHAPTER-10.html> p 10.
 Above note 44.
 Above note 2.
 The IP address is not usually personally identifiable information because most IP addresses are dynamic, changing every time the user connects to the internet, as opposed to a static address that is uniquely connected to the user’s computer: Federal Trade Commission Online Profiling: A Report to Congress Part 2 Recommendations July 2000 <www.ftc.gov.os/2000/07/onlineprofiling.htm> p 10 footnote 14.
 Above note 37 at p 13.
 The Commonwealth jurisdiction under the Cybercrime Act 2001 is limited to use of a telecommunications service in commission of an offence; with respect to the Commonwealth Act and the Crimes Amendment (Computer Offences) Act 2001 (NSW), there must be copying of data held in a computer and the access must be unauthorised. The web bug could possibly be said to copy data in emails stored in the user’s computer where such access is unauthorised: see Steel A ‘Boldly going where no-one has gone: the expansive new computer access offences’ (2002) 26(2) Criminal Law Journal 72 at 73, 77 and 80.
 Above note 44 at p 12.
 Case notes released by the Privacy Commissioner of New Zealand Casenote 12582 (1999) at <www.knowledge-basket.co.nz/privacy/people/cn12582.html>.
 Federal Trade Commission Online Profiling: A Report to Congress Part 2 Recommendations above note 47 at ,p 10 footnote 14.
 Above note 37 at p 13.
 European Directive recital 26 ,uses the phrase ‘likely reasonably’, which could conflate the meaning of probability into the word ‘reasonably’. Under the Ontario legislation, where the definition of personal information does not contain the word reasonably, nevertheless judicial gloss has been applied to give the legislation wider breadth. Provincial Order 230 provides that if there is reasonable expectation that the individual can be identified from the information, then the information is personal: see <www.ipc.on.ca/english/orders/orders-p/p-230.htm>
 ‘Personal information is information or an opinion ... about an individual whose identity is apparent ,or can reasonably be ascertained’: see ,Pt II, s 6 of the Act.
 The New Zealand Privacy Act 1993 defines ‘personal information’ to be ‘information about an identifiable person’. While this is a simpler definition that that of the Australian Act, the two are comparable to the extent that the New Zealand definition also has the two elements of ‘about’ and ‘identifiability’. The Ontario legislation defines generally personal information to mean ‘recorded information about an identifiable individual’, making it similar to the New Zealand legislation except that the Ontario legislation explicitly provides that personal information can include an identification number: see Privacy Act 1993 (NZ) Pt I, s 2 (Interpretation) <www.knowledge-basket.co.nz/privacy/legislation/1993028/doc00004.html>; Freedom of Information and Protection of Privacy Act s 2(1) and (2); and Municipal Freedom of Information and Protection of Privacy Act s 2(1) and (2), <184.108.40.206/DBLaws/Statutes/English/90f31_e.htm#s002>
 Above note 35 at .
 While the Ontario legislative definition includes ‘identifier assigned ,to an individual’, arguably in the case of IP addresses and cookie identifiers the number is not assigned to the individual but to a machine with potentially several users; therefore Order 23 still applies to distinguish the identifier as personal information in such cases.
 For an example of cookie information linked to web page URLs being sensitive information, see ,Kaplan C ‘Fighting to make a city’s cookie files public’ Cyber Law Journal 18 December 1997 <www.nytimes.com/library/cyber/law121897law.html>. An investigative journalist in Tennessee is fighting to access the cookie files of government employees under the charge that if government employees are surfing Ku Klux Klan sites, satanic sites or pornographic sites, then the public has the right to know and take action.
 Waters N ‘Cases and Complaints: C v ASB Bank Ltd’  PLPR 63.
 C v ASB Bank Ltd 4 (1997) HRNZ 303 at .
 Above note 44 at p 13.
 Above note 61.
 In Harder v Proceedings Commission  3 NZLR 80, Tipping J made obiter remarks ,about the proper approach to the understanding of the concept of ‘personal information’. Tipping J said:
An unqualified approach to what constitutes ‘information about an identifiable individual’ will lead readily to breaches of one or more of the information privacy principles ... ,[The provisions of s 14(a)] require the Commissioner, and implicitly others involved in the administration of the ,Act, to have due regard for the protection of important human rights and social interests that compete with privacy, including the general desireability of ,the free flow of information and the recognition of the right of government and business to achieve their objectives in an efficient way.
 Eastweek Publisher Ltd v Privacy Commissioner For Personal Data  HKCA 137 at <www.hklii.org/cgi-hklii/disp.pl/hk/cases/HKCA/2000/137.html> p 6.
 The discussion of identity in Robeiro J’s judgment is focused on anonymity rather than the woman’s unique characteristics as revealed by ,the photograph.
 (1991) Case DA/90 25/49/7 at , cited at <www2.austlii.edu.au/privacy/secure/Bygrave/index-CHAPTER-2.html#fn201>.
 Guidelines to the National Privacy Principles September 2001.
 Hetcher S ‘The emergence of website privacy norms’  7 Michigan Telecommunications and Technology Law Review 97, Lexis transcript p 19.
 See <www.privacy.org.nz/news3.html>.
 See <www.pco.org.hk/english/casenotes/case_enquiry2.php?id=13>.
 Campbell v Mirror Group Newspapers cited R v Broadcasting Standards Commission ex parte BBC  EWCA 59;  QB 885 where Lord Woolf MR said:
The fact that it is clandestine can add an additional ingredient. The fact that it is secret prevents those who are being filmed from taking any action to prevent what they are doing from being filmed.
In this case, the plaintiff had no opportunity of evading being photographed or refusing consent to be photographed: see Campbell v Mirror Group Newspapers  EWHC ,499 (QB) at <www2.bailii.org/~jury/cases/EW/EWHC_QB_2002_499.html> at -.
 Walker K ‘The Costs of Privacy’  25 Harvard Journal of Law ,and Public Policy 87 at 105.
 See above note at 107-113 for a discussion on the irrelevance of website privacy policies. For an American law student’s comment on the ineffectiveness of privacy policies in response to the US Department of Commerce Workshop ,on Online Profiling, see <www.ftc.gov/bcp/profiling/comments/ridder.htm>.
 Harper J and Singleton S ‘With a grain of salt: what consumer privacy surveys don’t tell us’ June 2001 <www.cei.org/PDFs/with_a_grain_of_salt.pdf> p 7.
 Seban L, ‘Who cares about internet privacy?’ 12 June 2001 <www.newsfactor.com/perl/printer/11161>.
 See Bygrave L, ‘Data Protection Pursuant to the Right to Privacy in Human Rights Treaties’ <www2.austlii.edu.au/privacy/secure/Bygrave/index-Appendix-2.html> p 3.
 Leander v Sweden 1987 <hudoc.echr.coe.int/hudoc> at . The second requirement of proportionality to a legitimate purpose conflates the two criteria of legitimacy and proportionality. The legitimate purposes are listed in art 8 cl 2 and the Court considers the fulfilment of this legitimacy criteria in its own right before determining proportionality: see Leander v Sweden at  ‘Legitimate Aim’. See also Koelman K and ,Bygrave L Privacy, Data Protection ,and Copyright: Their Interaction in ,the Context of Electronic Copyright Management Systems IMPRIMATUR, Institute for Information Law, Amsterdam, June 1998, p 68; ,Bygrave L above note 78 p 14.
 See Koelman K and Bygrave L above note 79 at p 48.
 See Directive 95/46/EC cl 10 of the preamble: Koelman K and Bygrave L above note 79 where Bygrave says that art 8 is ‘very important from the normative perspective for the interpretation of the Directive’.
 Junkbusters above note 23 at p 7; Eichelberger L ‘The cookie controversy’ <www.cookiecentral.com/ccstory/ cc3.htm> p 1. Because the life of the cookie can exceed the browsing time, the browser will save the cookie onto the user’s hard drive. The author notes that some cookies may be so persistent that they will last long after you change ISPs or upgrade your browser.
 For discussion that persistent cookies are unnecessary, see Garfinkel S ‘The persistence of cookies’ at <hotwired.lycos.com/packet/garfinkel/96/50/index2a.html>.
 Above note 17.
 Hetcher S above note 70 p 15; Weise E ‘A new wrinkle in surfing the net’ <www.usatoday.com/life/cyber/tech/cth582.htm> p 2.
 Gandy O ‘Legitimate business interest: no end in sight?’ (1996) University of Chicago Legal Forum 77 at 124-5; Kaplan C above note 60.
 Stepanek M ‘Weblining: Companies are using your personal ,data to limit your choices — and force you to pay more for products’ <www.businessweek.com/2000/00_14/b3675027.htm?scriptFramed> p 5. Venture Direct, a New York based company, sells a list of fat black women who are offered as targets for self-improvement products — an example ,of discriminatory commercial practice cited by J Reidenberg, Testimony before the Subcommittee on Courts and Intellectual Property, Committee on the Judiciary and United States House of Representatives Oversight Hearing on Privacy and Electronic Commerce ,18 May 2000.
 Lewis P ‘Had your fill of spam yet?’ <seattletimes.nwsource.com/ news/nation-world/html98/spam_ 032898.html>; Middleton J ‘Hotmail Users Face Mass Spamming’ <www.pcw.co.uk/Newa/1124709>.
 Even though the ECHR art 8 is focused on public authorities, it has an interpretative value for the application of the NPPs to the private sector because the European Directive applies the same standard of obligation to both public and private bodies: see Preamble cl 26. The UK Data Protection Act 1998 follows this approach by defining ‘data controller’ as a person who determines the purposes for which personal data is to be processed, without differentiation between public and private body controllers. Under such circumstances Bygrave’s gloss ,on the Human Rights Court’s interpretation of necessary as encompassing ‘pressing social or commercial need’ may be permissible: see Koelman K and Bygrave L above note 79 p 48. However, where the ECHR focus on public authorities is directly applied to the Australian Act, which differentiates between public and private sectors, there must be a clear judicial statement from the Court of Human Rights that art 8 has application to private bodies. A cursory browse through the European Court ,of Human Rights case database has revealed no such development from years 1999 to 2000, the recent case Amann v Switzerland merely reiterating that ‘the storing by a public authority of information relating to an individual’s private life amounts to an interference within the meaning of Article 8’: see <hudoc.echr.coe.int/hudoc>.
 Article 17 of the ICCPR is phrased in cl 2 to give a general right ,of protection against interference, whereas art 8 ECHR is phrased to include an exception. That is why the term ‘necessary’ does not appear in art 17 of the ICCPR. Even if the ICCPR serves as a link between the ECHR and the Australian Act, the High Court in Minister for Immigration and Ethnic Affair v Teoh (1995)  HCA 20; 183 CLR 273 ,said that the treaty should be used ,as an interpretative guide with due circumspection, having regard to the relationship between the treaty and the Act. See Balkin R ‘International Law and domestic law’ Chapter 5 at 5.3.4 ,in Blay et al (eds) Public International Law: An Australian Perspective; Evatt E ‘Meeting universal human rights standards: the Australian experience’ Senate Occasional Lecture Series 1998 available at the Australian Parliament House website, p 4.
 In the Chow Hung Ching case, the Court suggested that Australian courts ought to apply at least those principles of international law that had been universally accepted: Balkin R, ,above note 90 at 5.2.1. It can be argued that the necessity standard does not have standing as a rule of customary international law because it is only accepted in European states. For example, the American Convention ,on Human Rights omits the necessity criterion: Bygrave L above note 78.
 It is not certain to what extent ,s 15AB(1) and (2) of the Acts Interpretation Act 1901 (Cth) authorises treaties as an aid to interpretation where the treaty is not expressly referred to in the statute. However, obiter dicta of the Federal Court suggests that it may be permissible to consider such statutes where they have been referred to in ,the Second Reading Speech: Balkin R, above note 90 at 5.3.4.
 Balkin R, above note 90 at 5.2.2:
Even where a rule of customary law (or for that matter a treaty provision) can be satisfactorily established, the courts will refuse to apply it in the face of clearly contradictory statutory provisions.
 For use of rights rhetoric in political debate, see Murphy J MP, Privacy Amendment (Private Sector) ,Bill 2000 Second Reading Speech ,8 November 2000; Senator Stott Despoja N, Privacy Amendment (Private Sector) Bill 2000 Second Reading Speech 29 November 2000, Senate Hansard.
 Ms K Leigh Assistant Secretary ,of the Attorney General’s Department Information Law Branch, response ,to the Chairman, and Mr A Rose President of the Australian Law Reform Commission, response to Senator ,Abetz, Senate Legal and Constitutional References Committee, Official Committee Hansard Wednesday ,5 August 1998 Canberra pp 212 ,and 227. Available at the Australian Parliament House website. Compare this to the Senate Legal and Constitutional Committee recommendation that the privacy legislation be introduced on a wider base of international obligations, including ICCPR obligations: see Senate Legal and Constitutional Committee ‘Privacy in the Private Sector: Chapter 3 Evaluating a Privacy System’ <www. aph.gov.au/senate/committee/legcon_,ctte/privacy/Chap3.htm> p 14.
Clause 12B is a statement of Commonwealth legislative power ,rather than an undertaking of ICCPR obligation: The Parliament of the Commonwealth of Australia Senate, Privacy Amendment (Private Sector) ,Bill 2000 Revised Explanatory Memorandum, p 55 Item 48.
 Both the US and Australian governments share a kindred concern that the cost of implementing stringent EU privacy requirements could prove too much for businesses to remain competitively viable. For example, Orson Swindle, Commissioner of the FTC, said in dissent to the introduction of Fair Information Practice Principles that they would ‘impose costs or other unintended consequences that could severely stifle the New Economy’. The same rationale was responsible for introduction of the small business organisation exemption in Australia: ,see Orson Swindle, Prepared Statement of the Federal Trade Commission on ‘Privacy online: Fair information practices in the electronic marketplace’ Before the Committee on Commerce, Science, and Transportation US Senate, Washington DC 25 May 2000 , at <www.ftc.gov/os/2000/05/,privacyswindle.htm> p 2; and the Advisory Report on the Privacy Amendment (Private Sector) Bill 2000 House of Representatives Standing Committee on Legal and Constitutional Affairs, Chapter 2: Small Business Exemption, p 9, available at the Australian Parliament House website.
 European Commission Submission to the House of Representatives Committee on ,Legal and Constitutional Affairs concerning its inquiry into the ,Privacy Amendment (Private Sector) ,Bill 2000 pp 2 and 6, available ,from the Australian Parliament ,House website.
 R v Marwey  Qd R 247 ,at 250; Smith v McCarron  SAStRp 26;  ,SASR 244 at 247.
 Lang v Australian Coastal Shipping Commission  2 ,NSWLR 70 at 72-3.
 The Draft National Principle Guidelines May 2001 define collection of personal information to be necessary if ‘an organisation cannot in practice effectively pursue a function or activity without collecting personal information’. Compare this to the ,Draft National Principle Guidelines September 2001 which define collection of personal information to be necessary ‘if an organisation cannot in practice effectively pursue a legitimate function or activity without collecting personal information’.
 Kang J ‘Information Privacy in Cyberspace transactions’ (1998) 50 Stanford Law Review 1193 at 1285-1286.
 Cookies are only rejected ,0.68 per cent of the time per billion pages viewed: see Harper J and Singleton S above note 76 at p 9. Fred Langa calls the ‘web bug hysteria’ boloney: see Langa F ‘The web bug boondoggle’ above note 33 at p 2.
 Above note 2 at 17 at 18.
 The latest peer to peer system ,is a centralised system embedded in a decentralised system. This means that peers (private individuals acting as collectors) are connected to a supernode (network advertiser server) where the supernodes are not connected but the peers can communicate across nodes: see Minar N ‘Distributed systems topologies: Part 1’ <www.openp2p. com/lpt/a//p2p/2001/12/14/topologies_,one.html> p 3.
 Borland J ‘Stealth P2P network hides inside KaZaa’ <news.com.com/2100-1023-873181.html>.
 Borland J ‘KaZaa exec defends sleeper software’ <news.com.com/2100-1023-875016.html>.