AustLII Home | Databases | WorldLII | Search | Feedback

Privacy Law and Policy Reporter

Privacy Law and Policy Reporter (PLPR)
You are here:  AustLII >> Databases >> Privacy Law and Policy Reporter >> 2001 >> [2001] PrivLawPRpr 39

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Clarke, Roger --- "P3P revisited" [2001] PrivLawPRpr 39; (2001) 8(4) Privacy Law and Policy Reporter 81

P3P revisited

Roger Clarke

This is a column in Roger Clarke’s series on privacy invasive and privacy enhancing technologies. This column, including hotlinks, is available at <>.

In the introductory article to this series, I distinguished privacy invasive technologies (the PITs), and three different categories of privacy enhancing technologies (PETs). Two of those are savage PETs which deliver anonymity, while the other is a gentle PET which focuses on pseudonymity. This article considers a technology that arguably belongs to the third category of PETs, which I refer to as ‘PIT countermeasures’.

The world wide web has delivered an explosion in access to information, and in the ability to publish; but it has been perceived by consumer marketers as a further opportunity to apply old style consumer manipulation techniques that worked well for them in the broadcast and mass marketing era. One important privacy invasive mechanism on the web is the capture of personal data through web forms, cookies and other devices, without consent, or with considerably less than informed consent. This article examines a proposed enhancement to web protocols that was originally intended to provide controls over those incursions into privacy.


The World Wide Web Consortium (W3C) is an association of large corporations that fund an organisation directed by the web’s inventor, Tim Berners-Lee, to refine existing protocols and develop new ones. The Platform for Privacy Preferences (P3P) is an initiative of a W3C Working Group that is claimed to provide ‘a simple, automated way for users to gain more control over the use of personal information on websites they visit’.

Readers of this publication were provided with one of the first published overviews of P3P (see PLPR 5(2) (July 1998) at 35-39). I followed that with a critique in PLPR 5(3) (August 1998) at 46-48.

In the earlier of those articles, I depicted the purpose of the P3P specification as being to enable:

I was positive about P3P’s prospects. I based that judgment on the belief that P3P compliance was to be embedded within web browsers and web servers, in order to establish the following process.

— attempt negotiation with the web server, in order to achieve an outcome consistent with the preferences statement; or

— notify the web user (for example, notify of the nature of the mismatch, and/or of the site’s explanation why the practice is as it is), enabling the web user to:

— provide informed consent to provision of the data, despite the mismatch;

— attempt negotiation manually; or

— withdraw from further interaction with the website.

The scheme was intended to achieve what W3C referred to as ‘informed consent through user choice’. The W3C P3P Working Group maintains a substantial list of papers dealing with P3P, including (to their credit) the criticisms as well as the more positive reviews.

The gathering clouds

Privacy advocates adopted varying interpretations of P3P. Several activists, myself included, participated in the W3C Working Group in the belief that the initiative was capable for delivering real technological protections for web users. Several others were more sceptical, and preferred to stay outside the Working Group.

In my critique of early 1998, I identified four aspects of P3P that I was concerned about:

At the international privacy conference in Montreal in September 1997, EPIC’s Marc Rotenberg presented a classification scheme for technologies:

(1) technologies of surveillance (equivalent to my PITs);

(2) technologies for contracting (including P3P, which he saw as being neutral rather than a positive contribution to privacy);

(3) technologies for labelling and notice (such as ‘trust labels’); and

(4) privacy enhancing technologies (PETs).

To address some of his concerns about the limited contribution that he saw P3P as making, I suggested that some refinements were needed, including:

New York based Australian, Jason Catlett of Junkbusters Inc, expressed more serious concerns in an open letter to P3P’s designers in September 1999. He depicted P3P as being part of the direct marketing lobby’s manoeuvres to convert privacy from the fundamental human right that it is to nothing more than a consumer preference. It diverted attention away from what is really needed (privacy protective law complete with enforcement and redress), towards the US corporate view of privacy as merely notice of practices and consumer choice. Rather than a Platform for Privacy Preferences, he saw it as a Pretext for Privacy Procrastination.

P3P as a pseudo PET

I’ve had little to do with P3P during the 18 months since Jason’s open letter. I revisited P3P recently, and was very disappointed with what I found.

The descriptions of the now all but finalised specification make clear that the protocol specifies only the statement of a website’s use and disclosure policy. Worse, it is actually depicted as though it were a push mechanism, rather than a communication initiated by a request by a browser. The accompanying diagrams even go so far as to imply that the browser submits personal data to the server irrespective of what the website’s policy statement is.

Critically, the specification contains no minimum requirements of web browsers. This had to be omitted in order to avoid constraining competition among browser providers. P3P therefore fails to create any momentum towards the inclusion of the necessary privacy sensitive features in the tools that users have at their disposal.


The original promise of P3P has been neutered. The judgements of Marc Rotenberg in 1997 and Jason Catlett in 1998, as updated in the EPIC report of June 2000, are fully vindicated. P3P is nothing more than a ‘privacy policy declaration’ standard. That’s a mere fraction of what is was meant to be, and of what the situation demands.

The key proponents of the P3P protocol have laboured long and hard in an endeavour to deliver a PET, but the interests of W3C’s members have resulted in it being watered down to a mere pseudo protection. v

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra and Visiting Fellow, Department of Computer Science, Australian National University. He is also on the Editorial Board of PLPR.

Key references

Catlett J ‘Open letter 9/13 to P3P developers’ September 1999 at <>.

Clarke R ‘Platform for privacy preferences: an overview’ (April 1998), PLPR 5 (2) (July 1998) 35-39, at <> .

Clarke R ‘Platform for privacy preferences: a critique’ (April 1998), PLPR 5(3) (August 1998) 46-48, at <>.

EPIC ‘Pretty poor privacy: an assessment of P3P and internet privacy’ Electronic Privacy Information Center and Junkbusters, June 2000 at <>.

W3C ‘Platform for privacy preferences’, at Platform for Privacy Preferences (P3P).


This series is supplemented by a resource page that will be maintained on an ongoing basis. PLPR readers are invited, and actively encouraged, to contribute sources and suggestions for enhancement to, and to bookmark the page for their own use and for communication to others.

The resources page for the series is at <>.

AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback