AustLII Home | Databases | WorldLII | Search | Feedback

Privacy Law and Policy Reporter

Privacy Law and Policy Reporter (PLPR)
You are here:  AustLII >> Databases >> Privacy Law and Policy Reporter >> 1999 >> [1999] PrivLawPRpr 63

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Gammack, John --- "Secondary uses of personal information - ethical responsibilities of IT professionals" [1999] PrivLawPRpr 63; (1999) 6(6) Privacy Law & Policy Reporter 89

Secondary uses of personal information — ethical responsibilities of IT professionals

John Gammack

This is the introduction from an article entitled ‘Ethical responsibility and the management of knowledge’ published in the Australian Computer Journal vol 31 no 3 August 1999. It is reprinted here with permission of the authors and the editor of the ACJ (Editor).

Surveying the codes of ethics for various professional bodies in engineering and computer science[1] indicates worthy values with which surely few would disagree. Taking the Australian Computer Society professional code as typical,[2] the general obligations stress personal integrity and respect for others, including applicable laws. Implicit in such codes are personally identified activities, and a recognition of the impact of information systems on quality of life. These, however, reflect a localised awareness of the potential impacts of a system development, and seem at odds with the potential of many information systems to be put to purposes other than those originally intended. ‘If I had known I would have become a watchmaker’ was Einstein’s rueful reflection on work that led to the atomic bomb — and parallels with the use of recorded data for unforeseen purposes are many, as we now illustrate.

The government of Iceland, for example, is selling the rights to the details of the entire population’s genetic code to an American biotechnology company, who have been approved (unusually) to hold a 12 year monopoly on the data marketing rights.[3] The project is widely opposed internally and internationally, and the Ethical Committee of the Icelandic Medical Council are advising doctors not to participate in the project. Such a database, containing both medical and genealogical data, can potentially identify private details of individuals and their ancestors in terms of health and lineage status. Despite stated assurances of anonymity, there are neither technical nor political reasons why such identification may not be done in future uses of the database. In fact, the usual understanding of anonymity as ‘removal of personally identifying information’ has not been applied here. Instead, information identifying individuals is being encrypted, and thus is available to authorised individuals and hackers. Despite concern from the EU, this may lead to a Europe-wide database. Closer to home, Tasmania, as an island with a relatively stable population of extended families, has similar relevant qualities to Iceland, and is increasingly becoming recognised as a suitable laboratory for genetic research. An Australia-wide DNA database to be used in relation to crime scene forensics is also currently being proposed by the federal minister.

Other examples indicate the existing linkages to other systems that do explicitly identify individuals for specific services, particularly in e-commerce applications. The detailed insights into personal reading habits and potentially related direct marketing available to amazon.com are well known in this regard; books can even be commissioned to order, or proposals rejected, on the basis of identified customer projections. Supermarkets and other large retailers frequently use loyalty cards, linked to deals with allied corporations. These are often tied to direct marketing, but now are even linked to individually specified profit and loss statements by individual customers.[4] Such statements are already used for example in decisions about:

Banks, insurance companies and other corporations are doing similar things. Shared information systems allow simple transfer of data between (government) agencies, and can be used to alert those agencies to ‘suspicious transactions’. The criteria for suspicious transactions can always be redefined at some future point. A good description of current American legislation concerning personal financial profiling being used in this context, informed by a strong information systems awareness, is provided by Tomlin.[5] Concerns particularly arise when this data is sold and used by allied companies or other organisations for unanticipated purposes, including credit rating and undesired targeting. For example, the Safeway (supermarket) club in the US gives specific discounts only to those shoppers prepared to have their shopping habits tracked. No law prevents sharing or selling these files, and such information is known to have been subpoenaed by law enforcement bodies — for example, establishing if a suspected drug dealer has bought an abnormal amount of plastic bags can, with other information snippets, create evidence for these agencies.

Two more examples reinforce such points. A 59 year old man shopping at a Los Angeles store broke his knee when he slipped on spilled yoghurt. Unable to drive or work, he sued, but a mediator allegedly tried to settle saying the store had information that he bought a lot of alcohol, and would use those records in court. Elsewhere, a Maryland company was exposed in 1998 as providing marketers with its customers’ prescription purchasing information, effectively betraying their medical records.[6]

These examples highlight uses to which information systems may be put which may or may not be contrary to the original designer’s understanding of the privacy levels involved. The system specification may have been met honestly, and in accordance with ethical codes, but where does responsibility lie beyond that? Tomlin observes:

[professional systems designers] will always allow for future expansion of ‘existing’ applications and processing capabilities and ... for ‘additional’ applications and processing capabilities to be added with a minimum of additional effort and cost.[7]

It would be professionally unethical not to do so, but is it the designer’s responsibility if undesirable or unantici-pated uses are facilitated? Targeted junk mail and unsolicited phone calls may not enhance the quality of life for most, and this conceptualisation of individuals as techno-consumerist objects is an issue to which we return.

Acting Professor John Gammack is Head of the School of IT at Murdoch University in Perth. Paula Goulding is the Acting Programme Chair of Information Systems in the School of IT at Murdoch.


[1] IIT Codes of Ethics Online: Comp-uting and Information Systems <http://csep.iit.edu/codes/computer.html> (accessed 14 May 1999).

[2] ACS 1997 <http://www.acs.org.au/national/pospaper/acs131.htm> (accessed 14 May 1999).

[3] Schwartz J, ‘Iceland: Parliament sells rights to genetic code’ Washington Post, 12 January 1999, at <http://www.corpwatch.org/trac/corner/worldnews/other/291.html> (accessed 14 May 1999).

[4] Judge P, ‘What have you done for us lately?’ Business Week 14 September 1998, pp 140-146.

[5] Tomlin C, ‘Big Brother and your bank account’ <http://www.vvm.com/~ctomlin/a63.htm> (accessed 14 May 1999).

[6] Vogel J, ‘Getting to know all about you’ <http://www.salonmagazine.com/21st/feature/1998/10/14featureb.html> (accessed 14 May 1999).

[7] Above note 5.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/PrivLawPRpr/1999/63.html