Privacy Law and Policy Reporter
This is the introduction from an article entitled ‘Ethical responsibility and the management of knowledge’ published in the Australian Computer Journal vol 31 no 3 August 1999. It is reprinted here with permission of the authors and the editor of the ACJ (Editor).
Surveying the codes of ethics for various professional bodies in engineering and computer science indicates worthy values with which surely few would disagree. Taking the Australian Computer Society professional code as typical, the general obligations stress personal integrity and respect for others, including applicable laws. Implicit in such codes are personally identified activities, and a recognition of the impact of information systems on quality of life. These, however, reflect a localised awareness of the potential impacts of a system development, and seem at odds with the potential of many information systems to be put to purposes other than those originally intended. ‘If I had known I would have become a watchmaker’ was Einstein’s rueful reflection on work that led to the atomic bomb — and parallels with the use of recorded data for unforeseen purposes are many, as we now illustrate.
The government of Iceland, for example, is selling the rights to the details of the entire population’s genetic code to an American biotechnology company, who have been approved (unusually) to hold a 12 year monopoly on the data marketing rights. The project is widely opposed internally and internationally, and the Ethical Committee of the Icelandic Medical Council are advising doctors not to participate in the project. Such a database, containing both medical and genealogical data, can potentially identify private details of individuals and their ancestors in terms of health and lineage status. Despite stated assurances of anonymity, there are neither technical nor political reasons why such identification may not be done in future uses of the database. In fact, the usual understanding of anonymity as ‘removal of personally identifying information’ has not been applied here. Instead, information identifying individuals is being encrypted, and thus is available to authorised individuals and hackers. Despite concern from the EU, this may lead to a Europe-wide database. Closer to home, Tasmania, as an island with a relatively stable population of extended families, has similar relevant qualities to Iceland, and is increasingly becoming recognised as a suitable laboratory for genetic research. An Australia-wide DNA database to be used in relation to crime scene forensics is also currently being proposed by the federal minister.
Other examples indicate the existing linkages to other systems that do explicitly identify individuals for specific services, particularly in e-commerce applications. The detailed insights into personal reading habits and potentially related direct marketing available to amazon.com are well known in this regard; books can even be commissioned to order, or proposals rejected, on the basis of identified customer projections. Supermarkets and other large retailers frequently use loyalty cards, linked to deals with allied corporations. These are often tied to direct marketing, but now are even linked to individually specified profit and loss statements by individual customers. Such statements are already used for example in decisions about:
Banks, insurance companies and other corporations are doing similar things. Shared information systems allow simple transfer of data between (government) agencies, and can be used to alert those agencies to ‘suspicious transactions’. The criteria for suspicious transactions can always be redefined at some future point. A good description of current American legislation concerning personal financial profiling being used in this context, informed by a strong information systems awareness, is provided by Tomlin. Concerns particularly arise when this data is sold and used by allied companies or other organisations for unanticipated purposes, including credit rating and undesired targeting. For example, the Safeway (supermarket) club in the US gives specific discounts only to those shoppers prepared to have their shopping habits tracked. No law prevents sharing or selling these files, and such information is known to have been subpoenaed by law enforcement bodies — for example, establishing if a suspected drug dealer has bought an abnormal amount of plastic bags can, with other information snippets, create evidence for these agencies.
Two more examples reinforce such points. A 59 year old man shopping at a Los Angeles store broke his knee when he slipped on spilled yoghurt. Unable to drive or work, he sued, but a mediator allegedly tried to settle saying the store had information that he bought a lot of alcohol, and would use those records in court. Elsewhere, a Maryland company was exposed in 1998 as providing marketers with its customers’ prescription purchasing information, effectively betraying their medical records.
These examples highlight uses to which information systems may be put which may or may not be contrary to the original designer’s understanding of the privacy levels involved. The system specification may have been met honestly, and in accordance with ethical codes, but where does responsibility lie beyond that? Tomlin observes:
[professional systems designers] will always allow for future expansion of ‘existing’ applications and processing capabilities and ... for ‘additional’ applications and processing capabilities to be added with a minimum of additional effort and cost.
It would be professionally unethical not to do so, but is it the designer’s responsibility if undesirable or unantici-pated uses are facilitated? Targeted junk mail and unsolicited phone calls may not enhance the quality of life for most, and this conceptualisation of individuals as techno-consumerist objects is an issue to which we return.
Acting Professor John Gammack is Head of the School of IT at Murdoch University in Perth. Paula Goulding is the Acting Programme Chair of Information Systems in the School of IT at Murdoch.