AustLII Home | Databases | WorldLII | Search | Feedback

Journal of Law, Information and Science

Journal of Law, Information and Science (JLIS)
You are here:  AustLII >> Databases >> Journal of Law, Information and Science >> 2006 >> [2006] JlLawInfoSci 4

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Zhang, Constance --- "Regulation of the Internet - New Laws & New Paradigms" [2006] JlLawInfoSci 4; (2006) 17 Journal of Law, Information and Science 53

Regulation of the Internet –
New Laws & New Paradigms

CONSTANCE ZHANG[∗]

Abstract

Rapid development of the Internet has brought revolutionary impacts upon all aspects of life. Many are benefiting from the communication superhighway and the myriad commercial opportunities it brings. However, at the same time new conflicts have arisen and new crimes have emerged. Some have turned the Internet into profitable commercial enterprises, but at the expense of fair and open access to information by the public. Others have gone even further – exploiting the medium to commit acts of fraud and disseminate obscenity that is both immoral and criminal. What is more, those operating in cyberspace seem to be beyond the reach of national governments and regional law enforcers. How can we regulate the Internet when a netizen can be simultaneously present in multiple jurisdictions while there are neither uniform rules nor agreements among these jurisdictions, when new technologies enabling new means of interaction emerge everyday while a piece of legislation may take years to be enacted? This paper endeavours to explore the question of if and how cyberspace may be effectively regulated as well as the role of law in this process. It will be shown that while many technologies pose great challenges to Internet regulators, they may also turn out to be of assistance to them. The central argument put forward is that while new paradigms and mechanisms of control will be necessary to adapt to governance of cyberspace, the rule of law still plays a crucial role in achieving the ultimate regulatory objective: to utilise technology to facilitate a fair, safe and efficient space for social and commercial interactions while maintaining a balance between various competing interests and values.

1. Introduction – The Shock of the New

From the birth of its original predecessor, the ARPANET, in a US research lab nearly fifty years ago, to its exponential development all over the world in the last two decades, the Internet has grown into a Brave New World that is wholly built by humankind. It has become much more than a tool for communication, a vast library of information, a trading ground for commerce or an open space for socialisation. The Internet is all of these things and much more. While its infrastructure mainly consists of a network of wires connecting the millions of computers around the globe and a set of open protocols known as TCP/IP, it is the content the Internet generates and the activities it facilitates that make it the fascinating cyberspace we know today.

Although the Internet is a completely artificial universe, its evolution has taken a somewhat accidental, ‘decentralised and even haphazard’ course rather than according to ‘some grand design or coherent plan.[1] As Eric Schmidt puts it, ‘The Internet is the first thing that humanity has built that humanity doesn’t understand, the largest experiment in anarchy that we have ever had’.[2] An explosion of localised creative ideas, many in college dorms and living rooms, has led to a rapid development of new technologies associated with the Internet and, more importantly, new ways of utilising these technologies have given rise to new economies, new communities and new crimes. Not only do we have to re-conceptualise fundamental notions of reality, social interaction and market economy, cyberspace also poses one of the greatest challenges regulators and law enforcers have ever faced.

The ‘Internet Holy Trinity’, comprising the technology of the medium, the geographical distribution of its users and the informational nature of its content,[3] indeed renders existing regulatory mechanisms, particularly the rule of law, ineffective and inadequate. However, this does not mean that cyberspace is beyond regulation as some Cyber-Libertarians claim. Much academic energy over the last decade has been devoted to formulating a regulatory framework capable of dealing with the various challenges the Internet has posed for traditional paradigms of governance. This paper seeks to critically examine some of these regulatory models while learning from their insights.

The thesis of this paper is that the regulatory difficulties posed by the Internet consist of both problems from the pre-Internet era, albeit in a different form, as well as Internet-specific issues that have emerged with the medium itself. Therefore, it is necessary for regulators to both re-structure existing laws so as to be applicable and adaptable to the new cyber-environment as well as to construct new paradigms of control and regulation.

Part II of this paper will briefly outline the models of governance propounded by several theorists so as to provide a macro-view of the regulatory landscape in cyberspace. How some of these regulatory frameworks may be implemented and how effectively they may operate will be analysed through several specific examples in Part III. Issues of online copyright, freedom of speech and privacy are some of the archetypal examples illustrating the sorts of difficulties the Internet poses for regulators. Also this section will briefly touch upon the types of regulatory challenges likely to arise in the context of virtual communities. While this is a much less developed area, it is also highly controversial. Finally in Part IV, both existing judicial approaches and theoretical proposals in relation to the issue of border and jurisdiction in cyberspace will be discussed, as it is an issue particularly troubling for law enforcement.

2. A Sketch of Cyber-Regulatory Theories

Among the various attempts by cyber-scholars to extend traditional regulatory theories to cyberspace, Lawrence Lessig’s (1999) quadripartite model has been one of the most influential. He identifies four modalities of regulation: law, market, norms and architecture.[4] Each imposes constraints on the subject through different means: law through coercion and punishment; market through commercial incentives and imperatives; norms through social and communal pressure; and architecture through more direct behaviour-shaping techniques. Lessig emphasises the way in which these four distinct modes of regulation function interdependently and their potential to regulate one another in order to effect control on the ultimate subject.[5] Andrew Murray and Colin Scott (2002) have also put forward a similar model and identified the four elements as hierarchical control, competition-based control, community-based control and design-based control.[6] While agreeing with Lessig in relation to the regulator’s role in designing ‘hybrid forms of control’, Murray and Scott see a trend towards ‘the deployment of hierarchical controls as instruments to steer organic or bottom-up developments’.[7]

Centralised regulation through law and sanction is seen to be difficult to enforce in view of the cross-jurisdictional nature of the Internet and the diverse distribution of its end-users. Regulating the architectures and intermediaries therefore has come to be regarded as the key to cyber-governance. Controlling behaviour through architecture is not a new concept. It is at the heart of the Foucauldian paradigm of disciplinary power. Foucault’s works demonstrate how disciplinary technologies such as surveillance, visibility and separation (eg Jeremy Bentham’s Panopticon,[8] transparent walls in barracks and architectural designs separating children’s bedrooms from their parents[9]) help authorities in normalising individuals’ behaviour and entrenching prescribed values. Like architecture, semiotic and linguistic power is another example of the power of code. In George Orwell’s Nineteen Eighty-Four, the elimination of words such as ‘liberty’ in NewSpeak meant that such concepts and values were made impossible.

Lessig also borrows Yochai Benkler’s three-layer communication system and applies it to cyberspace.[10] The bottom layer consists of physical hardwares such as telephone cables, routers and computers upon which the Internet is built. The code layer refers to softwares that support the Internet including the TCP/IP (transmission control protocol / Internet protocol), HTTP (hypertext transfer protocol), HTML (hypertext markup language), SMTP (simple mail transfer protocol), operating systems and browsers. This code layer is what Lessig and many other scholars perceive as the key to effect control over the content layer which encompasses all materials and information ‘stored, transmitted and accessed’ in cyberspace.[11] Since the Internet is built upon an entirely artificial environment, the flexibility of its protocols makes architecture a particularly effective tool for control, and controlling the code-writers will save much trouble from controlling individual end-users. Hence ‘code is law.’[12]

Figure 1

[13]

Vertical regulation does not occur from the higher layers to the lower layers

Effective, vertical, regulation occurs from the supporting layer to the higher layers

Content

layer

Logical infrastructure (code) layer

Physical infrastructure layer

Spinello (2002) also recognises the importance of code, but believes that ‘the optimal form of regulation [of cyberspace] is self-regulation … facilitated by technology’ and carried out in an ‘ethical’ manner.[14] One of the reasons prompting him to opt for decentralisation and self-regulation is to lower the ‘transaction costs’ (taxes, loss of privacy etc) necessarily generated by government regulation. A second reason is that technology makes it possible for individuals to deal with some ‘negative externalities’ (such as pornography and copyright infringement) on their own terms.[15] Believing that Lessig has underestimated the significance norms, Spinello places a much greater emphasis on ‘ethics’ as a means to ensure responsible deployment of code. However, the result of placing such a high stake in norms and ethics is contentious, as their success is largely dependent on the moral consciousness of Net-users rather than any effective mechanism of regulation.

Despite certain difficulties in its enforcement, the rule of law still arguably plays a role in the governance of cyberspace that is no less significant than its role in real space. Even if code displaces law as the dominant form of direct regulation of behaviour, law nevertheless has a crucial indirect role to play in regulating the other three modes of control, particularly the architecture of cyberspace. (See below a modified version of Lessig’s schematic to reflect this point.) Norms and ethics may be an adequate and appropriate mechanism to regulate activities in certain parts of cyberspace, for instance, activities primarily confined to virtual communities and online forums. However, where the ‘negative externalities’ created in cyberspace carry certain impact on life in real space, the rule of law must step in to ensure that code is not manipulated or abused by cyber-criminals, anti-competitive merchants or authoritarian governments to jeopardise public interests.

Another objection to Spinello’s preference for self-regulation is that it is questionable whether the choice between competing values such as freedom of speech vs protection against offensive material, intellectual property vs creative commons, privacy vs security, should be left in the hands of individual Net-users. Arguably, it would be more appropriate for democratic legislatures and independent judiciaries to balance such important values relevant to societies at large. Because of the power of code, it is essential for the future of the Internet that the development of its architecture follows a responsible path, but law’s coercive power has a particularly critical task in making sure that code is developed in such a way as to facilitate the balancing of various competing interests. All in all, law’s power to directly and indirectly impact on the other three modalities of regulation means that having a strong and effective legal framework in place remains one of the preconditions for building a safe and prosperous cyberspace.

The next part of this paper will analyse the interaction between the various regulatory forces through several case studies, in particular how code and law may impact on cyber-governance.

LAW

Figure 2

ARCHITECTURE

MARKET

NORMS

INTERNET

3. Case Studies

3.1 Erosion of Intellectual Property or Its Fortification?

Intellectual property, and copyright in particular, is a good place to begin the analysis of the power of code over cyberspace and the need for law to regulate the use of code. The debate over how much protection for intellectual property is desirable reflects the different ideologies of two interest groups operating in cyberspace. Using the categories Christopher Engel (2003) developed from Mary Douglas’ cultural studies, the ‘individualists’ (which in fact mostly consist of corporations) see the Internet as an unprecedented commercial opportunity and they seize upon it to make a profit. On the other hand, the ‘egalitarians’ like the idea of the Internet being a ‘gift economy’ where knowledge is freely shared and creativity openly fostered.[16]

Before dealing with copyright in cyberspace, a brief look at existing copyright laws will show that copyright has never been an absolute and unlimited right. Traditional copyright laws all contain ‘safety valves’[17] to preserve the public’s right to access knowledge and to promote dissemination of ideas while offering protection for the creator’s proprietary interest in their creative works. Copyright in almost every country is granted only for a limited term after which the work becomes part of the public domain. The duration of copyright under the Berne Convention[18] and the TRIPS Agreement[19] is limited to the author’s lifetime plus 50 years. Various fair dealing defences can also be found in copyright legislation of most countries. Under the Australian Copyright Act 1968 (Cth), for example, fair dealing with copyrighted works in reasonable portions does not constitute an infringement if it is for purposes such as research or study, criticism or review, news reporting, satire and parody, and the giving of professional advice by legal practitioners.[20] Section 109A, or the commonly dubbed ‘iPod defence’, introduced by the Copyright Amendment Act 2006 (Cth), also allows the owner of a copy of a sound recording to make another copy for the sole purpose of private and domestic use on another device. Similar provisions can be found under US copyright law which, apart from making fair use a general and complete defence, also contains a ‘first-sale doctrine’ which permits first hand purchasers of a copyrighted work to lend or sell that work to someone else without permission of the copyright holder.

From DVD burners to MP3 files to Peer-to-Peer (P2P) protocols, with the fast development of digital technology which enables media files to be reproduced without reduction in quality, easily distributed in mass numbers and instantaneously transmitted across the world, the interest of copyright holders of music and film works could have been severely jeopardised. As the problem is compounded by the difficulty in detecting infringement and enforcing traditional copyright laws, some worry that the Internet will be the death of copyright. However, this is not the case. With some changes to the code, and the assistance from both the legislature and the judiciary, the balance has in fact shifted in favour of copyright holders. Copyright holders have returned fire through mainly two ‘distinct but related’ avenues:

1. Development of the Digital Rights Management (DRM) systems to ‘refix creative content’; and

2. Re-establishment of their ‘exclusive legal rights over such content through the extension of traditional copyright laws into cyberspace’.[21]

There are many varieties of DRMs. But in essence it is a system combining encryption technology with the use of ex ante licences. When purchasing a digital data file, the purchaser is granted a licence with limited rights of use and access. The right is not simply limited by the contractual arrangement, it is moreover limited by a code, a highly complex double encryption system built into the file itself. The licence usually contains conditions such as ‘frequency of access, expiration date and restriction of transfer to other devices’.[22] For example, when buying a music track for $0.99 from BigPond, the purchaser can only download the file once to one stand-alone personal computer, copy the file onto a maximum of two portable devices and burn it up to three times onto CD-Recordables.[23]

As proponents of the Creative Commons have pointed out, DRMs provide excessive protection for copyright in cyberspace to the extent of undermining statutory principles of fair use. The extent of access by the user is completely dependent on the extent of permission encrypted in the file by the distributor. Despite the rights given to consumers by legislative provisions such as those under the Copyright Act 1986 (Cth) mentioned above, if the distributor wishes, the user could be easily prevented from even reproducing small portions of a creative work for educational purposes or from transferring and storing it in a different format for private use. With contractual terms contained in the DRM set by the licensors dictating access to and use of all creative works in digital formats, the result will be the ‘quasi-privatisation of copyright law’.[24]

While being busy developing DRMs, copyright holders also did not forget to vindicate their rights under existing copyright laws, and courts have sided with them on many occasions. In a series of copyright infringement cases, individual downloaders of digital files, websites hosting downloads and P2P intermediaries have all been exposed to different degrees of liability. In BMG v Gonzales[25] the defendant who had downloaded 1370 music files from KaZaA was found liable for copyright infringement. The defence of fair use was rejected by the US Court of Appeals as were arguments of sampling and time shifting. In RIAA v MP3.com[26] Rakoff J held in favour of the Recording Industry Association of America (RIAA) finding that MP3.com’s conduct did not constitute ‘fair use’ because it copied musical works in their entirety and for a commercial purpose.

The Australian Federal Court has gone even further in Universal Music v Cooper[27]. The defendant in that case neither uploaded nor downloaded any copyrighted work, but merely allowed remote users to freely post on his website hyperlinks to music files on their own websites. Cooper was not found guilty of copyright infringement as he was not the one making the copyrighted works available for unlawful downloading, nor was he responsible for determining the content of the files. However, Cooper, as well as the Internet Service Provider (ISP) that hosted his website, were held to be guilty of authorising copyright infringement, as they had general knowledge of the type of operations carried out by the uploaders and downloaders of copyrighted music files.

Bearing this decision in mind, it would not seem too surprising that Napster, the ingenious P2P software, did not escape a similar fate. The P2P protocol functions as an intermediary, a central server that facilitates direct file-sharing between its users while itself does not store or cache any digital file, copyrighted or not. Napster was nevertheless held liable for contributory and vicarious copyright infringement.[28] The court’s finding that Napster had knowledge of its members’ unlawful activities was in part based on a feature of its code: the ‘implementation of a centralised database of files’.[29] To avoid this problem, successors of Napster, including Gnutella (supported by clients such as BearShare and Limewire), FastTrack (used by KaZaA), BitTorrent and eDonkey protocols, have ‘improved’ the original P2P code with decentralised temporary indexing systems and have removed the central servers altogether. Their operators have also decentralised their corporate structures. For instance, in the KaZaA litigation, it became a Herculean task for plaintiffs and law enforcers to locate the defendant, as one of the providers, Sharman Networks, was incorporated in Vanuatu but managed from Australia, with its servers operating in Denmark, its source code last seen in Estonia, and the creator and controller of the underlying technology thought to be residing in the Netherlands.[30] Nevertheless, in MGM et al v Grokster et al[31] the US Supreme Court once again found the P2P providers guilty of copyright infringement. It was found that their aim was to profit and they had knowledge of their members’ unlawful activities. The court introduced the ‘active inducement theory’ from patent law into copyright law,[32] reinforcing liability of intermediaries. The Australian Federal Court’s decision in Universal Music Australia Pty Ltd v Sharman License Holdings Ltd[33] reiterated that the defendant cannot escape liability by relying on the display of warnings against copyright infringement and an end user licence agreement. The Federal Court urged the P2P service provider to implement technical measures to curtail copyright infringement. In other words, not only did the court clearly acknowledge the superiority of code over traditional legal mechanisms such as contract in controlling behaviour in cyberspace, it is actively encouraging the use of code to facilitate more effective law enforcement.

While the judiciary applied existing legal principles of copyright law into cyberspace, the legislature also took steps to introduce new measures to protect copyright holders’ interests. In response to the Napster and KaZaA litigations, the US Congress devised a draft Protecting Intellectual Rights Against Theft and Expropriation Act 2004 (PIRATE Act) targeting directly at P2P service providers. More significantly, the Digital Millennium Copyright Act 1998 (DMCA) was enacted in the US which makes it an offence to ‘offer to the public… any technology’ that is designed to circumvent ‘a technological measure that effectively controls access’ to copyrighted works. The Copyright Amendment (Digital Agenda) Act 2000 (Cth) introduced the equivalent anti-circumvention technology provisions in Australia. As critics such as Benkler and Lessig have pointed out, the DMCA is a perfect example of how law, through regulating the logical or code layer, effects control over the content layer.[34] It has gone beyond being a measure to protect ‘copyright in the underlying works by protecting the digital locks attached to the content and by prohibiting the trade’ in the unlawfully decrypted content, it criminalises the ‘decryption keys themselves’.[35]

On the one hand, strong market incentives are what stimulate both the development of copyright circumvention technologies (one only needs to look at the rocket level of advertising revenues generated by Napster and KaZaA protocols) as well as countervailing technology to combat the damage to copyright holders’ commercial interests. On the other hand, there are genuine public benefits in limiting copyright protection to an appropriate degree and fostering creative activities by offering free and open access to certain information and entertainment, which is the motivation behind the Creative Commons’ open source movement. Once law steps in, the balance shifts, through its power to regulate behaviour by traditional means of coercion and sanction, but more importantly through its power to control architecture which in turn shapes behaviour and affects content. This is why both the legislature and the judiciary have a crucial role in striking the right balance between competing interests in cyberspace.

Freedom of Speech vs Content Control

The freedom of speech and expression has long been regarded as a fundamental human right. It is most famously embodied in the First Amendment of the US Constitution. It has also become increasingly enshrined in various charters and bills of rights in recent years, such as the Bill of Rights Act 1990 (NZ) and the Human Rights Act 1998 (UK) implementing the European Convention on Human Rights.[36] However, all would agree that the right to free speech is not an absolute and unconditional one. Should one person’s free speech be protected when it invades another’s privacy, offends a community’s collective moral sensibility, or even incites hatred so as to undermine public security and social stability? Where should the line be drawn?

This is not an issue newly emerged with the rise of the Internet, but its controversy has been intensified with the complexities created by the new technology. The ‘egalitarians’ have built the values they wish to uphold into the very architectures of the Internet in the process of creating it.[37] Armed with open codes such as HTML that enable every netizen to become ‘a pamphleteer… a town crier with a voice that resonates farther than it could from any soapbox’,[38] they argue that as activities over the Internet are essentially speech-based they ought to be free from restrictions. The ‘hierarchists’,[39] however, disagree and some of them, a most notable example being the Communist Chinese government, go to extraordinary lengths to ensure that the existing equilibrium between freedom of expression and control of communication is not disturbed dramatically by the rise of cyberspace. With its state-owned centralised gateways and state-funded army of code writers, this totalitarian government has the ability to effectively filter out undesirable Internet content and to block access and communication. Again, through control of its architecture, the Internet has become both the subject of and a tool for panoptic surveillance.

Content control is by no means an issue for authoritarian states alone. In the hugely contentious Yahoo![40] case, a French County Court sought to restrict Yahoo! from allowing the sale of Nazi memorabilia to French citizens through its US-based auction site. Judge Jean-Jacques Gomez attributed great significance to the availability of user-identification and content-filtering technologies and ordered Yahoo! to implement them to block access by French citizens to content that is unlawful under French law.[41]

Censorship of certain other activities over the Internet by government appears to be more justified. If the abuse of the new information superhighway by ‘individualists’ to make easy dollars through disseminating spam or selling pornography is within acceptable legal and moral boundaries, it is certainly imperative for governments to control web content and communication by the ‘fatalists’.[42] These are criminals who engage in such practices as obscenity, child pornography, extremist hate speech and propagation of terrorism. Effective governance of the Internet in this aspect is again a matter of striking the right balance between conflicting values. Both code and law have an important role to play, but each has its own inadequacies.

In the US, Congress passed the Communications Decency Act 1996 (CDA) to combat the explosion of pornography in cyberspace, particularly the dangers it poses for children. It criminalised the display or transmission of ‘indecent’ or ‘obscene’ material over the Internet to a minor including any message ‘that, in context, depicts or describes, in terms patently offensive as measured by contemporary community standards, sexual or excretory activities or organs.’ The CDA was soon challenged and held to be unconstitutional on the ground that, as a ‘content-based regulation of speech’, it was too vague and broad to meet the requirement under the First Amendment.[43] Congress then enacted the Child Online Protection Act 1998 and the Children’s Internet Protection Act 2000. While attempting to better define what is offensive by reference to ‘contemporary community standards’, the law also imposes obligations upon commercial websites to use ID-based systems to prevent access to pornography by minors. The Protecting Children from Peer-to-Peer Pornography Bill was introduced into Congress in 2003 aiming specifically to curtail access to pornography by children via P2P protocol-based services.

In Australia, the Broadcasting Services Amendment (Online Services) Act 1999 (Cth) vests in the Australian Communications and Media Authority (ACMA) the power to investigate, upon receiving complaints from the public, offensive material on the Internet and rate the content of the subject of investigation according to the standards and guidelines set by the Office of Film and Literature Classification. ACMA is also given the power to issue take-down notices to both the content host (Australian host content only) and ISPs requesting removal of ‘X’ rated (sexually explicit material) or ‘RC’ rated (refused classification) Australian hosted materials and restriction on access to foreign-hosted content. Cooperation between law and architecture, through regulation of both content hosts and intermediaries, is again at work.

‘Given the burdens imposed by laws on free speech and the threat of regulatory arbitrage’, some cyber-theorists prefer a ‘more decentralized approach’ and suggest that it may be more effective and appropriate to allow ‘parents, schools, libraries, and other organizations’ to regulate cyberporn through code.[44] Some disagree. Putting aside the question of technical effectiveness of filtering programs, architectures such as the Platform for Internet Content Selection (PICS) all too easily lead to excessive censorship. A conservative Christian school not only can rely on PICS to filter out pornographic material from children, it can also restrict access to websites promoting legalisation of abortion. If all dissenting views are blocked and, worse still, if the decision of what content needs to be censored is made by school principals, Church officials or leaders of local interest groups, it would not be much better than having a centralised gatekeeper appointed by the Chinese government. On the one hand, excessive content control will undermine what many view as the most valuable function of the Internet, ie making knowledge more accessible, enabling freer communication and building a vibrant civil society. On the other hand, complete freedom of expression in cyberspace may also lead to crimes, disruption of civil discourse and attack on fundamental moral values. It is therefore essential that regulators maintain a balance between competing interests through both legal sanctions and the utilisation of code.

Privacy, Surveillance & Security

This section in a sense builds on top of the last. Just as free speech is often taken (or mistaken) to be an intrinsic value built into the very architecture of the Internet, many believe that our online privacy is protected by the anonymity cyberspace offers. This is in fact quite a misconception. Advances in information technology have brought about unprecedented intrusion into our privacy as collection of personal data has never been more efficient and surveillance never more ubiquitous.[45] Whenever a Net-user subscribes to an online service, some personal information about him or her is collected and stored in a database somewhere. Every time a person visits a website, a cookie is created and retrieved (unless the default code setting on the person’s web browser is adjusted) to allow the web host to check what he or she did during his or her last visit to its website.[46] Every time an employee sends an email at work, there is a chance that it is being monitored by his or her employer’s digital panopticon.

While some of these code-based privacy intrusions, such as cookies, are driven by commercial interests, others are compromises Net-users must make in exchange for security and privacy itself. For example, the use of the password system is designed to protect individual users’ privacy so that online bank accounts will not be hacked into and private emails will not be accessed by unauthorised persons. But in order to achieve that protection, the account holder must first concede some privacy to the online banking service provider or the email host. Moreover, there are even more serious threats posed by fatalists’[47] ‘fraudulent, unlawful, anarchic [and] dangerous’ activities[48] than merely ‘inappropriate conduct’ like online pornography discussed above. These include, among others, online fraud, cyber-extortion and cyber-terrorism. One of the mechanisms to detect such conduct and enforce legal sanctions against cyber-criminals lies in the deployment of identification technologies.[49] The Child Online Protection Act 1998 (US) mentioned in the previous section is one such example.

Anonymity is an illusion even under existing Internet protocols. Every computer connected to the Internet has a unique IP address. It is used every time a packet of data transmitted to and from an address. Strictly speaking, although these protocols or virtual addresses do not reveal the identity of the Net-user, it is possible to at least trace the source of suspicious information to the particular machine that has sent it. To facilitate regulation and law enforcement in cyberspace, it is possible to modify its architecture and establish an ID-based certification-rich environment. It is theoretically possible, through national legislation and international agreements, to make it mandatory for Net-users to carry digital IDs and for online service providers to condition access on authenticated certification. For instance, online gambling can be restricted based on the person’s state of residence and access to pornography can be prohibited if the person is below the lawful age.[50] Digital IDs recorded by ISPs will also enable local police to trace cyber-criminals and help regulators to more effectively enforce their laws. However, while such an architecture may improve online security to some extent, Net-users need to be prepared to sacrifice a huge amount of privacy and be subjected to a much greater level of surveillance. Is this a price netizens are willing to pay? Opinions no doubt will differ. But if it has reached a stage where the prevalence of cyber-crimes severely endanger the prosperity of e-commerce and the safety of online communities, it might become necessary for national governments to increase their presence in cyberspace as law enforcers.

In and Out of Virtual Communities

Popularity of virtual communities is a growing phenomenon. Communities in cyberspace come in different forms. There are online forums and bulletin boards for netizens with specific interests to exchange ideas. There are social websites such as Facebook primarily used by its members to maintain contact with others they have existing relationships with in real space. Other websites including online dating services are mainly designed to enable their users to make new acquaintances across physical boundaries and interact with people who are strangers in real space. There is also a type of virtual community where people can live the life of their alter egos and interact with one another through the avatars they create. Governance of these increasingly sophisticated virtual societies such as Second Life demands radically new regulatory approaches, which will take another paper to fully explore. Due to its limited scope, this section will only offer a peek at some of the challenges presented.

In a ‘primitive’ virtual society, it may be that activities carried out within it have no impact on real space apart from perhaps the emotional reactions of the participants. In such cases, norms and codes may suffice as the governing power within the virtual community and it may not be appropriate for real space regulators and their laws to interfere. An example is the Mr Bungle affair in LambdaMOO.[51] LambdaMOO is a ‘MUD (acronym for Multi-User Dimension), Object Oriented’ (MOO). This is a program where each of its users controls a virtual character or avatar, and builds whatever reality they like through typing in verbal commands. One of its members interfered with the code of the program and manipulated his avatar, known as ‘Mr Bungle’, to perform a series of acts to another avatar. These actions, described by the scripts appearing on the MOO’s inhabitants’ computer screens, would amount to such crimes as assault and rape if committed in real space. But they were only words. Whatever trauma the victim (or rather the victim’s puppeteer) felt, she could not take the matter to real space police. However, inside the virtual community, the residents reacted strongly against Mr Bungle’s behaviour and wanted him expelled from their world. So one of the Wizards, ie those who created and controlled the program, closed the user’s account. He did return, as a different avatar (ie the puppeteer simply signed up again under a new name) and there was no effective way to stop that. However, what was remarkable was that after the incident, the MOO’s residents gradually developed a norm-based dispute resolution process as well as a democratic legislative system to govern their community. There were also the Wizards who were empowered by their technical control over code.

In this case, the only harm tangible in real space was the hurt felt by the real life person who was the owner and controller of the avatar that had been abused. However, even if the connection with real space is only an emotional or psychological one, it is only a matter of degree before we need to consider applying certain real space laws to virtual communities, as was manifest in the controversy surrounding the use of avatars to simulate child pornography. Some argue that although the pornographic activity is virtual and involves no children in real space, it will have an impact on the real psychological conditions and physical conduct of those engaged in it, potentially causing more crimes of child molestation to be committed in real space.

Furthermore, in sophisticated virtual worlds such as Second Life, the economic connection with real space has increased. Real space money-laundering through virtual casinos has become increasingly worrying for regulators. The exchangeability between Linden gold and US dollars marks the point when virtual life is no longer purely virtual (if it ever was). There has been one reported case where a Dutch teenager was arrested for alleged theft of virtual furniture from a virtual room in Habbo Hotel, another website hosting 3D virtual communities.[52]

As the stolen virtual furniture had an approximate value of $4,000 in real money, the alleged crime did not seem all that different from hacking into online bank accounts and making unauthorised electronic transfers.

It is not hard to imagine, for example, when Second Life residents do business in Second Life, trade virtual stocks and virtual real (this is the ultimate oxymoron!) estate with a currency that is convertible to real space money, the potential impact the virtual economy may have upon the real space economy.[53] Are netizens to be taxed by real space government when they make a profit in real space money? If so, by which government? The government of the place where the server or host of the virtual community is located (eg the US being where Linden Lab, host of Second Life, is situated) or where the real space controller of the avatar lives? Can US tort law be applied where an avatar in Second Life commits the tort of conversion against another avatar controlled by someone living in Australia? Are damages to be assessed based on the value of property in real space? Existing laws clearly have trouble dealing with these questions and there is an imminent need for regulators to develop a new paradigm of governance to establish law and order in virtual communities with problematic connections with real space.

4. The Jurisdictional Dilemma & International Cooperation

Manifest in what Cyber-Libertarians call the ‘Internet Holy Trinity’ is the notion that the trans-national nature of the Internet defies traditional border-based sovereign control by states.[54] If the rule of law wants to preserve its influence in cyberspace, it must be able to be enforced. Can a Western Australian statute banning online gambling be enforced against the owner of a server located in Tasmania? Can an Iranian court hold a Swedish uploader of pornographic material liable for making prohibited content available to Iranian residents under Iranian law? How can US police and courts protect its citizens from cyber-extortions carried out by Nigerian criminals, or punish China-based individuals violating copyrights owned by American corporations? As illustrated above, architecture such as ID-certification systems and filtering softwares, as well as market imperatives, may be relied upon to prevent contravention of national and state laws. But once a law has been breached, the national or state court must have legitimate jurisdiction to enforce it against the offender. The following discussions will now take a look at how some courts have dealt with the issue of jurisdiction in Internet-centred cases so far and then briefly examine an alternative conceptualisation of the bases for jurisdiction.

4.1 Judicial Responses So Far

Under US law, long-arm jurisdiction may be established under the principle of ‘specific jurisdiction’ where

1. a non-resident defendant purposefully avails itself of the privilege of conducting activities in the forum state, thereby invoking the protections of its laws; and

2. the plaintiff’s claims arise out of the defendants’ forum-related activities.[55]

In Zippo Manufacturing Co v Zippo.com Inc[56] a sliding scale test was formulated when applying the principle of specific jurisdiction in the Internet context. The court held that mere accessibility of and passive advertising on a website is not sufficient to establish jurisdiction while an interactive website that reaches out and touches the jurisdiction will be sufficient.[57] Another test, propounded in a pre-Internet case, Calder v Jones,[58] has also been increasingly relied upon in recent Internet-related cases. Under this approach, specific jurisdiction is established if the defendant has engaged in intentional acts targeting or directing at the forum state and causing harm to be suffered within the forum state.[59] In Metro-Goldwyn-Mayer Studios Inc v Grokster Ltd[60] both tests were applied, but not without controversy. The court held that Sharman Networks, the provider of the KaZaA software, incorporated in Vanuatu with its principal business in Australia, had constructive knowledge of its Californian users through the software licensing agreement it entered into with each user, and therefore had ‘knowingly and purposefully availed itself of the privilege’ of carrying on business in California. The three elements of the Calder v Jones test were also held to have been made out. Sharman was therefore held by the court to be subject to Californian copyright law.

In Australia, the High Court in a landmark decision, Dow Jones & Company Inc v Gutnick[61], held that the defendant, an online publisher based in New Jersey, USA, was liable under the defamation laws of Victoria, Australia. The High Court stated that the tort of defamation was committed in Victoria as it was the place where the defamatory material was downloaded, made available in a comprehensible form and caused damage to the plaintiff’s reputation. The defendant was therefore subject to the jurisdiction of the Victorian Supreme Court. This case shows that while US law requires a nexus stronger than mere accessibility, the Australian jurisdictional threshold is substantially lower.

In the even more contentious Yahoo![62] case, Judge Jean-Jacques Gomez of the County Court of Paris found that Yahoo! had violated French law by providing French citizens with access to its auction website where Nazi memorabilia were for sale. To Yahoo!’s executives based in the US, the prospect of losing the lucrative French market was likely to have been a more important factor than the threat posed by direct legal sanction in reaching the decision to give in to the French court and comply with its orders. This was an instance where law, as a direct means of regulation and control, clearly encountered significant restrictions for reasons including the jurisdictional barrier in subjecting foreign entities to local judgments. It turned out that the power of law was more effectively utilised through the mediation of another regulatory mode, the market.

These cases illustrate the attempt by various judiciaries around the world to uphold their local laws and protect their national interests by stretching existing principles of private international law. They clearly demonstrate the inadequacies presented in the cyber-era by the traditional notion of jurisdiction based on physical connections with a sovereign territory. New approaches need to be developed to deal with new technologies.

4.2 An Alternative Concept of Jurisdiction

Rejecting territoriality as the basis for asserting jurisdiction, be it the physical location of the network server or the place where the defendant carries on business, Darrel Menthe (1998) proposes to adopt nationality (of the uploaders and downloaders) as the nexus for jurisdiction over Internet offences.[63] His theory draws upon the legal principles concerning jurisdiction over the three existing international spaces: Antarctica, the high seas and outer space. In these areas which were created by sui generis treaty regimes,[64] nationality of the responsible individual or entity is the basis for national jurisdiction. This is embodied in the principle of the ‘law of the flag’ with respect to vessels on the high seas. It is also expressly provided for under The Antarctic Treaty[65] and the Outer Space Treaty[66].

Menthe suggests that cyberspace be made a fourth international space as it shares with the high seas, Antarctica and outer space their sovereignless quality.[67] He further notes the physical and non-physical distinction[68] between cyberspace and the three recognised international spaces, but has arguably underestimated the implications of such distinction. While for example Antarctica is not governed by any sovereign state, it is however clear to law enforcers when an act is carried out within the physical borders of Antarctica and when one is conducted outside of it. This is equally ascertainable in outer space and on the high seas because of their physical nature. In contrast, however, there is no clear border demarcating cyberspace from real space. This may seem like a metaphysical question but it in fact poses significant practical difficulties for regulators. It may be argued that one’s online activities are never completely confined to cyberspace and are always carried out in both cyberspace and real space. That may create situations where the national jurisdiction of a Net-user comes into conflict with the territorial jurisdiction a state has over its real space territory. To illustrate with an example: when a Nigerian cyber-extortionist carries out his or her criminal activities in South Africa by meeting the victims and receiving payments there, will a South African court be entitled to assert jurisdiction for the fraud that has been committed on South African soil or will Nigeria have jurisdiction based on the offender’s nationality? Apart from competing jurisdictions, choice of law and enforcement of foreign judgments are also immensely frustrating problems yet to be resolved.

Despite such potential problems, Menthe’s theory offers an important insight into the possibility of redefining existing legal principles to adapt to cyberspace. One important element in this process is to negotiate commonly acceptable doctrines between nations so as to render national laws enforceable. This is highlighted by Stuart Biegel (2001) who attributes a significant role to international agreements and cooperation as part of his overall model of Internet regulation which extends beyond dispute resolution in cross-jurisdictional matters.[69] He advocates for increasing cooperation at an international level such as making regulatory bodies like ICANN (Internet Corporation for Assigned names and Numbers) more representative of the international community. As the Internet was created by scientists in the US, the various Internet regulatory bodies such as the Internet Society (ISOC) and the Internet Engineering Task Force (IETF) have also been primarily US-based. These organisations are responsible for, among other things, coordinating the technical standards and managing the development of new protocols for the Internet. Given the importance of architecture in cyberspace, constructive involvement in the setting of standards and designing of protocols with respect to issues such as online copyright and domain name management by the UN and countries other than the US will be increasingly important for creating a democratic cyberspace. The trouble is of course that divergence of values and interests makes it difficult for nations to reach a consensus over such issues as the assertion of jurisdiction, creating a new international space or universalising codes for cyberspace. As Jurgen Habermas has pointed out, for supra-national bodies to take up the challenges of modernity and globalisation they must be vested with greater legislative and regulatory powers as do national governments today.[70] Although Habermas was speaking of the political and economic ramifications of globalisation in general, his insight is equally relevant when placed in the specific context of cyber-regulation.

5. Conclusion

Through analysing some of the approaches regulators have taken in response to the seeming irregulability of the Internet, including application of existing legal rules and principles such as copyright law, and indirect regulation of online conduct through control of Internet protocols, it is clear that cyberspace is not beyond regulation. In some aspects, there is a real need to re-conceptualise the rule of law and the nature of code so as to develop an effective regulatory model to meet the challenges the Internet brings. But the more important question is one that has perplexed mankind since the twilight of civilisation and that is how to strike a balance between competing public interests of different social groups and resolve the conflict between fundamental values. As having been demonstrated, laws can be amended and codes can be modified, old borders can be erased but new walls can also be built. Technology is always changing but human nature does not. While the disciplinary and normalising power of architecture is magnified in cyberspace, the rule of law still plays an important role in upholding fundamental rights such as freedom of speech and open access to knowledge. A top-down modality imposed by national legislatures and international regulatory bodies is necessary to supplement the localised, bottom-up power dynamics formed by market forces and social norms. Returning to Schmidt’s statement quoted at the beginning of this paper, regulators will eventually put order back into cyberspace (if they have not already done so in many parts of it), and the ultimate role of law is to ensure that in the process of doing so, code is not manipulated by either ‘individualists’ or ‘fatalists’ to the detriment of the public good for netizens.


[∗] Final year Arts/Law student, University of Sydney.

The author would like to thank Dr Isabel Karpin (Senior Lecturer, Faculty of Law, University of Sydney)

for her advice on this paper as well as the anonymous referees for their helpful suggestions.

[1] Richard A. Spinello, Regulating Cyberspace: The Policies and Technologies of Control (2002) 21.

[2] Andrew D. Murray, The Regulation of Cyberspace: Control in the Online Environment (2007) 233, quoting Eric Schmidt.

[3] James Boyle, Net Total, Law Politics and Property in Cyberspace (2001) 29.

[4] Lawrence Lessig, Code and Other Laws of Cyberspace (1999) 86-8.

[5] Ibid 91-5.

[6] Andrew D. Murray and Colin Scott, ‘Controlling the New Media: Hybrid Responses to New Forms of Power’ (2002) 65 Modern Law Review 491.

[7] Ibid 505.

[8] See Michel Foucault, Discipline and Punish: The Birth of the Prison (1977).

[9] See Michel Foucault, ‘The Eye of Power’ in Colin Gordon (ed) Foucault Power/Knowledge (1980) 146-65.

[10] Lawrence Lessig, The Future of Ideas: the Fate of the Commons in a Connected World (2001) 23.

[11] Spinello, above n 1, 15; Murray, above n 2, 44.

[12] Lessig, above n 4, 6.

[13] Diagram reproduced from Figure 2.4 in Murray, above n 2, 45.

[14] Spinello, above n 1, x.

[15] Ibid 52-7.

[16] Christoph Engel, Governing the Egalitarians from Without: The Case of the Internet (Reprints of the Max Planck Institute, 2003/10) http://ssrn.com/abstract=462485 at 7 May 2007.

[17] P. Goldstein, Copyright’s Highway (1994).

[18] Berne Convention for the Protection of Literary and Artistic Works, opened for signature 9 September 1886, as revised [1978] ATS 5, art 7(1) (entered into force generally 1 August 1931).

[19] Agreement on Trade-Related Aspects of Intellectual Property Rights, opened for signature 15 April 1994 [1995] ATS 38, art 12 (entered into force generally 1 January 1995).

[20] Sections 40-43; 103A-103C. See also other defences in ss 43A-73 and 104-112E.

[21] Murray, above n 2, 176.

[22] Ibid.

[23] Telstra Limited, ‘BigPond Music Terms and Conditions – Terms of Service and Sale (Terms)’ http://bigpondmusic.com/Terms.aspx at 10 June 2007.

[24] D. Burk and J. Cohen, ‘Fair Use Infrastructure for Copyright Management Systems’ (2001) 15 Harvard Journal of Law & Technology 41, 49.

[25] US Court of Appeals 7th Cir No 05-1314 (2005).

[26] 00 Cir 0472 JSR SDNY (2000).

[27] [2005] FCA 972.

[28] A&M Records Inc v Napster Inc, 114 F Supp 2d 896 (N D Cal 2000).

[29] Murray, above n 2, 186.

[30] Ibid.

[31] 125 S Ct 2764 (2005).

[32] Murray, above n 2, 188.

[33] [2005] FCA 1242.

[34] Murray, above n 2, 44-5.

[35] Ibid 195.

[36] European Convention for the Protection of Human Rights and Fundamental Freedoms, opened for signature 10 December 1948, Council of Europe, ETS No 005, (entered into force generally on 3 September 1953).

[37] Engel, above n 16.

[38] Reno v ACLU, [1997] USSC 73; 521 US 844 (1997).

[39] Engel, above n 16.

[40] LICRA v Yahoo! Inc. et Yahoo France, T.G.I. Paris, 22 May 2000, N°RG: 00/05308.

[41] Jack Goldsmith and Tim Wu, Who Controls the Internet? Illusions of a Borderless World (2006) 7, 8; Michael A. Geist, ‘Is There a There There? Toward Greater Certainty for Internet Jurisdiction’ (2001) 16 Berkeley Technology Law Journal 1345, 1350.

[42] Ibid.

[43] Spinello, above n 1, 52-7.

[44] Ibid 116.

[45] Ibid 177-8.

[46] Lessig, above n 4, 34.

[47] Engel, above n 16.

[48] See the classification of problematic online conduct into four categories in Stuart Biegel, Beyond Our Control? Confronting the Limits of Our Legal System in the Age of Cyberspace (2001) 54.

[49] Spinello, above n 1, 221-7; Lessig, above n 4, 30-42, 48-58.

[50] Lessig, above n 4, 54-5.

[51] For a description of this virtual community and the Mr Bungle affair, see J. Dibble, Ch 1 ‘A Rape in Cyberspace’ in My Tiny Life: Crime and Passion in a Virtual World (1998).

[52] ‘Virtual Theft’ Leads to Arrest (2007) BBC News http://news.bbc.co. uk/1/hit/technology/7094764.stm at 14 November 2007.

[53] Jason Whittaker, The Cyberspace Handbook (2004) 272-3.

[54] Boyle, above n 3.

[55] Metro-Goldwyn-Mayer Studios Inc v Grokster Ltd, 243 F Supp 2d 1073 (2003), 1084.

[56] 952 F Supp 1119 (1997).

[57] Brian Fitzgerald, G. Middleton and A. Fitzgerald, Jurisdiction and the Internet (2004) 106-8.

[58] [1984] USSC 53; 465 US 783 (1984).

[59] Fitzgerald and Fitzgerald, above n 57, 106 and 110.

[60] 243 F Supp 2d 1073 (2003).

[61] [2002] HCA 56.

[62] LICRA v Yahoo! Inc. et Yahoo France, T.G.I. Paris, 22 May 2000, N°RG: 00/05308.

[63] Darrel C. Menthe, ‘Jurisdiction in Cyberspace: A Theory of International Spaces’ (1998) 4 Michigan Telecommunications and Technology Law Review 69.

[64] Ibid 84.

[65] The Antarctic Treaty, opened for signature 1 December 1959, [1961] ATS 12, art 8 (entered into force generally 23 June 1961).

[66] Treaty on Principles Governing the Activities of States in the Exploration and Use of Outer Space, including the Moon and Other Celestial Bodies, opened for signature 27 January 1967, [1967] ATS 24, art 8 (entered into force generally 10 October 1967).

[67] Menthe, above n 63, 84.

[68] Ibid 85.

[69] Biegel, above n 48, 157-86.

[70] Jurgen Habermas, ‘Learning by Disaster? A Diagnostic Look Back on the Short 20th Century’ in Constellations Vol 5, No 3, September 1998, 307-320.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/JlLawInfoSci/2006/4.html