AustLII Home | Databases | WorldLII | Search | Feedback

University of New South Wales Law Journal Student Series

You are here:  AustLII >> Databases >> University of New South Wales Law Journal Student Series >> 2020 >> [2020] UNSWLawJlStuS 15

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Pull ter Gunne, Radha --- "The End of Online Freedom of Expression? An Analysis of Intermediary Liability in Defamation Law in Europe and Australia and Its Implications for Freedom of Expression" [2020] UNSWLawJlStuS 15; (2020) UNSWLJ Student Series No 20-15


THE END OF ONLINE FREEDOM OF EXPRESSION? AN ANALYSIS OF INTERMEDIARY LIABILITY IN DEFAMATION LAW IN EUROPE AND AUSTRALIA AND ITS IMPLICATIONS FOR FREEDOM OF EXPRESSION

RADHA M PULL TER GUNNE[1]

I INTRODUCTION

Online intermediaries such as Facebook and Twitter are mostly considered important platforms for freedom of expression. People can read, comment, share and retweet to their heart’s content. However, as the amount of communication is immense, the chances someone feels defamed are also increased.[2] In addition, it is argued that people approach online interaction differently than real life, in the sense that they feel freer to say certain, often hurtful, comments. This again increases the chances that defamation and damage occurs.[3] Nevertheless, this does not mean all online interaction should be monitored and filtered. The right to freedom of expression is an important cornerstone of a democratic society. However, currently the courts in the European Union (EU) and Australia seem to be more swayed by arguments that restrict the freedom of expression in favour of the protection of reputation.

This essay aims to answer the following question: ‘Do the current legal frameworks for defamation law in the EU and Australia protect the right to freedom of expression sufficiently in intermediary liability cases?’ The reason Australia and the EU are chosen is because of recent cases that potentially can have far reaching consequences for freedom of expression. In addition, both legal systems have a significant different history and development with regards to the right to reputation and the right to freedom of expression. Such differences are beneficial for a comparative analysis of legal systems. However, this research not only aims to provide a comparative analysis of intermediary liability in defamation law in the EU and Australia, it also argues that freedom of expression is being curtailed in two recent cases, Eva v Facebook[4] in the EU and Voller[5] in Australia.

Additionally, the discussion of intermediary liability in defamation cases in Australia is timely due to the upcoming review of the Defamation Act 2005[6]. In February 2019 the Council of Attorneys-General published a discussion paper on the review of Model Defamation Provisions[7] and more thought should be given to the balancing of the two opposing rights in defamation cases when it comes to intermediaries.

This essay starts with an outline of the legal frameworks in Part II in Europe and Australia regarding the right to reputation (Part A), the right to reputation balanced against the right to freedom (Part B) and intermediary liability (Part C). Then, the understanding of these legal frameworks and their differences are applied to two case studies in part III, for the EU in part A and Australia in Part B. Part C provides an overview of the implications of these two cases and Part IV provides a conclusion.

II LEGAL FRAMEWORK

Defamation law seeks to reconcile the protection of reputation and freedom of expression.[8] To understand the basic principles underlying this essay, the legal framework is explained, starting with how the right to reputation is protected in Europe, respectively Australia. It immediately becomes clear through comparison that Australia, due to its development of defamation law, provides much stronger protection to reputation than Europe. However, the opposite can be perceived when comparing the right to freedom of expression. These differences remain apparent in the development of intermediary liability in Australia and the EU.

A Right to Reputation

1 Europe

In Europe the right to reputation is not as clearly protected as is the case in Australia. The European Convention on Human Rights[9] (ECHR) does not mention the ‘right to reputation’ explicitly and only refers to the word ‘reputation’ as an exception to the right to freedom of expression. [10] The Charter of Fundamental Rights in the European Union (CFR) does not refer to the word ‘reputation’ even once. However, case law of the European Court of Human Rights (ECtHR) determined that reputation is part of article 8 ECHR of the right to privacy.[11] Based on article 52(3) CFR, the meaning and scope of the corresponding rights in the ECHR are the same as those in the CFR. Consequently, article 8 ECHR corresponds with article 7 CFR and thus protects the right to reputation as well.

The importance of understanding which rights are protected in which convention or charter is because of the different courts in Europe. The ECHR binds states, the contracting parties of the Council of Europe and the ECtHR was set up to ensure that these contracting parties observe their obligations under the ECHR.[12] The CFR only binds Member States of the European Union and the court overseeing EU law is the European Court of Justice (CJEU). The task of the CJEU is to ensure uniformity and consistency in the interpretation of EU law.[13] The right to reputation thus exists in the European sphere but is not harmonised in EU law through further community legislation. Therefore, to determine whether the right to reputation of an individual has been infringed is a matter for national courts.[14]

2 Australia

In contrast to the EU, the right to reputation in Australia is fundamentally established in common law and has partly been translated into a statutory framework, the Defamation Act 2005.[15] Defamation law provides a cause of action to compensate a person for the damage suffered through a published attack on reputation.[16] To hold a defendant liable in a defamation claim, a plaintiff must establish three elements. Firstly, the plaintiff must establish that the defendant published matter. Secondly, this matter must identify or is “of and concerning” the plaintiff and lastly, the matter must be defamatory of the plaintiff.[17]

3 Comparison

Though both legal frameworks recognise the right to reputation, the protection in Europe is complicated and based on case law. The right to reputation has a much stronger basis in the common law framework of Australia which is also partly codified in the Defamation Act 2005.

B The Right to Reputation Balanced against the Right to Freedom of Expression

1 Europe

The right to freedom of expression is protected in articles 10 ECHR and 11 CFR. As the right to reputation is not harmonised in the EU, most insights can be drawn from case law from the ECtHR where article 8 ECHR (right to privacy) is balanced against article 10 ECHR. According to the ECtHR, not any claim that could relate to reputation would create a claim under article 8 ECHR. The attack on a person’s reputation must have a certain level of seriousness, in addition to being made in a manner that causes prejudice to the personal enjoyment of the right to respect for private life.[18] The right to freedom of expression touches the core of the ECHR and is, according to the ECtHR, the essential foundation of a democratic society. This means states must justify any interference with article 10 ECHR.[19] When it comes to the internet, the ECtHR has expressly emphasised that the level of seriousness is important.[20] This is the case because millions of users post content on the internet every day that may be regarded as defamatory. Many of those are trivial in character, or do not cause any significant damage to a person’s reputation because the extent of the publication is limited.[21]

2 Australia

Even though the common law has attached value to freedom of expression, Australia is known to provide extensive protection to reputation.[22] Australia is even known as the “defamation capital of the world”.[23] The reason for this may be because of the lack of explicit protection of freedom of expression in an Australian constitution or federal statute.[24] Or, it may be because of the fact that the existence of defamation laws is older than the recognition that freedom of expression is important in a democratic society.[25] In addition, Australia provides far-reaching protection to reputation in its case law, also when it comes to internet defamation which can be witnessed through the development the concept “the grapevine effect”. This refers to the characteristic of internet publications and that the damage to reputation is caused by the ongoing publication, to a wider audience than that it initially was published.[26],[27]

3 Comparison

Where the right to reputation is clearly protected in Australia, the right to freedom of expression is less. This is in contrast to Europe where the right to freedom of expression receives prominent protection but the right to reputation is less clearly protected. The protection, or lack of protection, of these fundamental rights shines through in subsequent legislation and ultimately case law. This will become clear in the following section where the framework for intermediary liability in the EU and Australia is outlined and compared.

C Intermediary Liability

This section outlines intermediary liability in both the EU and Australia. Even though the legal framework is explained in general, the summary of case law is focused on legal rules relating to hosting providers such as Facebook. If one merely focuses on the statutory framework itself, both the EU and Australia provide conditional immunity to intermediaries (common law also maintains a strict liability framework). This means an intermediary receives immunity but can lose it the moment it obtains knowledge of illegal content and decides not to remove it.

1 The EU

In EU law, intermediary liability is governed in the E-Commerce Directive (ECD).[28] Its scope and application is further explained through case law of the CJEU.

(a) E-Commerce Directive

The ECD applies horizontally to any area and to any kind of illegal content.[29] The ECD is applicable to Information Society Services and provides a ‘safe harbour’ against liability, but not injunctions.[30] The ECD distinguishes three types of Information Society Services, mere conduit, article 12 ECD; caching, article 13 ECD; and hosting, article 14 ECD. The mere conduit intermediaries are exempt from liability for information that they store or transmit, provided that their activity is merely technical, automatic and passive in nature.[31] Of importance is that they have ‘neither knowledge of nor control over the information’.[32] Mere conduit can be characterized as broad immunity[33] as the intermediary is exempt from liability.[34] Caching and hosting services receive less protection and only receive immunity if they lack actual knowledge of the illegal information or act expeditiously to remove it once they gain knowledge.[35] Caching and hosting providers receive conditional immunity.[36] The intermediary is potentially shielded from liability for third-party content if certain conditions are met.[37] The other option is strict liability, where the intermediary is liable for third-party content even when it is not aware that the content is illegal, or even exists. In this case, an intermediary can avoid liability by monitoring and removing content proactively if it may be considered infringing.[38] Article 15 ECD ensures such a liability scheme cannot take place in the EU as there is a restriction on imposing a general monitoring obligation. However, recital 47 ECD leaves room for Member States to impose a specific monitoring obligation on an intermediary.[39]

(b) CJEU case law

To determine whether an intermediary’s activities falls under the ECD, the CJEU firstly considers whether it’s conduct is merely technical, automatic and passive in nature. If this is the case, it implies that the intermediary had neither knowledge of nor control over the information stored.[40] If the intermediary has control, it will not fall under the ECD. Interestingly, the CJEU has decided that commercial benefit (payment) through hosting does not lead to the assumption that the intermediary has control over the content.[41] However, an intermediary can be considered to exercise control if it plays an active role in presenting hosted content.[42] Secondly, the CJEU examines whether the provider actually had knowledge or awareness of the illegal content, for instance by being notified.[43] If the intermediary knew of illegal content and refrained from taking it down, it can be held liable under article 14 ECD.

2 Australia

In Australia, intermediary liability is not as clearly statutorily outlined and has a different starting point compared to the EU. Liability in defamation law for publication is broad and strict.[44] This has as a consequence that the meaning of intent is limited, and that nearly everyone involved in publication can potentially be considered a publisher.[45] In order to mitigate the potential harshness of this situation, certain defences have been developed[46], namely the defence of innocent dissemination and the conditional liability framework outlined in the Broadcasting Services Act 1994[47].

(a) Defence of Innocent Dissemination

An intermediary can rely on the defence of innocent dissemination (DoID) if it does not know, nor ought to have known that the matter being disseminated was defamatory and the lack of knowledge was not due to its negligence.[48] This defence can be found in both common law and statute.[49] The DoID is not open for those who are responsible for initial publication and can only apply to subordinate distributors.[50] Even though this defence was developed before the development of the internet, it is argued this defence is technology neutral and in the case Thompson v Australian Capital Television it was said that a mere distributor of electronic material can rely on the DoID.[51] This changes when an intermediary monitors content because the intermediary then knows what content it hosts or when there are “grounds for supposing” that there is defamatory content.[52]

Even though the DoID has extended in principle to new technologies, defendants have found it difficult to rely on this defence in practice.[53] The DoID can be categorised as a type of conditional liability but is different from the ECD. In the EU the host is presumed not to be liable when they are merely technical, automatic and passive in nature unless they gain knowledge of the illegal material.[54] In Australia, it is for the defendant to prove that they did not have knowledge as this is their defence to not be held liable as a publisher. This may be the reason why is it harder for intermediaries to rely on the DoID.

(b) Broadcasting Services Act

In addition to the DoID, there is special protection for intermediaries under Commonwealth law.[55] This protection can be found in schedule 5, section 91 of the Broadcasting Services Act 1992 (BSA) and applies, similarly to the ECD, horizontally. The statute shields intermediaries from civil liability for hosting and carrying content where it was not aware of its nature. Additionally, just like article 15 ECD, it restricts the State or Territory, or a rule of common law or equity to impose an obligation to monitor, make enquiries about or keep records of content which it hosts or carries.[56] Even though an intermediary cannot be held liable, it can be required to remove content when notified by the Australian Broadcasting Authority.[57]

The BSA provides another form of conditional liability as the intermediary can only avoid liability as long as it is unaware of the content it hosts.[58] Similarly to the EU framework, under the DoID and the BSA the immunity depends on the element of knowledge. However, the BSA is different from the DoID because negligence of the intermediary does not appear to be relevant in the BSA.[59]

Concluding, in Australia intermediaries have two conditional liability frameworks to fall back on, but this does not necessarily mean there is more protection. The sphere of the ‘safe harbour’ for the DoID is significantly smaller than the safe harbour in the EU as an intermediary will also be liable if there are “grounds for supposing” that there is defamatory content. This almost insinuates an monitoring obligation.

(c) Common law

The common law development of intermediary case law is quite complicated and not one line of reasoning can be found. However, the most current line of reasoning is that a plaintiff may be able to hold an intermediary liable as a primary publisher if the intermediary was instrumental in the act of publication.[60] This can be the case by being intentionally complicit in the act of making or authorising the defamatory content or if it failed to take reasonable steps to prevent its publication.[61] If the plaintiff cannot establish this, the plaintiff must allege the defendant had control over the publication of the content.[62] If this is not possible, liability can still arise if the defendant is considered a secondary or subordinate publisher.[63] The plaintiff must allege that the defendant had knowledge of the defamatory content. This can be done by alleging that the intermediary was notified or had sufficient responsibility over the content to exert control.[64] If the intermediary then fails to remove the content it is considered liable as it has ratified the continued publication of the defamatory content.[65]

Considering an intermediary a primary publisher is a form of strict liability, which is in line with the general concept of strict liability in defamation law for publication.[66] Whether this is a beneficial line of reasoning for the online world, however, is the question. In addition to a strict liability regime, the case law also distinguishes a conditional liability regime by considering some intermediaries secondary publishers.

3 Concluding remarks

In Australia the right to reputation is protected to a greater degree than freedom of expression, both through its defamation laws but also in the way intermediaries are held responsible for publication. In the EU the right to freedom of expression stands more on the forefront and this can be seen through its development of a conditional liability framework for intermediaries. However, both legal frameworks portray a struggle in the balance between the protection of reputation and freedom of expression and two recent judgements, analysed below, illustrate the consequences.

III CASE STUDIES

To fully understand how the framework outlined above and its interpretation is actually curtailing freedom of expression, two very recent cases from both the EU, Eva v Facebook (including the opinion on the case by the Advocate-General[67] (AG)), and Australia, Voller, are analysed.

A The EU: Eva v Facebook

The European legal concepts outlined above come together in the recent case from the CJEU Eva v Facebook. In this case the plaintiff sued Facebook after its failing to remove certain defamatory comments about her. The domestic courts decided that Facebook was liable after being put on notice and must therefore delete the posts. Facebook appealed and the Supreme Court of Austria subsequently referred the case to the CJEU for further explanation. The CJEU decided that an intermediary must not only remove information identical to the content which was previously declared to be unlawful, but also information which is equivalent to that unlawful content.[68] The CJEU diverges in this judgement from the opinion of the AG. However, even this opinion was already extensively criticized[69] for restricting the right to freedom of expression, but seems to be more nuanced than the judgement by the CJEU.

The CJEU states that, even though there is a restriction on general monitoring[70], if certain content is already declared illegal by domestic courts, specific monitoring is acceptable.[71] Damage can arise due to the internet’s rapidity and geographical extent[72] and Member States must ensure action is taken to terminate ‘any’ alleged infringement and to prevent ‘any’ further impairment of the interests involved.[73] The AG comes to a similar conclusion but explains that if a Member State imposes a general monitoring obligation on an intermediary, the intermediary’s conduct will not be considered neutral anymore as it now has knowledge of the content it hosts. This means the intermediary loses its ‘safe harbour’.[74] Additionally, the AG concludes that for a hosting provider to monitor identical content, no extra investment is necessary as it can utilize simple software tools which means the competing rights are balanced fairly.[75]

That identical information can be monitored and subsequently blocked makes sense as it is already determined to be illegal by a judge. The right to reputation of the individual being defamed must be fairly balanced against the right to freedom of expression. However, this balance seems off in the second part of the judgement regarding information that could be considered equivalent to the defamatory information as it seems to tilt the scale in the direction of the right to reputation.

Firstly, this is the case because the CJEU holds a broad definition of equivalent information which is even broader than what the AG considered. The CJEU explains that what is considered illegal is the meaning of the words used and not the actual combination of the words. [76] This is in line with what the AG explains, that defamatory statements are rarely expressed by two people the same way, but can still have the same meaning. Therefore, the AG argues that previous case law from intellectual property cases may be guiding but not directly applicable since in these cases the illegal content is content that is clearly identical or resembles the protected content that should be monitored.[77] This is different in defamatory cases and an important distinction because the interpretation of ‘equivalent information’ determines the scope for monitoring.[78] These monitoring tools cannot easily filter out the meaning of content in such a way it can recognise that it is defamatory. Content filters are not ‘Defamation Net Nannies’.[79]

The AG strives to protect the right to freedom of expression by only allowing the intermediary to monitor the content from the same user who initially posted the illegal content.[80] However, content from others cannot be monitored because that would require general monitoring and sophisticated monitoring solutions. This would lead to the intermediary not being considered neutral anymore and will lead to censorship and an active role of the intermediary.[81] In addition, there would not be a fair balance as the intermediary is required to invest in costly monitoring solutions.[82]

The AG determines that the distinction between specific and general monitoring is based on whether software tools or sophisticated monitoring tools are utilised. However, it is not clear what the basis is for this differentiation.[83] Additionally, the AG does not elaborate on the type of software tool it is referring to that may be used to monitor identical content. One could assume the AG is referring to automated software tools, but these are not yet optimised and provide for many false positives and according to research even result in “structural over-blocking”.[84]

For the CJEU, the right to reputation seems to outweigh freedom of expression. It emphasises that in order to stop the impairment of the interests involved, if content conveys the same message and is only worded slightly differently, this content should still be considered illegal and thus filtered out. This argument is understandable, however, short-sighted. When would wording be slightly different and when too different? And who decides this? If it is up to the defamed, the scope for protection will be broad and more inclusive then if it is up to the intermediary.

The CJEU strives to counter this problem by providing criteria that must be provided for an injunction. These elements must be provided in such a way that the intermediary does not have to carry out an independent assessment of that content. The requirement of the CJEU that an intermediary should not undertake an “independent assessment” may be because of the criticism in the literature, which is two-fold. It is asserted that intermediaries should not be tasked with determining what content can remain online and what cannot because they do not have enough legal knowledge to assess the legality of content.[85] If there is uncertainty about the legality of content an intermediary will be inclined to block the content as this is relatively risk free. However, blocking legal content is a form of censorship.[86] Secondly, a great amount of cases will not directly fall in either category of being legal or illegal. This is especially the case in defamation cases and it is often a matter of opinion whether something can be considered defamatory.[87] To determine its legality, legal proceedings are necessary which is often costly, lengthy and burdensome. Therefore, the intermediary may decide to take the safe way out which may lead to preventive over-blocking of legitimate content and again private censorship.[88]

Thus, based on this case even more information online will be filtered and subsequently blocked. If an intermediary receives a notification of supposed illegal content but decides to leave it online, it risks being confronted with an injunction that will expect them to not only monitor for identical information, but also equivalent information. This will require investment in technology. When monitoring for equivalent information, it may be uncertain whether it falls within the scope, but out of fear of being held liable since the information is already considered illegal, it will err on the side of caution and block information. This broadens the scope of what will be considered defamatory and most certainly will lead to over-blocking, Therefore, to prevent complicated work and potential liability, intermediaries now have an even greater incentive to take down information the moment they are notified, without any further analysis on the legality of the content.

B Australia: Voller

This Voller case may not immediately seem applicable to the discussion as it was not the intermediary, Facebook, that was held liable. However, the case still shows how the right to freedom of expression is being curtailed by imposing an obligation on commercial parties to monitor comments posted on Facebook. This obligation ultimately has the consequence that it creates a framework of strict liability.

This case is about a man who has been the subject of many articles which were shared on Facebook by various media companies. Mr. Voller sued three media companies because he alleged to be defamed by the comments posted by Facebook users in response to the stories published. The Supreme Court of New South Wales concludes that the media companies are publishers of the defamatory comments. The media companies provided the platform for publication and encouraged the publication of comments for its own commercial interest.

The Court comes to this conclusion by determining that the media companies are primary publishers, but if that conclusion does not hold up, they can be considered secondary publishers. The Court based their first conclusion on the reasoning in the Oriental Express case. In this case the defendant hosted a discussion forum and was considered a primary publisher because they could exercise editorial control over the comments posted.[89] The Court in the Voller case uses it as an analogy and argues that the media companies have control as they can hide all comments before they are published. However, the Court here seems to confuse the role of the intermediary which hosts the comments (Facebook) and the media companies using the platform. In Oriental Express it was the intermediary that was held liable. In the Voller case Facebook created the software for the platform and decided what users can do on this platform, including the parties who operate a Facebook page. They are the ones in control, not the parties who merely use the tools provided to them. By incorrectly using Oriental Express as an example the Court is extending liability to a party that should not be responsible for the monitoring of comments. A monitoring obligation is imposed on media companies which leads to a framework of strict liability. The media companies are considered the primary publisher[90] and thus responsible for the comments made by third-party Facebook users. However, such responsibility cannot be compared to the responsibility of an editor for the articles it publishes. Comments are different from the news both in nature and function.[91]

Secondly, the Court concludes that the media companies should still be considered secondary publishers if the Court’s analysis is wrong in considering the media companies primary publishers. For this conclusion the Court uses the reasoning in the Byrne v Deane case. The Court argues that the case is comparable because the media companies commercially operate an electronic bulletin board and post material that will attract comments, which most probably are defamatory.[92] In Byrne v Deane the defamatory material was posted on a bulletin board hanging in a golf club. According to the Court the cases are similar as the media companies create an opportunity for defamatory material to be posted and are thus “promoting” defamatory material and ratifying its presence and publication.[93] Thus, the creation of a public Facebook page for its own commercial benefit means the media company is a secondary publisher.[94] However, the comparison to this case is doubtful. In Byrne v Deans liability arose after the fact that the building owners were notified of defamatory material on the bulletin board and neglected to remove the content. The liability here arose because the defendants were notified and did not act, signifying they consented to the content. Thus, in Voller the media companies are immediately liable if there is a defamatory post because they should only unhide those comments that cannot be considered defamatory.

The arguments for this conclusion have in common that the Court wants to hold the media companies liable because they operate the public Facebook page for their own commercial benefit. According to the Court this means that they do not contribute to the freedom of expression or exchange of ideas.[95] The Court seems to insinuate that the one excludes the other, but does not explain why. It, of course, makes sense that a media company operates a Facebook page for its own commercial interest, it is not a charity. However, this does not take away that the platform still creates a place for discussion. Such a comment section under a post can spark debate as commentators can reply to each other. It is clear that such a comment section is consistent with the media’s role in fostering discussion.[96] Therefore, by obliging a media company to hide comments there will be a negative consequence for freedom of expression.

This negative consequence is intensified by the fact that by hiding all comments in advance and having moderators actively unhiding some, more comments will of course remain hidden than unhidden. When only those comments considered defamatory are deleted, the comments of which the content is not certain will most probably remain online. However, when everything is hidden, comments that are not clearly legal remain ‘un-hidden’ since it is safer to keep them hidden due to the strict liability imposed. This will lead to private censorship and it is interesting that this consequence is not considered by the Court. It may be because of the fact that the right to freedom of expression is not as clearly protected in Australia as it is in Europe. In Europe the right to freedom of expression does not only include the freedom to hold opinions and impart information and ideas, but also to receive information and ideas.[97]

Therefore, from a European perspective it is surprising that the Court offers the ‘solution’ to the strict liability framework by employing moderators to check the comments before un-hiding it. As outlined in the analysis of Eva v Facebook, requiring private companies to undertake an “independent assessment” is explicitly rejected by the CJEU.

The Court argues that such a strict liability regime is necessary and requiring investments fair since the Facebook page is operated for its own commercial interest.[98] However, the Court does not consider what this will cost society. Firstly, because freedom of expression will be restricted, and secondly because the cost of employing such moderators should not be underestimated. Especially for news companies that continue to struggle to adapt their business models to the digital age. Increasing costs is likely to have an impact on not only the quantity but also the quality of news.[99] This is exactly the reason why the ECD, the CJEU and international experts indicate that a monitoring obligation is inconsistent with the right to freedom of expression.[100]

To conclude, it is clear that not having constitutional or federal protection for the right to freedom of expression is detrimental for online Australia. Deciding that media companies are responsible for the comments posted by third-parties on an intermediary’s platform such as Facebook is not only illogical, it also provides a legal basis for private censorship. The suggestion to un-hide comments shows that the Court does not understand what the consequences are when private entities decide on the legality of content.

C Implications

Even though both cases have very different defendants, one being the intermediary and the other media companies using the intermediary, the consequence on freedom of expression is similar. Where in Eva v Facebook the CJEU restricts freedom of expression by providing vague criteria for the ‘specific’ monitoring obligations where an injunction is ordered, the Voller case provides a legal basis for private censorship.

If this line of reasoning is continued in Australia in a possible future case where Facebook is taken to court, it is likely that Facebook will be considered a primary publisher. The situation that then will arise is that multiple parties are responsible for monitoring one platform. Media companies operating a Facebook page must ensure that comments to their posts are monitored and if a Facebook user decides to share a post on their own page, Facebook is tasked with monitoring. This creates multiple layers of monitoring by different private companies that all have their own political agenda and lack of adequate legal knowledge to balance rights effectively.

Even though the CJEU in Eva v Facebook also outweighs the right to reputation to the right to freedom of expression, there is one thing the Supreme Court in New South Wales should take from this case: intermediaries should not be burdened with having to make “independent assessments” as this will lead to private censorship. If they are obliged to make assessments, private companies become the gatekeepers to the information on the internet.[101] They decide what information should and should not be posted online, basically holding a quasi-judicial position[102], and curtailing freedom of expression.[103]

IV CONCLUSION

Australia and Europe both have significant different legal systems and the right to reputation and the right to freedom of expression could almost not have developed any more differently. In this essay an outline of these systems and two case studies are provided to answer the following question: ‘Do the current legal frameworks for defamation law in the EU and Australia protect the right to freedom of expression sufficiently in intermediary liability cases?’ The short answer to this question is no, especially for Australia. The lack of constitutional or federal protection for the right to freedom of expression is detrimental for online Australia. This becomes apparent in the Voller case where the Court determines that media companies are responsible for comments posted by Facebook users on a third-party’s platform. In Australia the outcome of the Voller case will undoubtedly create a system of private censorship for multiple private parties on different levels. The CJEU also places a restriction on freedom of expression online, even though freedom of expression enjoys greater protection in Europe. In the EU vague criteria on what type of information can be monitored in case of an injunction are created by the CJEU. This will lead to an even greater incentive to take down content the moment an intermediary receives a notification on possible defamatory content.

V BIBLIOGRAPHY

A Articles/Books/Reports

Aplin, Tanya and Boslan, Jason, ‘The uncertain landscape of article 8 of the ECHR: The protection of reputation as a fundamental human right’ in A. Kenyon, Comparative Defamation and Privacy Law (CUP, 2016)

Brunner, Lisl, ‘The liability of an online intermediary for third party content, the watchdog becomes the monitor: intermediary liability after Delfi v Estonia’ (2016) 16 Human Rights Law Review 163

Handbook on European data protection law (European Union Agency for Fundamental Rights, Council of Europe, 2018)

Eisenberg, Julie, ‘Safety out of Sight: The Impact of the New Online Content Legislation on Defamation law’, (2000) 23 UNSW Law Review 232

Fabbrini, Federico and Larik, Joris, ‘The Past, Present and Future of the Relation between the European Court of Justice and the European Court of Human Rights’ (2016) Yearbook of European law 1

Gelber, Katharine, ‘Diagonal accountability: freedom of speech in Australia’ (2017) 23(2) Australian Journal of Human Rights 203

George, Patrick, Social Media and the Law (Chatswood, 2nd ed, NSW LexisNexis Butterworths, 2016)

Gray, Anthony, ‘Liability of Search Engines and Tech Companies in Defamation Law (2019) 27(1) Tort Law Review 18

Jones, Mariette, ‘Is EU law effective in preventing forum shopping for the pursuit of actions arising from online infringement of personality rights?’ in Russell Weaver, a.o. (eds). Privacy in a Digital Age: Perspectives from Two Continents (Carolina Academic Press 2017) 47

Kenyon, Andrew, “Defamation, Privacy and Aspects of Reputation” (2019) 56(1) Osgoode Hall Law Journal 59

Kuczerawy, Aleksandra ‘Intermediary liability & freedom of expression: Recent developments in the EU notice & action initiative (2015) 31 Computer Law & Security Review 46

Kuczerawy, Aleksandra, ‘General monitoring obligations: a new cornerstone of regulation in the EU?’ (2019) Intersentia, (forthcoming)

Mackinnon, Rebecca, Fostering Freedom Online: The Role of Internet Intermediaries (UNESCO Internet Society, 1st ed, 2014)

Macovei, Monica, Freedom of expression: A guide to the implementation of Article 10 of the European convention of Human rights, Human rights handbook, no.2 (Council of Europe, 2nd ed, 2004)

Pappalardo, Kylie and Suzor, Nicolas, ‘The Liability of Australian Online Intermediaries’ [2018] SydLawRw 19; (2018) 40 Sydney Law Review 469

Rolph, David, ‘Publication, innocent dissemination and the internet after Dow Jones & Co Inc v Gutnick’ (2010) 33(2) UNSW Law Review 562

Rolph, David, Defamation law ( Thomson Reuters, 1st ed, 2016)

Schellekens, Maurice, ‘Liability of internet intermediaries: A Slippery Slope? (2011) 8(2) SCRIPTed 154

Svantesson, Dan, ‘Grading AG Szpunar’s Opinion in Case C-18/18 – A Caution Against Worldwide Content Blocking as default’ (2019) Expert opinion on behalf of Facebook Ireland Limited in Case C-18/18, SSRN https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3404385 (accessed 23 November 2019).

B Cases

Common law cases

Byrne v Deane [1937] 1 KB 818

Thomson V Australian Capital Television Pty Ltd (1996) 186 CLR 574; [1996] HCA 38

Tamiz v Google Inc [2013] EWCA Civ 68

Oriental Express Group Ltd v Fevaworks Solutions Ltd (2013) 16 HKCFAR 366; [2013] HKCFA 47

Google Inc v Duffy [2017] 129 SASFC 130; [2017] SASCFC 130

Voller v Fairfax Voller v Nationwide News Pty Ltd; Voller v Fairfax Media Publications Pty Ltd; Voller v Australian News Channel Pty Ltd [2019] NSWSC 766

Judgement of the European Court of Justice

Google France SARL and Google Inc v Louis Vuitton Malletier SA (Joined cases C-236/08 to C-238/08) [2010] ECR I-02417

L’Oréal SA and Others v eBay International AG and Others (C-324/09) [2011] ECR I-06011

Papasavvas v O Fileleftheros Dimosia Etaireia Ltd. Et al. (Court of Justive of the European Union, C-291/13, ECLI:EU:C:2014:2209, 11 September 2014)

Eva Glawischnig-Piesczek v Facebook Ireland Limited (Court of Justice of the European Union, C-18/18, ECLI:EU:C:2019:821, 3 October 2019)

Opinion of Advocate-General

Eva Glawischnig-Piesczek v Facebook Ireland Limited (Opinion) (Advocate-General Szpunar, ECLI:EU:C:2019:458, 4 June 2019)

Judgement of the European Court of Human Rights

Axel Springer AG v Germany [2012] I Eur Court HR 227

C Legislation

Australian

Broadcasting Services Act 1992 (Cth)

Defamation Act 2005 (Cth)

European Union

Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1

D Treaties

Convention for the Protection of Human Rights and Fundamental Freedoms, opened for signature 4 November 1950, 213 UNTS 221 (entered into force 3 September 1953), as amended by Protocol No 14 to the Convention for the Protection of Human Rights and Fundamental Freedoms, Amending the Controls System of the Convention, opened for signature 13 May 2004 CETS NO 194 (entered into force 1 June 2010)

Charter of Fundamental Rights of the European Union, ratified 7 December 2000 [2000] 2012/C 326/02, amended 26 October 2012

E Other

Council of Attorneys-General, ‘Review of Model Defamation Provision NSW Government’ (Discussion Paper, NSW Government, February 2019)

European Court of Human Rights, Guidance on article 8 ECHR (Last updated 31 August 2019)

Michael Pelly, ‘Australia – the defamation capital of the world For instance’, The Australian Financial Review (Sydney, 4 September 2019)


[1] LL.B, LL.M (Cum Laude). I would like to thank Australian Barrister Lyndelle Barnett for her assistance and encouragement. All errors are my own.

[2] Anthony Gray, ‘Liability of Search Engines and Tech Companies in Defamation Law (2019) 27(1) Tort Law Review 18, 19.

[3] Ibid.

[4] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Court of Justice of the European Union, C-18/18, ECLI:EU:C:2019:821, 3 October 2019) (‘Eva v Facebook’).

[5] Voller v Fairfax Voller v Nationwide News Pty Ltd; Voller v Fairfax Media Publications Pty Ltd; Voller v Australian News Channel Pty Ltd [2019] NSWSC 766 (‘Voller’).

[6] Defamation act 2005 (Cth).

[7] Council of Attorneys-General, ‘Review of Model Defamation Provision NSW Government’ (Discussion Paper, NSW Government, February 2019).

[8] Andrew Kenyon, “Defamation, Privacy and Aspects of Reputation” (2019) 56(1) Osgoode Hall Law Journal 59, 60.

[9] Convention for the Protection of Human Rights and Fundamental Freedoms, opened for signature 4 November 1950, 213 UNTS 221 (entered into force 3 September 1953), as amended by Protocol No 14 to the Convention for the Protection of Human Rights and Fundamental Freedoms, Amending the Controls System of the Convention, opened for signature 13 May 2004 CETS NO 194 (entered into force 1 June 2010).

[10] Tanya Aplin and Jason Boslan, ‘The uncertain landscape of article 8 of the ECHR: The protection of reputation as a fundamental human right’ in A. Kenyon, Comparative Defamation and Privacy Law (CUP, 2016) ch 13, 1.

[11] Mariette Jones, ‘Is EU law effective in preventing forum shopping for the pursuit of actions arising from online infringement of personality rights?’ in Russell Weaver, a.o. (eds). Privacy in a Digital Age: Perspectives from Two Continents (Carolina Academic Press 2017) 47, 47.

[12] Handbook on European data protection law (European Union Agency for Fundamental Rights, Council of Europe, 2018) 23.

[13] Federico Fabbrini and Joris Larik, ‘The Past, Present and Future of the Relation between the European Court of Justice and the European Court of Human Rights’ (2016) Yearbook of European law 1, 22.

[14] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Opinion) (Advocate-General Szpunar, ECLI:EU:C:2019:458, 4 June 2019) [78].

[15] Council of Attorneys-General, ‘Review of Model Defamation Provision NSW Government’ (Discussion Paper, NSW Government, February 2019) 9.

[16] George, Patrick, Social Media and the Law (Chatswood, 2nd ed, NSW LexisNexis Butterworths, 2016) 144.

[17] David Rolph, Defamation law (Thomson Reuters, 1st ed, 2016) 92.

[18] European Court of Human Rights, Guidance on article 8 ECHR (Last updated 31 August 2019) [146].

[19] Monica Macovei, ‘Freedom of expression: A guide to the implementation of Article 10 of the European convention of Human rights, Human rights handbook’, no.2 (Council of Europe, 2nd ed, 2004) 7.

[20] Axel Springer AG v Germany [2012] I Eur Court HR 227 [83].

[21] European Court of Human Rights, Guidance on article 8 ECHR (Last updated 31 August 2019) [150].

[22] Rolph 2016 (n 17) 24.

[23] Michael Pelly, ‘Australia – the defamation capital of the world For instance’, The Australian Financial Review (Sydney, 4 September 2019).

[24] Katharine Gelber, ‘Diagonal accountability: freedom of speech in Australia’ (2017) 23(2) Australian Journal of Human Rights 203, 203.

[25] George (n 16) 144.

[26] Ibid 151.

[27] This seems to be the opposite of the EU, cf (n 22).

[28] Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1.

[29] Aleksandra Kuczerawy, ‘General monitoring obligations: a new cornerstone of regulation in the EU?’ (2019) Intersentia, (forthcoming) 1.

[30] Rebecca Mackinnon, Fostering Freedom Online: The Role of Internet Intermediaries (UNESCO Internet Society, 1st ed, 2014) 51.

[31] Lisl Brunner, ‘The liability of an online intermediary for third party content, the watchdog becomes the monitor: intermediary liability after Delfi v Estonia’ (2016) 16 Human Rights Law Review 163, 165.

[32] Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1, art. 12.

[33] Maurice Schellekens, ‘Liability of internet intermediaries: A Slippery Slope? (2011) 8(2) SCRIPTed, 154, 157.

[34] Mackinnon (n 30) 42.

[35] Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1, art. 13-14.

[36] Ibid.

[37] Mackinnon, (n 30) 40.

[38] Ibid.

[39] Schellekens, (n 33) 158.

[40] Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1, rec 42; Google France SARL and Google Inc v Louis Vuitton Malletier SA (Joined cases C-236/08 to C-238/08) [2010] ECR I-02417; L’Oréal SA and Others v eBay International AG and Others (C-324/09) [2011] ECR I-06011.

[41] Google France SARL and Google Inc v Louis Vuitton Malletier SA (Joined cases C-236/08 to C-238/08) [2010] ECR I-02417 [116]; Google France SARL and Google Inc v Louis Vuitton Malletier SA (Joined cases C-236/08 to C-238/08) [2010] ECR I-02417 [115].

[42] L’Oréal SA and Others v eBay International AG and Others (C-324/09) [2011] ECR I-06011 [123].

[43] Ibid [116]-[122].

[44] Rolph 2016 (n 17) 139.

[45] Kylie Pappalardo and Nicolas Suzor, ‘The Liability of Australian Online Intermediaries’ [2018] SydLawRw 19; (2018) 40 Sydney Law Review 469, 481.

[46] David Rolph ‘Publication, innocent dissemination and the internet after Dow Jones & Co Inc v Gutnick’ (2010) 33(2) UNSW Law Review 562, 569.

[47] Broadcasting Services Act 1994 (Cth).

[48] Defamation Act 2005 (Cth) s 32(1).

[49] Rolph 2016 (n 17) 293.

[50] Ibid.

[51] Thomson V Australian Capital Television Pty Ltd (1996) 186 CLR 574; [1996] HCA 38 [589].

[52] Julie Eisenberg, ‘Safety out of Sight: The Impact of the New Online Content Legislation on Defamation law’, (2000) 23 UNSW Law Review 232, 235.

[53] Rolph 2010 (n 46) 575.

[54] Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1, rec 42.

[55] Eisenberg (n 52) 232.

[56] Broadcasting Services Act 1992 (Cth) Sch 5 cl 91.

[57] Eisenberg (n 52) 235.

[58] Rolph 2010 (n 46) 577.

[59] Ibid.

[60] George (n 16) 163.

[61] Ibid.

[62] Tamiz v Google Inc [2013] EWCA Civ 68.

[63] George (n 16) 163.

[64] Google Inc v Duffy [2017] 129 SASFC 130; [2017] SASCFC 130 [184].

[65] George (n 16) 163.

[66] Rolph 2016 (n 17) 139.

[67] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Opinion) (Advocate-General Szpunar, ECLI:EU:C:2019:458, 4 June 2019).

[68] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Court of Justice of the European Union, C-18/18, ECLI:EU:C:2019:821, 3 October 2019) [53].

[69] See Dan Svantesson, ‘Grading AG Szpunar’s Opinion in Case C-18/18 – A Caution Against Worldwide Content Blocking as default’ (2019) Expert opinion on behalf of Facebook Ireland Limited in Case C-18/18, SSRN https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3404385 (accessed 23 November 2019).

[70] Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1, art.15.

[71] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Court of Justice of the European Union, C-18/18, ECLI:EU:C:2019:821, 3 October 2019) [34].

[72] Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1, rec 52.

[73] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Court of Justice of the European Union, C-18/18, ECLI:EU:C:2019:821, 3 October 2019) [30].

[74] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Opinion) (Advocate-General Szpunar, ECLI:EU:C:2019:458, 4 June 2019) [36], [39].

[75] Ibid.

[76] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Court of Justice of the European Union, C-18/18, ECLI:EU:C:2019:821, 3 October 2019) [41].

[77] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Opinion) (Advocate-General Szpunar, ECLI:EU:C:2019:458, 4 June 2019) [69].

[78] Ibid [71].

[79] Eisenberg (n 52) 235.

[80] Eva Glawischnig-Piesczek v Facebook Ireland Limited (Opinion) (Advocate-General Szpunar, ECLI:EU:C:2019:458, 4 June 2019) [72].

[81] Ibid [73].

[82] Ibid [74].

[83] Kuczerawy 2019 (n 29) 5.

[84] Schellekens (n 33) 168.

[85] Aleksandra Kuczerawy, ‘Intermediary liability & freedom of expression: Recent developments in the EU notice & action initiative (2015) 31 Computer Law & Security Review 46, 48.

[86] Schellekens (n 33) 168.

[87] Kuczerawy 2015 (n 85) 48.

[88] Ibid.

[89] Oriental Express Group Ltd v Fevaworks Solutions Ltd (2013) 16 HKCFAR 366; [2013] HKCFA 47 [146].

[90] Voller v Fairfax Voller v Nationwide News Pty Ltd; Voller v Fairfax Media Publications Pty Ltd; Voller v Australian News Channel Pty Ltd [2019] NSWSC 766 [228].

[91] Brunner (n 31) 171.

[92] Voller v Fairfax Voller v Nationwide News Pty Ltd; Voller v Fairfax Media Publications Pty Ltd; Voller v Australian News Channel Pty Ltd [2019] NSWSC 766 [230].

[93] Ibid.

[94] Ibid [232].

[95] Ibid [207].

[96] Brunner (n 31) 171.

[97] Macovei (n 19) 8.

[98] Voller v Fairfax Voller v Nationwide News Pty Ltd; Voller v Fairfax Media Publications Pty Ltd; Voller v Australian News Channel Pty Ltd [2019] NSWSC 766 [232].

[99] Brunner (n 31) 172.

[100] Ibid.

[101] Schellekens (n 33) 167.

[102] Mackinnon (n 30) 41.

[103] Schellekens (n 33) 156.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/UNSWLawJlStuS/2020/15.html