Home
| Databases
| WorldLII
| Search
| Feedback
University of Melbourne Law School Research Series |
Last Updated: 4 October 2010
Regulatory options for controlling tobacco advertising,
promotion and sponsorship under Article 13 of the
WHO
Framework Convention on Tobacco Control
Andrew T Kenyon[*] and Jason Bosland[**]
November 2007
Contents
Summary 2
1 Introduction 3
2 Domestic and cross-border
digital communications: the internet 4
3 Terminology and regulatory options
in relation to internet content 6
4 The general approach under Article
13 8
5 Domestic and cross-border digital communications: direct
broadcast satellite 9
6 Bans for content producers and content
providers 10
UK 10
Australia 10
European
Audiovisual Media Services Directive 11
7 Protection for
content hosts, content navigators and access providers 12
Protection
in tobacco-related legislation 13
Protection in legal instruments that are not tobacco-related:
the example of the E-Commerce Directive 13
Examples of implementation of the E-Commerce Directive 15
Germany 15
UK 15
France 15
Poland 16
8 Notice and takedown schemes for content
hosts 17
Notice and takedown schemes for illegal sexual content:
legislative
examples 17
Australia 17
Canada 18
Malaysia 19
Singapore 19
South
Korea 20
Sweden 20
Notice and takedown
schemes for illegal sexual content:
non-legislative example from the UK 20
9 User-generated
content 21
10 Jurisdiction, enforcement and related
entities 22
Appendix 27
Summary
This report examines regulatory options for controlling tobacco advertising, promotion and sponsorship under Article 13 of the WHO Framework Convention on Tobacco Control. Here, a simplified table of regulatory options related to the internet illustrates the possibilities.
Entities involved in internet communication and regulatory options for consideration
Entity
|
Definition
|
Regulatory options for consideration
|
Content producer
|
Content producers originate content or cause content to be produced.
They include entities that produce content intended purely for internet
distribution (including
user-generated content).
|
Content producers could be banned from including tobacco advertising,
promotion or sponsorship within the content they produce.
|
Content provider
|
Content providers publish content on the internet; that is, they
make content available for internet users after content providers
undertake a process of selection.
|
Content providers could be banned from making available content that
includes tobacco advertising, tobacco or sponsorship.
|
Content host
|
Content hosts store content—they control internet-connected
computer servers on which content is stored. They include entities that
aggregate content produced by others without pre-selecting the material.
|
Content hosts could be made liable when they have notice of illegal
content. They could have a general obligation to remove or disable
access to
illegal content, once content hosts have notice of that content.
|
Content navigator
|
Content navigators, such as search engines, facilitate the location of
content by internet users.
|
Content navigators could be liable when they have notice of illegal
content. They could have under a general obligation to disable
access to
illegal content, once content navigators have notice of that content.
|
Access provider
|
Access providers provide end-user access to the communications
service in question. For the internet, internet service providers (ISPs)
provide access to end users.
|
Access providers could be liable when they have notice of illegal content.
They could have a general obligation to disable access
to illegal content, once
access providers have notice of that content.
|
End user
|
User of internet content.
|
Filtering software could be offered to users.
|
The paper considers these options and issues concerning direct broadcast satellites, the types of protection offered under many laws to content hosts, content navigators and access providers, user-generated content, and jurisdiction and related matters. This suggests the general approach that could be taken to controlling tobacco advertising, promotion and sponsorship in all forms of media and communication could be:
________________________________________________________________
1 Introduction
Article 13 of the Framework Convention on Tobacco Control (FCTC) obliges Parties to undertake a comprehensive ban of all tobacco advertising, promotion and sponsorship (or to apply restrictions where constitutional limitations prevent a comprehensive ban). The obligations apply to domestic advertising, promotion and sponsorship. They also apply to cross-border advertising, promotion and sponsorship—including communication by internet, satellite and mobile platforms—whether that content originates within a Party’s territory and is available in a foreign territory, or is generated outside a Party’s territory and is available within it.
A report produced in August 2006 at the Centre for Media and Communications
Law (CMCL) at the University of Melbourne and the VicHealth
Centre for Tobacco
Control made recommendations about implementing Article
13.[1]
The 2006 Report recommended a ‘multilayered’ approach combining four
elements to curb the cross-border reach of tobacco
advertising, promotion and
sponsorship.[2] The
four elements concerned:
• law and regulation;
• monitoring
and enforcement mechanisms;
To some degree, all these four elements overlap. However, the four elements provide a useful conceptual model for Parties to implement their obligations under the FCTC. This document outlines measures which may be adopted in relation to two of these elements in particular: law and regulation, and monitoring and enforcement. It draws examples from other areas of content regulation (such as illegal sexual content and gambling). It also outlines some matters about international jurisdiction and enforcement.
The law and regulation element of this approach involves ensuring that there is an effective anti-content law in force within each Party. Article 13 of the FCTC is not media specific. Rather, it requires a comprehensive ban on tobacco advertising, promotion and sponsorship (subject to any constitutional limitation on such a comprehensive ban) irrespective of the medium, and irrespective of whether the content is provided in a manner that is linear (scheduled) or non-linear (on-demand). In addition, Article 13 is not entity specific; that is, it is not restricted to apply only to particular types of entities. Under Article 13, regulation could apply to a wide range of entities involved in the production and distribution of tobacco advertising, promotion and sponsorship.
The structure of digital media platforms such as the internet makes the range of entities that could be the target of regulation greater than for traditional media. For example, technological intermediaries involved in facilitating access to network infrastructure and distributing content could be the subject of regulation, in addition to entities that directly produce content or make it available to the public through publication.
Intermediaries could also be useful in monitoring and enforcing bans on tobacco advertising, promotion and sponsorship. For example, regulatory obligations could be applied to some intermediaries to takedown content constituting a tobacco advertisement, promotion or sponsorship when they become aware of its existence (for example, through a regulator alerting them to the content’s existence).
While a broad range of options exist for controlling content, not all options may be appropriate for adoption in all Parties. The aim is not to achieve legal uniformity between Parties on such matters, but to achieve consistency with the FCTC in a manner that allows strong domestic regulation and useful international cooperation.
2 Domestic and cross-border digital communications: the internet
In considering options for regulating domestic and cross-border digital communications, it is useful to outline aspects of the technologies’ operation. Internet communications are described here. Brief comments about satellite communications are made below in Part 5. Telecommunications technologies are not examined explicitly because many of the regulatory issues for tobacco advertising on telecommunications networks are similar to those for the network discussed below, the internet. For telecommunications networks, intermediaries—such as network operators—can be subject to analogous forms of regulation to intermediaries in internet communication. It is also becoming more common for the regulation of visual and audiovisual content delivered by internet, mobile or other platforms to be brought together under the same provisions.[3]
The internet is a network of interconnected computers, or a network of networks. It is hierarchically structured, with different tiers of service providers granting end-user access to the internet. In relation to internet regulation, it can be useful to think of the internet in terms of three distinct layers.[4]
The first layer is the physical layer. It includes such things as computer hardware, wires (telecommunication and cable networks), electromagnetic spectrum (wireless networks), routers and mobile telecommunications masts. It encompasses the tangible things that allow information to be transmitted from one computer to another, and from one network to another.
The second layer is commonly referred to as the code layer (or the ‘logical infrastructure’ layer). It is the software or computer code that enables and controls the flow of internet transmissions. The code layer is comprised of content protocols and carrier protocols, or two ‘sub-layers’. The content protocol dictates the particular technical language in which the content is sent and received (examples include http, java, VoIP, MMS). It also determines the application (or program) on which content can be manipulated by an end-user (examples include Microsoft Word, Internet Explorer, Mozilla Firefox, iTunes, Adobe Reader). The carrier protocol, on the other hand, is the language of the internet itself rather than content that is communicated on the internet. The carrier protocol directs the sending and receiving of information between one internet user and another. It makes the different networks that comprise the internet interoperable. The carrier protocol for all internet content is called the transmission control protocol/internet protocol (or ‘TCP/IP’).
The third and final layer in thinking about regulation is the content layer. This is the digital content that is transmitted over the network in binary form, such as images, text, music and audiovisual content.
Each of these layers provides a potential site for regulating internet content. Table 1 sets out these layers and ways that each can be regulated. It is worth noting that the general layers model can be adapted to other digital communications technologies, such as telephony and satellite systems. In addition, it should be noted that not all of the regulatory measures in the Table would be appropriate in relation to tobacco advertising, promotion and sponsorship. For example, digital rights management and technological protection measures appear unlikely to be significant in relation to advertising, promotion and sponsorship.
Table 1: Internet layers and regulatory measures
Layer
|
Regulatory measures
|
Physical layer
|
– access prevention / restriction |
Code layer (or logical infrastructure)
|
– disabling access to content (such as blocking / filtering, when related to metadata) |
Content layer
|
– disabling access to content |
3 Terminology and regulatory options in relation to internet content
A wide range of entities is involved with each layer of internet communication, and the terminology used internationally to describe the entities varies. For the purpose of developing guidelines under Article 13 of the FCTC about controlling tobacco advertising, promotion and sponsorship on the internet and other digital platforms, the terminology and definitions in Table 2 could be useful.[5] Some internet-related companies and digital media-related companies combine the roles of several of these entities. For example, a content producer may also act as a content provider; a content provider may also act as a content host; and so forth.
Table 2: Entities involved in internet communication and
regulatory options for consideration
Entity
|
Definition
|
Regulatory options for consideration
|
Content producer
|
Content producers originate content or cause content to be produced.
They include not only traditional content owners and producers (eg movie
studios, television production
companies) who produce content that can be
distributed online, but also newer types of entity that produce content intended
purely
for internet distribution (including peer-produced internet content and
user-generated content). Under the definition used here,
content producers do
not distribute or make available the material that is produced. That is done by
entities such as content providers.
|
Content producers could be banned from including tobacco advertising,
promotion or sponsorship within the content they produce; similar
to some
legislation applicable to traditional media that bans the inclusion of tobacco
advertisements in material that is available
or intended to be seen by
the public (eg Australia’s Tobacco Advertising Prohibition Act 1992), or
the devising of tobacco advertisements (eg UK Tobacco Advertising and
Promotion Act 2002).
|
Content provider
|
Content providers publish content on the internet; that is, they
make content available for internet users after a process of selection by
content providers. They include the internet arms of traditional publishers
(such
as newspaper companies and broadcasters) as well as internet-only
publishers. Some content providers combine or aggregate content that has
been produced by many others. Examples of such internet content aggregators
include iTunes and Amazon, which are services
that bring together a range of
content available for download or purchase. These services pre-select
the material they make available, which means their regulation with regard to
tobacco advertising, promotion and sponsorship should
be equivalent to other
content providers: they all select the material that they make available.
|
Content providers could be banned from making available content that
includes tobacco advertising, tobacco or sponsorship; similar
to legislation
applicable to much traditional media such as legislation banning the publication
of tobacco advertisements (many examples
exist, although they are often aimed at
particular types of content such as print or broadcasting content, or the linear
and non-linear
audiovisual content dealt with under the EU Audiovisual Media
Services Directive).
|
Entity
|
Definition
|
Regulatory options for consideration
|
Content host
|
Content hosts store content. That is, they control
internet-connected computer servers on which content is stored. Some entities,
such as Facebook
and YouTube, aggregate content that is produced by
others without pre-selecting the material. However, they have the
ability to remove material because they host it (or control the hosting of the
material).
Some content providers or content navigators (see below) operate
their own servers which means they also act as content hosts, while
some content
hosts are separate entities that do not provide content or offer content
navigation services. Those content hosts act
as intermediaries between content
providers / navigators and end users.
|
Content hosts could be made liable when they have notice of illegal
content. Content hosts could have a general obligation to remove
or disable
access to illegal content, once they have notice of that content. Such an
obligation might be implemented through a notice
and takedown scheme. Many
examples exist in relation to illegal sexual content and content that is alleged
to infringe copyright.
|
Content navigator
|
Content navigators, such as search engines, facilitate the location of
content by internet users.
|
Content navigators could be made liable when they have notice of illegal
content. Content navigators could have a general obligation
to remove or
disable access to illegal content, once they have notice of that content. Such
an obligation might be implemented through
a notice and blocking scheme.
|
Access provider
|
Access providers provide end-user access to the communications
service in question. In relation to the internet, internet service providers
(ISPs) provide access to end
users. (An analogy could be drawn with telephone
companies that provide their subscribers with access to the telephone network.)
|
Access providers, such as ISPs, could be made liable when they have notice
of illegal content. Access providers could have a general
obligation to disable
access to illegal content, once they have notice of that content. Such an
obligation might be implemented
through a notice and blocking scheme.
|
Backbone provider
|
Backbone providers provide access to the physical infrastructure of
the internet. They provide the point of contact to the internet for internet
services providers (with some backbone providers also acting as internet
service providers).
|
No regulatory measures are likely to be appropriate in relation to tobacco
advertising, promotion and sponsorship.
|
Internet end user
|
User of internet content.
|
Filtering software could be offered to users.
|
The categories of content producers and content providers do not only include makers and distributors of news, entertainment or informational content. The categories encompass advertising producers and their agents (including media planners and media buyers) who are involved in producing and distributing tobacco advertising, promotion and sponsorship. In relation to the internet, this could include services like Google AdSense and AdWords, which are examples of advertising producers and agents that differ from the types used in earlier media. When involved in tobacco advertising, promotion and sponsorship, they should be subject to legal control.
4 The general approach under Article 13
Table 2, above, suggests that two types of obligation could be considered in
relation to different entities involved in internet communication:
• content could be banned in relation to some entities (content
bans); and
Given ongoing change in communication technologies and the variety of platforms used for tobacco advertising, promotion and sponsorship, it is useful to conceptualise the types of regulatory option at the general level of ‘notice and obligation’, rather than at the level of a specific method of fulfilling an obligation to act once notice of the illegal content has been received. Thus, the approach should be that content is banned in relation to some entities, and a notice and obligation scheme exists in relation to other entities. Techniques that are reasonably common now—such as notice and takedown schemes in relation to illegal sexual content—illustrate this general obligation to remove or disable access to content.[6]
Therefore, in implementing Article 13 of the FCTC, the general approach could be:
Notice and obligation schemes have been implemented in two main ways; first, notice and takedown schemes for content hosts; second, notice and blocking schemes for content navigators and access providers. Under a notice and takedown scheme, when content hosts have been notified about objectionable content, they remove the content’s access to the network—content hosts would take the content down from the server on which they have been hosting it. Under a notice and blocking scheme, when content navigators or access providers have been notified about the content, they would prevent user access to the content; for example by blocking a particular internet address. It is worth noting that content bans (for content producers and content providers) and notice and takedown schemes (for content hosts) should be easier to implement effectively than notice and blocking schemes (for content navigators and access providers).[7] For much content, notice and blocking schemes may not add greatly to the effectiveness of controls on tobacco advertising, promotion and sponsorship beyond the level achieved by controlling content producers, content providers and content hosts. However, if content producers, providers and hosts are made the initial focus of control, the situation should be reviewed regularly. Methods of disabling access to content may become significant in relation to some content that is hosted outside the territory of any Party to the FCTC; for example, some content on the social networking sites discussed below in Part 9. Thus the general approach suggested here is to consider notice and obligation schemes for all three types of intermediary—content hosts, content navigators and access providers—instead of only for content hosts.
5 Domestic and cross-border digital communications: direct broadcast satellite
Direct broadcast satellite (DBS) involves direct-to-home broadcasting from satellite communication systems. Using electromagnetic spectrum, content is sent from an earth station to a space satellite, usually located in geostationary orbit.[8] It is then broadcast from the satellite directly to domestic receiving dishes. The footprint (or reception area) of a particular DBS service is determined by a number of factors, including the location of the satellite, the capacity of the transponder on the satellite[9] and the size and technological standards of domestic receiving dishes. The location of the satellite—or its ‘orbital slot’—is allocated by the International Telecommunications Union (ITU). Orbital slots are allocated to various countries, who lease them to satellite companies. Satellite companies, who launch and operate satellites, in turn lease transponder capacity to private entities such as broadcasters and telecommunications companies.
Entities involved in the transmission of DBS services, which could be the
target of various forms of regulation, can be classified
in a similar manner to
the categories used above in relation to the internet. DBS can
involve:
• content
producers (such as makers of programs and advertisements);
• content
providers (such as DBS broadcasters and channels);
• satellite end users.
One of the key differences with DBS concerns a chain of control that exists differently than for the internet. Countries who have control over orbital slots lease those slots to satellite companies. The satellite companies then sublease transponder capacity to various private entities such as broadcasters. It is possible that countries in control of orbital slots could require those leasing the slots to abide by certain rules with regard to content. For example, countries in charge of orbital slots could require satellite companies—and all the entities those companies deal with—to use their best efforts to comply with the FCTC. It appears that a number of Parties to the FCTC have control of orbital slots. In addition, laws that target content producers and content providers should encompass content produced for, or delivered by, DBS.
6 Bans for content producers and content providers
Some existing measures that ban producing, publishing or broadcasting tobacco advertising in traditional media already apply to content producers and content providers who use the internet and other digital platforms—or they could be adapted in order to apply to them. Examples from the UK, Australia and Europe illustrate this.
As well as prohibiting the publication of tobacco advertising, UK legislation prohibits the printing, devising or distribution of tobacco advertising. Section 2(2) of the Tobacco Advertising and Promotion Act 2002 states that ‘a person who in the course of a business prints, devises or distributes in the United Kingdom a tobacco advertisement which is published in the United Kingdom, or causes such a tobacco advertisement to be so printed, devised or distributed, is guilty of an offence’. The use of the concept ‘devise’ means this provision should apply to content producers, or at least to some of them, as well as content providers. But the scope of the prohibitions in s 2 are limited to entities that carry on business in the UK: ss 2(4), 5(5)(c).
In addition, the definition of content distribution in s 2(3) of the UK law means that it applies to content that is distributed online. Section 2(3) states that distribution includes: ‘transmitting it in electronic form, participating in doing so, and providing the means of transmission’.[10] Exceptions that apply to distributors such as content hosts, who do not know the content contains a tobacco advertisement, are examined below in Part 7. Content producers and content providers, however, do know about the content in question (or they are reasonably taken to have that knowledge by law) because they originate content or make it available after they engage in a process of selection.
Australian law offers another example that applies to at least some producers of tobacco advertising, promotion or sponsorship. Content producers are not explicitly covered by the Australian Tobacco Advertising Prohibition Act 1992 (Cth). However, it is an offence to ‘publish’ tobacco advertisements and publication is defined to encompass a person who includes the advertisement ‘in a document...that is available, or distributed, to the public’ or ‘in a film, video, television program or radio program that...is intended to be, seen or heard by the public’.[11] This provision has the potential to reach some content producers—those who include the advertisement in certain categories of material. If it were broader in scope, it could cover producers of tobacco advertising, promotion and sponsorship that is either available to the public or intended to be made available to the public using any media and communications delivery technology.
The Australian Act also applies to those who bring the material to public availability—by ‘publishing’ or ‘broadcasting’ it. While those terms might appear to be media-specific, the definition of ‘publish’ is broad. As well as the aspects detailed above, it includes a person who ‘otherwise brings the advertisement, or something that contains the advertisement, to the notice of, or disseminates the advertisement, or something that contains the advertisement, to the public, or a section of the public, by any means (including, for example, by means of a film, video, computer disk or electronic medium)’.[12] The types of exception, which apply under many laws to entities that do not know the content contains a tobacco advertisement, are examined below in Part 7.[13]
Many other examples relevant to bans for content producers and content providers exist. One further example worth noting is the European Audiovisual Media Services Directive (which updates the Television Without Frontiers Directive). It applies controls to linear and non-linear audiovisual media services. The definition of audiovisual media services in the Directive covers only mass media services—that is, services which are intended for reception by a significant portion of the population. This includes both traditional television (linear) and on demand (non-linear) audiovisual services. While public service enterprises are covered by the Directive, it does not include activities which are not primarily economic and not in competition with television broadcasters (such as private users, blogs, sharing and exchange websites).[14]
The Directive also deals with ‘audiovisual commercial communication’. Audiovisual commercial communication means ‘images with or without sound which are designed to promote, directly or indirectly, the goods, services or image of a natural or legal entity pursuing an economic activity. Such images accompany or are included in a programme in return for payment or for similar consideration or for self-promotional purposes. Forms of audiovisual commercial communication include, inter alia, television advertising, sponsorship, teleshopping and product placement.’
The Directive has specific provisions about tobacco:[15]
The Directive is a useful example for its breadth in dealing with advertising, sponsorship and product placement. But it does not directly deal with which entities are to be liable (and in what way) if an audiovisual commercial communication is made in breach of the Directive’s terms and, given the Directive’s focus on audiovisual media services, it does not deal with the full range of media and communications technologies. However, the Tobacco Advertising Directive[16] deals with many aspects of tobacco advertising and sponsorship on other media platforms, subject to them having cross-border effects. The Directive also adopts broad definitions of advertising—‘any form of commercial communications with the aim or direct or indirect effect of promoting a tobacco product—and of sponsorship—‘any form of public or private contribution to any event, activity or individual with the aim or direct or indirect effects of promoting a tobacco product’.[17]
The above examples suggest two key issues for content bans. Content bans should be broad in scope, in terms of reaching all those involved in producing and providing the content and in terms of reaching all forms of media and communications technology. Therefore, content bans should cover content producers and content providers of tobacco advertising, promotion and sponsorship that either is available to the public or intended to be made available to the public using any media and communications delivery technology. Further issues for content producers and content providers, particularly related to jurisdiction and enforcement are outlined below in Part 10.
7 Protection for content hosts, content navigators and access providers
The sort of direct liability that could be applied to content producers and providers is generally seen as inappropriate in relation to content hosts, content navigators and access providers. These intermediaries are usually unaware of the content for which they facilitate access. Thus, the general approach is to impose obligations only once these entities have notice of the content in question.
7.1 Protection in tobacco-related legislation
The notice and obligation approach can be seen in some law that relates directly to tobacco advertising, promotion and sponsorship. In the UK example, discussed above in Part 6, the definition of distribution in the Tobacco Advertising and Promotion Act 2002 includes online content. Section 2(3) states that distribution includes: ‘transmitting it in electronic form, participating in doing so, and providing the means of transmission’.[18] As well as covering content producers and content providers, this could make liable content hosts, content navigators and access providers. However, the offence requires the distributor to know about the content, as provided for in s 5(5):
In relation to a tobacco advertisement which is distributed as mentioned in section 2(3), a person does not commit an offence under section 2(2) of distributing it or causing its distribution if—
(a) [the person] was unaware that what [the person] distributed or caused to be distributed was, or contained, a tobacco advertisement,
(b) having become aware of it, it was not reasonably practicable for [the person] to prevent its further distribution.
7.2 Protection in legal instruments that are not tobacco-related: the example of the E-Commerce Directive
The notice and obligation type of protection can also be seen in many laws that do not concern tobacco advertising, promotion and sponsorship. It is not rare for legal instruments to provide that intermediaries such as content hosts, content navigators and access providers cannot be required to monitor content. A notable example is the European E-Commerce Directive.[19]
The E-Commerce Directive applies to ‘information society services’, which are services provided at a distance for payment on the request of the recipient. The Directive provides that certain immunities must exist for particular entities involved in the provision of information society services. In particular, no general obligation to monitor content can be imposed on particular intermediaries. However, intermediaries who have notice of illegal content may be required to remove it or disable access to it.
This position is achieved by Articles 12 to 15 of the Directive. Each is outlined here. (It should be noted that the Directive does not use the language of requiring ‘reasonable efforts’ to remove or disable content, or requiring content to be removed or blocked only where that is technically feasible. This aspect of the Directive has proved controversial, and the law of some Member States does adopt these sorts of qualifications; see, for example, the German law discussed below in Part 7.3. The UK law discussed above in Part 7.1 also adopts that sort of qualification.)
The Directive provides that ‘mere conduits’ are not liable for content provided by users (nor for providing access to the service) where the mere conduit does not initiate the transmission of the content, does not select the recipient of the content, and does not select or modify the content: Art 12(1). However, a court or an administrative authority within a Member State is not precluded from requiring the ‘mere conduit’ to terminate or prevent an infringement of law: Art 12(3).
Similarly, the Directive provides that caching does not give rise to liability where the entity that does the caching (here called a ‘caching entity’) does not modify the content and complies with certain other provisions in Art 13(1).[20] In addition, however, the Directive states that immunity from liability must depend on the caching entity acting expeditiously to remove content from the cache when it has actual knowledge that the content has been removed at the source or that access to it has been disabled, or that a court or administrative authority has ordered removal or disablement: Art 13(1)(e). Again, a court or an administrative authority within a Member State is not precluded from requiring the caching entity to terminate or prevent an infringement of law: Art 13(2).
The Directive also provides immunity for entities that host content, where the entity lacks actual knowledge of illegal content and where it acts expeditiously to remove or disable access to the content when it gains such knowledge: Art 14(1). Again, a court or an administrative authority within a Member State is not precluded from requiring the hosting entity to terminate or prevent an infringement of law: Art 14(3). In addition, the Directive does not prevent a Member State establishing procedures to govern the removal or disabling of access to content: Art 14(3).
The Directive prevents a general obligation to monitor being imposed on mere conduits, caching entities or hosting entities: Art 15(1). However, entities may be required to inform public authorities of illegal content of which they become aware: Art 15(2).
The Directive also expressly encourages self-regulation in relation to the development of notice and takedown procedures (in Art 16) and most Member States have left the issue of notice and take down to self-regulation.[21] However, Finland, Iceland, Hungary and Poland have enacted full statutory provisions relating to notice and take down (Finland’s relates to copyright infringement only), while Belgium has implemented a co-regulatory scheme in relation to all illegal/harmful content, under which the take down of content must be authorised by a state prosecutor.[22]
7.3 Examples of the implementation of the E-Commerce Directive
Aspects of the approaches taken in Germany, the UK, France and Poland are outlined here.
In Germany, the Federal Law to Regulate the Conditions of Information and Communications Services 1997 (the ‘Multimedia Law’) regulates the liability of intermediaries for internet content. The Multimedia Law applies to liability that arises in criminal and civil law. The basic position is that service providers can be liable for content that is prohibited under general law where it is their own content.[23] In addition, where control of the content is feasible, service providers can be liable for other content of which they have knowledge. The Multimedia Law provides:
5 Responsibility
(1) Service providers are responsible under the general laws for their own content which they make available for use.
(2) Service providers are only responsible for third-party content which they make available for use if they have knowledge of such content and blocking its use is both technically possible and can be reasonably expected...
(4) Any duties to block the use of illegal content according to the general laws remains unaffected, insofar as the service provider gains knowledge of such content [without breaching telecommunications secrecy law] ... and blocking is both technically possible and can be reasonably expected.
The UK has transposed the basic elements of the E-Commerce Directive into national law in the Electronic Commerce (EC Directive) Regulations 2002. Most of the provisions merely repeat the terms of the Directive. Regulation 22, however, is worth noting. It deals with providing notice to a service provider:
In determining whether a service provider has actual knowledge ... a court shall take into account all matters which appear to it in the particular circumstances to be relevant and, among other things, shall have regard to—
(a) whether a service provider has received a notice through a means of contact made available in accordance with regulation 6(1)(c),[24] and
(b) the extent to which any notice includes—
(i) the full name and address of the sender of the notice;
(ii) details of the location of the information in question; and
(iii) details of the unlawful nature of the activity or information in question.
France also has statutory provisions for giving notice. The
E-Commerce Directive has been transposed in the Loi pour la confiance dans
l’Economie Numérique (Law for Confidence in the Numerical
Economy or ‘LEN’). Under this law, there is immunity for internet
service providers
for content of which they lack knowledge. The law also sets
out provisions about providing notice, as part of a notice and takedown
scheme.
Once a service provider receives notification in a prescribed manner, it should
suspend access to, or the hosting of, the
allegedly illegal or infringing
material. The required contents of the notification include:
• the
date of the notification;
• a description of the content complained of and the location of the content;
Improper notifications can give rise to fines or imprisonment.
In Poland, the Act of 18 July 2002 on Providing Services by Electronic Means sets out circumstances in which content hosts will be exempt from liability because they have acted ‘expeditiously’ to remove or block access to offending content. A service provider who is aware of the unlawful content must disable access to the content without undue delay in order to be exempt from liability—but only where the awareness arises from an official information or a ‘credible message’. In all other cases, the content host has no obligation to take down or block the content. When it receives notification by a credible message, the content host must notify the website owner concerning the takedown / blocking of the content. Upon notification, the website owner can request that the content be put back. The service provider then must make a decision, taking into account all the information received, whether or not to restore the content. (The content host is exempt from contractual liability towards a service recipient on whose website the content was displayed if the takedown / blocking of the content was not justified.)[25]
The above examples all illustrate the general approach suggested here. Content bans would apply to content producers and content providers; and notice and obligation schemes would apply to content hosts, content navigators and access providers. Issues for consideration would include whether an obligation to remove content or disable access to content, after notification, is termed as being an absolute obligation or is a qualified obligation; for example by what is technically possible and reasonable to expect. However, any qualification should be constrained in light of the obligations that parties have accepted under the FCTC. The above examples also suggest the importance of notification procedures and the options of making the obligation arise only on notification being received from a designated authority (rather than received direct from the public). Identifying which authorities will have the remit to receive complaints is also important. As the 2006 Report stated:
No existing regulator in Australia or the UK has full power over both advertising and sponsorship on radio, television, mobile telephony and the internet (where remit does extend to the internet, such remit may not include all internet material).[26] This regulatory disparity is replicated widely internationally.
Such disparity may make public reporting and complaints procedures in relation to tobacco advertising confusing. By dividing responsibility for tobacco advertising among entities it also can dilute statistical and data recording capabilities as well as knowledge and experience in relation to tobacco advertising.[27]
This suggests that one entity within a Party should be designated as the contact point for all concerns about tobacco advertising, promotion and sponsorship. In addition, an internationally-based entity could be established under the FCTC (or the role could be taken on by the Conference of the Parties) so that complaints can be referred easily between states.
8 Notice and takedown scheme for content hosts
Notice and takedown schemes (as well as blocking and filtering schemes) were outlined in the 2006 Report. Where an entity that hosts content is notified the content is illegal, the host may take down or remove the content from the computer server on which it has hosted the content, either voluntarily or subject to order from an authority which has legislative power to do so. This can be referred to as a notice and takedown scheme.
Such schemes are quite common with regard to certain types of internet content, in particular illegal sexual content and content that allegedly infringes copyright. Similar schemes could be developed for tobacco advertising, promotion and sponsorship.
The 2006 Report detailed Australian and UK approaches to this issue in terms of illegal sexual content. Here, these are outlined again, along with several further examples of legislative and non-legislative approaches to notice and takedown.[28]
8.1 Notice and takedown schemes for illegal sexual content: legislative examples from Australia, Canada, Malaysia, Singapore, South Korea and Sweden
In Australia, Schedule 5 of the Broadcasting Services Act 1992 deals with some online sexual content. For internet content that is hosted in Australia, a notice and takedown scheme operates. (For content hosted outside Australia, a filtering scheme exists.) Similar approaches have been taken under Australia’s Spam Act 2003 and Interactive Gambling Act 2001. The 2006 Report described the legislative scheme in Australia:
Operating since 2000, it is administered by the national media regulator under Schedule 5 of the Broadcasting Services Act 1992, and specifically addresses the issue of cross-border content control.
The scheme involves regulating...content intermediaries...via industry-based codes of practice and a public complaints mechanism.[29]
Complaint can be made to the Australian Communications and Media Authority (ACMA) about internet content.[30] If ACMA finds the content to be ‘prohibited’—which is defined under the Act and relates to certain classification standards for sexual or violent material, drawn from standards applied to the public exhibition of films—then ACMA may order the content to be taken down if it is hosted in Australia.
ISP compliance with internet industry codes of practice, which are registered under the Act, is not compulsory as a default rule. (And membership of the industry association is voluntary.) However, the Act provides that once ACMA directs an ISP to comply with a registered industry code, the ISP must do so. Thus, ISP compliance with the scheme becomes mandatory and the NTD process can readily be enforced against Australian-based entities.
If the content is hosted overseas, ACMA refers details about the content to makers of internet filters that are approved under the Act and the filters are updated so they can block access to the content. However, the Australian legislation does not mandate blocking or filtering with respect to this prohibited foreign-based content. Instead it requires that approved filtering software be updated and requires ISPs to offer such user-level filtering to their users (with charges limited to a cost-recovery basis). Users may choose to use that filtering or not: the filtering is offered on an opt-in basis rather than being mandatory or offered on an opt-out basis to users.
In the case of ‘sufficiently serious’ illegal content, ACMA informs relevant law enforcement authorities or informs counterpart internet hotlines internationally overseas (for reporting on to law enforcement authorities).[31]
Somewhat similar statutory schemes exist elsewhere. Examples include Canada, Malaysia, Singapore, South Korea and Sweden.
In Canada, the Criminal Law Amendment Act 2001 addresses
the removal of child pornography from the internet. Under s 164.1(1) of the
Act, a judicial order can be made against
the ‘custodian’ of a
computer system within the jurisdiction on which child pornography is stored and
made available
to:
(a) give an electronic copy of the material to the
court;
(b) ensure that the material is no longer stored on and made available through the computer system; and
(c) provide the information necessary to identify and locate the person who
posted the material.
Under s 164.1(2), the judge can then order the person
who posted the material to appear in court. ‘If the person cannot be
identified
or located or does not reside in Canada, the judge may order the
custodian of the computer system to post the text of the notice
at the location
where the material was previously stored and made available, until the time set
for the appearance’. Canada
has similar provisions for the removal of
hate propaganda.
In Malaysia, internet content providers and service providers are regulated under the Communications and Multimedia Act 1998. Under the Act, the Malaysian Communications and Multimedia Commission is directed to create a ‘Content Forum’ to develop a code for offensive content.[32] Like Australia, the approach amounts to a co-regulatory system, in which compliance with the code can be ordered by the Commission.[33] Intermediaries are not required to rate content or monitor the actions of users, or to block content unless directed to do so under the complaints process.
Singapore provides another example of a legislatively-based notice and takedown scheme, although not one directed at tobacco advertising, promotion and sponsorship.[34] Singapore licences internet content providers and service providers under its Broadcasting Act. Licensees must comply with an Internet Code of Practice, which is created and administered by Singapore’s Media Development Authority (MDA) under the Broadcasting Act.[35] The Internet Code of Practice is concerned primarily with material that is ‘against public interest or order, national harmony or which offends good taste of decency’.[36] This wide scope of content control is in line with the stricter restrictions on free speech that apply in Singapore in comparison with some other common law countries. However, the approach has similarities to that taken in other countries in requiring intermediaries to have knowledge of the content in question before an obligation arises. As the MDA Internet Industry Guidelines state:
Web publishers and server administrators are not required to monitor the Internet or pre-censor content. They are only required to deny access to prohibited materials when directed by MDA. The primary responsibility for the content remains with the author and not the publisher or server administrator.
Internet Content Providers who are not targeting Singapore as their principal market will not be subject to Singapore’s standards unless they are primarily in the business of distributing pornography.[37]
In South Korea, legislation provides that internet service providers are required to block access to websites on a list compiled by an internet regulator. The Telecommunications Business Act establishes the Korean Internet Safety Commission (KISCOM). Its role includes the eradication of subversive communications. KISCOM is empowered to define harmful content and recommend which websites should be blocked. It also operates a centre to which illegal and harmful information can be reported, which allows internet users to report various types of content (such as obscene, gambling or antisocial content, but not specifically tobacco advertising, promotion and sponsorship). KISCOM is required to investigate complaints and, once verified, has power to add offending websites to a list which ISPs are required to block under the Internet Content Filtering Ordinance 2001.
In Sweden, under the Act on Responsibility for Electronic Bulletin Boards 1998, content hosts are legally responsible for content they host. In general, however, the obligation arises only once the existence of illegal content has been brought to their attention. Content on websites is included, but not personal communications such as email. There is an obligation to remove or make inaccessible content that is illegal (such as, content related to child pornography and certain violent content). The obligations of content hosts can arise without notice when the content is obviously illegal.
8.2 Notice and takedown schemes for illegal sexual content: non-legislative example from the UK
Some countries have not specifically mandated notice and takedown schemes under legislation; for example, the UK. (The content in question is illegal under the general law, but there is not a statutory scheme to facilitate notice and takedown.) The approach in the UK, with its Internet Watch Foundation (IWF), was discussed in the 2006 Report:
The IWF is a UK industry body...to which members of the public can report child pornography and some other illegal material on the internet.[38] It was formed following an agreement between the UK government, police and industry that a partnership approach was needed to tackle the distribution of child abuse images online...It is a not-for-profit company, funded by industry and includes ISPs, mobile network operators and manufacturers, content service providers, telecommunications and software companies and credit card bodies.
The IWF’s remit covers three types of content: child abuse images wherever they are hosted online; images that are criminally obscene under UK law, when they are hosted in the UK; and content that is criminally racist under UK law, when it is hosted in the UK.
The IWF operates a complaints hotline to which anyone can report material in the three categories they find online that they believe may be illegal. If the IWF agrees that the material is potentially illegal—under UK legislation and case law—it then seeks to determine the origin of the material and notifies the relevant UK or overseas law enforcement agency.[39] It also notifies the ISP when it is UK-based and asks it to remove the content (or to ask the user to do so). While acting as a hotline for the notification of complaints, the IWF also reports and notifies other hotlines internationally of potential child abuse images hosted on servers outside the UK.
The combination of a NTD scheme and notifying police has been highly effective in reducing UK-hosted child abuse content. Reports suggest that when the IWF began operating approximately 18% of material found to be illegal by the IWF was hosted in the UK. Nine years later that figure had dropped to nearly 1%.
In addition, the IWF’s annual reports provide a useful record and information source about this content issue. As well as supporting public education and awareness, such a style of centralised reporting would be extremely useful in relation to cross-border tobacco advertising control.
The IWF also explores partnerships with other industry bodies to strengthen the control of this illegal content. An example is an alliance with the Association of Payment Clearing Services (APACS). APACS is the UK trade association for institutions that deliver payment services and is the industry representative on many issues at the European level. APACS’ members reportedly account for approximately 97% of the total UK payments market.[40] The IWF and APACS have discussed developing initiatives designed to prevent the use of card payment services for the acquisition or transmission of illegal child abuse images over the internet.[41]
9 User-generated content
Varied questions exist about the treatment of user-generated internet content under the FCTC, such as content on social networking sites like Facebook and YouTube. One issue is considered here; namely, the situation where an individual has received a benefit from a tobacco company or advertising agency related to posting online the content in question as part of a ‘viral marketing’ campaign. Two points are noted about this issue.
First, posting such content online may breach the terms of use that are agreed to by users of social networking sites, and the content would be liable to be removed by the site’s operator. For example, the Facebook Terms of Use (7 November 2007) exclude ‘unsolicited or unauthorized advertising’ and uploading content that would ‘violate any local, state, national or international law’,[42] and the Facebook Advertising Guidelines (2 November 2007) state: ‘We do not accept advertising referencing, facilitating or promoting the following ... Tobacco products’.[43] To similar effect, the MySpace Terms and Conditions (11 April 2007) prohibit ‘commercial activities and/or sales without our prior written consent such as ... advertising’ and state that: ‘Prohibited activity includes, but is not limited to ... displaying an advertisement on your profile, or accepting payment or anything of value from a third person in exchange for your performing any commercial activity on or through the MySpace Services on behalf of that person, such as placing commercial content on your profile, posting blogs or bulletins with a commercial purpose, selecting a profile with a commercial purpose as one of your “Top 8” friends, or sending private messages with a commercial purpose’.[44]
Second, measures that control tobacco advertising, promotion and sponsorship by content producers and content providers should be formulated to include individuals or corporate entities that post such content on social networking sites, or cause the material to be posted. Legislative or regulatory schemes would need to provide sufficient penalties to deter involvement, and both corporate entities and their individual officers could be exposed to liability. Sufficient resources would also be required (either within individual Parties or through a central body) to fund the investigation of such content. In addition, measures that control content hosts could be applied to social networking sites. Thus, a combination of the content bans against individuals or corporate entities that post the material online, and notice and obligation schemes against the operators of social networking sites could operate.
10 Jurisdiction, enforcement and related entities
There are grounds on which Parties can claim a wide jurisdiction in relation to controlling tobacco advertising, promotion and sponsorship. In relation to jurisdiction, the 2006 Report noted how approaches to jurisdiction in international law have developed over time:
The ‘classical’ bases on which state jurisdiction may be exercised are the territorial principle and the nationality principle. The territorial principle allows a state to exercise jurisdiction over all activities occurring within its own territory. The nationality principle allows a state to exercise jurisdiction over its nationals wherever they may be[45]—thus allowing the extraterritorial exercise of jurisdiction.
While the general starting point is that jurisdiction is territorial, with extraterritorial jurisdiction a deviation from the basic norm of state territorial sovereignty, only justified in certain circumstances, a number of specific grounds for extraterritorial jurisdiction are generally accepted. In addition to the nationality principle, mentioned above, there are four bases: passive personality; protective security; universality; and the ‘effects’ doctrine. As the limits of these principles have generally not been subject to explicit consensus, common state practice, or judicial pronouncement at the international level, their boundaries are, in many respects, difficult to articulate precisely.[46]
The first point that is important to note is the breadth of the
territoriality principle in relation to communications using the internet
and other digital platforms. Communication that is available within a
country’s territory can be treated as having been communicated
within the jurisdiction. For example, Australian federal criminal law
states that offences do not occur within the standard geographical
jurisdiction
of Australia unless ‘the conduct constituting the alleged offence occurs
... wholly or partly in
Australia’.[47]
It also states that:
if a person sends, or causes to be sent, an electronic
communication:
(a) from a point outside Australia to a point in Australia;
or
(b) from a point in Australia to a point outside Australia;
that conduct is taken to have occurred partly in Australia.[48]
Such an approach would deal with all material that is available within the territory of a Party. As the 2006 Report stated:
The communicative aspects of tobacco advertising may well be understood in legal terms to occur within the territory in which they are received. This approach exists in other areas of law, and would appear capable of application to some aspects of tobacco advertising. For example, in the defamation law of many states following the common law tradition, the very availability of defamatory material online within a state is sufficient to found jurisdiction.[49] [50]
The second point concerns the effects doctrine and jurisdiction. The 2006 Report stated:
The ‘effects doctrine’ is the most controversial of the bases for jurisdiction. It suggests that a state may exercise extraterritorial jurisdiction over offences committed by non-nationals outside its territory but having an effect within it. The precise qualities of the required ‘effects’ have not been resolved. Recent commentary suggests that there is a ‘growing recognition that an assertion of jurisdiction over offences abroad having an intended and substantial effect within a state may be justified’,[51] and there is some authority validating jurisdiction over conduct outside state territory ‘that has or is intended to have substantial effect within its territory’,[52] thus intention is not necessarily a requisite element. To date the effects doctrine has been invoked almost exclusively in the context of restrictive trade practices law,[53] though it is potentially very expansive.[54]
As outlined above, states could claim jurisdiction over communications that are available within their territory. States may encounter difficulties in enforcing such laws outside their territory. Therefore, an additional measure would be to use the concepts of effect and targeting to impose liability on entities that target people within the territory of the Party. As the 2006 Report noted:
One of the responses to uncertainties about how to deal with activities taking place outside of a Party’s territory and engaged in by non-nationals, but having an effect within the Party’s territory, has been to focus on whether material can be said to be ‘targeted’ at persons within a state’s territory. An example of a law modelled on this basis is the Australian Interactive Gambling Act 2001, which is expressed to have extra-territorial application, with the offence of publishing an interactive gambling service advertisement in Australia applying to the activities of overseas entities engaged in the provision of online gambling services outside Australia where services are targeted at Australian users.[55]
The concept of ‘targeting’ also appears to be a useful way of addressing... considerations relating to both non-intervention and comity. Where a state’s citizens are ‘targeted’ by an entity operating from within another state’s territory, the state has an understandable claim to exercise jurisdiction over conduct occurring within the other state’s territory.[56]
The above review of matters related to jurisdiction emphasises the importance of recognising the breadth of the territoriality principle—it means jurisdiction can be claimed for material that is available within the territory of a Party without an extra-territorial claim to jurisdiction because the entities responsible for the content’s availability will be taken to have acted within the territory of the Party. Second, the effects principle could also be drawn on in drafting legislation as a way of respecting concerns about expansive claims to jurisdiction.
Beyond questions of jurisdiction, serious issues arise about the enforcement of judgments internationally for breaches of tobacco advertising, promotion and sponsorship controls. There is no international treaty on the recognition and enforcement of foreign judgments. This means the law of each country in which recognition is sought must be considered separately. The approach of countries varies. Some (such as Australia and the United States) are relatively liberal in recognising and enforcing foreign judgments; others (such as the Netherlands in relation to judgments from outside the EU, and Indonesia) recognise no foreign judgments.
In countries which are generally receptive to foreign judgments, especially common law jurisdictions, one issue above all will challenge any attempt to enforce judgments related to the breach of laws against tobacco advertising, promotion and sponsorship. The judgment that is sought to be enforced must not be penal in nature. This means that any criminal sanctions such as fines cannot be enforced, even in countries which are generally receptive to foreign judgments. The exclusion also means that any judgment involving the vindication of ‘public rights’—for example deterrence, punishment or retribution as opposed to private compensation—will not be enforced. Parties could agree to enforce such judgments between themselves, but that would be a change for general practice in areas of law outside the FCTC. The issue could be an important matter to address in a Protocol or in decisions of the Conference of the Parties.[57] The challenges that will exist for enforcing judgments in other states means that Parties should also focus attention on applying controls to entities that are within their territory through the combination of content bans for content producers and content providers, and notice and obligation schemes for content aggregators, content navigators and access providers. One or more of those entities will be within the territory of a Party, in almost every instance of tobacco advertising, promotion and sponsorship.
Another way in which difficulties of enforcement could be tackled would be through holding domestic entities responsible for the conduct of related entities that are located outside the territory of the Party. The treatment of multinational enterprises by law is a complex and evolving area of law. It is concerned with enterprises that can coordinate activities in more than one country.[58] A great variety of legal and commercial relationships come within the field of study. The ownership of such enterprises may be state-based or private and their relationships across territories can be implemented by equity measures (such as ownership of a domestic subsidiary by a foreign parent company), contractual measures (agreements between two or more contracting parties), joint ventures and informal alliances.
The question of when a foreign related entity can be made liable for the action or inaction of a domestic entity has received significant academic and judicial analysis. But that analysis has involved different contexts to the situation for tobacco advertising, promotion and sponsorship. For example, where the actions of a foreign parent lead a domestic subsidiary to become insolvent, creditors of the subsidiary may seek to make the foreign entity liable under to compensate them. The relevant question for Article 13 of the FCTC, however, would be the reverse; that is, imposing obligations on a domestic entity that is affiliated with a foreign entity when the foreign entity is the one responsible for tobacco, advertising and sponsorship being available domestically.
Some types of obligation may be appropriate to impose on the domestic entity; for example, an obligation to use its best efforts to prevent entities with which it is affiliated (by measures such as equity, contract, joint venture or informal alliance) from being involved in tobacco advertising, promotion and sponsorship contrary to Article 13 of the FCTC. The related entity is likely to have increasing ability to prevent tobacco advertising, promotion and sponsorship from being available within the territory of the Party; for example, through the use of geolocation techniques.
This strategy to deal with related entities could be added to those
considered earlier in this report. In that case, controls would
be applied
primarily to entities within the territory of a Party through:
• content bans for content producers and content providers;
While this strategy could be considered by Parties, it should be noted that it does not appear to be an approach for which useful and directly comparable examples exist from other areas of law.
Appendix – Articles of the FCTC
Article 1
Use of terms
For the purposes of this Convention...
(c) “tobacco advertising and promotion” means any form of commercial communication, recommendation or action with the aim, effect or likely effect of promoting a tobacco product or tobacco use either directly or indirectly...
(g) “tobacco sponsorship” means any form of contribution to any event, activity or individual with the aim, effect or likely effect of promoting a tobacco product or tobacco use either directly or indirectly;
Article 13
Tobacco advertising, promotion
and sponsorship
1. Parties recognize that a comprehensive ban on advertising, promotion and sponsorship would reduce the consumption of tobacco products.
2. Each Party shall, in accordance with its constitution or constitutional principles, undertake a comprehensive ban of all tobacco advertising, promotion and sponsorship. This shall include, subject to the legal environment and technical means available to that Party, a comprehensive ban on cross-border advertising, promotion and sponsorship originating from its territory. In this respect, within the period of five years after entry into force of this Convention for that Party, each Party shall undertake appropriate legislative, executive, administrative and/or other measures and report accordingly in conformity with Article 21.
3. A Party that is not in a position to undertake a comprehensive ban due to its constitution or constitutional principles shall apply restrictions on all tobacco advertising, promotion and sponsorship. This shall include, subject to the legal environment and technical means available to that Party, restrictions or a comprehensive ban on advertising, promotion and sponsorship originating from its territory with cross-border effects. In this respect, each Party shall undertake appropriate legislative, executive, administrative and/or other measures and report accordingly in conformity with Article 21.
4. As a minimum, and in accordance with its constitution or constitutional principles, each Party shall:
(a) prohibit all forms of tobacco advertising, promotion and sponsorship that promote a tobacco product by any means that are false, misleading or deceptive or likely to create an erroneous impression about its characteristics, health effects, hazards or emissions;
(b) require that health or other appropriate warnings or messages accompany all tobacco advertising and, as appropriate, promotion and sponsorship;
(c) restrict the use of direct or indirect incentives that encourage the purchase of tobacco products by the public;
(d) require, if it does not have a comprehensive ban, the disclosure to relevant governmental authorities of expenditures by the tobacco industry on advertising, promotion and sponsorship not yet prohibited. Those authorities may decide to make those figures available, subject to national law, to the public and to the Conference of the Parties, pursuant to Article 21;
(e) undertake a comprehensive ban or, in the case of a Party that is not in a position to undertake a comprehensive ban due to its constitution or constitutional principles, restrict tobacco advertising, promotion and sponsorship on radio, television, print media and, as appropriate, other media, such as the internet, within a period of five years; and
(f) prohibit, or in the case of a Party that is not in a position to prohibit due to its constitution or constitutional principles restrict, tobacco sponsorship of international events, activities and/or participants therein.
5. Parties are encouraged to implement measures beyond the obligations set out in paragraph 4.
6. Parties shall cooperate in the development of technologies and other means necessary to facilitate the elimination of cross-border advertising.
7. Parties which have a ban on certain forms of tobacco advertising, promotion and sponsorship have the sovereign right to ban those forms of cross-border tobacco advertising, promotion and sponsorship entering their territory and to impose equal penalties as those applicable to domestic advertising, promotion and sponsorship originating from their territory in accordance with their national law. This paragraph does not endorse or approve of any particular penalty.
8. Parties shall consider the elaboration of a protocol setting out appropriate measures that require international collaboration for a comprehensive ban on cross-border advertising, promotion and sponsorship.
[*] Director,
CMCL–Centre for Media and Communications Law; Associate Professor,
Melbourne Law School, University of Melbourne;
<http://www.law.unimelb.edu.au/cmcl>
.
[**]
Research Fellow, CMCL–Centre for Media and Communications Law, Melbourne
Law School, University of
Melbourne.
[1] Andrew
T Kenyon and Jonathan Liberman, Controlling Cross-Border Tobacco Advertising,
Promotion and Sponsorship–Implementing the FCTC (Melbourne: University
of Melbourne, 2006) available at
<http://papers.ssrn.com/sol3/papers.cfm?abstract_id=927551>
(‘2006
Report’). The 2006 Report discussed options for a Protocol, but a great
deal of its analysis applies equally
to Guidelines to Article 13 of the
FCTC.
[2] Ibid
41–56.
[3] One
example is provided by Australia’s Communications Legislation Amendment
(Content Services) Act 2007 (Cth), which inserts a new Schedule 7 into the
Broadcasting Services Act 1992 (Cth). When it comes into force, it will
replace the existing arrangements for internet content (restricted access and
the notice
and take down scheme) and mobile content (SMS, MMS and walled-garden
premium services, and the notice and take down scheme in operation).
See Part 8
below for a discussion of the earlier Australian approach to internet
content.
[4] The use
of a ‘layers’ analogy is reasonably common in the literature, with
various approaches being adopted; see eg Yochai
Benkler, ‘From Consumers
to Users: Shifting the Deeper Structures of Regulation’ (2000) 52
Federal Communications Law Journal 561, 562–63; see also Andrew D
Murray, The Regulation of Cyberspace: Control in the Online Environment
(Abingdon: Routledge Cavendish, 2007) 74–89. The version described here
is relatively simple, but could be useful for thinking
about controlling tobacco
advertising, promotion and
sponsorship.
[5] The
terminology and definitions expand upon the three core entities focussed on in
the 2006 Report, above n 1; that is,
advertising producers and their agents; content providers; and technological
intermediaries.
[6]
The higher level approach can be seen in existing legal instruments, such as the
European E-Commerce Directive; see below Part
7.
[7] That is so at
least in regard to content hosts located within the territory of a Party. The
application of blocking or filtering
in relation to other content—content
that is not tobacco advertising, promotion or sponsorship—has often been
contentious
when it has been sought to be applied at network level rather than
by end-users, due in part to the effects such blocking or filtering
can have on
network performance. For an example of legislative underpinning for user-level
filtering see Kenyon and Liberman, above
n 1,
51–52.
[8] A
satellite in geostationary orbit is located above the equator and is stationary
in relation the earth. These are the only types
of satellites that have
operated commercially in the broadcasting and telecommunications sectors: see
Monroe E Price, Satellite Broadcasting as Trade Routes in the Sky (1999)
and its footnote 7;
<http://www.transcomm.ox.ac.uk/working papers/traderouteshk.PDF>
.
[9]
A transponder is a device on a communications satellite that receives, amplifies
and transmits signals. The amount of information
that can be transmitted on a
single transponder will vary. Digital compression technologies, however, have
greatly increased the
individual transponder capacity beyond the
single-television-channel capacity of transmissions in analogue format.
[10] It is also
worth noting s 7 of the Tobacco Advertising and Promotion Act 2002 (UK)
which states that the relevant government minister may make orders to amend
provisions of the Act related to electronic distribution
in light of
technological
change.
[11]
Tobacco Advertising Prohibition Act 1992 (Cth) s
10.
[12] Tobacco
Advertising Prohibition Act 1992 (Cth) s
10.
[13] The
Australian exception is not detailed in Part 7; for the Australian position see
clause 91 of schedule 5 of the Broadcasting Services Act 1992
(Cth).
[14] The
definition also excludes services where audiovisual media is merely incidental
to the service and not its principal
purpose.
[15] These
articles replace the general prohibition contained in Article 13 of the
Television Without Frontiers Directive: ‘All
forms of television
advertising and teleshopping for cigarettes and other tobacco products shall be
prohibited.’
[16]
Directive 2003/33/EC on 26 May 2003 of the European Parliament and of the
Council on the approximation of the laws, regulations and
administrative
provisions of the Member States relating to the advertising and sponsorship of
tobacco products [2003] OJ
L152.
[17] Ibid Art
2.
[18] It is also
worth noting s 7 of the Tobacco Advertising and Promotion Act 2002 (UK)
which states that the relevant government minister may make orders to amend
provisions of the Act related to electronic distribution
in light of
technological
change.
[19]
Directive 2000/31/EC of the European Parliament and of the Council of 8 June
2000 on Certain Legal Aspects of Information Society
Services, in particular
Electronic Commerce, in the Internal Market [2000] OJ
L178/1.
[20] A
‘cache’ is a temporary digital storage area used to facilitate
faster access to data. A cache exists where original
digital
information—for example, data stored on one computer server—is
duplicated and stored in another location. The
reproduction is referred to as a
‘cached’ copy. It can be faster and cheaper to retrieve frequently
accessed information
by using caches, and they are part of the standard
operation of many internet intermediaries.
[21] See eg
Commission of the European Communities, Report from the Commission to the
European Parliament, the Council and the European Economic and Social
Committee (Brussels, 21 November 2003)
14.
[22] Ibid and
see eg Lilian Edwards, ‘Articles 12-15 ECD: ISP Liability: The Problem of
Intermediary Service Provider Liability’
in Lilian Edwards (ed) The New
Legal Framework for E-Commerce in Europe (2005)
122.
[23] Service
providers are ‘natural or legal persons or groups of person who make
either their own or third-party teleservices available
for use, or who provide
access for such
use’.
[24]
That regulation provides: ‘A person providing an information society
service shall make available to the recipient of the service
and any relevant
enforcement authority, in a form and manner which is easily, directly and
permanently accessible, the following
information (c) the details of the service
provider, including his electronic mail address, which make it possible to
contact him
rapidly and communicate with him in a direct and effective
manner’.
[25]
See eg Katarzyna Kryczka, ‘Ready to Join the EU Information Society?
Implements of E-Commerce Directive 2000/31/EC in the EU
Acceding
Countries–The Example of Poland’ (2004) 12 International Journal
of Law and Information Technology
55.
[26] Eg in
Australia, Schedule 5 of the Broadcasting Services Act 1992 applies to
internet content of only some subject (and outlined above) and some technical
forms; it excludes intranets, email, chat
rooms, live audio or video streaming
or material contained in newsgroups. ACMA’s remit does not extend to
include ‘content
that is accessed in real time without being previously
stored’.
[27]
Kenyon and Liberman, above n 1,
52.
[28] Reference
could also be made to schemes that apply to material that allegedly infringes
copyright; see eg Australia’s Copyright Act 1968 (Cth) ss
116AA–116AJ and the US Digital Millennium Copyright
Act.
[29] The
complaints mechanism is provided for by Part 4 of Schedule 5 of the
Broadcasting Services Act
1992.
[30] The
government decided it would not be reasonable for ISPs to be the first point of
contact for lodging and investigating complaints.
Complaints are made directly
to ACMA via an internet complaints hotline. It is also worth noting that
Schedule 5 ensures that ISPs
are not liable for content of which they are
unaware. (And they are indemnified against liability under other Australian law
if
it would require them to monitor content on their
services.)
[31]
Kenyon and Liberman, above n 1,
51–52. While the Australian approach is being changed, in pursuit of
greater consistency across internet and mobile delivery
platforms, the general
notice and obligation approach remains. See further, Communications
Legislation Amendment (Content Services) Act 2007 (Cth) which inserts a new
Schedule 7 into the Broadcasting Services Act 1992 (Cth) and is due to
come into force on 20 January 2008. Schedule 7 covers content delivered via all
carriage services and extends
regulation to a broader range of content than the
former scheme—live streamed content, links services and commercial content
services—and extends the requirement for restricted access regimes to a
wider range of sexual
content.
[32]
Communications and Multimedia Act 1998 (Malaysia) ss 212, 213. Offensive
content is also prohibited under s
211.
[33] Under s
99 of the Communications and Multimedia Act 1998
(Malaysia).
[34]
Singapore does have strong tobacco control legislation; see eg M Assunta and S
Chapman, ‘“The World’s Most Hostile
Environment”: How
the Tobacco Industry Circumvented Singapore’s Advertising Ban’
(2004) 13 Tobacco Control
51.
[35]
Broadcasting Act (Cap 28) (Singapore) s
6.
[36] Singapore,
Media Development Authority, Internet Code of Practice (1 November 1997)
paragraph 1.
[37]
Singapore, Media Development Authority, Internet Industry Guidelines (nd,
accessed October 2007) paragraphs
20-21.
[38] See
<http://www.iwf.org.uk>
.
[39]
The IWF also works in partnership with the Internet Hotline Providers
Association (INHOPE), which is a network of international hotlines
and crime
squads who pass relevant information to
INTERPOL.
[40] See
<http://www.apacs.org.uk/payment_options/plastic_cards_5_2.html>
.
[41]
Kenyon and Liberman, above n 1,
51–51.
[42]
<http://www.facebook.com/terms.php>
. The governing law is stated to be
that applicable in the US state of Delaware and the courts to be used in
relation to disputes
are the state and federal courts of California.
[43]
<http://www.facebook.com/ad_guidelines.php>
.
[44]
<http://www.myspace.com/index.cfm?fuseaction=misc.terms>
. The governing
law is stated to be that applicable in the US state of California. The Facebook
and MySpace terms could be
contrasted with the YouTube Terms of Use and
Community Guidelines:
<http://www.youtube.com/t/terms>
,
<http://www.youtube.com/t/community_guidelines>
. (The governing law is
stated to be that applicable in the US state of California.) These terms place
some restrictions on
commercial use of the YouTube site (eg they disallow uses
for the primary purpose of gaining advertising revenue for another site)
but
they do not have the same direct limitations as Facebook and MySpace.
[45] Gillian
Triggs, International Law, Contemporary Principles and Practices (2006)
351.
[46] Kenyon
and Liberman, above n 1,
33.
[47]
Commonwealth Criminal Code s
14.1.
[48]
Commnwealth Criminal Code s
16.2
[49] Eg Dow
Jones v Gutnick (2002) 210 CLR 575 (Australian HC). An equivalent approach
has been taken under French criminal law: La Ligue Contre Le Racisme et
L’Antisemitisme v Yahoo! Inc, TGI Paris (20 November
2000).
[50] Kenyon
and Liberman, above n 1,
38
[51] See Triggs,
above n 45, 367–73 (emphasis
added).
[52]
American Law Institute, Restatement of the Law (Third), The Foreign Relations
Law of the United States (1987) [402] (emphasis
added).
[53] The
most notorious are the US Clayton and Sherman Acts 15 USC
§§ 1-27 (1914), but see also the European extraterritorial antitrust
law, Treaty Establishing the European Economic Community, opened for
signature 25 March 1957, 298 UNTS 3, arts 85-6 (entered into force 1 January
1959); and eg Gillian Triggs, ‘Extraterritorial Reach of United
States Anti-Trust Legislation: The International Law Implications of
the
Westinghouse Allegations of a Uranium Producers’ Cartel’ [1979] MelbULawRw 15; (1979) 12
Melbourne University Law Review 250; Paul Torremans,
‘Extraterritorial Application of EC and US Competition Law’ (1996)
21 European Law Review
280.
[54] Kenyon
and Liberman, above n 1,
35.
[55] See
sections 14, 61CA, and 61EA in
particular.
[56]
Kenyon and Liberman, above n 1,
38–39.
[57]
See eg Allyn L Taylor and Karen C Sokol, ‘The Power of the FCTC Conference
of the Parties’ Obligations to Control Cross-Border
Tobacco Advertising,
Promotion and Sponsorship (Article
13)’.
[58]
See generally Peter Muchlinski, Multinational Enterprises and the Law
(Oxford: Oxford University Press, 2nd ed, 2007).
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/journals/UMelbLRS/2007/17.html