Skip to main content
EUPLANT

Strengthened Liability for User-upload Platforms: Is Filtering Obligation Reasonable?

The past few years have seen heated debates on the Europe Union’s (‘EU’) copyright reform, first proposed in September 2016. In April 2019, the new Directive on copyright and related rights -- Copyright Digital Single Market Directive (hereinafter referred to as “CDSM Directive”) -- was adopted. It became effective on 6 June and must be implemented by every Member State, through the incorporation into national law, by 7 June 2021.

Published:

EUPLANT logo sat above the Erasmus+ logo which states 'with the generous support of the Erasmus+ programme of the European Union

ZHU Xiaorui, Tsinghua Law School

What is worth noting and commenting on is Article 17 (former Article 13 in the European Commission’s proposal) of the CDSM Directive -- the so-called “upload filters” provision. Article 17 potentially introduces a new liability regime for user-upload platforms: making certain qualified online platforms directly liable for the contents uploaded by their users and explicitly excluded from the safe harbors for users’ copyright infringing acts. It applies to “online content-sharing service providers” (OCSSPs) as defined in Article 2 (6) and supported inter alia by recital 64. Well-known platforms such as YouTube, Facebook and other types of user-upload platforms falling into this definition and not carved out of are prone to be regulated by this provision.

Article 17 has gathered widespread attention both within and outside the EU for altering the existing liability regime for online platforms and consequently the landscape of EU copyright law. The debate is essentially centered on the potential effects of the statutory obligations placed on OCSSPs to “ensure the unavailability” and “prevent future uploads” of specific protected content. Although reference to the use of specific filtering technologies were removed from the enacted text, Article 17 can reasonably be interpreted as intending to achieve this result. Proponents consider that this design will enhance the position of authors and help them to achieve fair remuneration for the online use of their works, while opponents contest that the prohibitive costs of performing these obligations would hinder the development of the Internet industry. More importantly, the opponents argue that this design is incompatible with existing EU directives as well as the EU Charter of Fundamental Rights, as interpreted by the CJEU, ultimately threatening the consistency of the whole EU law system.

Strengthened Liability: From “Safe Harbors” to Filtering Obligation

From the outset, it is useful to set the background and context and provide some detail on the Eu’s copyright reform. Before the enactment of CDSM Directive, online intermediaries were protected by “safe harbors” and liability exemptions according to European Union’s E-Commerce Directive. Safe harbor rules originating from United States (Digital Millennium Copyright Act) still dominate the existing liability regime. According to these rules, Internet intermediaries are exempted from copyright liability for infringements of their users’ if they have no knowledge or control or have taken down the infringing contents upon notifications. In most instances, online platforms are not required to prevent infringing contents from being uploaded or stored on their platforms.

During the past two decades, however, the dramatic development of digital and network technology has revolutionized the way copyright contents are produced, accessed and distributed. Broadly as works are communicated, the enhanced ability of network communication also reduces copyright holder’s control over their works and the cost of safeguarding their interests. Online infringements are thus greatly proliferated. Meanwhile, the increasingly active role played by online platforms in making copyright contents available online for profit-making purposes catches the eyes of the copyright holders and EU authorities. What is frequently referred to as the “value gap” – the gap between the revenue made by user-upload platforms and the revenue returned to the copyright holders – is identified and becomes the major rationale underpinning the reform of copyright laws.

What should also be noted is that the CDSM Directive and its Article 17 are part of a broader strategy push in the EU. As early as May 2015, the European Commission adopted its DSM strategy aiming to create the proper regulatory conditions and right environment for digital networks and services to flourish. The trend towards enhanced liability or responsibility of online platforms was already present in 2017. European Commission put forward the Communication on Tackling Illegal Content Online in September 2017 and Commission Recommendation on measures to effectively tackling illegal contents online in March 2018, both of which are intended for preventing the dissemination of terrorist contents online. Besides, an upcoming revision of the Audio-visual Media Services Directive might also require online platforms to put in place measures to protect minors from harmful contents and to protect everyone from incitement to hatred. It is then clear that there is a package of online intermediary liability reforms under consideration in the EU as part of the Digital Single Market Strategy.

The CDSM Directive is one of the efforts to modernize the rules concerning online platforms to keep pace with the technological changes and market developments. Having undergone several amendments, the final version of the legal framework that Article 17 entails is rather complex. Pursuant to this provision, OCSSPs’ service constitute an act of communication to the public or one of making available to the public as set forth in the Directive when it gives the public access to copyright-protected works uploaded by its users for profit-making purposes. Therefore, OCSSPs shall obtain an authorization from the copyright holders in order to perform their service. It is also mentioned that authorization would also cover acts carried out by their service users.

In the absence of authorization, OCSSPs could only be exempted from liability if they meet the following three obligations:

(a) making the best efforts to obtain an authorization;

(b) making the best efforts to ensure the unavailability of specific works for which the copyright holders have provided them with the relevant and necessary information; and

(c) acting expeditiously, subsequent to notice from copyright holders, to disable access to the notified infringing content and making best efforts to prevent their future uploads. The provision also provides some factors to be considered in determining whether OCSSPs have complied with their obligations if no authorization is granted.

To put it simply, this design leaves concerned OCSSPs two possibilities. They could either obtain an authorization or meet the said three conditions to avoid liability. Since it is impossible to obtain all the required authorizations for potentially user-uploaded contents, it is highly likely that the adoption of upload filters is inevitable to fulfill the cumulative obligations stipulated therein. Under the new regulatory framework, online platforms are primarily liable for the contents available on their platforms if they fall foul of the obligations stipulated in Article 17.

The Reasonableness of Filtering Obligation

Safe harbor rules were originally introduced to promote innovation and the emerging Internet market. Can it still serve that goal when confronted with radically changed technologies and market conditions? Can it still maintain the delicate balance between intermediaries, copyright holders and the public in a cost-effective way? It is justifiable and feasible to make OCSSPs take greater responsibilities to help prevent copyright infringement? These critical questions need to be contemplated by policymakers. This post will try to provide certain explanations to demonstrate the superiority of strengthened liability and the merits of introducing filtering obligation for user-upload platforms. This post suggests that it is reasonable to intensify the duty of care of OCSSPs, so that they would cooperate with copyright holders to establish filtering mechanism to prevent copyright infringement efficiently without prejudice to the rights and interests of the public.

First, the advances of content recognizing technology make it feasible to establish filtering mechanism via the cooperation between OCSSPs and copyright holders. By and large, content-based filtering technology has reached its mature stage, with content fingerprinting tools representing the future mainstream development. Content fingerprinting tools are capable of examining characteristics of the underlying text and media files to make identifications. More importantly, these tools are robust to alterations in the contents of the files and tailored to different types of protected contents. Although incapable of fully addressing copyright infringement, filtering technologies are playing an increasingly important role in the online ecosystem in identifying and removing infringing materials. For example, YouTube has created a filtering system known as Content ID which seeks to protect video creators. YouTube scans authorized copyrightable contents into a huge database and compares them with contents submitted by other uploaders. This automatic filtering system has the advantage of higher speed, lower error rate and easier operation in comparison with the traditional notice-and-takedown procedures which is both time-costing and labor-consuming.

Second, OCSSPs are the lower cost avoiders, relative to copyright holders, to prevent online copyright infringements under the new technical and market conditions. Copyright law has always made a reasonable allocation of the cost of preventing infringements between copyright holders and the public. Safe harbor rules were tailored for the technical conditions and Internet ecosystems two decades ago when OCSSPs could not readily identify potentially copyrightable material and determine whether its use is authorized or not. It was cost-effective to place the burden of policing online infringements squarely on copyright holders at that time. However, technological progress has changed the presumed premise – the relative cost of preventing infringements. At present, OCSSPs are well-positioned to be the “gatekeeper” facing enhanced liability not only because they possess the power to restrict the infringer’s access but also because they have the technical capacities to detect and prevent copyright infringements at a reasonable cost.

Third, the imposition of filtering obligation for OCSSPs facilitates the cooperation between the online platforms and copyright holders by helping overcome the obstacle of transaction cost. Both safe harbor rules and the new liability regime aim to promote the cooperation between online platforms and copyright holders to fight against infringements. Although the implementation of filtering mechanism could make their cooperation more efficient and profitable, copyright holders could not readily persuade the online platforms to establish filtering system without the intervention of law. The transaction cost is prohibitive because of limited purchasing power of single decentralized copyright holders. Additionally, many online platforms, in effect, benefit from the communication of infringing contents. The asymmetrical bargaining position and gap of revenue are problems the new liability regime seeks to address. The design of enhanced liability and filtering obligation is a reasonable reaction for the purpose of boosting the copyright industry and platform economy.

Admittedly, this new design deviates from existing safe harbor rules, entailing heavier duty of care on online platforms, which might incur higher operating costs and undermine user privacy and free speech interests. However, this does not mean safe harbor rules are still held true. Policymakers have to make a proper trade-off and strike a new balance between copyright protection and the freedom to conduct business, freedom of expression, and the protection of personal data and privacy. It seems to be only a matter of time for the copyright law to confirm the obligation of adopting filtering mechanism.

 

 

Back to top