In the mid to late 90s the creative industries - music, film, TV, photography, video games, etc - were worried about new technology that was starting to facilitate more infringement than could ever happen with tape trading, VCRs, or floppy disks - and it was only going to get worse. But compared to today, the internet landscape was simpler. Most web content was either hosted by a user's internet service provider, or by a host that provided paid-for storage, or on educational networks. If these services and institutions could be compelled to cancel the accounts of repeat infringers, this would act as a strong deterrent.
The problem at that time was not content-sharing platforms in the form we know them today. By the end of the 90s a few early proto-platforms existed entirely to share material, like ShareYourWorld.com, but the physical limitations of the time meant this was a manageable problem without rampant infringement, and the grainy videos and overcompressed audio were generally a poor substitute for CDs and DVDs.
Similarly, there were a handful of free web hosting options, such as Geocities or Angelfire, but while being popular, these were extremely limited. Storage space was barely in the double digits of megabytes so you would only practically have been able to store 4 or 5 MP3s online anyway. The commercial implications of infringement on these platforms was minimal. However, the idea of a service that subsidised web hosting with advertising would live on, and change everything.
The USA's approach to the competing concerns of the internet's effect on the creative industries and its benefits for the technology industry was to attempt to strike a balance in the Digital Millennium Copyright Act, or DMCA for short. The legislation protects most internet services from being liable for infringing content placed on their system by users, providing that they don't know about the exact infringement and that, when they're made aware of it, they remove or disable access to it promptly. These are the 'safe harbors' which protect the provider as long as they follow a set of rules.
The first of these rules was the need to create and follow a repeat infringer policy, as hinted at above. This was intended to be the deterrent aspect. The second rule is that the service provider "accommodates and does not interfere with standard technical measures", where such measures would be used by copyright owners to identify their works, and provided on reasonable terms and cost.
The DMCA also places quite specific requirements on the nature and content of any notification to a host of infringing content. What is perhaps not well understood - perhaps even deliberately obscured by some - is that takedown notices are not a new thing caused or enabled by the DMCA, but rather the only effective form of legal notice that remains available to rightsholders now that the DMCA is in effect has to be DMCA-compliant, hence the subsequent 'DMCA takedown' vernacular.
The corresponding law in the EU (or more precisely, a directive that required member states to implement such a law), is Directive 2000/31/EC. It has a very similar effect - hosts are not responsible for what is on their site providing they are merely hosting, caching, or relaying it, providing they remove or disable access to infringing content shortly after being made aware of it. Due to the 2000 EU directive's similarities to the DMCA provisions I'll use 'safe harbours' to mean both pieces of legislation henceforth, except where otherwise noted.
In the years after the DMCA and Directive 2000/31/EC were passed, two key developments happened in parallel.
First, bandwidth and storage capacity continued to grow, meaning more and larger files could be stored by a typical user on both paid and free hosting, and could be downloaded much more effectively by other users. As such, the scope for copyright infringement grew dramatically now that it didn't take 15 minutes to upload a single song over dial-up modems, and you could store hundreds or thousands of them rather than a handful. The capability for online copyright infringement grew dramatically.
Second, technology companies learned how to make online advertising more effective. The 'dumb' banner ads of the 90s started to give way to smarter advertising that could be tailored to a user's direct needs - first by appearing next to their search queries, but soon afterwards appearing on third party sites. By combining information of a user's search habits with the specifics of the site that user was currently visiting, it was possible to serve up more valuable advertising than ever before. This in turn meant that entirely ad-supported online businesses were viable, creating a predatory pricing effect that drove most subscription services out of the market.
This combination created new conditions which led inexorably to the situation we have today:
So, unlike the late 90s and early 2000s where users basically paid for the right to host their work, the system has turned on its head, giving internet companies an incentive to encourage their users to upload more and more high quality data to encourage other people to visit their service, whether the users have the right to the works they're uploading or not.
To add insult to injury, the 'standard technical measures' as mentioned in the DMCA never came into being. Neal Turkewitz discusses this in detail elsewhere but the essence of the matter is that somehow the platforms managed to exercise their Safe Harbor protection without doing anything to accommodate standard technical measures. Perhaps mere unwillingness to engage in an industry-wide discussion on the matter meant that standards could never arise and therefore there was never anything to accommodate. Whatever the reason, this capability was denied to rightsholders as well, rendering the DMCA mostly toothless for effective enforcement of copyright.
Since users on these platforms are practically anonymous, terminating a user’s account on such a platform has little effect, meaning a repeat infringer policy (as mandated by the DMCA) is not longer the deterrent it was originally intended to be. Unlike signing up for a new ISP or paying for web hosting, the user can create another account at no cost and in very little time, and re-upload exactly the same infringing material should they so choose. They don't risk losing access to the internet or even access to the service they were removed from.
Worse, the "you're only liable once you know about it" aspect of these laws has created a perverse situation where the online platforms have an incentive not only to turn a blind eye to what is on their site, but to actively make it harder for other people to bring illicit content to their attention. Hardly any sites offer a way to report copyright infringement on behalf of someone else - you have to find the rightsholder and get them to complain directly.
One example is YouTube - they used to have the ability for users to flag any video as a copyright infringement, and it was seeing a lot of use - but they decided to remove it because they preferred to hide behind the safe harbor provisions than make the effort to respond to the flags. (More recently some cases in the EU are starting to suggest that YouTube may not fully qualify for the EU protection from liability, but the outcome of that remains to be seen.)
It's hard to argue that this is really what US and EU lawmakers intended. In an honest attempt to shield the burgeoning internet industry from lawsuits that could kill a company based on one or two rogue users, the effect has been to encourage deliberately irresponsible companies that can profit from encouraging users to infringe copyright and be shielded from any real liability for it.
The EU is considering passing a new Directive on Copyright in the Digital Single Market, including "Article 13" which, in conjunction with aspects of other articles, would see the safe harbor significantly reduced so that it would no longer apply to sites such as YouTube which actively share the uploaded content with users for profit-making purposes.
If a platform's exemptions from liability in the EU were to be reduced - either via Article 13 or some other statute or case law - is it possible for the platform to survive? Can it still provide a useful service, with the risk from liability for infringing content at one extreme and the risk of being so draconian on content as to alienate its users on the other? I would argue that it is, if the platform uses some or all of the following tools detailed in the next two posts, on non algorithmic measures and algorithmic measures respectively.