As Germany gears up to transpose the European Copyright Directive into national law by the summer of 2021, the new draft tabled by Justice Minister Christine’s Lambrecht (SPD) provides for upload filters, although the government promised to do without them. EURACTIV Germany reports.
The bill was published by Justice Minister Christine Lambrecht (SPD) and has yet to be approved by the federal government. Lambrecht said the draft is a “fair balance of interests”, and an “important step towards meeting the implementation deadline of summer 2021”.
However, the government had promised that such upload filters would be avoided when it agreed to the EU copyright reform in April 2019.
In 2019, the EU directive was highly controversial as people all over Europe took to the streets under the motto “Save Your Internet”. The proposed reform obliged platforms (YouTube, Facebook etc.) to prevent illegal uploads of copyright-protected content, yet the directive left it open as to how this would be done.
Critics nevertheless fear that given the volume of daily uploads, such a system would only work with filter software that automatically searches for illegal content.
The problem, however, is that current algorithms cannot clearly distinguish between illegal and legal uploads. Parodies, for example in the form of memes, are allowed, but such cultural nuances are too sophisticated for current filters.
Even Germany’s justice ministry admits that “algorithms are not yet able to recognise context-related legally permitted uses, at least not currently.” It is therefore possible for legal uploads to be erroneously blocked. But this so-called “over-blocking ” is prohibited by the same directive and interferes with the freedom of speech.
Three-step check
The justice ministry is now trying to resolve this contradiction.
Platforms should be able to buy collective licences, for example, a large “Universal Studios” licence, so that they can pay for their material in advance and show it legally afterwards, and filters should still be used but should filter as little as possible. A three-step procedure should be used for uploads: Filters decide twice, the user decides once.
The process would run roughly as follows: Users upload a video to YouTube, after which the platform runs the video through a filter to check whether it contains protected material. If it does not contain such material, or if the platform has the right licence, the video goes online.
But if the filter detects a potential copyright infringement, users receive a warning and decide for themselves whether to cancel the upload or continue. In the second case, they “flag” the video, i.e. mark it as legal – for example, because it falls under the parody exception.
The marking is verified by the filter, checking whether it is “obviously incorrect”. If not, the video is finally published online.
Flagging content
This procedure is “a major step backwards for freedom of expression”, according to Julia Reda, a former Pirate MEP (now independent) turned project manager for copyright at the Society for Freedom Rights.
Both filter checks could result in “false positives”, i.e. blocking of legal content, precisely because algorithms do not understand contextual nuances, which is something even the justice ministry admits.
Reda also regrets that users can only “flag” their content as legal after a potential legal breach has been detected in their uploads. In the original draft, this was possible right from the first upload attempt.
This means that there is no need for preventive protection against later blocking requests. Rights holders often only complain about blocking when content has been made available on platforms for a long time.
A ‘fair balance of interests’
When users “flag” their content as legal, the content is protected against future blocking requests because marked content can no longer simply be deleted. If a breach is suspected, rights holders must initiate a complaint procedure while the content remains online.
Non-labelled content which, according to the current draft, has made it directly onto the platforms would, however, be taken down immediately in the event of a complaint. It is then up to the users to fight for the content to be restored.
“As soon as the content is blocked, however, the damage to freedom of expression is already done,” said Reda.
[Edited by Zoran Radosavljevic]