From Friday (25 August), large online platforms and search engines will have to comply with the new EU Digital Services Act, a landmark law designed to combat hate speech and disinformation online. However, enforcing the new rules is likely to be challenging.
The Digital Services Act (DSA), which came into force in November 2022, introduces legal changes involving detailed new requirements regarding content moderation, flagging illegal content and regular risk assessments to protect users from online harms.
In February, the Act required online platforms to publish their number of active users. From these figures, the European Commission gave 19 platforms – those with over 45 million active monthly users, or 10% of the EU’s population – the status of “very large online platforms” (VLOPs) or “very large online search engines” (VLOSEs).
While the DSA will apply in full from spring 2024, from Friday VLOPs and VLOSEs will have to provide their risk assessments to the European Commission and adhere to much stricter rules.
The new legislation represents a legal U-turn, for the first time introducing fines for non-compliance, Peggy Müller, a lawyer and partner at the German commercial law firm ADVANT Beiten, told EURACTIV.
Fines can constitute up to “6% of [the company’s] total worldwide annual turnover”, stipulates the DSA. Further sanctions could also apply, including a ban from the EU’s Single Market.
Non-compliance may involve failing to counter illegal content online, failing to implement the possibility for users to challenge platforms’ content moderation decisions, or failing to block advertisements targeted at teenagers.
To address enforcement challenges, a European Commission Spokesperson told EURACTIV that the Commission “has been in regular contact” with the 19 designated services and “offered to conduct stress tests” on a voluntary basis.
In June and July, Internal Market Commissioner Thierry Breton’s team conducted stress tests with social media platforms such as X (formerly known as Twitter), TikTok, Snapchat and Meta’s Facebook and Instagram.
Legal changes
National authorities are still responsible for monitoring platforms whose headquarters are registered in their country. However, the European Commission is now empowered to monitor and sanction the list of VLOPs and VLOSEs.
Platforms also remain non-liable for the content they host, but they now should follow detailed new requirements regarding content moderation systems, handling of notification of illicit content, prohibition of dark patterns and cooperation with law enforcement authorities.
Censorship is still forbidden, but platforms now have the right to proactively set up moderation processes. In this sense, hybrid models of content moderation could become the norm, using automatic tools to proactively identify illegal content, while giving the responsibility of the final decision to a human.
Risk assessments
From Friday onwards, the 19 platforms shall submit their risk assessments to the Commission.
The assessed systemic risks vary greatly from one platform to another and “can range from protection of minors, to disinformation, or to integrity of electronic processes” explained Commission spokesperson Johannes Bahrke in a press briefing on Tuesday.
However, Claire Pershan, EU Advocacy Lead from the non-profit organisation Mozilla Foundation, told EURACTIV that some stakeholders consider there to be “a real risk of risk assessment washing” as no third party has been involved in drafting the risk assessments.
Skill shortages
While the new legislation introduces regular independent platform algorithms audits, experts have raised concerns about possible skills shortages.
Jean-Sébastien Mariez, lawyer and founding partner of the French tech law firm Momentum Avocats, said that skills shortages could hit “third-party auditors, national regulators and also the European Commission”.
Müller pointed out the issue might be even more acute for Germany, “which is already experiencing a lack of qualified personnel”.
Trusted flaggers
The DSA’s “trusted flaggers” are entities with proven expertise in flagging harmful or illegal content to platforms. The new regulation provides that their content flagging shall be prioritised by platforms when moderating content.
The trusted flaggers are “a great idea that created a lot of expectations”, said Mariez.
Yet, these entities, many of which are likely to be NGOs, will have to acquire digital skills on top of their fundamental rights expertise in order to effectively flag illicit content to platforms – and will need financial backing to deliver their services.
Legal uncertainty
In the coming months, the Commission is expected to adopt a series of delegated acts, pieces of secondary law designed to help implement a new law, in order to shed more light on platforms’ obligations and increase legal certainty to drive compliance plans.
Legal uncertainty has already resulted in Zalando and Amazon filing complaints in front of the Court of Justice of the European Union, challenging the definition of their services as VLOPs.
Smaller platforms
On 24 February 2024, the Digital Services Act rules will apply to platforms whose monthly users are below the 45 million threshold.
Therefore, Mariez advises platform companies to start implementing “a compliance plan, as they did with the General Data Protection Regulation. They have six months left.”
[Edited by Alice Taylor/Nathalie Weatherald/Benjamin Fox]
Source: Euractiv.com
Leave a comment