09/11/2023 - While developments regarding the AI Act are currently the topic of conversation, soon the rules of the Digital Services Act ("DSA") will be in full effect. This means that from then on, the regulation will be actively applied, its provisions must be complied with and enforcement can take place.
This was already the case for so-called VLOPS and VLOSE (Very Large Online Platforms and Very Large Online Search Engines), such as Google, TikTok and X, but as of Feb. 17, 2024, the rules will apply to all services covered by the DSA. It is important for digital service providers to align their processes and operations with this new European regulation.
The Digital Services Act (DSA) is a European Union ("EU") regulatory framework designed to make the digital space safer and strengthen users' rights online. Paired with the Digital Markets Act, the EU thereby aims to create a digital world that is more competitive, safer and encourages innovation. The DSA imposes obligations on all digital services operating in the EU, including social media platforms, online marketplaces, online travel and accommodation platforms and other types of hosting services.
For the "big players," certain obligations have already been in place since late August, but as of Feb. 17, 2024, this broadens to include all so-called "intermediary services," which include social media platforms, search engines, online marketplaces, Web hosting and Cloud services. This means that they, too, must comply with the DSA's rules or else face substantial fines ranging up to as much as 6% of global annual sales or penalties of up to 5% of average daily revenues.
A core component of the DSA is requiring transparent content moderation procedures. Here, users should have the ability to challenge decisions. Increasing transparency regarding algorithms that present content and advertisements to users is also part of this. The European Commission facilitates this transparency by means of the DSA Transparency Database, in which online platforms will have to publicly justify their policies and decisions regarding content moderation. The set deadline for this is again Feb. 17, 2024.
There are also several new obligations for online marketplaces. For example, they must verify the identity of sellers in advance to prevent the distribution of illegal goods and services. Also, marketplaces must allow users to identify sellers and must notify users when a seller is selling illegal goods.
It is crucial for providers of online brokering services covered by the DSA to examine their current processes and systems to identify any gaps in DSA compliance. This includes reviewing content moderation procedures and ensuring that processes are in line with the new transparency requirements. It is also essential that staff be informed of the changes and receive adequate training.
In addition to compliance with the DSA, it is critical to build trust with users. Therefore, develop clear communication strategies that inform users about how their data and content are managed. In addition, provide a reliable complaint handling process so that users feel heard and have confidence in how their concerns are handled. This proactive approach reinforces the reputation as a responsible and reliable digital service provider.
At Considerati, we understand the importance of these developments and are available for support and advice in preparing for DSA compliance. Do not hesitate to contact us for advice or more information.