Understanding the DSA: 4 things to know about EU’s internet law
Strategic Insights

Understanding the DSA: 4 things to know about EU’s internet law

Teleperformance - 01.06.2023

The EU Digital Services Act (DSA) aims to answer the ongoing question of how to tackle harmful content across online spaces. The lack of standardized rules and policies on content moderation raises more questions than answers, making users vulnerable to harmful content on various online platforms and services.

As such, the DSA was passed in the European Union (EU) to curb the spread of illegal content online. The DSA aims to provide and apply clearer, more standardized rules for large and small digital service providers operating in the European markets. To limit online harm, the DSA will apply not only to larger platforms but also to small businesses and social networks, among many other online providers.

The DSA is labeled a “gold standard” for content and platform governance in the EU region. Published in the Official Journal of the European Union in October as an EU regulation, digital service providers have until January 2024 to comply with the regulation’s provisions.

As with new laws, policies, and regulations, it’s essential to take note of the who, the when, the how, and the what of the DSA. Here are four critical answers to the most common questions about it:

1. Who does the EU Digital Services Act apply to?

The DSA will first apply to very large online platforms (VLOP) or online search engines (VLOSE), i.e., those with an average of at least 45 million active users per month. Social networks, online marketplaces, content-sharing platforms, ISPs and domain registrars, and cloud-hosting services, to name a few, all fall within the DSA’s scope of regulations. These platforms have until February 17, 2023, to release information on their user numbers publicly. The EU Digital Services Act will apply to all providers that offer or conduct digital services in the EU, regardless of their place of establishment.

The article “Here's how the DSA Changes Content Moderation” by Teleperformance’s Global President of Trust and Safety, Akash Pugalia, and Global Vice President of Trust and Safety Policy, Farah Lalani, highlights the broad impact of the DSA on both small and large platforms — there are more than 10,000 platforms in the EU region, 90% of which are small or medium-sized enterprises. While the DSA strives to ensure that small or medium enterprises are not overly affected, the regulation expects them to remain fully accountable and continue upholding user protection across online spaces.

2. When will the DSA be applied?

The DSA entered into force on November 16, 2022. It will apply across the EU 15 months or from January 1, 2024 (whichever comes later) after the entry into force. Regulators will be assigned in the form of a Digital Services Coordinator (DSC), responsible for the supervision and enforcement of DSA regulations at the national level. The European Commission can oversee the larger platforms, i.e., Google and Facebook, which are subject to fines of up to 6% of a company’s annual global revenue if found non-compliant.

3. How will the DSA fit into the larger content moderation picture?

The same article mentions that “with the EU Digital Services Act, the process by which platforms are notified and must take subsequent action on illegal content will be harmonized.” The process will include “trusted flaggers,” entities with expertise that are tasked to detect and notify illegal content, eventually submitting “notices” objectively. Companies will then need to remove the illegal content swiftly. Although the DSA does not mandate specific timelines, companies must be proactive and prepared for quick content removal by having proper processes in place once notified of violating content.

The DSA aims to bring cohesive standards to safeguarding online spaces, particularly the need for “notice and action” procedures to cover the creation of complaints and how platforms should take action.  

4. What does this mean for companies and platforms?

Complying with the DSA means that companies will have to follow a variety of new obligations spanning rules around transparency, advertising, risk assessments, and much more. They must detail their efforts, actions, and decisions related to content moderation.  

For example, when platforms or companies remove or limit user-generated content, they are obliged to provide a statement of reason to explain the types of action taken on the matter and on what basis. Additional requirements may also surface. For instance, E-commerce marketplaces may be required to trace traders under the “know your customer” principle. Meanwhile, hosting services can notify law enforcement should suspicions of criminal activity or threats to the safety of individuals arise. Further, ad targeting may require more information, such as how and why ads target users and who sponsors the ads.

VLOPs and VLOSEs will also need to disclose pertinent information, including details about the technology they use to remove illegal content, content that has already been removed, and public reports that contain their processes on how they mitigate online risks to users and society.

Adhering to EU Digital Services Act regulations will significantly impact the Trust and Safety landscape, as it is set to become a global benchmark for unified and standardized approaches to protecting users online. Compliance with the DSA ensures that safety policies are met, in accordance with the law, eventually paving the way for safer digital spaces that value the safety and rights of their users.

Click here to learn more about Teleperformance's Trust and Safety services.

Image Linkedin
Image Twitter
Image Facebook
Image Email
Image Share