The European Union (EU) has given final approval to online safety-focused legislation, which is an overhaul of the region’s social media and e-commerce rules. Called the Digital Services Act (DSA), the law tightly regulates the way intermediaries, especially large platforms such as Google, Meta, Twitter, and YouTube, function in terms of moderating user content.
“It will give better protection to users and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and provide a single, uniform framework across the EU,” the European Commission said in a blog post.
The regulation will be published in the EU’s Official Journal on October 13, and the majority of its provisions will begin to apply 15 months after the DSA’s entry into force.
What are the key features of the Digital Services Act?
* Faster removals and provisions to challenge: As part of the overhaul, social media companies will have to add “new procedures for faster removal” of content deemed illegal or harmful. They will also have to explain to users how their content takedown policy works. The DSA also allows for users to challenge takedown decisions taken by platforms and seek out-of-court settlements.
* Bigger platforms have greater responsibility: One of the most crucial features of the legislation is that it avoids a one-size fits all approach and places increased accountability on the Big Tech companies. Under the DSA, ‘Very Large Online Platforms’ (VLOPs) and ‘Very Large Online Search Engines’ (VLOSEs), that is platforms, having more than 45 million users in the EU, will have more stringent requirements.
* Direct supervision by European Commission: More importantly, these requirements and their enforcement will be centrally supervised by the European Commission itself — a key way to ensure that companies do not sidestep the legislation at the member-state level.
* More transparency on how algorithms work: VLOPs and VLOSEs will face transparency measures and scrutiny of how their algorithms work, and will be required to conduct systemic risk analysis and reduction to drive accountability about the society impacts of their products. VLOPs must allow regulators to access their data to assess compliance and let researchers access their data to identify systemic risks of illegal or harmful content.
* Clearer identifiers for ads and who’s paying for them: Online platforms must ensure that users can easily identify advertisements and understand who presents or pays for the advertisement. They must not display personalised advertising directed towards minors or based on sensitive personal data, according to the DSA.
How does the EU’s DSA compare with India’s online laws?
In February 2021, India had notified extensive changes to its social media regulations in the form of the Information Technology Rules, 2021 (IT Rules) which placed significant due diligence requirements on large social media platforms such as Meta and Twitter.
These included appointing key personnel to handle law enforcement requests and user grievances, enabling identification of the first originator of the information on its platform under certain conditions, and deploying technology-based measures on a best-effort basis to identify certain types of content.
Social media companies have objected to some of the provisions in the IT Rules, and WhatsApp has filed a case against a requirement which mandates it to trace the first originator of a message. One of the reasons that the platform may be required to trace the originator is if a user has shared child sexual abuse material on its platform.
WhatsApp has, however, alleged that the requirement will dilute the encryption security on its platform and could compromise personal messages of millions of Indians.
This June, with a view to make the Internet “open, safe and trusted, and accountable”, the IT Ministry proposed further amendments to the IT Rules. One of the most contentious proposals is the creation of government-backed grievance appellate committees which would have the authority to review and revoke content moderation decisions taken by platforms.
India is also working on a complete overhaul of its technology policies and is expected to soon come out with a replacement of its IT Act, 2000, which is expected to look at ensuring net neutrality and algorithmic accountability of social media platforms, among other things.