EU’s Digital Services Act (DSA)
Jun 22, 2022
EU’s Digital Services Act (DSA)
Q Why is it in News ?
A The European Parliament and European Union (EU) Member States announced that they had reached a political agreement on the Digital Services Act (DSA).
Q What is Digital Services Act ?
- Digital Services Act is a landmark legislation to force big Internet companies to act against disinformation and illegal and harmful content, and to “provide better protection for Internet users and their fundamental rights”.
- The Act, which is yet to become law, was proposed by the EU Commission (anti-trust) in December 2020.
- As defined by the EU Commission, the DSA is “a set of common rules on intermediaries’ obligations and accountability across the single market”.
- It seeks to ensure higher protection to all EU users, irrespective of their country.
- The proposed Act will work in conjunction with the EU’s Digital Markets Act (DMA), which was approved last month.
Q Whom will the Digital Services Act apply?
- Intermediaries: The DSA will tightly regulate the way intermediaries, especially large platforms such as Google, Facebook, and YouTube, function when it comes to moderating user content.
- Abusive or illegal content: Instead of letting platforms decide how to deal with abusive or illegal content, the DSA will lay down specific rules and obligations for these companies to follow.
- Ambit platforms: The legislation brings in its ambit platforms that provide Internet access, domain name registrars, hosting services such as cloud computing and web-hosting services.
- Very large platforms: But more importantly, very large online platforms (VLOPs) and very large online search engines (VLOSEs) will face “more stringent requirements.”
- 45 million monthly users-base: Any service with more than 45 million monthly active users in the EU will fall into this category. Those with under 45 million monthly active users in the EU will be exempt from certain new obligations.
Q What are Key features of the Digital Services Act ?
A A wide range of proposals seeks to ensure that the negative social impact arising from many of the practices followed by the Internet giants is minimised or removed:
- Faster removal of illicit content: Online platforms and intermediaries such as Facebook, Google, YouTube, etc will have to add “new procedures for faster removal” of content deemed illegal or harmful. This can vary according to the laws of each EU Member State.
- Introduction of Trusted Flaggers: Users will be able to challenge these takedowns as well. Platforms will need to have a clear mechanism to help users flag content that is illegal. Platforms will have to cooperate with “trusted flaggers”.
- Imposition of duty of care: Marketplaces such as Amazon will have to “impose a duty of care” on sellers who are using their platform to sell products online. They will have to “collect and display information on the products and services sold in order to ensure that consumers are properly informed.”
- Annual audit of big platforms: The DSA adds an obligation for very large digital platforms and services to analyse systemic risks they create and to carry out risk reduction analysis. This audit for platforms like Google and Facebook will need to take place every year.
- Promoting independent research: The Act proposes to allow independent vetted researchers to have access to public data from these platforms to carry out studies to understand these risks better.
- Ban ‘Dark Patterns’ or “misleading interfaces: The DSA proposes to ban ‘Dark Patterns’ or “misleading interfaces” that are designed to trick users into doing something that they would not agree to otherwise.
- Transparency of Algorithms: It also proposes “transparency measures for online platforms on a variety of issues, including on the algorithms used for recommending content or products to users”.
- Easy cancellation of subscription: Finally, it says that cancelling a subscription should be as easy as subscribing.
- Protection of minors: The law proposes stronger protection for minors, and aims to ban targeted advertising for them based on their personal data.
- Crisis mechanism clause: This clause will make it “possible to analyse the impact of the activities of these platforms” on the crisis, and the Commission will decide the appropriate steps to be taken to ensure the fundamental rights of users are not violated.
- Others: Companies will have to look at the risk of “dissemination of illegal content”, “adverse effects on fundamental rights”, “manipulation of services having an impact on democratic processes and public security”, “adverse effects on gender-based violence, and on minors and serious consequences for the physical or mental health of users.”
Q Is there Bar over Social Media ?
- It has been clarified that the platforms and other intermediaries will not be liable for the unlawful behaviour of users.
- So, they still have ‘safe harbour’ in some sense.
- However, if the platforms are “aware of illegal acts and fail to remove them, they will be liable for this user behaviour.
- Small platforms, which remove any illegal content they detect, will not be liable.
Q Are there any such rules in India?
- India last year brought the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
- These rules make the social media intermediary and its executives liable if the company fails to carry out due diligence.
- Rule 4 (a) states that significant social media intermediaries — such as Facebook or Google — must appoint a chief compliance officer (CCO), who could be booked if a tweet or post that violates local laws is not removed within the stipulated period.
- India’s Rules also introduce the need to publish a monthly compliance report.
- They include a clause on the need to trace the originator of a message — this provision has been challenged by WhatsApp in the Delhi High Court.