News

EU Introduces Stricter Rules on Internet Regulation

The Digital Services Act (DSA) is a welcomed update to the legislation establishing the rights and obligations of digital service providers and their consumers. Building on the e-Commerce Directive (2000/31/EC) adopted in the year 2000, this act aims to keep up with the rapid innovation of technology and the added risks that come with it. This push for an update has gained fresh momentum from the ongoing war on Ukraine, due to the large amount of fake information on Big Tech Companies.

The approach of the EU within the e-Commerce Directive was based on three principles:

  1. Country of Origin Principle – information society services must comply with the laws of their established country.
  2. Limited Liability Regime – online intermediaries are exempt from liability for any illegal content they convey/host if such intermediaries satisfy certain conditions and remove illegal content once aware of it.
  3. No General Obligation to Monitor – member states are prohibited from imposing the general obligation to monitor information stored onto online intermediaries. This is to protect the fundamental rights of the user.

Due to the increase in exposure of illegal and harmful activities online, the Commission put forward this proposal on the bases of Article 114 TFEU to guarantee protection of rights and obligations for businesses and consumers across the digital market.

The DSA categorises online intermediaries based on their role, size, and impact. These are:

  1. Intermediary Services provided by network infrastructure providers, including mere conduit services, and catching services.
  2. Hosting Services supplied by providers storing and disseminating information to the public.
  3. Online platform services by providers bringing together sellers and consumers.
  4. Very Large Online Platforms (VLOP) services provided by platforms that have a  particular impact on the economy/ society and pose particular risks in the  dissemination of illegal content and societal harms

The obligations imposed on all Providers of Intermediary Services are to:

  1. Act responsibly in applying and enforcing restrictions, including algorithmic decision-making review.
  2. Report on the removal and disabling of illegal content/ information contrary to the terms and conditions of the provider.
  3. Establish a single point of contact to communicate with Member States, and if established outside of the EU they must appoint a legal representative within the EU.

The obligations imposed on Online Platform and Hosting Service Providers are to:

  1. Create notice and action mechanisms for third parties to notify the providers of illegal content.
  2. Provide a statement of reasoning as to why they removed or disabled access to illegal information.
  3. Establish internal complaint handling procedure which is easily accessible and straightforward.
  4. Engage in out of court dispute settlements with their users.
  5. Give priority in the processing of reports from Trusted Flaggers (entities appointed by each Member State who have expertise and competence on illegal content).
  6. Inform competent authorities when they become aware of any information which raises the suspicion of serious criminal offences against a person’s life or safety.
  7. Obtain and verify identification information from traders before allowing them to use their services (Know Your Business Customer’s principle).
  8. Give users information on the ads they see online, and why they were targeted with the specific advertisement.

The obligations imposed on Very Large Online Platforms (VLOPs) are to:

  1. Assess the systemic risks from the function and use of their services annually. Specifically, these three categories of risk must be assessed:
    • Potential misuse by users of their services
    • Impact of their services on fundamental rights
    • Intentional manipulation of their services.
  2. Take appropriate mitigating measures.
  3. Submit to external and independent audits.
  4. Compile and publicise detailed information on the adverts they display to enable those with authority to monitor and assess compliance.
  5. Appoint compliance officers.

Member States must designate independent digital services coordinators empowered to oversee and receive complaints against providers of intermediary services. These Coordinators must also collaborate with other Member State coordinators and partake in joint investigation.

Furthermore, the European Board for Digital Services (EDPB) will be set up to coordinate and ensure the application of this new legislation.

Very large online platforms (VLOPs) will also be subject to enhanced supervision by the European Commission itself, which may intervene, conduct investigations/on-site inspections, and enforce interim measures or binding commitments onto the VLOPs.

While overall welcomed, multiple criticisms are expressed. The Computer and Communications Industry Association maintains the opinion that banning targeted advertisements will negatively affect small businesses and reduce the consumer’s choice. In contrast, the European Data Protection Supervisor suggests stronger measures of protection in regard to content moderation and online targeted advertisements. Small to medium-sized enterprise associations wish that size and scale are factors considered into the obligation placed on providers.

This article was written by Dr Ian Gauci and Legal Trainee Ms Jodie Arpa.

For more information, please contact Dr Ian Gauci.

Disclaimer: This article is not intended to impart legal advice and readers are asked to seek verification of statements made before acting on them.