Tighter rules for digital services

garden cosmos, mac wallpaper, full hd wallpaper-8219905.jpgStarting with 25th of August 2023 the designated Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) must comply with the obligations laid down in the Regulation 2022/2065 or otherwise known as the Digital Services Act (DSA).

Context

The Digital Services Act is an update to the e-Commerce Directive 2000. It regulates:

  • illegal content
  • transparent advertising and
  • disinformation

It was published in the Official Journal of the European Union on 19 October 2022. Affected service providers will have until 1 January 2024 to comply with its provisions.

Nevertheless, designated VLOPs and VLOSEs must comply with their specific obligations four months after they have been designated as such by the EU Commission.

According to the text of law such designation is made on the parameters of a significant reach where the number of recipients (of the services) exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population.

On 25th of April 2023 the Commission adopted the first designation decisions under the Digital DSA, designating 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that reach at least 45 million monthly active users.

According to the Commission official communication the 17 VLOPS are: Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube, Zalando.  The designated VLOSEs are Bing and Google Search.

 

Specifically, the DSA applies to providers of intermediary services, and in particular intermediary services consisting of services known as mere conduit, caching, and hosting services, given their role in the intermediation, and spread of unlawful or otherwise harmful information and activities (illegal content).


Useful terminology.

Information society services. This is any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient.

Intermediary services. Specifically, the DSA applies to providers of intermediary services, and in particular intermediary services consisting of services known as mere conduit, caching, and hosting services, given their role in the intermediation, and spread of unlawful or otherwise harmful information and activities (illegal content).

Illegal content should be understood as information, irrespective of its form, that under the applicable law is either itself illegal, such as:

  • illegal hate speech
  • terrorist content
  • unlawful discriminatory content
  • content that relates to activities that are illegal such as:

     – the sharing of images depicting child sexual abus

     – unlawful non-consensual sharing of private images

     – online stalking

     – the sale of non-compliant or counterfeit products

     – the non-authorised use of copyright protected material

     – activities involving infringements of consumer protection law.

 

Providers of intermediary services have specific obligations amongst which establish a single point of contact, to designate a sufficiently mandated legal representative in the Union, to annually report on the content moderation, to put in place user-friendly notice and action mechanisms, content removal notifications, internal complaint-handling systems, use trusted flaggers, publish the average monthly active recipients.

Online platforms such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, at their request.  Interpersonal communication services, such as emails or private messaging services, fall outside the scope of the DSA.

Exemption from liability for hosting services. To benefit from the exemption from liability for hosting services, the hosting services provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through its own-initiative investigations or notices submitted to it by individuals or entities in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.

Certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers.


DSA objectives.

DSA aim to guarantee different public policy objectives such as:

  • the safety and trust of the recipients of the service, including minors and vulnerable users.
  • protect the relevant fundamental rights enshrined in the Charter of Fundamental Rights of the Union.
  • ensure meaningful accountability of providers of intermediary services.
  • empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.

The DSA sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms.


Providers of intermediary services obligations:

  • To establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location.
  • Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives.
  • To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report on the content moderation they engage in including the measures taken because of the application and enforcement of their terms and conditions. However, to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro or small enterprises.
  • All providers of hosting services, regardless of their size must put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned (‘notice’), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content (‘action’).
  • Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including using automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression.
  • Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes.
  • Trusted flaggers can be entities that have demonstrated, among other things, that they have expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (Europol) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions.
  • For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Union.

Online advertising

  • Online platforms should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed.
  • Recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling.
  • The requirements of the DSA on the provision of information relating to advertisements is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising.

Very large online platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) obligations

  • Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact, and means.
  • Very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service and take appropriate mitigating measures.
  • Three categories of systemic risks should be assessed in-depth:

A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of:

  • child sexual abuse material
  • illegal hate speech
  • the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products.

A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition.

A third category concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with foreseeable impact on health, civic discourse, electoral processes, public security, and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through:

  • the creation of fake accounts,
  • the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
  • Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example:
  • enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content
  • adapting their decision-making processes.
  • adapting their terms and conditions
  • corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. 
  • Very large online platforms may reinforce their internal processes or supervision of any of their activities, as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures.
  • Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts, and civil society organisations.
  • Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by the DSA.
  • A core part of a very large online platform’s business is the way information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking, and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
  • Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation, and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, where targeted advertising is concerned.
  • To appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of the DSA.

 

Compliance officers

  • Given the complexity of the functioning of the systems deployed and the systemic risks they present to society, very large online platforms should appoint compliance officers, which should have the necessary qualifications to operationalise measures and monitor the compliance with this Regulation within the platform’s organisation. Very large online platforms should ensure that the compliance officer is involved, properly and in a timely manner, in all issues which relate to this Regulation. In view of the additional risks relating to their activities and their additional obligations under this Regulation, the other transparency requirements set out in this Regulation should be complemented by additional transparency requirements applicable specifically to very large online platforms, notably to report on the risk assessments performed and subsequent measures adopted as provided by the DSA.

Enforcement

    • Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as Digital Services Coordinator in each Member State.
    • To ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators. The Commission, through the Chair, should participate in the Board without voting rights.

Author: Petruta Pirvan, Founder and Legal Counsel Data Privacy and Digital Law at EU Digital Partners