Support the regulation of digital platforms

The regulation of the digital space on issues such as disinformation, hate speech, and censorship is not only complex but also urgent. It demands immediate attention, primarily because it raises a debate about freedom of expression. 

Although freedom of expression is not an absolute right, and the American Convention on Human Rights itself establishes limits on its exercise in its article 13, when it constitutes “any advocacy of national, racial or religious hatred that constitutes incitements to violence or any other similar illegal action against any person or group of people, for any reason, including those of race, color, religion, language or national origin,” the possibilities of regulation enter into debate with that right. Furthermore, there is broad consensus that these limitations do not imply that a State can exercise prior censorship.

Secondly, the complexity lies in the fact that technological innovation advances at an accelerated pace, while regulations are rigid and slow and take a considerable time to consolidate. This causes a vicious circle in which legislation, when it finally arrives, is outdated in the face of new technological advances. 

Both debates can be navigated more or less successfully. However, the European Union has taken significant steps in that direction, offering a beacon of hope with its innovative legislation on Digital Services and Digital Markets, set to come into force in 2024. These regulations focus on making the operation of digital platforms and social networks transparent, as well as guaranteeing fair competition between the different actors in the digital economy. Europe’s pioneering role in proposing an Artificial Intelligence Law further adds to this optimism.

The Digital Services Regulation is the world’s first digital regulation that holds digital services companies across the EU accountable for the content published on their platforms.

It focuses on creating a safer online environment and protecting fundamental rights in the digital environment by establishing new rules on:

  • the fight against illegal online content, including goods, services, and information, in full respect of the Charter of Fundamental Rights;
  • combating online risks to society;
  • the traceability of traders in online markets;
  • transparency measures for online platforms;
  • reinforced supervision.

 

Among other novelties, the Regulation establishes the obligation of online platforms and search engines to evaluate and mitigate the risks derived from the design and operation of their services, carrying out risk assessments supervised by independent audits. Specifically, it establishes as one of the risks to be evaluated “any real or foreseeable negative effect concerning gender violence, the protection of public health and minors, and serious negative consequences for the physical and mental well-being of the person.” According to the Regulation, these risks can also arise from coordinated disinformation campaigns related to health issues.