News

EU: Digital Services Act comes into force

Following the EU Parliament, the member states have now also approved the Digital Services Act, which imposes far-reaching monitoring obligations on online platforms. This should make it possible to counter hate speech, terrorist propaganda and the like more quickly and more strongly. Positions on the Digital Services Act differ widely.

Breakthrough regulation?

The Digital Services Act gives the EU far-reaching new rights vis-à-vis major online platforms. For example, they are required to disclose extensive information about their algorithms, provide extensive insight to external reviewers for the purpose of risk assessment, and grant the EU broad access to their platform in the event of a threat to public safety. In addition, the EU is given the option of collecting up to six percent of annual global revenue as a penalty for violations.

In addition, there are numerous obligations that are intended to strengthen user rights on the one hand and prevent the spread of hate speech and terrorist propaganda on the other. In the future, platforms will have to disclose the reasons why they display certain advertisements to users or recommend certain content. Personalized advertising may no longer be served on the basis of data on sensitive personality categories (such as sexual or religious orientation), but this does not mean that such data may not be collected. Dark patterns will be banned, minors may no longer be shown personalized advertising and – and this is essential – platforms will be obliged to operate low-threshold accessible reporting systems, to process reports promptly, to delete illegal content and to cooperate with civil society initiatives in identifying this illegal content. Upload filtering, on the other hand, is not permitted: it is not allowed to automatically check whether a content could be illegal already at the upload stage.

Jozef Síkela, the Czech Minister of Industry and Trade, sees the regulation as a milestone: “The Digital Services Act is one of the most groundbreaking horizontal regulations in the EU and I am convinced that it has the potential to become the ‘gold standard’ for other regulators in the world. By setting new standards for greater security and accountability in the online environment, the Digital Services Act marks the beginning of a new relationship between online platforms, users and regulators in the European Union and beyond.”

The law will come into force after it is published in the Official Journal of the EU and a transition period of several months has expired.

Criticism of the Digital Services Act

However, the new form of regulation is by no means viewed in a purely positive light. For example, there are fears that the platforms required to delete illegal content will delete too much rather than too little and thus curtail freedom of expression. Although the text of the law states that users should have more options than before to take action against the deletion of their content, it is obvious that the platforms do not want to take the risk of allowing potentially illegal content to exist in order to avoid high penalties, and therefore delete it first as a preventive measure – and then possibly row back after a complaint. The fact that there is no provision for a legal review contributes to this: If the illegality is recognizable for the platform, it must delete.

Another criticism is that the Digital Services Act gives the EU Commission the ability to declare a kind of state of emergency, which gives it the possibility of even greater access to large platforms and their content. The Movement European Digital Rights sees this as a violation of fundamental democratic principles: without a democratic decision, the EU Commission can grant itself the right to temporarily restrict access to information as well as the right to freedom of expression on the Internet. This criticism is related to the criticism of the general democratic deficit of the EU, which can be seen above all in the EU Commission, which largely determines EU policy without having been elected.

Furthermore, European Digital Rights criticizes in this context the fact that it is unclear when exactly a crisis exists that poses a threat to public security. The power of definition here lies solely with the EU Commission, which can extend its own rights when such a situation is identified – which is diametrically opposed to the principles of democratic control.

Patrick Breyer, MEP for the Pirate Party, also criticized the fact that the opportunity had been missed to enshrine digital rights in law. Instead, he said, the law had cleared the way for “arbitrary[] platform censorship as well as cross-border[] deletion orders from illiberal member states without a judge’s decision […], so that perfectly legal reports and information can be deleted.” Breyer is referring to the fact that each EU member state should establish a digital services coordination body responsible for enforcing the law – and thus be able to make relatively far-reaching decisions that apply across the EU. Only the supervision of the largest platforms lies directly with the EU – and there, not with the elected parliament, but with the Commission.

Simon Lüthje

I am co-founder of this blog and am very interested in everything that has to do with technology, but I also like to play games. I was born in Hamburg, but now I live in Bad Segeberg.

Related Articles

Neue Antworten laden...

Avatar of Basic Tutorials
Basic Tutorials

Gehört zum Inventar

7,663 Beiträge 2,105 Likes

Following the EU Parliament, the member states have now also approved the Digital Services Act, which imposes far-reaching monitoring obligations on online platforms. This should make it possible to counter hate speech, terrorist propaganda and the like more quickly and more strongly. Positions on the Digital Services Act differ widely. Breakthrough regulation? The Digital Services … (Weiterlesen...)

Antworten Like

Back to top button