Newsroom / CNA
Major online platforms risk fines and sanctions if they violate the new European law on digital services that came into force end of August and which, inter alia, makes online platforms legally responsible for the content posted on them. A competent EU source provided further clarifications on the Services Law package, on Monday, during an informal technical briefing for journalists.
The source said that the Digital Services Act (DSA) started to be implemented at the end of August for the 19 largest online platforms and online search engines designated by the European Commission in April 2023. These 19 platforms range from social media like TikTok, Facebook, Instagram, online marketplaces like Amazon Marketplace and there are also some other platforms like Google Maps and Wikipedia.
The source explained that the Digital Services Act aims to empower and protect online users, including minors, by requiring designated agencies to assess and mitigate systemic risks when it comes to illegal content online and provide robust mitigation tools.
The Digital Services Act was adopted by the Commission in December 2020 along with the Digital Market Act (DMA), and this essentially formed the two cornerstones of the Commission's digital legislation.
It was further explained that the reason why the Commission proposed this law was that while online platforms offer great benefits to users, they are also a source of great risks, as has been demonstrated in the past and made reference to platforms with illegal content or dangerous products.
It was also added that online platforms would also be subject to democratically validated rules which would be the same across the European Union. Therefore, the EU source continued, there will no longer be different rules depending on the Member State where a consumer lives and everyone will enjoy the same rights online.
It was also noted that the Commission, as the regulator for these very large online platforms and search engines, will oversee the systems and tackle illegal content and disinformation and will support users' rights by protecting them. It was stressed that to this end, the Commission is equipped with broad investigative and supervisory powers, including the power to impose sanctions and fines.
Furthermore, it was noted that these agencies had 4 months to comply with the obligations of the Digital Services Act, which includes carrying out and providing the Commission with the first annual risk assessment. The competent source stated that along with this obligation the 19 platforms had other obligations related for example to their terms and services and obligations related to the protection of minors.
At the same time, it was said that for the remaining platforms, which are not very large and therefore have less than 45 million users in the European Union, the national digital service coordinators in each member state would be responsible for the supervisory role.
It was also mentioned that national authorities will have to appoint digital service coordinators by 17 February 2024, when platforms with fewer than 45 million active users will have to comply with all the rules of the Digital Services Act.
The EU source reiterated that the aim is to create a safer online space in which the fundamental rights of all digital service users are protected and a level playing field for businesses across the European Union is created.