The European Commission is once again using its powerful DSA regulation to full effect, this time putting Meta and TikTok in its crosshairs. The commission has found Meta and TikTok of violating key transparency rules under the Digital Services Act (DSA), a law designed to hold big online platforms accountable for how they operate. In a statement, the European Commission said both companies failed to meet several legal obligations, including providing proper data access to independent researchers and ensuring that users can easily report illegal content. The findings are preliminary, but if confirmed, the EU could impose fines of up to 6% of each company’s global annual revenue.
Notably, the Digital Services Act (which came into effect last year) is one of the EU’s most significant attempts to regulate online platforms. It requires what the EU calls ‘Very Large Online Platforms’ (those with more than 45 million users in the region) to act responsibly when moderating content, to be transparent about how their systems work, and to allow public oversight of their impact on society. The law also bans manipulative design features, often called ‘dark patterns’, that can trick users into taking certain actions and hide key settings.
According to the Commission, Meta Platforms (the owner of Facebook and Instagram) and TikTok (owned by the Chinese company ByteDance) both failed to give qualified researchers sufficient access to public data about how their platforms function. EU officials said that while both companies technically offer some form of access, the systems are so restrictive and difficult to use that they prevent meaningful research into how content spreads online and how platform design affects users. The Commission said this lack of access violates one of the DSA’s central requirements, which is enabling independent scrutiny of the societal impact of digital platforms.
Importantly, the Mark Zuckerberg-led company faces additional allegations concerning the way its platforms handle user reporting and content moderation. The Commission said Facebook and Instagram’s tools for reporting illegal material (like hate speech, terrorism-related content, or child sexual abuse imagery) are not sufficiently easy to use and clearly visible. Officials also said that Meta’s interfaces contain dark patterns. Meanwhile, TikTok’s case focuses primarily on transparency for researchers. The EU claims that the company has not provided the kind of data access required by law. If violations are confirmed, the EU could order corrective actions and impose fines of up to 6% of each firm’s annual global turnover, a figure that, for Meta alone, could exceed $7 billion.
This is not the first time these social media giants have faced legal heat in the European region. In 2023, Meta was fined a record €1.2 billion by Ireland’s Data Protection Commission for transferring user data from the EU to the United States in violation of privacy laws. Later, in 2024, the European Commission hit Meta with another €797 million fine for abusing its market power by linking Facebook Marketplace to its main social media platform. TikTok has also faced similar action. For example, in 2025, EU regulators proposed a €530 million fine against TikTok for allegedly transferring European users’ personal data to China without proper safeguards.
The Tech Portal is published by Blue Box Media Private Limited. Our investors have no influence over our reporting. Read our full Ownership and Funding Disclosure →