TikTok and Meta have been formally told to provide the EU with information about the possible spread of disinformation on their platforms relating to the Israel-Gaza conflict.
Previously they were given 24 hours to provide answers to the bloc’s concerns.
But that request did not carry legal force, whereas this latest demand does.
Both firms have a week to respond. Under its new tech rules, the EU can open a formal investigation if it is unsatisfied with their responses.
The EU is concerned about the possible spread of terrorist and violent content, and hate speech, after Hamas’ attack on Israel.
A Meta spokesperson said: “Our teams are working around the clock to keep our platforms safe, take action on content that violates our policies or local law, and coordinate with third-party fact checkers in the region to limit the spread of misinformation. We’re happy to provide further details of this work, beyond what we have already shared, and will respond to the European Commission.”
The EU’s latest demand comes a week after it contacted X, formerly known as Twitter, over the same concerns.
X said at the time it had removed hundreds of Hamas-affiliated accounts from the platform.
Social media firms have seen a surge in disinformation about the conflict between Israel and Hamas, including doctored images and mislabelled videos.
The chief executives of Meta, TikTok, X and Google each received letters from EU commissioner Thierry Breton earlier in October, giving them 24 hours to respond.
But these letters were not formal, legally-binding requests under new EU tech laws governing what kind of content is allowed online.
Now, under the Digital Services Act (DSA), the firms must respond by the set deadlines.
Failure to comply with the DSA can result in fines of as much as 6% of a company’s global turnover, or even suspension of the platform.
In this formal step under the DSA, the Commission has set Meta and TikTok two deadlines.
First, the firms have been told to provide requested information on “the crisis response” by 25 October, while they must respond to questions about protecting election integrity by 8 November .
TikTok has additionally been tasked with telling the European Commission how it is protecting minors online by the November deadline.
When the social media firms were previously asked to provide more information, Mr Breton said Meta must prove it had taken “timely, diligent and objective action”.
He later said TikTok “has a particular obligation to protect children & teenagers from violent content and terrorist propaganda”.