The UK's media regulator, Ofcom, has announced a March 31 deadline for social media platforms such as Facebook, Instagram, TikTok and other online services to submit risk assessments detailing the likelihood of users encountering illegal content on their sites.
According to a report by news agency Reuters, this mandate stems from the Online Safety Act passed last year, which requires companies like Meta (Facebook and Instagram) and ByteDance (TikTok) to proactively address criminal activity and enhance user safety on their platforms.
Under the new legislation, these companies must evaluate and mitigate the risks of various offenses, including terrorism, hate crime, child sexual exploitation, and financial fraud.
What the regulator has to say
Ofcom has explicitly instructed platforms to determine how likely it is that users could encounter illegal content on their service.
“Specifically, they must determine how likely it is that users could encounter illegal content on their service, or, in the case of user-to-user services, how they could be used to commit or facilitate certain criminal offences,” Ofcom said in a statement.
Failure to comply with the March 31 deadline for risk assessment submission could result in enforcement action from Ofcom.
This comes as the UK’s Information Commissioner’s Office (ICO) is investigating how TikTok, Reddit and Imgur handle children's personal data, focusing on their compliance with data protection laws and safeguards for young users.
The probe will specifically examine TikTok's use of personal information from 13 to 17-year-olds to curate content feeds, as well as how Reddit and Imgur verify the age of child users and ensure adherence to age-related regulations.
“If we find there is sufficient evidence that any of these companies have broken the law, we will put this to them and obtain their representations before reaching a final conclusion,” the ICO stated.