Ofcom fines TikTok for failure to comply with information request
On 23 July 2024, Ofcom, the UK’s communications and online safety regulator, issued its final decision to TikTok Information Technologies UK Limited ("TikTok"), imposing a financial penalty of £1,875,000 (reduced from £2,500,000) for breaches related to its compliance with the Communications Act 2003. This decision marks a crucial step in enforcing regulatory standards on video sharing platforms ("VSPs") to ensure the safety and protection of users, particularly minors, and a sign of Ofcom's likely approach to enforcement under the Online Safety Act 2023.
Background and legal framework
Part 4B of the Communications Act 2003 (the "Communications Act") sets outs the VSP regime which mandates that platforms like TikTok must take appropriate measures to protect the public from harmful material and under-18s from restricted material. These measures include age assurance systems and parental controls, among others. Ofcom's role within this framework is to ensure VSPs have effective measures in place to protect users from such material. This includes the authority to request information from VSPs and publish transparency reports about the measures taken to protect users from harmful material.
Under the Communications Act, Ofcom sought information from VSPs to inform a planned report highlighting the safety measures they have in place to protect children from harmful content. As part of this process, Ofcom asked TikTok to provide data on the uptake of its parental controls feature, “Family Pairing” along with other information. The uptake information was crucial not only in helping Ofcom assess the feature's effectiveness in protecting teenage users but was also to be published in its Child Safety Report (the "Report") to help inform and empower parents to make decisions about which platforms they and their children use.
After several delays to providing information about the parental control measures over 2022 and 2023, Ofcom issued a final information notice on 6 July 2023. TikTok responded with information about "Family Pairing" and the number of UK parent or guardian accounts and UK teen accounts that had "Family Pairing" activated on 4 September 2023. However, while Ofcom and TikTok were engaged in conversation about Ofcom's intention to disclose the data within the Report and confidentiality representations, on 1 December 2023, TikTok disclosed that as a result of a technical issue, the data provided was inaccurate and that it was conducting an internal investigation to understand the root cause of these inaccuracies. Given this disclosure, Ofcom launched an investigation on 14 December 2023 to determine whether TikTok had failed to comply with its duties to respond to a statutory demand for information. Ofcom also considered whether TikTok had cooperated fully for the purposes of producing the Report, given the considerable delay in alerting the regulator to the issues.
The breaches
Ofcom identified a number of areas where it claimed TikTok had failed to comply with its obligations under the Communications Act, including a failure to provide information in the manner and form and within the reasonable period specified by Ofcom (section 368Z10(6)), non-compliance with an information request (section 368Y(3)(b)), and failing to provide full cooperation (section 368Y(3)(c)).
As TikTok took the option to settle the investigation, TikTok waived its procedures rights, including its right to make written representations or have an oral hearing on the substance of the provisional findings. TikTok has also confirmed that it would not challenge or appeal against the determination.
Similar previous Ofcom fines
Ofcom has a history of fining companies for providing inaccurate or incomplete information in response to its regulatory requests. For instance, in December 2021, Ofcom fined O2 £150,000 for failing to provide accurate and complete information during an investigation into billing issues. This fine was issued because O2's incorrect and incomplete responses caused delays in the investigation process. Similarly, in October 2021, Ofcom fined the internet service provider Colt £15,000 for failing to comply with an information request. However, this latest fine against TikTok highlights Ofcom's ongoing commitment to enforcing compliance among companies required to provide information for regulatory purposes, in particular in relation to online safety.
Similar powers under the Online Safety Act
Similar powers are available to Ofcom under the Online Safety Act 2003 ("OSA"). Ofcom is authorised to request information from entities to aid in fulfilling its duties, provided it acts proportionately. This is wide-ranging, from assessing compliance to preparing guidance and codes of practice, to carrying out research or preparing a report. Various offences are associated with non-compliance, including failing to provide information or submitting knowingly or recklessly false information. Ofcom can issue significant penalties for non-compliance under the OSA, up to 10% of annual global qualifying revenue or £18 million, whichever is greater. This is significantly higher than fines under the Communications Act at the greater of 5% or £250,000. In addition to this, "senior managers" can also be held criminally liable for an entity's failure to comply with the OSA in specific circumstances, including for failing to comply with information notices and the individual failed to take all reasonable steps to prevent that offence being committed.
Key takeaways
Ofcom is currently heavily engaged in information gathering from entities as it develops guidance and codes of practice to quickly implement and regulate the OSA. This latest penalty against TikTok serves as a warning to other entities within the regulatory remit of Ofcom about the serious consequences of failing to provide timely and accurate information and cooperate with the regulator in relation to information requests and investigations. It shows Ofcom's readiness to use its powers to enforce a firm approach towards compliance, which is likely to continue under the OSA, solidifying its role as a determined enforcer of online safety standards.