TikTok faces watchdog probe after supplying inaccurate information

A stock image of a woman scrolling a phone screenGetty Images

TikTok is being investigated by the UK communications watchdog over providing it with inaccurate information.

It is now investigating whether the platform failed to comply with a legal information request.

The information fed into a report Ofcom published on Thursday into what video-sharing platforms are doing to stop children viewing harmful content.

TikTok has blamed a technical problem, and says it spotted the issue and informed Ofcom, triggering the probe.

It also said it was the accuracy of the information that was being scrutinized, and not its parental controls.

Ofcom asked TikTok, Snap and Twitch for information about how they complied with legal requirements to protect children from seeing videos that may harm “their physical, mental, or moral development”.

It found that while all three platforms have measures to prevent children encountering harmful videos, they can still sometimes face harm while using these platforms.

Ofcom asked TikTok about its parental control system, “but have reason to believe the information it provided was inaccurate,” Ofcom wrote.

TikTok was asked about its parental control system, called “Family Pairing”, which it introduced in April 2020.

The system allows parents to link their account with their child’s and manage things such as screen time, direct messages, content filtering and privacy settings.

Under-18 users can deactivate Family Pairing at any time but parents get a notification if they do.

Ofcom says it may update its report if it receives more accurate information from TikTok.

Fake age

Ofcom research suggests that more than a fifth of children aged between 8 and 17 have an adult online profile.

TikTok, Twitch and Snap all require users to be aged 13 and over, the report noted.

But it was easy for users to gain access by entering a false age.

The platforms should “explore improving how they identify children and stop them encountering harm”, the report says.

This contrasted with the multi-layered checks of OnlyFans, a subscription based platform known for its explicit content, which uses facial age estimation, ID and other systems to check users are adults.

The platforms used a range of methods to spot underage users who have created accounts, including AI and human moderators.

But the regulator said based on the available numbers it was difficult to say how many underage users they had.

Twitch concerns

Ofcom noted that on Twitch – a livestreaming site popular with gamers – content is open access, which means anyone of any age can watch its videos, regardless of whether they have an account and even if the videos are rated mature.

It also said the platforms content warnings could simply be dismissed.

And while TikTok and Snap had parental controls, Twitch requires parents to supervise children in real time while they are using the service according to the platforms’ terms and conditions, Ofcom found.

Parents could also request the removal of a child’s account, if they submitted specific account details and returned a signed form verifying their relationship to the child. However, not a single parent had contacted Twitch to request removal of a child’s account over the 12 months from August 2022, the platform said.

Twitch recently relaxed its rules to allow art featuring nudity to appear on the platform.

The report was looking at the firms’ compliance with regulations covering UK based video sharing platforms.

But firms will soon have to comply with the recently passed Online Safety Act which requires that children are protected from harmful social media content.

Ofcom will be consulting on guidance on the new act’s broad child safety measures in spring 2024.

Related Topics

Comments

Leave a Reply

Skip to toolbar