In a substantial legal blow, TikTok, the Chinese-owned video-sharing platform, has been slapped with a staggering €345 million (£296 million) fine for flouting European Union (EU) data protection regulations concerning accounts belonging to children. The Irish data watchdog, which oversees TikTok’s operations throughout the EU, has accused the popular social media platform of committing a series of transgressions against the General Data Protection Regulation (GDPR).
These breaches encompassed TikTok’s default placement of child users’ accounts into the public setting, permitting unrestricted public comments on such accounts, failure to ascertain the legitimacy of adults granted access to a child’s account through a “family pairing” scheme, and inadequately addressing the risks posed to users under the age of 13 who found themselves on a public setting.
The Irish Data Protection Commission (DPC) highlighted that individuals aged between 13 and 17 were seamlessly funneled through the registration process, inadvertently resulting in their accounts adopting a public status by default. This public setting enabled unbridled access to the content within these accounts, as well as unrestricted commenting.
Moreover, the DPC noted that TikTok’s “family pairing” initiative, which bestows control over a child’s account settings upon an adult, failed to verify whether the adult involved was indeed a parent or legal guardian.
The DPC’s verdict rested on the assertion that TikTok, while mandating a minimum user age of 13, had negligently disregarded the potential threats posed to underage users exposed to a public setting. Consequently, the platform’s adherence to a public-setting-by-default approach provided an avenue for anyone to peruse and interact with the content posted by these young users.
Furthermore, TikTok’s features, including Duet and Stitch, which enable users to collaboratively create content, were inexplicably activated by default for users under the age of 17. Notably, the DPC did not identify any GDPR violations in terms of TikTok’s methods for verifying users’ ages.
This considerable penalty follows TikTok’s previous run-in with regulatory authorities. In April, the UK’s data regulator imposed a £12.7 million fine on TikTok for unlawfully processing the data of 1.4 million children under the age of 13, who had been using the platform without parental consent. The Information Commissioner stated that TikTok had exhibited minimal effort, if any, in verifying the identities of its users.
In response to the fine, TikTok emphasized that the DPC’s investigation scrutinized the company’s privacy measures between July 31 and December 31, 2020. It asserted that subsequent adjustments had been made to address the concerns raised during the inquiry. Starting from 2021, all TikTok accounts designated for users aged 13 to 15 have been set to private by default, thus limiting content visibility exclusively to approved viewers.
TikTok expressed its disagreement with the DPC’s decision, particularly concerning the magnitude of the imposed fine. The company underscored that the criticisms predominantly pertained to features and settings that had been in place three years prior and had undergone modification long before the initiation of the investigation.
The DPC acknowledged certain disagreements with its findings by the European Data Protection Board, an entity comprising data and privacy regulators from EU member states. Consequently, the DPC incorporated a proposed judgment by the German regulator, asserting that TikTok’s deployment of “dark patterns,” a term denoting manipulative website and app designs steering users toward specific behaviors or choices, contravened GDPR provisions related to fair processing of personal data.