Last Friday the Irish Data Protection Regulator decided to apply a €345 million fine to TikTok (TTL) for GDPR violations. The very long 126 pages decision can be downloaded from the European Data Protection Board website.
At the core of the decision is the fact that TTL failed to implement appropriate technical and organisational measures (Art. 24 GDPR) exposing child users to exponential harms without their knowledge or control.
- Regarding child users account being set to public by default: While, during the registration process the child user was prompted to select between “go private” and “remain public”, the child user could opt to simply skip this. This use of language seems to incentivise or even to trivialise the decision to opt for a private account.
- When a video was to be posted publicly for the first time a child user would be nudged to select between “post now” and “cancel”. Plainly the platform settings incentivised the selection of the posting of videos publicly, given both the phraseology used and the difference in colour gradient. Where a video was posted publicly on a public account this had the effect of being viewable and accessible by an unlimited audience.
- The decision for a child user account to be public-by-default meant that comments were also enabled publicly-by-default. This meant that any registered TikTok user, whether adult or child, could comment on the video of a child user or interact with them via these comments. The potential for abuse on this platform setting by bad actors is open-ended as persons could utilise this feature to contact child users directly. While, comments are, of course, not comparable to direct messages where users can privately message each other, the potential for ill consequences remains.
- Any person whatsoever could view the child user’s account or videos via comments, regardless of whether of not they were registered and therefore, utilise and process the data therein beyond the control of the data subject and TTL.
With respect to the information provision obligations (Art. 13 GDPR) the decision notes that:
- The Privacy Policy states that content would be visible to “third parties such as search engines, content aggregators and news sites”. There is no mention here at all non-registered users. This is the critical context as to why the usage of the terms “public”, “everyone” and “anyone” was not sufficient.
- Even at its heigh, a prudent and privacy-conscious child user who consulted the Privacy Policy would have been unable to determine that any non-registered user at all could view their content.
- TTL used the term “may” to refer to the recipients that they did mention. “May” is a conditional term and the use of this term indicates that TTL did not communicate in a clear, plain, and transparent manner to a child user about the fact that recipients would definitively receive the child user personal data in each case.
- TTL did not explain precisely who might constitute a third party in this context. I consider that the use of an imprecise umbrella term such as “third parties” is unclear and opaque as it does not provide the child users with specific information about the recipient of their personal data.
Above are just few passages of the long list of failures. As TechCrunch notices TikTok was found to:
“(…) have violated the following eight articles of the GDPR: 5(1)(a); 5(1)(c); 5(1)(f); 24(1); 25(1); 25(2); 12(1); and 13(1)(e) aka breaches of:
- lawfulness, fairness and transparency of data processing;
- data minimization; d
- ata security;
- responsibility of the controller;
- data protection by design and default; and
- the rights of the data subject (including minors) to receive clear communications about data processing; and
to receive information on recipients of their personal data.
So, it’s quite the laundry list of failings.”
Author: Petruta Pirvan, Founder and Legal Counsel Data Privacy and Digital Law EU Digital Partners