Ire­land: fine of EUR 345 mil­li­on against Tik­Tok – vio­la­ti­ons of the duty to inform and insuf­fi­ci­ent TOMs regar­ding children

On Sep­tem­ber 1, 2023, the Irish Data Pro­tec­tion Com­mis­si­on (DPC) issued a 125-page decis­i­on regar­ding Tik Tok, which found seve­ral vio­la­ti­ons and accor­din­gly issued a Fine of EUR 345 mil­li­on. impo­sed (Media release). The DPC pres­ents the inve­sti­ga­ti­on as follows:

Infographic: TikTok Decision Key Takeaways

The DPC, as the lead aut­ho­ri­ty, initia­ted an inve­sti­ga­ti­on into TikTok’s data pro­ce­s­sing on Sep­tem­ber 14, 2021. The other super­vi­so­ry aut­ho­ri­ties were sub­se­quent­ly con­sul­ted in accordance with Art. 60 para. 3 GDPR. Objec­tions from the Ita­li­an and Ber­lin aut­ho­ri­ties could not be resol­ved by mutu­al agree­ment, which is why the EDPB was invol­ved in accordance with Artic­le 65(1)(a) GDPR. On August 2, 2023, the EDPB made a bin­ding decis­i­on. This DPC decis­i­on is based in part on this EDPB decision.

The sub­ject of the inve­sti­ga­ti­on was the pro­ce­s­sing of per­so­nal data of users who of regi­stered per­sons bet­ween the ages of 13 and 17 in the peri­od bet­ween July 31 and Decem­ber 31, 2020 (Tik­Tok is open to per­sons over the age of 13, and “child” means a per­son under the age of 18 under the Data Pro­tec­tion Act 2018 (in con­junc­tion with Art. 8 para. 1 GDPR)).

The DPC reco­gnized seve­ral vio­la­ti­ons of Tik­Tok against the GDPR:

  • Con­tents were also for child­ren set to “public” by default. All per­sons, even tho­se not regi­stered with Tik­Tok, were able to see the cor­re­spon­ding con­tent. Appro­pria­te tech­ni­cal and orga­nizatio­nal mea­su­res were not in place to ensu­re that only neces­sa­ry data was pro­ce­s­sed by default. This vio­la­ted pri­va­cy by design and the prin­ci­ple of data mini­mizati­on and led to con­sidera­ble risks for the child­ren con­cer­ned. Tik­Tok had also fai­led to assess the cor­re­spon­ding risks.
  • With a so-cal­led “Fami­ly link”, third par­ties – e.g. par­ents – could link their account to that of the child. Howe­ver, the third par­ty could then also Acti­va­te direct mes­sa­ging func­tion for child­ren over 16. This also con­sti­tu­tes a breach of appro­pria­te tech­ni­cal and orga­nizatio­nal mea­su­res, becau­se the­re was no appa­rent rea­son why the third par­ty could not only app­ly stric­ter but also less strict data pro­tec­tion settings.
  • Tik­Tok had taken mea­su­res to pre­vent the regi­stra­ti­on of child­ren under 13. The risk that Child­ren under 13 still have access to the plat­form had never been asses­sed in a struc­tu­red man­ner. A data pro­tec­tion impact assess­ment was available, but this risk had been disregarded.
  • Tik­Tok had the duty to inform vio­la­ted. It is true that Tik­Tok infor­med users that a public account set­ting gave third par­ties access to the con­tent (“public”, “ever­yo­ne”, “anyo­ne”). Howe­ver, the fact that this did not only app­ly to other regi­stered users, but that any inter­net user out­side of Tik­Tok could also view con­tent, was not com­mu­ni­ca­ted (and terms such as “may” were used that were too vague). The DPC reco­gnized this as a vio­la­ti­on of Art. 13 f. GDPR. In con­trast, the gene­ral Prin­ci­ple of trans­pa­ren­cy not vio­la­ted. Fol­lo­wing the EDPB, the DPC has a more rest­ric­ti­ve under­stan­ding of the prin­ci­ple of transparency:

    In the par­ti­cu­lar cir­cum­stances, I do not con­sider that TTL’s infor­ma­tio­nal defi­ci­ts con­sti­tu­te an inf­rin­ge­ment of Artic­le 5(l)(a). This is becau­se, while the inf­rin­ge­ments of Artic­les 12(1) and 13(l)(e) GDPR are serious in natu­re, they are not of such a natu­re that they extend bey­ond the con­fi­nes of tho­se spe­ci­fie artic­les and are not suf­fi­ci­ent­ly exten­si­ve to amount to an over­ar­ching inf­rin­ge­ment of the trans­pa­ren­cy prin­ci­ple. Spe­ci­fi­cal­ly, and having regard to EDPB Bin­ding Decis­i­on 01/2021, I do not con­sider that TTL’s infor­ma­tio­nal défi­ci­ts are of the natu­re or ext­ent descri­bed in EDPB Bin­ding Decis­i­on 1/2021 such that it might be said that the­re has been an inf­rin­ge­ment of the Artic­le 5(l)(a) GDPR trans­pa­ren­cy prin­ci­ple itself.

  • But the Prin­ci­ple of fair­ness. The rea­son was “nud­ging”: In the account set­tings, the user could sel­ect the “pri­va­te” opti­on, but was invi­ted to skip the set­ting (“skip”) in the cor­re­spon­ding settings:

The DPC the­r­e­fo­re orde­red Tik­Tok to rec­ti­fy the rele­vant vio­la­ti­ons if Tik­Tok had not alre­a­dy done so. The DPC also impo­sed a Buses of a total of EUR 345 million,

  • that for most vio­la­ti­ons no intent was veri­fia­ble, with the excep­ti­on of the fact that accounts were public by default;
  • that, fol­lo­wing reci­tal 150 of the The con­cept of an under­ta­king under anti­trust law is decisi­ve for deter­mi­ning the upper limit for fines (based on glo­bal annu­al tur­no­ver). The­re is a rebut­ta­ble pre­sump­ti­on that group com­pa­nies are under uni­form manage­ment if a group parent com­pa­ny has a direct or indi­rect decisi­ve influence. Not requi­red is in par­ti­cu­lar that the parent com­pa­ny a respon­si­ble per­son i.e. deter­mi­nes the spe­ci­fic data pro­ce­s­sing. The pre­sump­ti­on would be rebut­ted by pro­of that the com­pa­ny who­se inf­rin­ge­ment is under dis­cus­sion is acting with “real autonomy”;
  • the amount of the total fine results from the fines for the indi­vi­du­al inf­rin­ge­ments. It is the­r­e­fo­re not only neces­sa­ry to deter­mi­ne the most serious offen­se and increa­se its fine appro­pria­te­ly, but also the fines for the indi­vi­du­al offen­ses. Vio­la­ti­ons are to be added tog­e­ther.

Aut­ho­ri­ty

Area

Topics

Rela­ted articles

Sub­scri­be