The FDPIC today published the final report in the 2017 ope­ned cla­ri­fi­ca­ti­on of the facts published in the mat­ter of Ricar­do and TX Group:

The sub­ject of the inve­sti­ga­ti­on was the trans­mis­si­on of data by Ricar­do to TX, the use of this data for per­so­na­li­zed mar­ke­ting and the data pro­tec­tion decla­ra­ti­ons in this context.

The final report con­ta­ins ele­ven recom­men­da­ti­ons, which are pri­ma­ri­ly addres­sed to Ricar­do; recom­men­da­ti­on B/2 con­cerns TX. TX and Ricar­do con­sider the­se recom­men­da­ti­ons to be irrele­vant becau­se they rela­te to out­da­ted facts and old law and are unfoun­ded in sub­stance. They have the­r­e­fo­re neither accept­ed nor rejec­ted the recom­men­da­ti­ons. The final report reflects the posi­ti­on of Ricar­do and TX as follows:

252 Both Ricar­do AG and TX Group AG essen­ti­al­ly cla­im that the FDPIC’s recom­men­da­ti­ons are based on a facts that no lon­ger exist and a law that no lon­ger applies would refer to. The FDPIC’s recom­men­da­ti­ons are the­r­e­fo­re irrele­vant and the cla­ri­fi­ca­ti­on of the facts should the­r­e­fo­re be writ­ten off. Ricar­do AG and TX Group AG decla­re that the recom­men­da­ti­ons of the FDPIC neither accept nor reject. From a mate­ri­al point of view, the con­tent of the recom­men­da­ti­ons unsta­ble. The par­ties reject the fin­ding of vio­la­ti­ons of the FADP and dis­pu­te the legal con­clu­si­ons of the fac­tu­al inve­sti­ga­ti­on: The data trans­mit­ted to TX Group AG is not per­so­nal data, which is why the FADP is not appli­ca­ble to the data pro­ce­s­sing under inve­sti­ga­ti­on. Fur­ther­mo­re, TX Group AG does not pro­cess per­so­na­li­ty pro­files. The gene­ral data pro­tec­tion prin­ci­ples, in par­ti­cu­lar the reco­gniza­bi­li­ty of the data pro­ce­s­sing, are com­plied with, so that the­re is no vio­la­ti­on of per­so­na­li­ty rights. Alt­hough no justi­fi­ca­ti­on would be requi­red for the data pro­ce­s­sing in que­sti­on, the Ricar­do users’ con­sent would be obtai­ned and the­re would be an over­ri­ding pri­va­te interest.

Ricar­do and TX sub­mit­ted comm­ents on the final report to the FDPIC and reque­sted that the­se comm­ents be published if the final report was published. The FDPIC did not com­ply with this request. Howe­ver, the comm­ents them­sel­ves are available here with the kind per­mis­si­on of Ricar­do and TX:

Dis­clai­merTX was repre­sen­ted in the inve­sti­ga­ti­on by Wal­der Wyss (inclu­ding the aut­hor of this artic­le), Ricar­do and SMG by Vischer (David Rosen­thal and his team).

Back­ground

Ricar­do is now part of the SMG Swiss Mar­ket­place Group (SMG), in which the TX Group invol­ved and ope­ra­tes one of the most suc­cessful Swiss online mar­ket­places. As with all online offe­rings, cer­tain data is coll­ec­ted from visi­tors, such as the IP addres­ses of the devices used and other usa­ge data. Fur­ther data is coll­ec­ted during regi­stra­ti­on and sub­se­quent use of the plat­form. Ricar­do may dis­c­lo­se this data to TX in pseud­ony­mi­zed form. Based on a non-per­so­nal iden­ti­fier, the data can then be lin­ked to other data and fur­ther pro­ce­s­sed in aggre­ga­ted form. Affi­ni­ties deri­ved from this can be used for tar­get group-spe­ci­fic online adver­ti­sing. As can be seen from the fin­dings, TX only uses aggre­ga­ted data wit­hout a per­so­nal iden­ti­fier and only seg­ments com­pri­sing at least 50 users. Ricar­do and TX have clas­si­fi­ed re-iden­ti­fi­ca­ti­on as de fac­to impossible.

The rele­vant data pro­tec­tion decla­ra­ti­ons were amen­ded in the cour­se of the cla­ri­fi­ca­ti­on of the facts. This is a cir­cum­stance that the FDPIC cites as the rea­son for the extra­or­di­na­ri­ly long dura­ti­on of the pro­ce­e­dings. Howe­ver, it is obvious that such adjust­ments were not the sole rea­son for this dura­ti­on. On the con­tra­ry, exce­s­si­ve­ly long pro­ce­du­res make chan­ges unavo­ida­ble. The FDPIC has alre­a­dy com­plai­ned about this seve­ral times, and right­ly so. Howe­ver, this pro­blem is not sol­ved by the com­pa­nies con­cer­ned free­zing their busi­ness acti­vi­ties at the level at which the pro­ce­e­dings were ope­ned, but by stream­li­ning the procedures.

The FDPIC’s recom­men­da­ti­ons are extra­or­di­na­ri­ly strict and are lar­ge­ly super­fi­ci­al or not sup­port­ed by the facts of the case or con­tra­dict estab­lished data pro­tec­tion law. Howe­ver, this is not sur­pri­sing, espe­ci­al­ly after the Cla­ri­fi­ca­ti­on of the facts in the case of Digi­tec Gala­xus and seve­ral demar­ches by the FDPIC sin­ce the new DPA came into force. The FDPIC’s hand­ling of data pro­tec­tion law, to which he is bound, can only be descri­bed as crea­ti­ve, even apart from any pole­mics. Howe­ver, the FDPIC’s task is to super­vi­se the appli­ca­ti­on of data pro­tec­tion law (Art. 4 para. 1 FADP) and not to deve­lop it further.

Tran­si­tio­nal law

The cla­ri­fi­ca­ti­on of the facts was initia­ted in 2017 under the old law. As with Digi­tec Gala­xus, it the­r­e­fo­re had to be con­clu­ded in accordance with the old DPA under tran­si­tio­nal law, as a result of the dura­ti­on of the pro­ce­e­dings bey­ond August 31, 2023. As a result, all of the FDPIC’s recom­men­da­ti­ons are based on the old law, which no lon­ger applies in gene­ral and the­r­e­fo­re also no lon­ger applies to TX and Ricar­do. Whe­ther the imple­men­ta­ti­on of the recom­men­da­ti­ons would be neces­sa­ry under the cur­rent DPA could not be the sub­ject of the cla­ri­fi­ca­ti­on of the facts, and the trans­fera­bi­li­ty of the FDPIC’s legal assess­ment to the new law can­not be assu­med wit­hout a cor­re­spon­ding examination.

Accor­din­gly, the FDPIC’s recom­men­da­ti­ons are not bin­ding, not even within the scope of Art. 29 aDSG. It is the­r­e­fo­re at least que­stionable whe­ther the FDPIC will still be allo­wed to issue recom­men­da­ti­ons after Sep­tem­ber 1, 2023 in the con­text of cla­ri­fi­ca­ti­ons of the facts under the old law – the­se are basi­cal­ly legal-histo­ri­cal considerations.

They can­not be enforced befo­re the FAC in this or any other way, becau­se an action by the FDPIC befo­re the FAC under the old DPA would start a new pro­ce­du­re for which the tran­si­tio­nal pro­vi­si­ons of the new DPA do not app­ly. The FDPIC would have to con­duct a new inve­sti­ga­ti­on, if at all, which would re-exami­ne the facts of the case under the direct appli­ca­ti­on of the FADP and con­duct a legal review under the new FADP.

Pro­ce­s­sing of per­so­nal data

In order to be able to make recom­men­da­ti­ons at all, the FDPIC must assu­me that per­so­nal data is being pro­ce­s­sed. Accor­ding to the FDPIC, a per­so­nal refe­rence must be affirm­ed becau­se a “refe­rence bet­ween the aggre­ga­ted data and the indi­vi­du­al user” remains if an iden­ti­fier is used. The con­cept of per­so­nal data must be inter­pre­ted “exten­si­ve­ly”: This is wrong; it must be inter­pre­ted, but accor­ding to the usu­al methods of inter­pre­ta­ti­on and not somehow “exten­si­ve­ly”.

The FDPIC then cor­rect­ly refers (in sub­stance) to the Logi­step case law, accor­ding to which not every theo­re­ti­cal pos­si­bi­li­ty of iden­ti­fi­ca­ti­on is suf­fi­ci­ent. It fol­lows from this, among other things, that pseud­ony­mi­zed data is not per­so­nal data for the body that has no pos­si­bi­li­ty of iden­ti­fi­ca­ti­on; pseud­ony­mizati­on is anony­mizati­on for the­se bodies.

In prin­ci­ple, one could stop the exami­na­ti­on here. Howe­ver, the FDPIC comes to the con­clu­si­on in a very brief con­side­ra­ti­on that the data trans­mit­ted by Ricar­do is per­so­nal, even for TX. Why? Becau­se the per­so­nal refe­rence does not requi­re a con­clu­si­on about the “civil iden­ti­ty”, but a pseud­onym is suf­fi­ci­ent. Alt­hough TX has no inte­rest in iden­ti­fi­ca­ti­on, this is not the only cri­ter­ion; it “may” be suf­fi­ci­ent if iden­ti­fi­ca­ti­on is possible.

The FDPIC first­ly igno­res the lack of an inte­rest in iden­ti­fi­ca­ti­on and second­ly does not exami­ne the iden­ti­fi­ca­ti­on effort. He the­r­e­fo­re omits the checks that he hims­elf descri­bes as neces­sa­ry, as well as a sur­vey of the facts – name­ly the iden­ti­fi­ca­ti­on pos­si­bi­li­ties – which would have made it pos­si­ble to clas­si­fy the data as per­so­nal in the first place. The FDPIC’s state­ments the­r­e­fo­re boil down to one of two vari­ants: eit­her he effec­tively allo­ws sin­gu­la­rizati­on to suf­fice as iden­ti­fi­ca­ti­on – at least in the online area, but wit­hout sta­ting this and con­tra­ry to the pre­vai­ling doc­tri­ne and his own state­ments – or he dis­re­gards clear data pro­tec­tion law.

The par­ties have also argued that TX only pro­ce­s­ses a deri­va­ti­ve of the trans­mit­ted data in aggre­ga­ted form and only uses seg­ments com­pri­sing at least 50, i.e. suf­fi­ci­ent K anony­mi­ty even if the iden­ti­fiers were to be trea­ted as per­so­nal data. The FDPIC does not say why he nevert­hel­ess assu­mes per­so­nal reference.

Recom­men­da­ti­on A/1: Data and purposes

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

1. it is cle­ar­ly reco­gnizable for Ricar­do users for which pur­po­ses which per­so­nal data is processed;

The FDPIC requi­res it to be made clear which per­so­nal data is pro­ce­s­sed for which pur­po­ses. Ricar­do and TX have alre­a­dy ful­fil­led this point by adap­ting their data pro­tec­tion decla­ra­ti­ons in line with gene­ral prac­ti­ce, pri­or to the cor­re­spon­ding recommendation.

It is not clear whe­re the FDPIC deri­ves this recom­men­da­ti­on from; pro­ba­b­ly from the prin­ci­ple of trans­pa­ren­cy and the duty to pro­vi­de infor­ma­ti­on. Howe­ver, the­re is no evi­dence in the mate­ri­als, case law or lite­ra­tu­re that the FADP fun­da­men­tal­ly requi­res an allo­ca­ti­on of data and pur­po­ses, and this would hard­ly be justi­fia­ble as a gene­ral requi­re­ment in the cur­rent FADP eit­her. Here too, howe­ver, the FDPIC’s state­ments cor­re­spond to tho­se in the Digi­tec Gala­xus final report.

In prac­ti­ce, howe­ver, a cer­tain com­bi­na­ti­on of data and pur­po­ses has beco­me wide­spread. Howe­ver, the main­ten­an­ce effort for the con­trol­ler, the rea­ding effort for the data sub­jects and the infor­ma­ti­on needs of the data sub­jects must be set in an appro­pria­te rela­ti­on­ship – the scope of the infor­ma­ti­on obli­ga­ti­on is of cour­se sub­ject to the gene­ral prin­ci­ple of proportionality.

The FDPIC also sug­gests that wit­hout such a link, data sub­jects are pre­ven­ted from objec­ting to data pro­ce­s­sing. Howe­ver, more gene­ral infor­ma­ti­on allo­ws for a grea­ter num­ber of pro­ce­s­sing ope­ra­ti­ons. A gene­ric list of data and pur­po­ses is more likely to lead to objec­tions than a gra­nu­lar one, and they are more com­pre­hen­si­ve – which is detri­men­tal to the con­trol­ler and not the data sub­ject. In addi­ti­on, the infor­ma­ti­on obli­ga­ti­on only requi­res gene­ric infor­ma­ti­on. It is a pre­cur­sor to the right to infor­ma­ti­on, which spe­ci­fi­es the infor­ma­ti­on on request of the data sub­ject with regard to the pro­ce­s­sing that con­cerns him/her.

This hier­ar­chy explains why the­re is both a duty to inform and a duty to pro­vi­de infor­ma­ti­on. This is why the duty to pro­vi­de infor­ma­ti­on also sta­tes that the infor­ma­ti­on requi­red to exer­cise the rights must be pro­vi­ded: This refe­rence refers in par­ti­cu­lar to the right to infor­ma­ti­on; this should be faci­li­ta­ted. This expres­ses the self-respon­si­bi­li­ty of the data sub­ject, and thus exempts from the duty to pro­vi­de infor­ma­ti­on that can be obtai­ned via the right of access. The FDPIC dis­re­gards this hierarchy.

Recom­men­da­ti­on A/2: Per­so­na­li­ty profiles

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

2. it is cle­ar­ly reco­gnizable for Ricar­do users whe­ther and, if so, which data pro­ce­s­sing leads to per­so­na­li­ty profiles;

The FDPIC reco­gnizes in the pro­ce­s­sing of Ricar­do or TX a for­ma­ti­on of per­so­na­li­ty pro­files, becau­se the lin­king of data by TX results in at least a par­ti­al pic­tu­re of the per­so­na­li­ty. Howe­ver, no abstract stan­dard applies to per­so­na­li­ty pro­files. This must be for­mu­la­ted in the pre­sent ten­se becau­se the per­so­na­li­ty pro­fi­le con­ti­nues to exist as a pro­duct of the pro­fil­ing pro­cess with a high risk. Howe­ver, pro­fil­ing is by no means a pre­re­qui­si­te for “high-risk” pro­fil­ing, as the FDPIC wron­gly sug­gests. After all, the final report makes it clear that the per­so­na­li­ty pro­fi­le must be the result of the pro­fil­ing pro­cess and the­r­e­fo­re refers to the out­put of the pro­fil­ing and not the input:

[…] “Pro­fil­ing” refers to a par­ti­cu­lar form of pro­ce­s­sing. From the logi­cal con­nec­tion bet­ween the result and the form of pro­ce­s­sing, in turn, it can be dedu­ced, based on the rele­vant doc­tri­ne, that auto­ma­ted pro­ce­s­sing that results in a legal “per­so­na­li­ty pro­fi­le” that allo­ws an assess­ment of essen­ti­al aspects of a person’s per­so­na­li­ty gene­ral­ly also ful­fills the qua­li­fi­ca­ti­on cri­te­ria of “high-risk profiling” […].

Howe­ver, qua­li­fi­ca­ti­on as a per­so­na­li­ty pro­fi­le depends on whe­ther the­re is a real risk of the data sub­ject being rest­ric­ted in their beha­vi­or or “self-pre­sen­ta­ti­on”. The decisi­ve fac­tor is the­r­e­fo­re pri­ma­ri­ly the spe­ci­fic use of the data, as the FDPIC explai­ned in the Cla­ri­fi­ca­ti­on of the facts in the case of Bici­c­letta had been recor­ded. How a clas­si­fi­ca­ti­on into inte­rest cate­go­ries such as “Car Lovers” or “Do-It-Yours­elf (DIY) Buy­ers” should lead to the neces­sa­ry risk remains open – in any case, such banal state­ments do not pro­vi­de an actu­al par­ti­al pic­tu­re. Howe­ver, the final report lacks both fac­tu­al fin­dings and legal expl­ana­ti­ons. But cer­tain­ly such an affi­ni­ty for­ma­ti­on does not lead to a chan­ge in a person’s beha­vi­or. And if the­re are objec­tions to such per­so­na­lizati­on in mar­ke­ting, the pro­blem should be loca­ted in fair­ness law rather than data pro­tec­tion law.

Howe­ver, it is not only the clas­si­fi­ca­ti­on as a per­so­na­li­ty pro­fi­le that is incor­rect; the con­clu­si­ons deri­ved from this are also incor­rect: It was clear to the data sub­jects in the pri­va­cy poli­cy that data is used for per­so­na­li­zed adver­ti­sing. The­re is no appa­rent basis for a more exten­si­ve trans­pa­ren­cy obli­ga­ti­on, in par­ti­cu­lar no obli­ga­ti­on to wri­te “per­so­na­li­ty pro­fi­le” in a pri­va­cy poli­cy (and the same applies to pro­fil­ing or high-risk pro­fil­ing under the cur­rent DPA).

Recom­men­da­ti­on A/3: Track­ing platforms

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

3. it is cle­ar­ly reco­gnizable for Ricar­do users which plat­forms are invol­ved in track­ing or data lin­king for adver­ti­sing purposes;

Here, the FDPIC recom­mends making it (more) clear which plat­forms are invol­ved in track­ing or data links for adver­ti­sing pur­po­ses. Here too: The aDSG did not have such an obli­ga­ti­on (nor does the cur­rent DSG). It is suf­fi­ci­ent to name cate­go­ries of reci­pi­en­ts, and the con­trol­ler may choo­se whe­ther to name indi­vi­du­al reci­pi­en­ts or only cate­go­ries, as can be seen from the mate­ri­als and lite­ra­tu­re as well as from a Bern judgment (sub­ject, of cour­se, to the reci­pi­en­ts’ obli­ga­ti­on to pro­vi­de infor­ma­ti­on as con­trol­lers in their own right and sub­ject to the sender’s con­trac­tu­al obli­ga­ti­on to dis­c­lo­se the name of the reci­pi­ent, which cor­re­sponds to prac­ti­ce in cer­tain con­stel­la­ti­ons). It is also not neces­sa­ry to spe­ci­fy part­ners from whom data is obtai­ned out­side of the right to infor­ma­ti­on. The hier­ar­chy from infor­ma­ti­on to dis­clo­sure (see abo­ve) is cle­ar­ly regu­la­ted by law: Sources are only to be pro­vi­ded with infor­ma­ti­on on request.

Recom­men­da­ti­on A/4: Spe­ci­fi­ca­ti­on of justi­fi­ca­ti­on rea­sons and pos­si­bi­li­ties of objection

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

4. it is cle­ar­ly reco­gnizable for Ricar­do users for which data pro­ce­s­sing Ricar­do AG reli­es on which justi­fi­ca­ti­ons and how the data pro­ce­s­sing can be objec­ted to if necessary.

In the Digi­tec Gala­xus case, he also reque­sted infor­ma­ti­on on which data pro­ce­s­sing is based on which justi­fi­ca­ti­ons and how the data pro­ce­s­sing can be objec­ted to. Howe­ver, this is not the case and the FDPIC does not justi­fy this request in detail.

On the one hand, a justi­fi­ca­ti­on is only requi­red if the­re is a vio­la­ti­on of per­so­na­li­ty rights. This is not the case. Second­ly, the­re is no legal obli­ga­ti­on to sta­te grounds for justi­fi­ca­ti­on in the pri­va­cy poli­cy. This is clear from the mate­ri­als: under the old Data Pro­tec­tion Act, it was dis­pu­ted whe­ther it was neces­sa­ry to sta­te grounds for justi­fi­ca­ti­on in the con­text of the right to infor­ma­ti­on becau­se it still requi­red infor­ma­ti­on on “the legal basis for pro­ce­s­sing”, but only in the con­text of the right to infor­ma­ti­on. This obli­ga­ti­on was deli­bera­te­ly not inclu­ded in the new DPA, not even in the right of access, and it cer­tain­ly can­not be deri­ved from the obli­ga­ti­on to pro­vi­de infor­ma­ti­on or the prin­ci­ple of trans­pa­ren­cy, as the obli­ga­ti­on to pro­vi­de infor­ma­ti­on is less far-rea­ching than the right of access. The FDPIC is awa­re of this fact: As part of the revi­si­on, he had sug­ge­sted that the duty to pro­vi­de infor­ma­ti­on should include a duty to pro­vi­de infor­ma­ti­on on the legal bases, but the legis­la­tor did not take up this con­cern. The fact that this does not pre­vent the FDPIC from postu­la­ting such an obli­ga­ti­on is remarkable.

The right to object also ari­ses from the law, which is assu­med to be known. The Swiss legis­la­tor has also refrai­ned from pro­vi­ding a sim­pli­fi­ca­ti­on requi­re­ment for data sub­ject rights, as the GDPR does. Dele­ti­on and objec­tion opti­ons and other data sub­ject rights do not have to be men­tio­ned in data pro­tec­tion decla­ra­ti­ons. TX and Ricar­do have nevert­hel­ess inclu­ded refe­ren­ces to era­su­re and objec­tion rights in their pri­va­cy poli­ci­es, as is com­mon prac­ti­ce, but is not requi­red by law.

Recom­men­da­ti­on A/5: Acce­s­si­bi­li­ty of the pri­va­cy policy

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

5. the pri­va­cy poli­cy is easy to find, com­pre­hen­si­ble and clear. One obvious imple­men­ta­ti­on opti­on is the mul­ti-level infor­ma­ti­on approach: At the top level, con­cise and easy-to-under­stand infor­ma­ti­on pro­vi­des an initi­al over­view of the key aspects of data pro­ce­s­sing; the detail­ed pri­va­cy poli­cy can then be acce­s­sed via a link;

As with Digi­tec Gala­xus, the FDPIC inclu­des his idea of imple­men­ta­ti­on as a sug­ge­sti­on in the recom­men­da­ti­on wit­hout direct­ly requi­ring or recom­men­ding it – this is hard­ly per­mis­si­ble, as recom­men­da­ti­ons in them­sel­ves must be dis­po­si­ti­ve. In any case, this requi­re­ment expres­ses the con­flic­ting objec­ti­ves of data pro­tec­tion decla­ra­ti­ons: a data pro­tec­tion decla­ra­ti­on should be detail­ed and com­ple­te as a mini­mum, but at the same time easy to read. This con­flict requi­res a com­pro­mi­se. The FADP does not spe­ci­fy what this may look like, and the comm­ents of the FDPIC are some­what con­tra­dic­to­ry here: he calls for a mul­ti-level approach, but cri­ti­ci­zes the refe­rence from Ricardo’s pri­va­cy poli­cy to that of TX (which is “mul­ti-level”) becau­se this would make it dif­fi­cult to understand.

As neither the FADP nor the GDPR con­tain spe­ci­fic requi­re­ments, it is up to the con­trol­ler to deci­de which infor­ma­ti­on is pla­ced whe­re, which infor­ma­ti­on is high­ligh­ted or moved to a first level, whe­re refe­rence is made to other infor­ma­ti­on and whe­re a sum­ma­ry is suf­fi­ci­ent for the sake of rea­da­bili­ty. As long as data sub­jects are not mis­led and all neces­sa­ry infor­ma­ti­on is pro­vi­ded, the details are at the dis­creti­on of the con­trol­ler – the FDPIC or courts should not inter­ve­ne wit­hout necessity.

Recom­men­da­ti­on A/6: Refe­ren­ces to the DPA

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

6. if refe­rence is made to the legal basis, the pri­va­cy poli­cy refers, whe­re appli­ca­ble, to the pro­vi­si­ons of the appli­ca­ble Data Pro­tec­tion Act (DSG) and not only to tho­se of the GDPR;

With recom­men­da­ti­on A/6, the FDPIC demands that the pri­va­cy poli­cy be amen­ded so that refe­rence is made not only to the pro­vi­si­ons of the GDPR, but also to tho­se of the DPA. This recom­men­da­ti­on is not con­vin­cing becau­se Ricardo’s rele­vant pri­va­cy poli­cy only refers to the GDPR in two places: when sta­ting that the GDPR could be appli­ca­ble and when refer­ring to the EU repre­sen­ta­ti­ve. It is not pos­si­ble to refer to the GDPR here. Howe­ver, this requi­re­ment would not be legal­ly justi­fi­ed eit­her, as neither the aDSG nor the cur­rent DSG requi­re refe­ren­ces to legal provisions.

In prac­ti­ce, data pro­tec­tion decla­ra­ti­ons often con­tain refe­ren­ces to pro­vi­si­ons of the GDPR becau­se such pro­vi­si­ons are sup­po­sed to be man­da­to­ry under the GDPR. As alre­a­dy men­tio­ned, the­re is no such obli­ga­ti­on in the FADP, and the posi­ti­on under the GDPR can­not be trans­fer­red to the FADP becau­se Swiss law does not reco­gnize a pro­hi­bi­ti­on prin­ci­ple for pri­va­te con­trol­lers and the­r­e­fo­re does not requi­re any “legal bases” for the pro­ce­s­sing of data. The FDPIC may be try­ing to empha­si­ze that the con­trol­ler must pro­vi­de justi­fi­ca­ti­ons – but, as alre­a­dy men­tio­ned, this is incorrect.

Recom­men­da­ti­on A/7: No exce­s­si­ve information

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

7. the pri­va­cy poli­cy reflects or lists the data pro­ce­s­sing actual­ly car­ri­ed out;

Digi­tec Gala­xus also repeats this recom­men­da­ti­on. It vio­la­tes the prin­ci­ple of trans­pa­ren­cy and the prin­ci­ple of good faith to also indi­ca­te data pro­ce­s­sing that is not or not yet being car­ri­ed out. This view can­not be endor­sed eit­her: The infor­ma­ti­on should defi­ne the expec­ta­ti­ons of the data sub­jects by sta­ting how the con­trol­ler intends to use per­so­nal data. Infor­ma­ti­on about pos­si­ble pro­ce­s­sing achie­ves this bet­ter than infor­ma­ti­on only about pro­ce­s­sing that is alre­a­dy live at the time the infor­ma­ti­on is pro­vi­ded. In this way, the data sub­ject lear­ns what to expect at the time he or she enters into a rela­ti­on­ship with the con­trol­ler and takes note of the pri­va­cy poli­cy for the first and usual­ly last time. The dis­patch on Art. 4 para. 4 aDSG alre­a­dy assu­med that infor­ma­ti­on about pos­si­ble pro­ce­s­sing may be pro­vi­ded, and the lite­ra­tu­re is unani­mous in this view, even recom­men­ding infor­ma­ti­on about pos­si­ble processing.

Refe­rence should again be made to the hier­ar­chy bet­ween the duty to inform and the duty to pro­vi­de infor­ma­ti­on: Anyo­ne who wants to know what data the con­trol­ler is pro­ce­s­sing can request a copy of this data with the accom­pany­ing infor­ma­ti­on in accordance with Art. 25 FADP – this pro­vi­des the neces­sa­ry con­cre­tizati­on. The legis­la­tor its­elf has thus pro­vi­ded for a mul­ti-level infor­ma­ti­on approach, and the FDPIC threa­tens to under­mi­ne this sen­si­ble assess­ment with exce­s­si­ve infor­ma­ti­on requirements.

Recom­men­da­ti­on A/8: Spe­ci­fi­ca­ti­on of dele­ti­on and objec­tion options

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

8. the pri­va­cy poli­cy descri­bes the cor­rect dele­ti­on or objec­tion opti­on depen­ding on the justi­fi­ca­ti­on for the data pro­ce­s­sing and its prac­ti­ce regar­ding dele­ti­on or objec­tion requests is imple­men­ted cor­rect­ly in this regard;

As men­tio­ned, the rights to era­su­re and objec­tion ari­se from the law. The legis­la­tor has also refrai­ned from pro­vi­ding for a sim­pli­fi­ca­ti­on requi­re­ment for data sub­ject rights, as the GDPR does. Dele­ti­on and objec­tion opti­ons and other data sub­ject rights the­r­e­fo­re do not have to be men­tio­ned in data pro­tec­tion decla­ra­ti­ons. TX and Ricar­do have nevert­hel­ess inclu­ded refe­ren­ces to rights of era­su­re and objec­tion in their pri­va­cy poli­ci­es, as is stan­dard practice.

Recom­men­da­ti­on A/9: Adap­t­ati­on of the Con­sent Manage­ment Platform

Ricar­do AG has adapt­ed the Ricar­do plat­form in such a way that

9. it is com­pre­hen­si­ble and reco­gnizable for users in the Con­sent Manage­ment Plat­form (CMP) which data pro­ce­s­sing takes place for which pur­po­ses, as well as the respec­ti­ve objec­tion opti­ons. Ricar­do must ensu­re that no data pro­ce­s­sing takes place if the sel­ec­tion in the CMP is set to “inac­ti­ve”.

The FDPIC’s recom­men­da­ti­on here rela­tes to the con­cern that the set­ting opti­ons in the imple­men­ta­ti­on of the CMP were not cle­ar­ly under­stan­da­ble at the time the facts were estab­lished. The­re was a lack of cla­ri­ty regar­ding the distinc­tion bet­ween acti­ve con­sent, an objec­tion and pas­si­ve beha­vi­or, which, unli­ke an objec­tion, does not pre­vent track­ing. The use of a CMP is legal­ly vol­un­t­a­ry, but a requi­re­ment of the mar­ket; and how CMPs should look is deter­mi­ned by the adver­ti­sing plat­forms, which, for exam­p­le, allow the use of the IAB Trans­pa­ren­cy & Con­sent Frame­work (see also here). In accordance with the GDPR, this distin­gu­is­hes bet­ween con­sent and legi­ti­ma­te inte­rest with a cor­re­spon­ding opti­on to object.

Recom­men­da­ti­on B/1: Obtai­ning con­sent for data pro­ce­s­sing for adver­ti­sing purposes

1. Ricar­do adapts the Ricar­do plat­form in such a way that in future it obta­ins the con­sent of Ricar­do users to the pro­ce­s­sing car­ri­ed out by Ricar­do and TX for adver­ti­sing pur­po­ses of the TX data offer com­pa­nies befo­re it coll­ects usa­ge data and pas­ses on per­so­nal data to TX for the­se pur­po­ses. This must be done vol­un­t­a­ri­ly and express­ly after appro­pria­te infor­ma­ti­on (see recom­men­da­ti­on A). Con­sent can be obtai­ned, for exam­p­le, by dis­play­ing a one-off pop-up at the next log­in, by adap­ting the regi­stra­ti­on form or by ticking a box in the CMP. As cross-plat­form track­ing may only take place with the user’s con­sent, the but­ton with the text “object to legi­ti­ma­te inte­rests” should not be displayed.

The FDPIC deri­ves a justi­fi­ca­ti­on requi­re­ment from the “hint” (?). Moreo­ver, he claims that the exami­na­ti­on of pro­por­tio­na­li­ty is “very clo­se­ly rela­ted in terms of con­tent to the exami­na­ti­on of the justi­fi­ca­ti­on of over­ri­ding inte­rest”, which is why it is “more appro­pria­te” to car­ry out the exami­na­ti­on at the justi­fi­ca­ti­on level. Howe­ver, data pro­ce­s­sing only needs to be justi­fi­ed if it leads to a vio­la­ti­on of per­so­na­li­ty rights. The FDPIC rever­ses this rela­ti­on­ship and thus assu­mes that every data pro­ces­sor is in breach of data pro­tec­tion law.

After the FDPIC has weig­hed up the inte­rests in this way, he comes to the con­clu­si­on that the inte­rests of the data sub­jects pre­vail. Howe­ver, the balan­cing of inte­rests is dou­bly incomplete.

First­ly, the FDPIC sees a risk that con­su­mers’ free­dom of choice will be rest­ric­ted and that “psy­cho­lo­gi­cal cha­rac­te­ri­stics and vul­nerabi­li­ties” will be exploi­ted. What the­se risks are sup­po­sed to result from remains unknown, no facts have been estab­lished and of cour­se the FDPIC can­not imply such a thing – apart from the fact that the pro­tec­tion of this free­dom of choice would be a com­pe­ti­ti­on law con­cern. The fact that the FDPIC can­not enforce mar­ket beha­vi­or under the tit­le of data pro­tec­tion has been known sin­ce the Hels­a­na ruling clear.

Second­ly, you can only weigh up what you have deter­mi­ned and weigh­ted first. Howe­ver, the inte­rests of Ricar­do, TX and the media indu­stry in per­so­na­li­zed mar­ke­ting are neither asses­sed nor weig­hed, nor are the mea­su­res to pro­tect the data sub­jects (rem­oval of all spea­king iden­ti­fiers, aggre­ga­ti­on, dis­clo­sure of only aggre­ga­ted (and thus, from Ricardo’s and TX’s point of view, anony­mi­zed) seg­ment data).

The result is the impres­si­on that the FDPIC wants to modo legis­la­to­ris intro­du­ce a gene­ral ban on per­so­na­li­zed adver­ti­sing in groups of com­pa­nies wit­hout con­sent – accor­ding to the FDPIC’s ide­as; con­sent was obtai­ned via the CMB. The enact­ment of such a ban lies out­side the FDPIC’s area of com­pe­tence. His task is to enforce data pro­tec­tion law. It would be the task of the legis­la­tor to spe­ci­fy this in con­cre­te terms – if so; howe­ver, to date no one apart from the FDPIC has cal­led for such a ban.

Recom­men­da­ti­on B/2: Dele­ti­on of data wit­hout consent

2. the TX must dele­te the exi­sting data of Ricar­do users that has alre­a­dy been coll­ec­ted for adver­ti­sing pur­po­ses by the TX data offer com­pa­nies, unless the Ricar­do users have given or obtai­ned legal­ly valid consent.

As a con­se­quence of the pre­vious recom­men­da­ti­ons, the FDPIC requi­res TX to dele­te the data of Ricar­do users unless con­sent is obtai­ned for pro­ce­s­sing. This is con­si­stent if, like the FDPIC, one assu­mes unlawful­ness, but of cour­se also pre­sup­po­ses this.

What remains?

The FDPIC’s final report in the Ricar­do and TX Group case is remar­kab­le for seve­ral rea­sons. First­ly, the dura­ti­on of the pro­ce­e­dings was extra­or­di­na­ri­ly long, also in com­pa­ri­son with other inve­sti­ga­ti­ons into the facts of the case, which also took years. As far as can be seen so far, this is chan­ging in the inve­sti­ga­ti­ons under the new law, which are being pro­gres­sed more quickly.

Second­ly, the FDPIC’s approach to data pro­tec­tion law is too per­mis­si­ve. Of cour­se, the FDPIC is entit­led to inter­pret data pro­tec­tion law as he sees fit, and the fact that this inter­pre­ta­ti­on is some­ti­mes stric­ter than that of com­pa­nies (and aut­ho­ri­ties) is in kee­ping with his role. Howe­ver, igno­ring estab­lished doc­tri­ne and case law is some­thing else – the FDPIC does not have to fol­low this, but he should not dis­re­gard the­se sources as a law-app­ly­ing func­tion. The exami­na­ti­on of the rele­vant facts is also too super­fi­ci­al. For exam­p­le, the FDPIC can­not come to the con­clu­si­on that per­so­nal data is being pro­ce­s­sed wit­hout first having estab­lished the iden­ti­fi­ca­ti­on pos­si­bi­li­ties and inte­rests as facts of the case.

The VwVG was only appli­ca­ble by ana­lo­gy to the old-law cla­ri­fi­ca­ti­ons of the facts, and becau­se they could not lead to bin­ding orders wit­hout judi­cial review, a cer­tain free­dom in the pro­ce­du­re is under­stan­da­ble. Howe­ver, the FDPIC unde­re­sti­ma­tes his role if he does not take into account the fac­tu­al bin­ding natu­re of his statements.

As far as (the aut­hor) is awa­re, no decis­i­ons by the FDPIC under the new law have yet been chal­len­ged in court, but it is a mat­ter of time. We will then see whe­ther the FDPIC adapts its approach in accordance with the new law. Howe­ver, all signs point to this. This may not be advan­ta­ge­ous for com­pa­nies in every case, but it is fun­da­men­tal­ly bene­fi­ci­al, and even more so for data protection.

As with the cla­ri­fi­ca­ti­on of the facts in the Digi­tec Gala­xus case and else­whe­re, this shows the par­ti­cu­lar importance that the FDPIC atta­ches to trans­pa­ren­cy. In prin­ci­ple, he is right to do so. Swiss data pro­tec­tion law is based in par­ti­cu­lar on trans­pa­ren­cy, as it does not requi­re a legal basis for data pro­ce­s­sing by pri­va­te indi­vi­du­als. The empha­sis is the­r­e­fo­re pla­ced even more than with the GDPR on the per­so­nal respon­si­bi­li­ty of the data sub­jects, and trans­pa­ren­cy is a pre­re­qui­si­te for this. Com­pa­nies that draft data pro­tec­tion decla­ra­ti­ons should the­r­e­fo­re do so with care, and what they would pre­fer not to say should be prin­ted in bold. Howe­ver, trans­pa­ren­cy is desi­gned as a tie­red system. If the enti­re weight is shifted to the data pro­tec­tion decla­ra­ti­ons becau­se it is belie­ved that data sub­jects will not exer­cise their right to infor­ma­ti­on, this not only con­tra­dicts expe­ri­ence, but abo­ve all the will of the legis­la­tor. This aspect was not taken into account in the pre­vious cla­ri­fi­ca­ti­on of the facts or here. Of cour­se, com­pa­nies must pro­vi­de the infor­ma­ti­on neces­sa­ry to under­stand their data pro­ce­s­sing. Howe­ver, it is not neces­sa­ry to assu­me that data sub­jects are inca­pa­ble of fin­ding their way through data pro­tec­tion decla­ra­ti­ons that are more com­plex in terms of their sub­ject mat­ter. And if data sub­jects do not under­stand some­thing, they should sim­ply ask – the lack of under­stan­ding of indi­vi­du­al per­sons may trig­ger an inve­sti­ga­ti­on by the FDPIC, but is not evi­dence of a legal defect.