Takea­ways (AI):
  • The Gui­de from FDPIC and pri­va­tim wur­de aktua­li­siert und bie­tet umfas­sen­de Anwei­sun­gen zu Wah­len und Abstimmungen.
  • Neue recht­li­che Vor­ga­ben beto­nen die joint respon­si­bi­li­ty bei der Daten­ver­ar­bei­tung durch Track­ing Pixel und Social Media-Anbieter.
  • Bedeut­sa­me Fra­gen blei­ben unklar, ins­be­son­de­re zur Order pro­ce­s­sing und den Pflich­ten von Auftragsbearbeitern.
  • Der Leit­fa­den betont die Not­wen­dig­keit von infor­mier­ter und selbst­be­stimm­ter Ein­wil­li­gung zur Ver­ar­bei­tung von Personendaten.

The FDPIC and pri­va­tim have joint­ly deve­lo­ped the Gui­de to elec­tions and voting updated. The pre­vious ver­si­on was dated June 1, 2019.

The gui­de is inten­ded to be an aid, but in part goes bey­ond the cur­rent as well as future law. Refe­ren­ces to the legal bases have been deli­bera­te­ly omit­ted in order to pro­mo­te gene­ral com­pre­hen­si­bi­li­ty, and the main aim is to “sen­si­ti­ze” (here one com­pe­tes with the Dozens of con­cernsfor which the fede­ral govern­ment also “rai­ses awa­re­ness”). Nevert­hel­ess, the gui­de is not limi­t­ed to prin­ci­ples, but con­ta­ins detail­ed ins­truc­tions for action. It is unclear whe­re the gui­de is to be under­s­tood as a recom­men­da­ti­on and whe­re as an inter­pre­ta­ti­on of bin­ding rules; howe­ver, it is most­ly for­mu­la­ted as an ins­truc­tion. This is part­ly due to the fact that it does not distin­gu­ish bet­ween the regu­la­ti­ons of the DPA and tho­se of the can­tons. Over­all, howe­ver, it is hard­ly gene­ral­ly com­pre­hen­si­ble and not always legal­ly con­vin­cing. A few comm­ents on this:

Shared respon­si­bi­li­ty:

  • If a par­ty has a Third par­ty track­ing pixel and ther­eby gene­ra­tes infor­ma­ti­on about the visi­tors to the web­site, which is sub­se­quent­ly used for tar­ge­ted addres­sing on social media, then the par­ty as the ope­ra­tor of the web­site and the social media pro­vi­der are joint­ly respon­si­ble. Howe­ver, this view is not substantiated.
  • This is pro­ba­b­ly based on the case law of the ECJ. In the Fan­pages decis­i­on the ECJ saw a joint respon­si­bi­li­ty due to the influen­cing of the sub­se­quent pro­ce­s­sing by Face­book for sta­tis­ti­cal pur­po­ses; after all, such an influen­cing was still requi­red here. In the Fashion ID decis­i­on it was alre­a­dy suf­fi­ci­ent to cau­se the data coll­ec­tion by Face­book in the com­mon eco­no­mic inte­rest, even wit­hout the fur­ther influence. The deve­lo­p­ment of case law is thus moving towards an expan­si­on of joint respon­si­bi­li­ty, wher­eby a con­scious coope­ra­ti­on in know­ledge of the essen­ti­al cir­cum­stances of the joint pro­ce­s­sing is pro­ba­b­ly neces­sa­ry, but also suf­fi­ci­ent. Howe­ver, this also beco­mes clear with the cor­re­spon­ding gui­de­lines of the EDSA not. The­r­e­fo­re, it remains open under which con­di­ti­ons and with refe­rence to what the FDPIC and pri­va­tim see a joint respon­si­bi­li­ty, which is why their refe­rence hard­ly helps the practice.
  • Eit­her way, shared respon­si­bi­li­ty requi­res that per­so­nal data is pro­ce­s­sed at all, which need not be the case with track­ing pixels (becau­se uni­que IDs are not always per­so­nal data under the FADP – even the nDSG).
  • Joint­ly respon­si­ble third par­ties shall be requi­red to “demon­stra­te that they are com­ply­ing with all data pro­tec­tion requi­re­ments.” What is meant by this?

Order pro­ce­s­sing:

  • The Gui­dance con­ta­ins refe­ren­ces to order pro­ce­s­sing and pro­vi­des in Appen­dix at C that “data bro­kers and data ana­ly­tics com­pa­nies” when acting as order processors,

    befo­re con­clu­ding the con­tract, they shall ensu­re that their cli­ent is wil­ling and tech­ni­cal­ly and orga­nizatio­nal­ly capa­ble of pro­ce­s­sing the data recei­ved in accordance with the set terms and con­di­ti­ons and the contract;

    Why would a pro­ces­sor have the­se obli­ga­ti­ons? While a pro­ces­sor may be joint­ly lia­ble if he or she kno­wing­ly con­tri­bu­tes to a data breach by the con­trol­ler, he or she has no obli­ga­ti­on to con­duct a cor­re­spon­ding audit.

Per­so­nal data requi­ring spe­cial pro­tec­tion:

  • An excep­tio­nal­ly broad under­stan­ding of per­so­nal data requi­ring spe­cial pro­tec­tion is to apply:

    Data that allow con­clu­si­ons to be drawn about poli­ti­cal or ideo­lo­gi­cal views are con­side­red to be par­ti­cu­lar­ly wort­hy of protection

    The fact that data “allow” con­clu­si­ons to be drawn can hard­ly be suf­fi­ci­ent, espe­ci­al­ly sin­ce the que­sti­on of per­so­nal refe­rence its­elf depends not only on the abstract pos­si­bi­li­ties, but also on the rea­li­sti­cal­ly available means and also the moti­va­ti­on of the body that has access to per­so­nal data. When it comes to the qua­li­fi­ca­ti­on as par­ti­cu­lar­ly wort­hy of pro­tec­tion, it can hard­ly be other­wi­se. Here, too, one seems to fol­low the Case law of the ECJ which is to be rejec­ted in any case accor­ding to the FADP (the newer can­to­nal laws work with a gene­ral clau­se, which could be inter­pre­ted accor­din­gly accor­ding to circumstances).

  • The FDPIC and pri­va­tim continue:

    Alt­hough the­re is no com­pre­hen­si­ve case law on this yet, it can be assu­med that digi­tal data pro­ce­s­sing in con­nec­tion with the poli­ti­cal pro­cess is gene­ral­ly sub­ject to the level of pro­tec­tion appli­ca­ble to par­ti­cu­lar­ly sen­si­ti­ve per­so­nal data, if only becau­se of its pur­po­se of influen­cing the ideo­lo­gi­cal views of many people.

    This state­ment is not cor­rect eit­her. Not all data “in con­nec­tion with the poli­ti­cal pro­cess” is par­ti­cu­lar­ly wort­hy of pro­tec­tion a prio­ri. The decisi­ve fac­tor is also not the inten­ti­on to influence views, but the signi­fi­can­ce of a data item in the con­text of its processing.

Con­sent:

  • Con­sent would have to be “self-deter­mi­ned”. Pre­su­ma­b­ly, this refers to the vol­un­t­a­ry natu­re of con­sent. This pre­sup­po­ses that the per­sons con­cer­ned can give their con­sent in a “dif­fe­ren­tia­ted” manner:

    Con­sent is self-deter­mi­ned if the data sub­jects can give dif­fe­ren­tia­ted con­sent with regard to the acti­va­ti­on or deac­ti­va­ti­on of indi­vi­du­al aspects and func­tion­a­li­ties of the digi­tal appli­ca­ti­ons (e.g., by set­ting appro­pria­te check­marks) and thus have a real choice not only whe­ther to make their data available, but also to what extent.

    And fur­ther down: 

    Can visi­tors indi­vi­du­al­ly (“gra­nu­lar­ly”) choo­se whe­ther or which of the web track­ing tools used they want to allow? 

    This reads as if, first­ly, con­sent is requi­red for track­ing, which is not the case under Swiss law, and as if you have to con­sent to track­ing mea­su­res indi­vi­du­al­ly, which is not even requi­red by the Ger­man or French aut­ho­ri­ties under the DSGVO and local regu­la­ti­ons for coo­kies, etc.

  • In addi­ti­on

    data sub­jects must be able to revo­ke their con­sent and request dele­ti­on of their data at any time. Mee­ting the­se demands requi­res invest­ment in data pro­tec­tion-fri­end­ly tech­no­lo­gies on the part of the players,

    This is not the case. Data sub­jects gene­ral­ly have the right to revo­ke con­sent, but the­re is no requi­re­ment in Swiss law to make revo­ca­ti­on as easy as con­sent, nor is the­re a gene­ral faci­li­ta­ti­on requi­re­ment for data sub­ject rights as under the GDPR. And the Pri­va­cy by Design prin­ci­ple only requi­res proac­ti­ve com­pli­ance with data pro­tec­tion; it does not pro­vi­de for addi­tio­nal obligations.

  • The con­sent must also informs be. Accor­ding to the gui­dance, this should requi­re, among other things, that tho­se affec­ted “also be infor­med about their rights” such as “that of revo­ca­ti­on at any time”. This is also taken one-to-one from the GDPR. Con­sent does not beco­me inva­lid under Swiss law if infor­ma­ti­on about revo­ca­ti­on is not pro­vi­ded, even under the can­to­nal laws that may requi­re such infor­ma­ti­on when it is pro­vi­ded (e.g., § 12 of the IDG ZH).
  • In addi­ti­on, the infor­ma­ti­on would only be suf­fi­ci­ent if it

    make the pur­po­ses and modes of ope­ra­ti­on of the digi­tal pro­ce­s­sing methods […] acce­s­si­ble in seve­ral levels of expl­ana­ti­on appro­pria­te to the addres­sees and, in par­ti­cu­lar, pro­vi­de infor­ma­ti­on about the dura­ti­on of the pro­ce­s­sing and the pos­si­ble for­war­ding of the data. The cas­ca­de of infor­ma­ti­on beg­ins with a cle­ar­ly visi­ble brief infor­ma­ti­on on the regi­stra­ti­on page, which explains the most important points of data pro­ce­s­sing. Each of the­se points con­ta­ins fur­ther links that take the rea­der to the rele­vant pas­sa­ges in the rele­vant pro­ce­s­sing regu­la­ti­ons and data pro­tec­tion provisions.

    It is obvious that this is not legal­ly binding.

  • Expli­cit con­sent:

    For express con­sent, an acti­ve act of con­sent by the data sub­ject is neces­sa­ry. […] Decla­ra­ti­ons by which per­sons mere­ly accept terms of use in a gene­ral man­ner, on the other hand, are not express consents.

    This is not gene­ral­ly cor­rect. If the terms of use express­ly pro­vi­de for the pro­ce­s­sing of par­ti­cu­lar­ly sen­si­ti­ve data and the user actively agrees to the terms, this con­sent is express. Whe­ther it is also vol­un­t­a­ry is ano­ther question.

Infor­ma­ti­on and trans­pa­ren­cy:

  • On a web­site, trans­pa­ren­cy would requi­re at least the fol­lo­wing infor­ma­ti­on, which can cer­tain­ly only app­ly under the can­to­nal data pro­tec­tion laws and would also go a long way here, for exam­p­le with regard to infor­ma­ti­on about “arti­fi­ci­al intelligence”:
    • the iden­ti­ty of the respon­si­ble hol­ders of the
      Collection
    • the cate­go­ries of the pro­ce­s­sed data
    • the pro­cu­re­ment of data with refe­rence to third-par­ty sources
    • the cur­rent pur­po­se and, if neces­sa­ry, the justi­fi­ca­ti­on of the processing
    • the pro­ce­s­sing methods, inclu­ding the pur­po­se and ope­ra­ti­on of the ana­ly­sis methods used, inclu­ding arti­fi­ci­al intelligence
    • the cate­go­ries of pos­si­ble data recipients
    • The roles, duties, and respon­si­bi­li­ties of data pro­vi­ders, data ana­ly­tics com­pa­nies, or data platforms
    • the appli­ca­ble terms of use of third par­ties and their
      References.
  • During the initi­al cont­act after an indi­rect data coll­ec­tion, the data sub­jects should be

    to indi­ca­te who is respon­si­ble for the com­mu­ni­ca­ti­on recei­ved, whe­re fur­ther infor­ma­ti­on on the rela­ted data pro­ce­s­sing can be obtai­ned and how the data subject’s rights can be asserted.

    If this means that this initi­al infor­ma­ti­on can not only con­tain a refe­rence to a data pro­tec­tion state­ment, but also infor­ma­ti­on on the rights of the data sub­ject, this is an attempt to come clo­ser to the cor­re­spon­ding state­ments of the EDSA on mini­mum first level infor­ma­ti­on in the case of gra­dua­ted infor­ma­ti­on (so-cal­led “basic infor­ma­ti­on”). Howe­ver, the­re is no basis for this inter­pre­ta­ti­on of the infor­ma­ti­on obli­ga­ti­on; a refe­rence to the pri­va­cy state­ment is pre­su­ma­b­ly sufficient.

  • The­r­e­fo­re, all data sub­jects must be able to exer­cise their rights to infor­ma­ti­on, cor­rec­tion and dele­ti­on in an appro­pria­te man­ner. This starts with informing them about their rights and how and whe­re they can assert them.

    In any case, the nDSG does not sti­pu­la­te that data sub­jects must be actively infor­med of their rights. Howe­ver, the can­to­nal data pro­tec­tion laws may con­tain such infor­ma­ti­on obligations.

Role of indi­vi­du­als:

  • Befo­re a per­son dis­c­lo­ses infor­ma­ti­on about third par­ties to par­ties, inte­rest groups, data trad­ers, data ana­ly­tics com­pa­nies or data plat­forms, they must obtain “their express con­sent in advan­ce” to do so and make sure that “soft­ware acce­s­ses this data that comes from relia­ble sources.” The last sen­tence is some­what omni­bus, and the per­son in que­sti­on is not requi­red to obtain con­sent in prin­ci­ple befo­re sha­ring eit­her, but he or she must inform the data sub­jects about the sha­ring. Accor­din­gly, the reci­pi­ent may also assu­me that this infor­ma­ti­on has been provided.

AI-gene­ra­ted takea­ways can be wrong.