Con­sul­ta­ti­on VE DSG: Opi­ni­on of pri­va­tim (sum­ma­ry)

pri­va­tim, the asso­cia­ti­on of Swiss data pro­tec­tion com­mis­sio­ners, today published a Sum­ma­ry of their opi­ni­on to Preli­mi­na­ry draft of the DPA published. The Con­sul­ta­ti­on on the preli­mi­na­ry draft is still open until April 4, 2017.

The opi­ni­on of pri­va­tim natu­ral­ly focu­ses on the regu­la­ti­ons on data pro­ce­s­sing by public bodies. Howe­ver, from the point of view of pri­va­te pro­ces­sors, the fol­lo­wing points are noteworthy:

Data pro­tec­tion impact assessment

  • A data pro­tec­tion impact assess­ment (DPIA) would have to be Befo­re each machi­ning becau­se the “increa­sed risk” could only be deter­mi­ned by the DSFA. This is logi­cal­ly quite under­stan­da­ble, but requi­res a distinction: 
    1. In a first step, a con­trol­ler (and only the con­trol­ler should have a duty to per­form a DIA, not also the pro­ces­sor!) must ask its­elf whe­ther the­re is likely to be an increa­sed risk (the cor­rect term would be: a “high” risk). This first step is alre­a­dy neces­sa­ry to ensu­re data secu­ri­ty, but can be done wit­hout any for­ma­li­ties and is not part of an actu­al DSFA.
    2. Howe­ver, an actu­al DSFA, which will pro­ba­b­ly be the sub­ject of the docu­men­ta­ti­on requi­re­ment under Art. 19 lit. a CA, should only be requi­red if an initi­al, form-free assess­ment has shown that the­re is a suf­fi­ci­ent risk (which should be “high”, not just “increa­sed”). To always requi­re such a DSFA would not only be imprac­ti­cal, but would also go far bey­ond Art. 35 para. 1 DSGVO out.
  • It is cla­ri­fi­ed that the Pro­tec­tion of fun­da­men­tal rightsIn the view of the can­to­nal data pro­tec­tion com­mis­sio­ners, the pro­tec­tion objec­ti­ve set out in Art. 16 (1) VE-DSG only applies to public bodies (as is alre­a­dy the case under cur­rent law). Accor­din­gly, pri­va­te par­ties only have to pro­tect the per­so­na­li­ty of data subjects.
  • The Pri­or checking (i.e. con­sul­ta­ti­on of the FDPIC) was insuf­fi­ci­ent. At least in the case of fede­ral bodies, the DIA (inclu­ding plan­ned mea­su­res) must be sub­mit­ted to the FDPIC. In any case, it would be wel­co­me that the DIA only has to be repor­ted to the FDPIC if it shows that a high risk is likely to remain despi­te the mea­su­res taken or plan­ned. Other­wi­se, the­re is not­hing for the FDPIC to exami­ne. The GDPR also fol­lows this approach (Art. 36 para. 1).

Pri­va­cy by Design / by Default

Here, the can­to­nal data pro­tec­tion com­mis­sio­ners rai­se the very legi­ti­ma­te que­sti­on of whe­ther the obli­ga­ti­ons implied in Art. 18(1) CA (Pri­va­cy by Design) are not alre­a­dy appli­ca­ble under Art. 11 CA (Data secu­ri­ty) exist. In the case of Art. 18(2) CA (Pri­va­cy by Default), pri­va­tim rai­ses the ana­log­ous que­sti­on with refe­rence to the Pro­por­tio­na­li­ty prin­ci­pleInde­ed, it is not evi­dent that or in what way Art. 18 CA requi­res more than Art. 4 and 11 CA.

pri­va­tim also doubts the appro­pria­ten­ess of the com­pen­sa­ti­on pro­vi­ded for a vio­la­ti­on of Art. 18 VE. Thre­at of punish­mentThis is pro­ba­b­ly in view of the con­sti­tu­tio­nal requi­re­ment of cer­tain­ty, which such a thre­at of sanc­tions could in fact hard­ly with­stand, at least not wit­hout a strong rest­ric­tion by the courts in the appli­ca­ti­on of the law (that the neces­sa­ry cer­tain­ty of a norm can be brought about not only by the legis­la­tu­re but also by the courts may be sur­pri­sing, but it is recognized).

Data por­ta­bi­li­ty and right to erasure

pri­va­tim recom­mends that Right to data por­ta­bi­li­ty (data por­ta­bi­li­ty) along the lines of the GDPR (Art. 20 GDPR; cf. the Working Paper of the Art. 29 Data Pro­tec­tion Working Par­ty) into the Include lawbut wit­hout going into detail about the con­tent and natu­re of this right. The lat­ter is a pity inso­far as this right in the Euro­pean legis­la­ti­ve pro­ce­du­re high­ly con­tro­ver­si­al was. The Euro­pean Par­lia­ment had even dele­ted it from its draft ver­si­on. One rea­son for the cri­ti­cism is that the right to data por­ta­bi­li­ty is not based on data pro­tec­tion law, but rather on intellec­tu­al pro­per­ty law and, abo­ve all, anti­trust law. Second­ly, it is feared that the data struc­tu­ring requi­red for trans­fer would lead to new secu­ri­ty risks and would even be coun­ter­pro­duc­ti­ve in this respect. The­re are also que­sti­ons about the distinc­tion bet­ween “pro­vi­ded data”, which has been trans­fer­red to the respon­si­ble par­ty and is sub­ject to por­ta­bi­li­ty, and “deri­ved data”, which has been gene­ra­ted by the respon­si­ble par­ty and is exclu­ded from the cla­im to portability.

pri­va­tim fur­ther recom­mends, also fol­lo­wing the exam­p­le of the DSGVO (Art. 17) also a To estab­lish the right to data era­su­re. Howe­ver, such a right alre­a­dy results from Art. 23(2)(b) of the preli­mi­na­ry draft, accor­ding to which pro­ce­s­sing con­tra­ry to the decla­ra­ti­on of intent of the data sub­ject is unlawful. The fact that justi­fi­ca­ti­on opti­ons remain does not chan­ge this; a right to era­su­re can­not be uncon­di­tio­nal any­way, and is not accor­ding to Art. 17 GDPR.

Pro­ce­du­ral issues

pri­va­tim also wel­co­mes the pro­po­sed exemp­ti­on from costs of data pro­tec­tion pro­ce­e­dings. What is also nee­ded, howe­ver, is a Rever­sal of bur­den of pro­ofbecau­se in many cases it is not even pos­si­ble for the per­sons con­cer­ned to pro­vi­de evi­dence of unaut­ho­ri­zed editing. The lat­ter may be true. Howe­ver, one must ask whe­ther a rever­sal of the bur­den of pro­of com­bi­ned with the fact that the pro­ce­du­re is free of char­ge does not vir­tual­ly invi­te abu­se. If the­re are no indi­ca­ti­ons of data pri­va­cy vio­la­ti­ons, the­re is no need for an incen­ti­ve for pro­ce­e­dings. Howe­ver, if the­re are indi­ca­ti­ons, then the pro­ce­du­ral duties to coope­ra­te of the pro­ces­sor (Art. 160 ff. CCP) should actual­ly suf­fice. A breach of the duty to coope­ra­te may in prac­ti­ce come clo­se to a rever­sal of the bur­den of pro­of (cf. Art. 164 and 167 CCP). At most, it would have to be exami­ned whe­ther an optio­nal rever­sal of the bur­den of pro­of in indi­vi­du­al cases along the lines of Art. 13a para. 1 UWG may be justi­fi­ed (alt­hough the initi­al situa­ti­on the­re is different).

Sanc­tions

The state­ment of pri­va­tim on the plan­ned sanc­tions is to be wel­co­med. It is wrong to “pass on full defi­ci­ts to cri­mi­nal law.”

With the new pro­vi­si­ons, the cri­mi­nal judge enters into com­pe­ti­ti­on with the data pro­tec­tion super­vi­so­ry aut­ho­ri­ty, which makes neither insti­tu­tio­nal nor fac­tu­al sen­se. Many of the new cri­mi­nal pro­vi­si­ons lack spe­ci­fi­ci­ty, so that they con­tra­dict the prin­ci­ple of “Nulla poe­na sine lege”. In addi­ti­on, the cri­mi­nal pro­vi­si­ons descri­bed abo­ve do not ful­ly imple­ment the requi­re­ments of Direc­ti­ve (EU) 2016/680 and Art. 12bis para. 2 lit.c E‑SEV 108. The EU as well as the Coun­cil of Euro­pe also expli­ci­t­ly requi­re admi­ni­stra­ti­ve sanc­tions that the Com­mis­sio­ner can impose.

Instead, the FDPIC should be able to impo­se admi­ni­stra­ti­ve sanc­tions, even if this requi­res an expan­si­on of the FDPIC’s resour­ces. If neces­sa­ry, the orga­nizati­on of the FDPIC should be adapt­ed to that of the Comco.

Other pro­vi­si­ons

The pri­va­tim state­ment addres­ses other points, inclu­ding the following:

  • Scope: The FADP should not app­ly to the par­ties during ongo­ing pro­ce­e­dings – this is alre­a­dy the case today and is cer­tain­ly cor­rect; other­wi­se, the right to infor­ma­ti­on will beco­me even more of a sub­sti­tu­te for pre­cau­tio­na­ry evi­dence under Art. 158 CCP.
  • Spe­cial per­so­nal dataRace” should be dele­ted. “Bio­me­tric” data, on the other hand, is too broad – a pho­to­graph, for exam­p­le, should not be inclu­ded. pri­va­tim pro­po­ses an alter­na­ti­ve definition.
  • Pro­fil­ing: The defi­ni­ti­on of “pro­fil­ing” is not cri­ti­ci­zed (not even the inclu­si­on of non-auto­ma­ted pro­fil­ing with fac­tu­al data) – on the con­tra­ry: “Howe­ver, it is com­ple­te­ly insuf­fi­ci­ent if pro­fil­ing is then vir­tual­ly “waved through” in the area-spe­ci­fic data pro­tec­tion law (in the fede­ral laws to be adapt­ed) with blan­ket aut­ho­rizati­ons. What is requi­red is that clear and strict frame­work con­di­ti­ons for pro­fil­ing be spe­ci­fi­ed in the fede­ral laws.”
  • Con­sentThe requi­re­ment of “unam­bi­guous­ness” is wel­co­med. Howe­ver, what “expli­cit” means should be cla­ri­fi­ed in the message.
  • Good prac­ti­ce recom­men­da­ti­onsThis insti­tu­te, which tends to be wel­co­med in busi­ness cir­cles if it is real­ly the inte­re­sted par­ties who draw up such recom­men­da­ti­ons, is view­ed cri­ti­cal­ly. Among other things, too litt­le is expres­sed in Art. 9 CA, “that com­pli­ance with good prac­ti­ce recom­men­da­ti­ons is mere­ly a sta­tu­to­ry pre­sump­ti­on of com­pli­ance with data pro­tec­tion rules.”
  • Data secu­ri­tyData pro­tec­tion goals should be defined.
  • Data of a decea­sed per­sonThe pro­vi­si­on of Art. 12 CA is wel­co­med in prin­ci­ple, but cri­ti­ci­zed in detail. Among other things, the exclu­si­on of offi­ci­al and pro­fes­sio­nal secrets is “extre­me­ly problematic”.
  • Auto­ma­ted indi­vi­du­al case decis­i­onsFor the pri­va­te sec­tor, the pro­po­sed Art. 15 VE is welcomed.
  • Breach noti­fi­ca­ti­on”: Here it is reque­sted to defi­ne the data breach that trig­gers an obli­ga­ti­on to noti­fy. The fol­lo­wing defi­ni­ti­on is proposed:

    A data breach occurs when secu­ri­ty is brea­ched such that pro­ce­s­sed per­so­nal data is irre­trie­v­a­b­ly destroy­ed or lost, inad­ver­t­ent­ly or unlawful­ly alte­red or dis­c­lo­sed, or unaut­ho­ri­zed per­sons gain access to such per­so­nal data.”

    This rest­ric­tion to qua­li­fi­ed brea­ches of data secu­ri­ty is to be wel­co­med. An obli­ga­ti­on to report other unaut­ho­ri­zed pro­ce­s­sing such as late dele­ti­on would not only be imprac­ti­cal, but also con­sti­tu­tio­nal­ly que­stionable (“nemo tenetur”). The GDPR also limits the noti­fi­ca­ti­on obli­ga­ti­on to secu­ri­ty brea­ches (Art. 34 par. 1 i.V.m. Art. 4 No. 12 GDPR).

  • Docu­men­ta­ti­on requi­re­ment: Fol­lo­wing the con­cern of shif­ting the bur­den of pro­of, pri­va­tim demands that the docu­men­ta­ti­on obli­ga­ti­on con­cerns com­pli­ance with data pro­tec­tion as a who­le. Howe­ver, this is pro­ba­b­ly impos­si­ble. Even a cer­ti­fi­ed data pri­va­cy manage­ment system can­not ensu­re com­pli­ance with data pri­va­cy or cor­re­spon­ding docu­men­ta­ti­on. Only defi­nable and mea­sura­ble para­me­ters should be sub­ject to documentation.

Aut­ho­ri­ty

Area

Topics

Rela­ted articles

Sub­scri­be