Take-Aways (AI)
  • Pri­va­tim iden­ti­fi­es access by for­eign aut­ho­ri­ties as a key cloud-spe­ci­fic data pro­tec­tion risk and recom­mends loca­ti­on, con­tract and encryp­ti­on measures.
  • The aut­hor cri­ti­ci­zes sche­ma­tic cate­go­rizati­ons (e.g. “par­ti­cu­lar­ly sen­si­ti­ve data”) and calls for case-based risk ana­ly­ses and accep­ta­ble, non-zero-tole­rance risk decisions.

New leaf­let

pri­va­tim, the influ­en­ti­al con­fe­rence of Swiss data pro­tec­tion com­mis­sio­ners, published a new ver­si­on of its fact sheet “Cloud-spe­ci­fic risks and mea­su­res” on Febru­ary 14, 2022. The older and now repla­ced ver­si­on was from Decem­ber 2019 (see also here):

The aim of the fact sheet is to iden­ti­fy the spe­ci­fic data pro­tec­tion risks in cloud and simi­lar ser­vices and to show how public bodies can assu­me their respon­si­bi­li­ties in this con­text. Accor­din­gly, it refers neither to gene­ral risks of order pro­ce­s­sing nor to con­cerns out­side of data and sec­re­cy protection.

Basic infor­ma­ti­on on cloud risks

pri­va­tim assu­mes that the spe­ci­fic cloud risks are deter­mi­ned by five factors:

  • Con­tract design
  • Loca­ti­ons of data pro­ce­s­sing inclu­ding for­eign aut­ho­ri­ty accesses
  • Confidentiality/secret pro­tec­tion, encryp­ti­on and key management
  • Data about the users of the cloud services
  • Sub­con­trac­ting relationships

In other words, pri­va­tim con­siders pos­si­ble Access by for­eign aut­ho­ri­ties as the main cloud-spe­ci­fic (data pro­tec­tion) risk – right­ly so; all other risk fac­tors are not cloud-spe­ci­fic, but also ari­se in other con­tract pro­ce­s­sing operations.

In addi­ti­on, howe­ver, pri­va­tim sees fur­ther risks that can increa­se through the use of a cloud or that need to be considered:

  • Report­ing requirements
  • Right and pos­si­bi­li­ty of control
  • Infor­ma­ti­on secu­ri­ty measures
  • Obli­ga­ti­ons in the event of con­tract ter­mi­na­ti­on Depen­den­ci­es on the ser­vice pro­vi­der (avai­la­bi­li­ty, migra­ti­on effort in the event of a change)

Notes on the main factors

Con­tract design

Recom­men­da­ti­ons privatim

  • in wri­ting
  • Cover pro­por­tio­na­li­ty and earmarking
  • Duty to pro­vi­de sup­port for claims under data pro­tec­tion law
  • Con­trol rights with regard to con­tract compliance
  • Enforce­ment befo­re easi­ly acce­s­si­ble courts in a fami­li­ar legal system; in prin­ci­ple the­r­e­fo­re Swiss law and place of juris­dic­tion in Switzerland
  • Excep­ti­on: the data is effec­tively pro­tec­ted from access by the pro­vi­der and third par­ties by means of encryp­ti­on or the place of juris­dic­tion and the appli­ca­ble law are tho­se of a sta­te with an ade­qua­te level of pro­tec­tion. Howe­ver, this excep­ti­on does not app­ly if the data con­cer­ned are par­ti­cu­lar­ly wort­hy of pro­tec­tion or con­tain per­so­na­li­ty pro­files or secrets.

Notes

The recom­men­da­ti­ons of pri­va­tim are basi­cal­ly just that – recom­men­da­ti­ons. As such, they can hard­ly be wrong, they are just more or less convincing.

Howe­ver, the view that the cate­go­rizati­on of data con­cer­ned as par­ti­cu­lar­ly wort­hy of pro­tec­tion or per­so­na­li­ty pro­fi­le is a decisi­ve fac­tor in its­elf is not con­vin­cing. Neither is access by the aut­ho­ri­ties more likely across the board for such data, nor does it have to be more dama­ging (unli­ke access by a pri­va­te attacker, whe­re the­re is a risk of publi­ca­ti­on on the dark­net; here, an abstract con­side­ra­ti­on is justi­fi­ed, this in distinc­tion to access by the authorities).

Alt­hough the data pro­tec­tion laws link cer­tain legal con­se­quen­ces to the sta­tus of data requi­ring spe­cial pro­tec­tion (or spe­cial per­so­nal data in the more recent can­to­nal decrees), this is spe­ci­fi­cal­ly in rela­ti­on to cer­tain points (e.g. the requi­red stan­dard level for the legal basis) and not in rela­ti­on to dis­clo­sure abroad. The cate­go­ry of the spe­cial per­so­nal data can the­r­e­fo­re at most have an indi­ca­ti­ve effect in the case of dis­clo­sure abroad.

This applies simi­lar­ly to secretsThe decisi­ve fac­tor is not whe­ther a secret exists, becau­se the sta­tus of a secret as such says not­hing about the risk of access. For exam­p­le, the­re is no rea­son to assu­me that for­eign aut­ho­ri­ties are gene­ral­ly par­ti­cu­lar­ly inte­re­sted in secrets. On the con­tra­ry, if one looks at FISA and ana­log­ous legal bases in other count­ries, it beco­mes clear that the focus is not on secrets, but rather on com­mu­ni­ca­ti­ons data, for example.

It is the­r­e­fo­re for each type of secret a sepa­ra­te and open-ended risk ana­ly­sis to be car­ri­ed out, based on

  • the natu­re of the secret and espe­ci­al­ly the inte­rests pro­tec­ted by the secret,
  • the likeli­hood of access by for­eign aut­ho­ri­ties and
  • the nega­ti­ve con­se­quen­ces in the event of such access.

Only on this basis is a meaningful cate­go­rizati­on of infor­ma­ti­on assets accor­ding to risks possible.

In the case of gene­ral offi­ci­al sec­re­cy, for exam­p­le, it must be bor­ne in mind that this has a dual pro­tec­ti­ve pur­po­se, in that it pro­tects genui­ne sta­te inte­rests on the one hand and the citizen’s trust in his or her inter­ac­tion with the sta­te on the other. Genui­ne sta­te inte­rests, howe­ver, are by no means affec­ted by all infor­ma­ti­on sub­ject to gene­ral offi­ci­al sec­re­cy, and the citizen’s trust is not always at sta­ke eit­her, and even whe­re it is, it does not requi­re zero risk.

With regard to the appli­ca­ble law and the place of juris­dic­tion, the cate­go­rizati­on as par­ti­cu­lar­ly wort­hy of pro­tec­tion is like­wi­se not decisi­ve for whe­ther Swiss law or juris­dic­tion in Switz­er­land must be agreed. It is true that this is sen­si­bly agreed (pro­vi­ded that a judgment is enforceable in the provider’s sta­te). But for a small muni­ci­pa­li­ty, this point is likely to be more important than a public body having the means and know-how to sue abroad.

Loca­ti­ons of data pro­ce­s­sing inclu­ding for­eign aut­ho­ri­ty accesses

Recom­men­da­ti­ons privatim

  • Decla­ra­ti­on of the loca­ti­ons by the provider
  • Loca­ti­on Switz­er­land is to be pre­fer­red (among other things due to the con­trol pos­si­bi­li­ties, but also for other secu­ri­ty considerations)
  • Any obli­ga­ti­ons to dis­c­lo­se under the CLOUD Act must be taken into account.
  • Pro­ce­s­sing abroad – inclu­ding acce­s­ses from abroad – is only per­mis­si­ble if the level of pro­tec­tion is equi­va­lent. This can be sub­sti­tu­ted by stan­dard con­trac­tu­al clau­ses, pro­vi­ded that no acce­s­ses are pos­si­ble in the tar­get sta­te on the basis of a legal basis that is insuf­fi­ci­ent under fun­da­men­tal rights:

    Data pro­ce­s­sing at for­eign sites is only per­mit­ted in count­ries that have an equi­va­lent level of data pro­tec­tion or in which ade­qua­te data pro­tec­tion can be achie­ved by con­tract – name­ly through reco­gnized stan­dard con­trac­tu­al clau­ses. The lat­ter is then not the case, if offi­ci­al access is pos­si­ble in the sta­te in que­sti­on, which do not satis­fy the con­sti­tu­tio­nal gua­ran­tees of fun­da­men­tal rights (prin­ci­ple of lega­li­ty, pro­por­tio­na­li­ty, rights of the per­sons con­cer­ned, access to inde­pen­dent courts). In this case, addi­tio­nal mea­su­res (in par­ti­cu­lar effec­ti­ve encryp­ti­on) must be taken to ensu­re that the trans­fer of per­so­nal data abroad is permissible.

Notes

pri­va­tim pro­ce­eds on the basis of the gene­ral prin­ci­ples of for­eign trans­fers. It appears, howe­ver, that pri­va­tim is using an ina­de­qua­te legal basis for the acce­s­ses (in Schrems-II-lan­guage: a “pro­ble­ma­tic legis­la­ti­on” such as FISA in the USA) requi­res zero risk. This is not cor­rect – Data pro­tec­tion law does not requi­re zero risk. This applies not only to pri­va­te but also to public data pro­tec­tion law, becau­se the fun­da­men­tal rights of affec­ted indi­vi­du­als must be pro­tec­ted, but not abso­lut­e­ly. Public bodies, too, may and must ask them­sel­ves how they can struc­tu­re the admi­ni­stra­ti­on effi­ci­ent­ly and use their resour­ces eco­no­mic­al­ly. In gene­ral, gene­ral data secu­ri­ty law does not requi­re zero risk, even for public bodies, and this is no dif­fe­rent for for­eign dis­clo­sure than for data secu­ri­ty mea­su­res in general.

If a coun­try enables access by aut­ho­ri­ties, for exam­p­le, by means of two laws, one of which satis­fies the requi­re­ments of fun­da­men­tal rights (e.g., the CLOUD Act in the case of the USA), but the other does not (e.g., FISA in the case of the USA), this only stands in the way of dis­clo­sure to the cor­re­spon­ding coun­try if it is to be expec­ted that the defi­ci­ent law will be used at all for the rele­vant data.

In terms of data pro­tec­tion law, the pro­ba­bi­li­ty of access by the aut­ho­ri­ties must the­r­e­fo­re be asses­sed when pro­ce­s­sing the data loca­ti­on (see the Gui­de­lines of the EDSA on Schrems II mea­su­res. and the cur­rent stan­dard con­trac­tu­al clau­seswhich expli­ci­t­ly requi­re a pro­ba­bi­li­ty-based approach in clau­se 14, and which the FDPIC has reco­gnized). This applies not only if access via the CLOUD Act is in que­sti­on, but also if data is loca­ted in a sta­te wit­hout an ade­qua­te level of pro­tec­tion or if cor­re­spon­ding access is pos­si­ble. Of cour­se, howe­ver, an access risk is hig­her if data is stored on site and not only access is pos­si­ble – the­se are all fac­tors that are rele­vant in the con­text of the pro­ba­bi­li­ty ana­ly­sis. Inci­den­tal­ly, this does not say what risk is still accep­ta­ble – that is for the insti­tu­ti­on con­cer­ned to decide.

Inte­re­st­ing in this con­text is a remark to which pri­va­tim – deli­bera­te­ly? – only allo­ws a footnote:

As far as Access to aut­ho­ri­ties wit­hout informing the respon­si­ble public body can take place, the Pro­ba­bi­li­ty of occur­rence not ass­essa­ble (becau­se it is not veri­fia­ble), so that the main focus is on the ext­ent of the dama­ge, whe­re the data qua­li­ty is decisi­ve (sen­si­ti­ve per­so­nal data).

Even if the pro­vi­der is pro­hi­bi­ted from pro­vi­ding infor­ma­ti­on (by a “gag order”), the que­sti­on is with what pro­ba­bi­li­ty this is the case. That the access is sur­rep­ti­tious can, after all, is not an expres­si­on of a pro­ble­ma­tic legal system, but rather a mat­ter of cour­se in cer­tain cases. It does not mean that such acce­s­ses are not sub­ject to clear rules, nor that they are fre­quent, that sur­rep­ti­tious access is more harmful to the per­son con­cer­ned than open access, or that the­re are no figu­res at all on such acce­s­ses. It is the­r­e­fo­re wrong that in the case of a pos­si­ble gag order, the pro­ba­bi­li­ty of occur­rence could or should not be taken into account. At most, it can be esti­ma­ted with less cer­tain­ty, so that the cor­re­spon­ding risk may be esti­ma­ted some­what hig­her due to the fuz­ziness. Pro­vi­ders, howe­ver, often have a duty to take legal action against aut­ho­ri­ty access even if the cus­to­mer can­not issue a cor­re­spon­ding ins­truc­tion becau­se he knows not­hing about it. And again: whe­ther a data item is par­ti­cu­lar­ly wort­hy of pro­tec­tion is not decisi­ve in the pre­sent context.

Also the Sec­re­cy law does not, as noted abo­ve, requi­re zero risk of access by the aut­ho­ri­ties, at least not as a rule.

Confidentiality/secret pro­tec­tion, encryp­ti­on and key management

Recom­men­da­ti­ons privatim

  • At least in tran­sit data are to be encrypted
  • data must be ade­qua­te­ly pro­tec­ted during fur­ther processing
  • for per­so­nal data requi­ring spe­cial pro­tec­tion, more strin­gent requi­re­ments app­ly: it must be encrypt­ed by the public body, and the keys must be available only to the public body
  • decryp­ti­on at the provider’s pre­mi­ses requi­res the public body to demon­stra­te that the­re are “no unac­cep­ta­ble risks to the fun­da­men­tal rights of the data sub­jects”. Howe­ver, the pro­vi­der may only retain the keys here if it under­ta­kes to use them only with the express con­sent of the insti­tu­ti­on, and acce­s­ses must be logged
  • Secrets may only be made acce­s­si­ble to the pro­vi­der to the ext­ent that the sec­re­cy regu­la­ti­ons per­mit the invol­vement of auxi­lia­ry per­sons, and, if neces­sa­ry, in com­pli­ance with the cor­re­spon­ding requirements.
  • the unju­sti­fi­ed vio­la­ti­on of sec­re­cy rules should be con­side­red “as a legal bar­ri­er to out­sour­cing and not just as a risk”

Notes

The distinc­tion bet­ween nor­mal per­so­nal data and data requi­ring spe­cial pro­tec­tion is not the appro­pria­te cri­ter­ion here eit­her. This plays out in two ways: It may be that the mea­su­res out­lined by pri­va­tim are also neces­sa­ry for per­so­nal data that is not par­ti­cu­lar­ly wort­hy of pro­tec­tion, and con­ver­se­ly, they are not always so even for per­so­nal data that is par­ti­cu­lar­ly wort­hy of pro­tec­tion. The decisi­ve fac­tor is risk con­side­ra­ti­ons that take into account the enti­re cir­cum­stances, and not the abstract clas­si­fi­ca­ti­on of data. Moreo­ver, as with all secu­ri­ty mea­su­res, the public body may – must – also take into account prac­ti­cal­i­ty considerations.

With the Secrets it is cor­rect that the rele­vant secret is to be inter­pre­ted. Howe­ver, privatim’s com­ment that the vio­la­ti­on should be under­s­tood as a bar­ri­er and not only a risk is unclear. It is pos­si­ble that pri­va­tim means that the public body must pre­vent an objec­ti­ve dis­clo­sure and can­not excu­se its­elf with the sub­jec­ti­ve lack of fault. This is true. Howe­ver, it would be a cir­cular argu­ment to dedu­ce that the objec­ti­ve facts requi­re zero risk. Rather, the start­ing point is the Secret inte­restand in the vast majo­ri­ty of cases – except per­haps in the case of actu­al sta­te secrets – this will not be less, but also not more than ade­qua­te safe­guards requi­re. Not every resi­du­al risk of dis­clo­sure by the pro­vi­der to an aut­ho­ri­ty the­r­e­fo­re makes out­sour­cing to the pro­vi­der inadmissible.

Data about the users of the cloud services

Recom­men­da­ti­ons privatim

  • If a pro­vi­der gene­ra­tes data about users during cloud use (e.g., edge, tele­me­try, or log­ging data), it must be trea­ted with the same care as com­mu­ni­ca­ted data. They must be sub­ject to the same con­trac­tu­al pro­vi­si­ons, and the pro­tec­tion of data sub­ject rights and dele­ti­on must be ensured
  • such data may be used only for pur­po­ses that would also be per­mit­ted to the public body, e.g. for non-per­so­nal purposes
  • such data can be par­ti­cu­lar­ly wort­hy of pro­tec­tion and are then sub­ject to the cor­re­spon­ding specifications

Notes

It is true that mar­gi­nal data must be trea­ted with the same care as other data, but this only means that the same stan­dards must be applied to the risk assess­ment, and not that such mar­gi­nal data must neces­s­a­ri­ly be clas­si­fi­ed as requi­ring the same level of pro­tec­tion as other data. On the sub­ject of per­so­nal data requi­ring spe­cial pro­tec­tion, see abo­ve. Fur­ther­mo­re, the state­ment that per­so­nal data requi­ring spe­cial pro­tec­tion exists if mar­gi­nal data indi­ca­te “that a data sub­ject is in a psych­ia­tric or penal insti­tu­ti­on” is not cor­rect as a gene­ral rule. Inci­den­tal­ly, when mar­gi­nal data are used for the provider’s own pur­po­ses, the pro­vi­der is hard­ly a com­mis­sio­ned pro­ces­sor, but a controller.

Sub­con­trac­ting rela­ti­on­ships (sub­con­trac­ting)

Recom­men­da­ti­ons privatim

  • Pri­or to the con­clu­si­on of the con­tract, the pro­vi­der must dis­c­lo­se its sub­con­trac­ting rela­ti­on­ships indi­vi­du­al­ly in such a way that the insti­tu­ti­on can assess the per­mis­si­bi­li­ty of data trans­fers abroad and the risks.
  • the con­tract must spe­ci­fy how the pro­vi­der ins­tructs and con­trols the sub­con­trac­tors and deals with sub-sub­con­trac­tors, for example
  • Sub­con­trac­tors from count­ries wit­hout ade­qua­te pro­tec­tion are to be exclu­ded if this can­not be effec­tively com­pen­sa­ted contractually
  • during the con­tract peri­od, chan­ges must be noti­fi­ed and the insti­tu­ti­on must be able to ter­mi­na­te the con­tract if it does not appro­ve a new subcontractor

Notes

It remains open what infor­ma­ti­on the pro­vi­der must dis­c­lo­se about sub­con­trac­tors. Howe­ver, the insti­tu­ti­on must at least be able to assess the risk of for­eign aut­ho­ri­ties acce­s­sing con­tract data. Howe­ver, it can also call on the pro­vi­der to do this, becau­se in the case of a trans­fer by a pro­vi­der in a secu­re sta­te to a sub­con­trac­tor in a sta­te wit­hout an ade­qua­te level of pro­tec­tion, the sub­con­trac­tor must con­duct a risk ana­ly­sis under data pro­tec­tion law (becau­se the sub­con­trac­tor is the export­er). The insti­tu­ti­on can rely on this risk ana­ly­sis, but may need to know and under­stand it, depen­ding on the risk pro­fi­le of the data involved.

Other risk factors

  • Mes­sa­gesThe cloud ser­vice pro­vi­der must report secu­ri­ty inci­dents and cor­rec­ti­ve mea­su­res so that the cloud ser­vice pro­vi­der can take time­ly action.
  • Con­trol: The pro­vi­der must be requi­red to car­ry out regu­lar checks on the ser­vice, and audit reports must be sub­mit­ted to the insti­tu­ti­on and the com­pe­tent data pro­tec­tion super­vi­so­ry aut­ho­ri­ty on request. Audits by the body and the super­vi­so­ry aut­ho­ri­ty must also be possible.
  • Infor­ma­ti­on secu­ri­ty mea­su­res: The insti­tu­ti­on must be awa­re of the ser­vice provider’s secu­ri­ty mea­su­res, and the­se must be ade­qua­te and cer­ti­fi­ed, if necessary.
  • Con­tract can­cel­la­ti­on: The pro­cess on ter­mi­na­ti­on must be agreed (in par­ti­cu­lar, return deli­very and des­truc­tion of data).

Con­clu­ding remarks

All in all, the impres­si­on is that pri­va­tim is the Risk assess­ment by the respon­si­ble body of the public insti­tu­ti­on and the­r­e­fo­re pre­scri­bes rather sche­ma­tic cri­te­ria. The­re is not­hing wrong with that in its­elf. But if the­se cri­te­ria are too sche­ma­tic, they do not sup­port the risk assess­ment, but distort it. This is par­ti­cu­lar­ly true if the cate­go­ry of spe­cial per­so­nal data is made a main fac­tor in for­eign dis­clo­sure, becau­se the spe­cial need for pro­tec­tion of this data does not rela­te to the risk of access by aut­ho­ri­ties, but rather to the per­so­nal natu­re of this data. Howe­ver, this is too abstract a yard­stick for a risk assess­ment, espe­ci­al­ly sin­ce this type of data is pro­ba­b­ly of below-avera­ge inte­rest to for­eign authorities.

On the que­sti­on of Risk accep­tance The situa­ti­on is simi­lar: The state­ments from pri­va­tim sug­gest, at least in places, that a zero risk must be achie­ved. Pri­va­tim is not that clear, and the­re are also state­ments that sound the oppo­si­te (e.g. in point 5: “This risk ana­ly­sis must […] show the […] mea­su­res with which tho­se risks can be eli­mi­na­ted. or can be redu­ced to a tole­ra­ble level„). But in any case, it would be wrong to demand zero risk. The­re is no basis for the view that the risk of access by a for­eign aut­ho­ri­ty must be redu­ced to zero – zero risk is gene­ral­ly not requi­red any­whe­re. Moreo­ver, it is not the data pro­tec­tion aut­ho­ri­ty that has to make a risk accep­tance decis­i­on, but the com­pe­tent exe­cu­ti­ve body (which pri­va­tim also sta­tes; howe­ver, if pri­va­tim requi­res zero risk, the risk accep­tance decis­i­on is effec­tively prohibited).