- Privatim identifies access by foreign authorities as a key cloud-specific data protection risk and recommends location, contract and encryption measures.
- The author criticizes schematic categorizations (e.g. “particularly sensitive data”) and calls for case-based risk analyses and acceptable, non-zero-tolerance risk decisions.
New leaflet
privatim, the influential conference of Swiss data protection commissioners, published a new version of its fact sheet “Cloud-specific risks and measures” on February 14, 2022. The older and now replaced version was from December 2019 (see also here):
The aim of the fact sheet is to identify the specific data protection risks in cloud and similar services and to show how public bodies can assume their responsibilities in this context. Accordingly, it refers neither to general risks of order processing nor to concerns outside of data and secrecy protection.
Basic information on cloud risks
privatim assumes that the specific cloud risks are determined by five factors:
- Contract design
- Locations of data processing including foreign authority accesses
- Confidentiality/secret protection, encryption and key management
- Data about the users of the cloud services
- Subcontracting relationships
In other words, privatim considers possible Access by foreign authorities as the main cloud-specific (data protection) risk – rightly so; all other risk factors are not cloud-specific, but also arise in other contract processing operations.
In addition, however, privatim sees further risks that can increase through the use of a cloud or that need to be considered:
- Reporting requirements
- Right and possibility of control
- Information security measures
- Obligations in the event of contract termination Dependencies on the service provider (availability, migration effort in the event of a change)
Notes on the main factors
Contract design
Recommendations privatim
- in writing
- Cover proportionality and earmarking
- Duty to provide support for claims under data protection law
- Control rights with regard to contract compliance
- Enforcement before easily accessible courts in a familiar legal system; in principle therefore Swiss law and place of jurisdiction in Switzerland
- Exception: the data is effectively protected from access by the provider and third parties by means of encryption or the place of jurisdiction and the applicable law are those of a state with an adequate level of protection. However, this exception does not apply if the data concerned are particularly worthy of protection or contain personality profiles or secrets.
Notes
The recommendations of privatim are basically just that – recommendations. As such, they can hardly be wrong, they are just more or less convincing.
However, the view that the categorization of data concerned as particularly worthy of protection or personality profile is a decisive factor in itself is not convincing. Neither is access by the authorities more likely across the board for such data, nor does it have to be more damaging (unlike access by a private attacker, where there is a risk of publication on the darknet; here, an abstract consideration is justified, this in distinction to access by the authorities).
Although the data protection laws link certain legal consequences to the status of data requiring special protection (or special personal data in the more recent cantonal decrees), this is specifically in relation to certain points (e.g. the required standard level for the legal basis) and not in relation to disclosure abroad. The category of the special personal data can therefore at most have an indicative effect in the case of disclosure abroad.
This applies similarly to secretsThe decisive factor is not whether a secret exists, because the status of a secret as such says nothing about the risk of access. For example, there is no reason to assume that foreign authorities are generally particularly interested in secrets. On the contrary, if one looks at FISA and analogous legal bases in other countries, it becomes clear that the focus is not on secrets, but rather on communications data, for example.
It is therefore for each type of secret a separate and open-ended risk analysis to be carried out, based on
- the nature of the secret and especially the interests protected by the secret,
- the likelihood of access by foreign authorities and
- the negative consequences in the event of such access.
Only on this basis is a meaningful categorization of information assets according to risks possible.
In the case of general official secrecy, for example, it must be borne in mind that this has a dual protective purpose, in that it protects genuine state interests on the one hand and the citizen’s trust in his or her interaction with the state on the other. Genuine state interests, however, are by no means affected by all information subject to general official secrecy, and the citizen’s trust is not always at stake either, and even where it is, it does not require zero risk.
With regard to the applicable law and the place of jurisdiction, the categorization as particularly worthy of protection is likewise not decisive for whether Swiss law or jurisdiction in Switzerland must be agreed. It is true that this is sensibly agreed (provided that a judgment is enforceable in the provider’s state). But for a small municipality, this point is likely to be more important than a public body having the means and know-how to sue abroad.
Locations of data processing including foreign authority accesses
Recommendations privatim
- Declaration of the locations by the provider
- Location Switzerland is to be preferred (among other things due to the control possibilities, but also for other security considerations)
- Any obligations to disclose under the CLOUD Act must be taken into account.
- Processing abroad – including accesses from abroad – is only permissible if the level of protection is equivalent. This can be substituted by standard contractual clauses, provided that no accesses are possible in the target state on the basis of a legal basis that is insufficient under fundamental rights:
Data processing at foreign sites is only permitted in countries that have an equivalent level of data protection or in which adequate data protection can be achieved by contract – namely through recognized standard contractual clauses. The latter is then not the case, if official access is possible in the state in question, which do not satisfy the constitutional guarantees of fundamental rights (principle of legality, proportionality, rights of the persons concerned, access to independent courts). In this case, additional measures (in particular effective encryption) must be taken to ensure that the transfer of personal data abroad is permissible.
Notes
privatim proceeds on the basis of the general principles of foreign transfers. It appears, however, that privatim is using an inadequate legal basis for the accesses (in Schrems-II-language: a “problematic legislation” such as FISA in the USA) requires zero risk. This is not correct – Data protection law does not require zero risk. This applies not only to private but also to public data protection law, because the fundamental rights of affected individuals must be protected, but not absolutely. Public bodies, too, may and must ask themselves how they can structure the administration efficiently and use their resources economically. In general, general data security law does not require zero risk, even for public bodies, and this is no different for foreign disclosure than for data security measures in general.
If a country enables access by authorities, for example, by means of two laws, one of which satisfies the requirements of fundamental rights (e.g., the CLOUD Act in the case of the USA), but the other does not (e.g., FISA in the case of the USA), this only stands in the way of disclosure to the corresponding country if it is to be expected that the deficient law will be used at all for the relevant data.
In terms of data protection law, the probability of access by the authorities must therefore be assessed when processing the data location (see the Guidelines of the EDSA on Schrems II measures. and the current standard contractual clauseswhich explicitly require a probability-based approach in clause 14, and which the FDPIC has recognized). This applies not only if access via the CLOUD Act is in question, but also if data is located in a state without an adequate level of protection or if corresponding access is possible. Of course, however, an access risk is higher if data is stored on site and not only access is possible – these are all factors that are relevant in the context of the probability analysis. Incidentally, this does not say what risk is still acceptable – that is for the institution concerned to decide.
Interesting in this context is a remark to which privatim – deliberately? – only allows a footnote:
As far as Access to authorities without informing the responsible public body can take place, the Probability of occurrence not assessable (because it is not verifiable), so that the main focus is on the extent of the damage, where the data quality is decisive (sensitive personal data).
Even if the provider is prohibited from providing information (by a “gag order”), the question is with what probability this is the case. That the access is surreptitious can, after all, is not an expression of a problematic legal system, but rather a matter of course in certain cases. It does not mean that such accesses are not subject to clear rules, nor that they are frequent, that surreptitious access is more harmful to the person concerned than open access, or that there are no figures at all on such accesses. It is therefore wrong that in the case of a possible gag order, the probability of occurrence could or should not be taken into account. At most, it can be estimated with less certainty, so that the corresponding risk may be estimated somewhat higher due to the fuzziness. Providers, however, often have a duty to take legal action against authority access even if the customer cannot issue a corresponding instruction because he knows nothing about it. And again: whether a data item is particularly worthy of protection is not decisive in the present context.
Also the Secrecy law does not, as noted above, require zero risk of access by the authorities, at least not as a rule.
Confidentiality/secret protection, encryption and key management
Recommendations privatim
- At least in transit data are to be encrypted
- data must be adequately protected during further processing
- for personal data requiring special protection, more stringent requirements apply: it must be encrypted by the public body, and the keys must be available only to the public body
- decryption at the provider’s premises requires the public body to demonstrate that there are “no unacceptable risks to the fundamental rights of the data subjects”. However, the provider may only retain the keys here if it undertakes to use them only with the express consent of the institution, and accesses must be logged
- Secrets may only be made accessible to the provider to the extent that the secrecy regulations permit the involvement of auxiliary persons, and, if necessary, in compliance with the corresponding requirements.
- the unjustified violation of secrecy rules should be considered “as a legal barrier to outsourcing and not just as a risk”
Notes
The distinction between normal personal data and data requiring special protection is not the appropriate criterion here either. This plays out in two ways: It may be that the measures outlined by privatim are also necessary for personal data that is not particularly worthy of protection, and conversely, they are not always so even for personal data that is particularly worthy of protection. The decisive factor is risk considerations that take into account the entire circumstances, and not the abstract classification of data. Moreover, as with all security measures, the public body may – must – also take into account practicality considerations.
With the Secrets it is correct that the relevant secret is to be interpreted. However, privatim’s comment that the violation should be understood as a barrier and not only a risk is unclear. It is possible that privatim means that the public body must prevent an objective disclosure and cannot excuse itself with the subjective lack of fault. This is true. However, it would be a circular argument to deduce that the objective facts require zero risk. Rather, the starting point is the Secret interestand in the vast majority of cases – except perhaps in the case of actual state secrets – this will not be less, but also not more than adequate safeguards require. Not every residual risk of disclosure by the provider to an authority therefore makes outsourcing to the provider inadmissible.
Data about the users of the cloud services
Recommendations privatim
- If a provider generates data about users during cloud use (e.g., edge, telemetry, or logging data), it must be treated with the same care as communicated data. They must be subject to the same contractual provisions, and the protection of data subject rights and deletion must be ensured
- such data may be used only for purposes that would also be permitted to the public body, e.g. for non-personal purposes
- such data can be particularly worthy of protection and are then subject to the corresponding specifications
Notes
It is true that marginal data must be treated with the same care as other data, but this only means that the same standards must be applied to the risk assessment, and not that such marginal data must necessarily be classified as requiring the same level of protection as other data. On the subject of personal data requiring special protection, see above. Furthermore, the statement that personal data requiring special protection exists if marginal data indicate “that a data subject is in a psychiatric or penal institution” is not correct as a general rule. Incidentally, when marginal data are used for the provider’s own purposes, the provider is hardly a commissioned processor, but a controller.
Subcontracting relationships (subcontracting)
Recommendations privatim
- Prior to the conclusion of the contract, the provider must disclose its subcontracting relationships individually in such a way that the institution can assess the permissibility of data transfers abroad and the risks.
- the contract must specify how the provider instructs and controls the subcontractors and deals with sub-subcontractors, for example
- Subcontractors from countries without adequate protection are to be excluded if this cannot be effectively compensated contractually
- during the contract period, changes must be notified and the institution must be able to terminate the contract if it does not approve a new subcontractor
Notes
It remains open what information the provider must disclose about subcontractors. However, the institution must at least be able to assess the risk of foreign authorities accessing contract data. However, it can also call on the provider to do this, because in the case of a transfer by a provider in a secure state to a subcontractor in a state without an adequate level of protection, the subcontractor must conduct a risk analysis under data protection law (because the subcontractor is the exporter). The institution can rely on this risk analysis, but may need to know and understand it, depending on the risk profile of the data involved.
Other risk factors
- MessagesThe cloud service provider must report security incidents and corrective measures so that the cloud service provider can take timely action.
- Control: The provider must be required to carry out regular checks on the service, and audit reports must be submitted to the institution and the competent data protection supervisory authority on request. Audits by the body and the supervisory authority must also be possible.
- Information security measures: The institution must be aware of the service provider’s security measures, and these must be adequate and certified, if necessary.
- Contract cancellation: The process on termination must be agreed (in particular, return delivery and destruction of data).
Concluding remarks
All in all, the impression is that privatim is the Risk assessment by the responsible body of the public institution and therefore prescribes rather schematic criteria. There is nothing wrong with that in itself. But if these criteria are too schematic, they do not support the risk assessment, but distort it. This is particularly true if the category of special personal data is made a main factor in foreign disclosure, because the special need for protection of this data does not relate to the risk of access by authorities, but rather to the personal nature of this data. However, this is too abstract a yardstick for a risk assessment, especially since this type of data is probably of below-average interest to foreign authorities.
On the question of Risk acceptance The situation is similar: The statements from privatim suggest, at least in places, that a zero risk must be achieved. Privatim is not that clear, and there are also statements that sound the opposite (e.g. in point 5: “This risk analysis must […] show the […] measures with which those risks can be eliminated. or can be reduced to a tolerable level„). But in any case, it would be wrong to demand zero risk. There is no basis for the view that the risk of access by a foreign authority must be reduced to zero – zero risk is generally not required anywhere. Moreover, it is not the data protection authority that has to make a risk acceptance decision, but the competent executive body (which privatim also states; however, if privatim requires zero risk, the risk acceptance decision is effectively prohibited).