Today, on August 31, 2023, the long revision of data protection law in Switzerland comes to a provisional end – provisional because it is foreseeable that the DPA will face a fate similar to that of the UCA: it is adaptable to all data-related concerns and is therefore likely to be continuously supplemented. But this is speculation; now, on September 1, at 00:00, the revised DPA will enter into force.
For many companies, the revision phase meant a profound examination of their handling of personal data and the internal organizational framework conditions – naturally also or especially against the background of the GDPR and other international developments. More or less independent positions were filled (independence being less a question of being free from instructions, conflicts of interest, and reporting lines than one of having sufficient resources to allow data protection bodies to do some agenda setting rather than being relegated to responding to internal requests). Privacy contacts have been identified in units, e.g., marketing, HR, and IT, and business leaders have been briefed on legal risks, especially including criminal risks on the front lines as well as in executive and management.
The legal risks here are both overestimated and underestimated – underestimated by companies that still perceive data protection as an imposition and further woke restrictions on legitimate profit-making and are unwilling to invest in more than a privacy statement, and overestimated by companies that fear even negligence could lead to penalties.
Overall, however, the risks under criminal law are likely to be overestimated. For example, it is hardly possible to involve a processor without fulfilling the minimum requirements under Article 9 (1) and (2) DPA, which are punishable by law. To a large extent, they already arise as an ancillary obligation under mandatory law from any relationship similar to a contract, at least from a criminal law perspective that does not allow for overly broad interpretations or analogies. With regard to data security, we have already explainedthat and why there are no justiciable minimum requirements for data security in the DSV when viewed correctly. And with regard to the transfer abroad, penalties are conceivable if, for example, standard contractual clauses are forgotten, but, for example, the omission of a Transfer Impact Assessment (TIA) cannot in itself lead to criminal liability.
The greatest risks certainly exist in connection with the duty to inform and the right to information. In the case of the duty to inform, however, the preliminary question would arise as to whether there is a procurement of personal data at all (because not every occurrence of data is a procurement) and whether a gap in the data protection declaration really is such a gap, because information only has to be provided about those purposes and disclosures that are at least foreseeable, if not planned, at the time of procurement. And in general, the question arises as to how detailed information must be in a data privacy statement. In view of the more extensive right to information in the context of the right to information, one can certainly not expect exhaustive details, even if many companies use detailed data protection declarations for reasons of prudence and reputation.
How active law enforcement agencies will be remains to be seen anyway – but if one thinks of the enforcement practice in the area of the Unfair Competition Act and especially of the ban on spam (we know about non-adoption orders with sometimes adventurous justifications), no activism is to be expected.
The FDPIC is also not expected to engage in broad investigative activities. Although it will be under a certain pressure to succeed and to make use of its extended competences, it is likely to impose restraint due to political considerations, due to its self-image (“not a regulator”) and due to a shortage of resources.
Companies will continue to be busy with implementation work after September 1, on the one hand with the obvious requirements for the duty to inform, but also with internal organization and the usual long-tail tasks such as the storage and deletion of personal data. A certain amount of additional work will also remain, e.g., in collaboration with service providers and partners, where data protection contracts and corresponding negotiations have increased.
On balance, however, the impact of the new DPA on corporate practice is likely to remain manageable. Even the representatives of the data protection bubble know that data protection law is not the only regulation that companies have to deal with. Apart from sector-specific regulations, depending on the industry, companies may have to comply with antitrust law, anti-corruption law, money laundering law, etc., and a one-sided focus on data protection law would only lead to other obligations and risks being neglected – something that the data protection authorities are probably also aware of.
Has the revision been successful? Yes and no. The new DPA is the result of a long political tug-of-war. This has advantages and disadvantages – on the one hand, the DSG is less governor-like than the GDPR. On the other hand, many technical errors in the DSG and the DSV will continue to lead to legal uncertainty. It also does not contribute to the acceptance of data protection law that punishability is individualized and the selection of obligations subject to punishment seems arbitrary (why should it be punishable to transfer personal data to third countries without standard contractual clauses, but not to report a data protection impact assessment or data breach notification?), or that a consistent approach to legal risk assessment is missing. After all: This gives companies the freedom they always demand when implementing data protection law.
What data protection law cannot change: The further development of technology and its penetration of even the smallest ramifications of everyday life. It will not be possible to escape generative AI even if one wanted to, and there would be good reasons to do so (no opportunity to outsource thinking ever went unused). What effects this will have is difficult to foresee, and this is not the place to speculate. But one consideration suggests itself: Means, deployed, work back; what you own, you own. An AI that claims to be human-like is not only human-like AI, but also blueprints for human behavior, just as other enabling technologies have been – the Internet is not only a place of freedom, but can replace knowledge with information and thinking with Googling. It is therefore not only possible, but likely, that sooner or later an AI will no longer be seen as a deficient human, but humans will be seen as deficient AIs. We will deal, for example, with the question of whether there is really a need for a right to be heard by humans in the case of automated individual decisions, or rather a right to be heard by machines in the case of human decisions.
However, data protection law is hardly in a position to take up such questions if it is not to become even more the “Law of Everything. More than philosophical questions, it requires craftsmanship, a constant preoccupation with data processing in the machine room of companies, so to speak, and a change in understanding away from the necessary evil to a thoroughly sensible framework for a digitized world.
In this sense – out with the old, in with the new! We will continue to lovingly accompany data protection law at this point.