Prof. Dr. Isabelle Wildhaber and Dr. Isabel Ebert have written a legal opinion on ADM systems (for “Automated Decision-Making”) in the workplace on behalf of the syndicom trade union and with the project management of AlgorithmWatch CH, which available here is. The NZZ has reported on this.
The report examines
- the current legal framework in Switzerland and internationally at EU level (GDPR, Directive 89/391 on the protection of workers’ health), the Council of Europe (ECHR 108, ECHR), the UN (UN Covenants I and II, UN Sustainable Development Goals, UN Guiding Principles on Business and Human Rights), the ILO (ILO Conventions 155 and 187), the OECD (Guidelines for Multinational Enterprises, Recommendations on the Use of Artificial Intelligence) for the use of ADP systems, the regulation of the participation of employees and their representatives and the relevant court cases,
- and the legal need for action when using ADM systems.
The experts diagnose Gaps in particular
- at Participation Act (MitwG), which is not sufficiently known, does not provide for sanctions, does not prevent termination for economic reasons (this is apparently seen as a loophole in the MitwG) and does not sufficiently clarify when ADM systems are health-related (and thus subject to participation);
- at the Involvement of the trade unionswho do not make sufficient use of collective bargaining instruments;
- at the individual enforcement of rightsbecause those affected “in the context of surveillance and discrimination” with the “structural-systemic” effects of ADM systems are often not individually identifiable and both substantive and procedural hurdles have to be overcome;
- at the collective law enforcement ex postbecause the FADP is designed for individual enforcement despite the FDPIC’s authority to issue orders, labor inspectorates often only intervene when ADM systems have harmful effects on health and procedural barriers stand in the way.
To remedy at least certain gaps, the report postulates (the NZZ writes somewhat maliciously: “as desired”) the following measures at legislative levelThe experts make concrete suggestions for implementation in each case:
- A Strengthening of rights of employee representatives and associations. In particular, employers must be obliged to inform employees not only individually in accordance with the DPA, but also collectively. When using ADM systems and in the event of subsequent changes that may have negative effects, information and consultation rights must be ensured;
- the Objection options of employees and their representation in ADM systems should be improved;
- there are structures for Supervision and control for example in the area of risk management or through impact assessments;
- Solutions with the Social partnership should be drawn up before any revision efforts are considered at a legal level.
At company level, a Duty of care employers in accordance with the UN Guiding Principles on Business and Human Rights (UNGPs) that require stakeholder involvement as part of due diligence.
The expert opinion Empirical results from the National Science Foundation study “Big Data or Big Brother? – Big Data HR Control Practices and Employee Trust” (March 2017 to February 2021) is progressing. However, one wonders to what extent the results of the study can actually be used as a basis for the report. This study apparently revealed an increase in people analytics tools. However, it even included online satisfaction surveys or computer-based exit surveys or software for controlling video cameras, i.e. basically harmless measures. According to the Dissertation by Gabriel Kasper were the most popular applications of People Analytics, in that order:
- web-based satisfaction surveys
- Video surveillance systems
- Exit surveys
- RFID badges
- Feedback instrument
In contrast, robots are currently hardly used for recruitment or sentiment analysis. This alone hardly indicates a need for action. The report does not quantify how high the proportion of actual ADM systems within the people analytics tools was in each case. However, Gabriel Kasper stated that automated individual decisions within the meaning of the FADP hardly ever occur in people analytics.
In this respect, we can only conclude that the problem described in the study is poorly documented despite the study at the time. The suggestions for improvement can therefore certainly close gaps, but whether there is a real need for this remains to be seen.
However, more sensitive ADM tools are likely to increasingly use AI systems. Perhaps we should therefore wait for the discussion on the regulation of AI before reacting to problems in the sector.