Take-Aways (AI)
  • Clear com­pli­ance requi­re­ments, invol­vement of data pro­tec­tion offi­cers and pro­vi­si­on of secu­re func­tion­al accounts instead of pri­va­te accounts.
  • Pro­hi­bi­ti­on of input/output of per­so­nal data, opt-out from training/history and checking for accu­ra­cy and discrimination.
  • No auto­ma­ted final decis­i­on; sen­si­ti­ze employees; obser­ve legal aspects (copy­right, trade secrets, AI Act).

The Ham­burg Com­mis­sio­ner for Data Pro­tec­tion and Free­dom of Infor­ma­ti­on (HmbBfDI) has published a Check­list for the use of chat­bots based on Lar­ge Lan­guage Models (LLMs) such as ChatGPT have been published in Ger­man and English.

The check­list inclu­des the fol­lo­wing points:

  1. Com­pli­ance regu­la­ti­ons spe­ci­fy (ins­truc­tions etc.)
  2. Data Pro­tec­tion Offi­cer integrate
  3. Pro­vi­si­on of a Func­tion accounts (so that employees do not have to use a pri­va­te account; ide­al­ly the account should be wit­hout a name)
  4. Safe Authen­ti­ca­ti­on (so that attackers can­not misu­se the account)
  5. None Ente­ring per­so­nal data (cus­to­mers, busi­ness part­ners, the employee’s own per­so­nal data)
  6. None Out­put of per­so­nal data (becau­se the AI can incor­po­ra­te infor­ma­ti­on from the Inter­net; prompts should the­r­e­fo­re not tar­get per­so­nal data, e.g. no “who” questions)
  7. Cau­ti­on with per­so­nal data (i.e. no input of data that allo­ws a per­so­nal refe­rence in context)
  8. Opt-out of the AI trai­ning (e.g. swit­ching off “Chat histo­ry and training”)
  9. Opt-out of the Histo­ry (espe­ci­al­ly if seve­ral peo­p­le use the same account)
  10. Results Check for correctness
  11. Results check for dis­cri­mi­na­ti­on (for exam­p­le, a recom­men­da­ti­on to fill a vacan­cy with “male spec­ta­cle wea­rers” should not be followed (?))
  12. None Auto­ma­ted final decis­i­on (no adop­ti­on of recom­men­da­ti­ons, e.g. becau­se it is not clear how a recom­men­da­ti­on is made)
  13. Employees sen­si­ti­ze (regar­ding the per­mis­si­ble use of such tools)
  14. Data pro­tec­tion is not ever­ything (copy­rights, trade secrets, etc. must also be taken into account)
  15. More Deve­lo­p­ment track (e.g. the AI Act, which also regu­la­tes users)