LfDI Baden-Würt­tem­berg: Que­sti­ons about ChatGPT

The LfDI Baden-Würt­tem­berg joins other aut­ho­ri­ties in Euro­pe that cri­ti­cal­ly assess, if not pro­hi­bit, ChatGPT. One Media release from April 24, 2023 Accor­ding to the report, the LfDI has asked Ope­nAI for a state­ment. In addi­ti­on, the Euro­pean Data Pro­tec­tion Com­mit­tee (EDSA) is working toward a uni­form Euro­pean approach.

The focus seems to be on how cus­to­mers of Ope­nAI – i.e., tho­se respon­si­ble; Ope­nAI offers an ADV – know which data are pro­ce­s­sed, how and for what pur­po­se, how the appro­pria­te tech­ni­cal and orga­nizatio­nal mea­su­res can be taken and checked, how data requi­ring spe­cial pro­tec­tion are spe­ci­al­ly pro­tec­ted, and how the rights of the data sub­jects can be safe­guard­ed. Fur­ther infor­ma­ti­on on data pro­tec­tion pain points can be found in the last acti­vi­ty report of the LfDIto which he refers in the media release.

So far, some aut­ho­ri­ties have alre­a­dy dealt with the topic of AI or ChatGPT or Lar­ge Lan­guage Models, e.g.

  • the Euro­pean Data Pro­tec­tion Board EDSAwhich deci­ded to set up a task force for con­cer­ted action by the Euro­pean data pro­tec­tion aut­ho­ri­ties (on April 13, 2023),
  • the HBDI Hes­se (April 13, 2023), which shares the con­cerns of the Ita­li­an Garante,
  • the Office of the Pri­va­cy Com­mis­sio­ner Cana­da (April 4, 2023), which is con­duc­ting an investigation,
  • the FDPIC, which has published gui­dance on the use of ChatGPT and simi­lar appli­ca­ti­ons (on April 4, 2023).
  • the Ita­li­an gua­ran­torwhich not only inve­sti­ga­ted the use of ChatGPT, but pro­hi­bi­ted it due to lack of infor­ma­ti­on and legal basis, con­cern for children’s data, and the unre­lia­bi­li­ty of ChatGPT’s gene­ra­ted state­ments (on March 30, 2023), befo­re he was released after a mee­ting with repre­sen­ta­ti­ves of ChatGPT on April 6, 2023 has deci­ded (on 12 April 2023) to rein­sta­te the ban as of April 30, 2023, pro­vi­ded that ChatGPT imple­ments cer­tain requi­re­ments by then to address the afo­re­men­tio­ned concerns;
  • the UK Natio­nal Cyber Secu­ri­ty Cent­re in a state­ment (March 13, 2023).

At the same time, regu­la­ti­on in the AI space is moving for­ward. For the April 26, 2023, the agree­ment of the Euro­pean Par­lia­ment will be expec­ted, the tri­lo­gue on this could start accordingly.

Howe­ver, the pro­gress of ChatGPT, among others, seems to trig­ger a rather fun­da­men­tal deba­te rela­ted to the distinc­tion between

  • Gene­ral Pur­po­se AI (GPAI) respec­tively Foun­da­ti­on AI or Foun­da­ti­on Models on the one hand, i.e. AI systems that are capa­ble of per­forming a wide ran­ge of tasks and have cor­re­spon­ding adap­ti­ve capa­bi­li­ties – ChatGPT falls into this cate­go­ry -, and
  • on the other hand, spe­cia­li­zed appli­ca­ti­ons for indi­vi­du­al tasks that are less “intel­li­gent” (“Nar­row Intel­li­gence”; the use of the word “intel­li­gence” for AI would be worth a sepa­ra­te post, but it expres­ses an under­stan­ding of man that makes him at least as much like the machi­ne as the machi­ne is like him).

A Infor­ma­ti­on sheet for the atten­ti­on of the Euro­pean Par­lia­ment of March 2023 con­ta­ins fur­ther details on this. In any case, the­re is some pres­su­re to also regu­la­te GPAI in the AI Act, which would not be addres­sed in the Commission’s draft, or would be addres­sed only insuf­fi­ci­ent­ly (depen­ding on the inter­pre­ta­ti­on of the term “arti­fi­ci­al intel­li­gence system” accor­ding to Art. 3 No. 1 and Annex I of the Regu­la­ti­on – cf. the cor­re­spon­ding recom­men­da­ti­ons of the future of life institute).




Rela­ted articles