Take-Aways (AI)
  • 61 Data pro­tec­tion aut­ho­ri­ties demand pro­tec­ti­ve mea­su­res against abu­si­ve AI gene­ra­ti­on of rea­li­stic-loo­king images and inti­ma­te depictions.
  • Orga­nizati­ons must pro­vi­de trans­pa­ren­cy about capa­bi­li­ties, safe­guards, per­mit­ted use and sanc­tions for misuse.
  • Legal requi­re­ments: AI labe­l­ing requi­re­ment in the AI Act from Aug. 2026 and natio­nal cri­mi­nal law initia­ti­ves against deepfakes.

On Febru­ary 23, 2026, 61 data pro­tec­tion aut­ho­ri­ties inclu­ding the FDPIC and the EDPB, the CNIL (FR) and the ICO (UK) published a joint state­ment on AI-gene­ra­ted images:

The decla­ra­ti­on is aimed at deve­lo­pers and ope­ra­tors of gene­ra­ti­ve AI. The back­ground to this is the fact that it is beco­ming incre­a­sing­ly easy to gene­ra­te rea­li­stic images and vide­os of peo­p­le wit­hout their know­ledge, inclu­ding inti­ma­te depic­tions and defa­ma­to­ry con­tent. The decla­ra­ti­on the­r­e­fo­re for­mu­la­tes four Expec­ta­ti­ons of orga­nizati­ons that deve­lop or use gene­ra­ti­ve AI:

  • Pro­tec­ti­ve mea­su­res against the misu­se of per­so­nal data and the crea­ti­on of non-con­sen­su­al inti­ma­te images, in par­ti­cu­lar depic­tions of children
  • Trans­pa­ren­cy regar­ding capa­bi­li­ties, pro­tec­ti­ve mea­su­res, per­mis­si­ble uses and con­se­quen­ces of misu­se of AI
  • Dele­ti­on mecha­nisms so that tho­se affec­ted can quick­ly request the rem­oval of harmful content
  • Mea­su­res for the spe­cial pro­tec­tion of children

The AI Act con­ta­ins a labe­l­ing requi­re­ment for deepf­akes (appli­ca­ble from August 2026). In Ger­ma­ny, for exam­p­le, the­re is a draft bill for a new Sec­tion 201b of the Cri­mi­nal Code that would cri­mi­na­li­ze deepfakes:

§ Sec­tion 201b Vio­la­ti­on of per­so­nal rights through digi­tal forgery

(1) Anyo­ne who vio­la­tes the right of per­so­na­li­ty of ano­ther per­son by using media con­tent pro­du­ced or modi­fi­ed by com­pu­ter tech­no­lo­gy that vio­la­tes the Appear­ance of a true-to-life image or sound recor­ding the exter­nal appear­ance, beha­vi­or or ver­bal state­ments of this per­son, to a third per­son. makes acce­s­si­ble, shall be punis­hed with impri­son­ment of up to two years or a fine. The same applies if the offen­se pur­su­ant to sen­tence 1 rela­tes to a decea­sed per­son and their per­so­nal rights are serious­ly vio­la­ted as a result.

(2) Anyo­ne who makes the media con­tent acce­s­si­ble to the public in the cases refer­red to in sub­sec­tion (1) sen­tence 1 or makes media con­tent acce­s­si­ble that rela­tes to an event in the high­ly per­so­nal sphe­re shall be lia­ble to a cus­to­di­al sen­tence not exce­e­ding five years or to a mone­ta­ry penalty.

(3) Para­graph 1 sen­tence 1, also in con­junc­tion with para­graph 2, shall not app­ly to acts per­for­med in the exer­cise of over­ri­ding legi­ti­ma­te inte­rests in par­ti­cu­lar for art or sci­ence, rese­arch or tea­ching, report­ing on cur­rent events or histo­ry or for simi­lar purposes.

(4) The image or sound car­ri­ers or other tech­ni­cal means used by the offen­der or par­ti­ci­pant may be con­fis­ca­ted. § Sec­tion 74a shall apply.

[Amend­ment of Sec­tion 205 and the Code of Cri­mi­nal Procedure]

In Switz­er­land, the Natio­nal Coun­cil rejec­ted the Moti­on Maha­im (23.3563) on the regu­la­ti­on of deepf­akes in public spaces in May 2025. Depen­ding on the sub­ject mat­ter of deepf­akes, howe­ver, cri­mi­nal law pro­vi­si­ons, civil law pro­tec­tion of per­so­na­li­ty rights, fair tra­ding law, etc. may apply.