Take-Aways (AI)
  • EDPB empha­si­zes pro­por­tio­na­li­ty and requi­res a case-by-case assess­ment as well as high requi­re­ments for legi­ti­ma­te inte­rests in video sur­veil­lan­ce in accordance with Art. 6 para. 1 lit. f GDPR.
  • Bio­me­tric and par­ti­cu­lar­ly sen­si­ti­ve data fall strict­ly under Art. 9 GDPR; coll­ec­tion, sto­rage or tem­p­la­te matching gene­ral­ly requi­res expli­cit consent.

The Euro­pean Data Pro­tec­tion and Pri­va­cy Board (EDSA/EDPB), fol­lo­wing the Draft has now published the defi­ni­ti­ve ver­si­on of its gui­de­lines on video sur­veil­lan­ce (Gui­de­lines 3/2019 on pro­ce­s­sing of per­so­nal data through video devices) adopted with date of Janu­ary 29, 2020 (for the time being only available in Eng­lish). The gui­de­lines explain in just under 30 pages, among other things, the per­mis­si­bi­li­ty of video sur­veil­lan­ce (legal basis), dis­clo­sure to third par­ties, the pro­ce­s­sing of spe­cial cate­go­ries of per­so­nal data, the rights of data sub­jects in con­nec­tion with video recor­dings, the infor­ma­ti­on of data sub­jects, the sto­rage of recor­dings, the neces­sa­ry secu­ri­ty mea­su­res and data pro­tec­tion impact assessments.

Pro­por­tio­na­li­ty

The EDSA places empha­sis on the Pro­por­tio­na­li­ty of the mea­su­res, which requi­res a case-by-case assess­ment in each case. It also places rela­tively high requi­re­ments on the spe­ci­fic legi­ti­ma­te inte­rest of the con­trol­ler, inso­far as the con­trol­ler reli­es on Art. 6 (1) (f) GDPR (cf, Rs. C‑708/18).

Per­so­nal data requi­ring spe­cial protection

The comm­ents on per­so­nal data requi­ring spe­cial pro­tec­tion are inte­re­st­ing. Here, the EDSA con­firms the gene­ral view that recor­dings that poten­ti­al­ly show par­ti­cu­lar­ly sen­si­ti­ve fea­tures (e.g. glas­ses as a poten­ti­al health datum) are not per se par­ti­cu­lar­ly wort­hy of pro­tec­tion. Only when the­se sen­si­ti­ve state­ments are taken from the recor­dings is per­so­nal data wort­hy of spe­cial pro­tec­tion processed:

Howe­ver, if the video foota­ge is pro­ce­s­sed to dedu­ce spe­cial cate­go­ries of data Artic­le 9 applies.

This applies not only to health data, but other cate­go­ries as well:

Video sur­veil­lan­ce cap­tu­ring a church does not per se fall under Artic­le 9.

Howe­ver, poten­ti­al­ly par­ti­cu­lar­ly sen­si­ti­ve per­so­nal data is sen­si­ti­ve, which is why the prin­ci­ple of pro­por­tio­na­li­ty is of par­ti­cu­lar importance here.

Bio­me­tric data

Also note­wor­t­hy are the comm­ents on bio­me­tric data. Bio­me­tric data within the mea­ning of Art. 4 No. 14 DSGVO are only pro­ce­s­sed if

  • Data rela­tes to phy­si­cal, phy­sio­lo­gi­cal or beha­vi­oral Fea­tures refer,
  • tho­se with spe­cial tech­ni­cal pro­ce­s­ses obtai­ned beco­me and the
  • be used to iden­ti­fy a per­son Cle­ar­ly iden­ti­fy.

This is not the case, for exam­p­le, when a came­ra in a store auto­ma­ti­cal­ly detects the gen­der or age of a per­son, as long as the system can­not iden­ti­fy the person.

Howe­ver, if the system has a bio­me­tric tem­p­la­te is crea­ted and stored in order to reco­gnize a spe­ci­fic per­son, this is said to con­sti­tu­te pro­ce­s­sing of bio­me­tric data – even if the per­son in que­sti­on is not known by name; with the con­se­quence that Art. 9 GDPR applies. Here, the EDSA reli­es – wit­hout say­ing so – on the con­cept of the Sin­gu­la­rizati­onwhich is appar­ent­ly sup­po­sed to be equi­va­lent to an iden­ti­fi­ca­ti­on here:

If a con­trol­ler wis­hes to detect a data sub­ject re-ente­ring the area or ente­ring ano­ther area (for exam­p­le in order to pro­ject con­tin­ued cus­to­mi­zed adver­ti­se­ment), the pur­po­se would then be to uni­que­ly iden­ti­fy a natu­ral per­son, mea­ning that the ope­ra­ti­on would from the start fall under Artic­le 9. This could be the case if a con­trol­ler stores gene­ra­ted tem­pla­tes to pro­vi­de fur­ther tail­o­red adver­ti­se­ment on seve­ral bill­boards throug­hout dif­fe­rent loca­ti­ons insi­de the store. Sin­ce the system is using phy­si­cal cha­rac­te­ri­stics to detect spe­ci­fic indi­vi­du­als coming back in the ran­ge of the came­ra (like the visi­tors of a shop­ping mall) and track­ing them, it would con­sti­tu­te a bio­me­tric iden­ti­fi­ca­ti­on method becau­se it is aimed at reco­gni­ti­on through the use of spe­ci­fic tech­ni­cal processing.

Howe­ver, the EDSA takes an even stric­ter view: bio­me­tric data pro­ce­s­sed not only for tho­se indi­vi­du­als for whom a tem­p­la­te has been crea­ted, but also for all tho­se per­sons who­se cha­rac­te­ri­stics are matched with the tem­p­la­te. If, for exam­p­le, facial fea­tures of VIPs are stored in a hotel so that they can be imme­dia­te­ly reco­gnized upon check-in, not only the con­sent of the VIPs is requi­red, but also that of all other gue­sts who­se faces are scan­ned, even though the­se gue­sts can­not be iden­ti­fi­ed due to the lack of their own template:

A hotel uses video sur­veil­lan­ce to auto­ma­ti­cal­ly alert the hotel mana­ger that a VIP has arri­ved when the face of the guest is reco­gnized. The­se VIPs have given their expli­cit con­sent to the use of facial reco­gni­ti­on befo­re being recor­ded in a data­ba­se estab­lished for that pur­po­se. The­se pro­ce­s­sing systems of bio­me­tric data would be unlawful unless all other gue­sts moni­to­red (in order to iden­ti­fy the VIPs) have con­sen­ted to the pro­ce­s­sing accor­ding to Artic­le 9 (2) (a) GDPR.

The EDSA fur­ther comm­ents on the vol­un­t­a­ry natu­re of the con­sent requi­red and on data mini­mizati­on in bio­me­tric systems.

Data sub­ject rights

The­re is also a right of access to video recor­dings. Howe­ver, the respon­si­ble par­ty should not neces­s­a­ri­ly be forced to hand over copies of recor­dings if third par­ties are also depic­ted in them. In any case, the respon­si­ble par­ty is not requi­red to purcha­se systems, for exam­p­le, to pixe­l­a­te other per­sons. In this case, the EDSA lea­ves it open how infor­ma­ti­on is to be pro­vi­ded; (edi­ted) still images are a pos­si­ble opti­on (see the Gui­de­lines of the Irish Regu­la­to­ry Aut­ho­ri­ty). Fur­ther­mo­re, the EDSA addres­ses the issue of iden­ti­fy­ing the per­son pro­vi­ding the infor­ma­ti­on and the other data sub­ject rights.

Trans­pa­ren­cy

For the infor­ma­ti­on of the per­sons con­cer­ned, the EDSA recom­mends quite exten­si­ve­ly a staged pro­ce­du­re with a noti­ce at the came­ras and the fur­ther infor­ma­ti­on e.g. at a recep­ti­on or simi­lar. The EDSA recom­mends the fol­lo­wing infor­ma­ti­on sign:

Sto­rage duration

Video recor­dings are to be dele­ted if they are no lon­ger requi­red. The EDSA recom­mends a reten­ti­on peri­od of 1 – 2 days for secu­ri­ty systems, becau­se e.g. acts of van­da­lism are usual­ly dis­co­ver­ed after 1 or 2 days. For lon­ger peri­ods, the need for sto­rage must be justified:

Taking into con­side­ra­ti­on the prin­ci­ples of Artic­le 5 (1) (c) and(e) GDPR, name­ly data mini­mizati­on and sto­rage limi­ta­ti­on, the per­so­nal data should in most cases (e.g. for the pur­po­se of detec­ting van­da­lism) be era­sed, ide­al­ly auto­ma­ti­cal­ly, after a few days. The lon­ger the sto­rage peri­od set (espe­ci­al­ly when bey­ond 72 hours), the more argu­men­ta­ti­on for the legi­ti­ma­cy of the pur­po­se and the neces­si­ty of sto­rage has to be provided.