TNW València is officially sold out 🇪🇸 We will see you in 3 days

This article was published on February 16, 2019

Taylor Swift is watching you watching her

Taylor Swift is watching you watching her
Brahim Bénichou
Story by

Brahim Bénichou

Brahim Bénichou is a part-time legal researcher at the KU Leuven Centre for IT and IP law, Of Counsel at the IP/IT team of NautaDutilh and f Brahim Bénichou is a part-time legal researcher at the KU Leuven Centre for IT and IP law, Of Counsel at the IP/IT team of NautaDutilh and founder of My Privacy Specialist. He focuses on privacy and the legal issues raised by technology and innovation.

In 2018, face recognition technology (“FRT”) stumbled from childhood into adolescence in commercial applications. It evolved from being used primarily as a gimmick to unlock your phone or retrieve pictures of specific persons within your 60K pictures, to being used in an almost infinite number of commercial applications on a local scale (mostly in China, a frontrunner in FRT).

Examples are checking in to hotels, buying chicken wings, monitoring signs of distraction or fatigue while driving, detecting problem gamblers in casinos and scanning for unwanted guests in a Taylor Swift audience.

What does this mean for the EU? Will you be scanned too at the next concert you attend and have your face end up in one or more databases without you even knowing it?

A dream comes true

How many “Swifties” wouldn’t dream that Taylor Swift would notice them and recognize their face? That is exactly what happened at a Taylor Swift concert in Los Angeles on 18 May 2018. Well, not exactly, but when the fans entered the concert venue, they had to pass next to a kiosk on which unreleased rehearsal clips of the pop star were shown.

While the fans were staring at the screens, they were filmed, their faces were recognized with facial recognition technology and with facial recognition cameras and they were cross-referenced with hundreds of Taylor Swift’s “known stalkers.”

You don’t have to know Taylor Swift’s songs to know that she is an (at least self-declared) advocate of women and LGBTQ rights and that she supported a Democratic candidate during the last US mid-term elections. That, and probably also the mere fact that she is a (well-known) woman, have made the singer of “Safe & Sound” a beloved target of fake news, trolling actions, public attacks and severe threats of persons who deem such actions a suitable way to express their “disagreement” with or their “love” for her.

Among these “fans and haters” are reportedly hundreds of persistent stalkers against who Taylor Swift wishes to protect herself by using FRT, without the persons being scanned knowing this.

Will face scanning at concerts be the new normal in the EU?

With Live Nation, the largest worldwide event organizer and owner of Ticketmaster, investing in FRT to test if they can replace concert tickets by facial recognition, it seems only a matter of time before your face will also be scanned at the next European summer festival you attend.

It will (or should) not go that fast, however, as using FRT is considered processing biometric data under the General Data Protection Regulation (“GDPR”), “biometric data” is considered as sensitive data and its processing is subject to strict conditions, in addition to the other (general) conditions that must be met for each processing of personal data.

This is because biometric data such as a face scan could possibly be used to identify a person in a lot of different situations, allowing the owner of such scan to get insights into the private life of the concerned person, to a much further extent as that person may anticipate. This would especially be the case if different data sets would be merged or if FRT would be broadly applied to public places.

What is required to use face scanning at concerts?

Practically, for a concert organizer to be able to use FRT at its event in the way it was used at the Taylor Swift concert above, it would essentially be required to:

  • obtain unambiguous, specific, and explicit consent of all persons whose face will be scanned (art. 9 GDPR – see our previous blog posts on this subject, here and here) and can be withdrawn at any time;
  • duly and transparently inform these persons about the intended face scanning (art. 12, 13 and 14 GDPR);
  • conduct a data processing impact assessment or DPIA (art. 35 GDPR and Guidelines on Data Protection Impact Assessment (DPIA), wp248rev.01, p.9);
  • pay specific attention to protect such data and strictly limit access to such data;
  • provide the right to the concerned persons to object against decisions based on the automated processing of the biometric data (art. 22 GDPR);
  • consider national legislation in the country where the event takes place, as member states may impose further conditions on the processing of biometric data (art. 9, 4 GDPR);
  • make sure the processing of the biometric data is strictly limited to what is required.

This means that in practice, a private organization cannot face scan the public at a large concert without infringing the law. It would never be possible to obtain informed and explicit consent from every single visitor of the concert whereas a single withdrawal of such consent would make the entire operation impossible.

It is possible to use face recognition as a replacement for the entry ticket insofar the conditions above are strictly respected. This includes providing the possibility for visitors to (freely) chose whether they want to use face recognition and providing those that chose to use FRT a possibility to withdraw their consent. The question remains, however, if that would be a good idea.

And what about the reliability of FRT?

Another reason to be careful in both using and relying on FRT is that the technology is not yet fully reliable. There are cases known where FRT can be fooled by a picture or by sunglasses. More important is that FRT would even be “racist” and less accurate (providing false matches) in recognizing people with darker skin, due to an underrepresentation of darker skinned faces in the datasets used to train the AI behind FRT.

So, if face recognition would become a general thing, it could be a good thing after all that you posted your #10yearchallenge to help train the Facebook Facial Recognition AI.


From a legal point of view, using face recognition technology on a broad scale or in day-to-day situations to recognize others than yourself needs to pass some hurdles to be applied, if it can be applied at all. It is however clear that the technology is here to stay and how fast it will enter our lives on a broad scale will mainly depend on the way states, regulators, consumer organizations, and the public itself react to the first uses. It is clear that regulation that is not enforced will not stop any technology, no matter how strict it is.

On a personal note, be careful when accepting a service to use face recognition and make sure to know where your face scan is stored and that the gains you win are worth your while. Your face scan contains sensitive information that will remain useful and once it is out there, you may never be able to re-contain it.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with

Back to top