Ah, face recognition, that nifty little feature that’s already making the still new fingerprint scanners (kind of) obsolete. Let’s face it, any technology that lets us skip a step is a welcome move. But, since these features often collect and use massive amounts of very personal information, there will always be that nagging question on just how is it being used.
That question was what dragged Facebook to the court in April this year after it was discovered the digital behemoth was using facial data without user consent. Apparently, Facebook had resorted to using “face templates” in its tag suggestions without letting anyone know.
The debacle has worked as a sort of a warning shot to other companies that utilize biometric data to deliver services. For instance, dating sites use facial data to match people, casinos use it to track addictive gamblers, face recognition is also becoming a standard feature in all smartphones, and some credit card companies are now letting us pay with our smiling faces.
As privacy laws are becoming tougher the world over (particularly the EU), companies that collect and use such data need to up their game and find ways to comply with legislation such as the GDPR. Here’s a quick summary of how they can do just that…
Where does face recognition fall under GDPR
The GDPR data privacy and security legislation is the biggest overhaul since 1995, giving users more power than they’ve ever had. Given the frantic pace at which technology was evolving, such a measure was but inevitable. But, it also has left companies guessing how to catch up.
As face recognition technology (or FRT) collects information of a person’s facial features, its classed under biometric data, which is labeled as “sensitive personal data.” The verbatim definition of biometric data in GDPR is…
[Biometric data] means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.
Clearly, the GDPR breaks biometric information into two categories…
- Physical characteristics: facial features, fingerprints, iris characteristics, weight etc.
- Behavioral characteristics: Habits, actions, personality traits, quirks, addictions, etc.
The rule set also gives member states further powers to add restrictions on sensitive data as they see fit.
Ensuring your face recognition policies comply with GDPR
Even though GDPR is quite restrictive, there are exceptions within it that allow the collection and use of sensitive data, this includes…
- If the user has given his/her consent willingly
- If biometric information is required for carrying out employment, social security, or social protection obligations
- If biometric data is required to protect the vital interests of the individual and he/she is incapable of giving consent
- If it’s required for legal issues
- If biometric data is necessary to aid in public interest such as health
Based on these, there are certain steps you can take to ensure your FRT data policies are within GDPR purview…
1) Always focus on getting user consent
At its crux, GDPR is all about increasing transparency and letting the user know how their data will be used. Facebook’s FRT law suite for instance, could have been avoided altogether had they simply popped up a message, telling users about the feature and asking if they will like to have it turned on. Consent as per GDPR is…
- A positive opt-in with no pre-ticked boxes. The user must freely agree or not agree to having their information collected
- Can be withdrawn by the user anytime
- Name of all the third parties you intend to share the information with
- Specific, granular information on what information will be collected and why. This information should be presented differently from a terms and conditions page
- How long that data will be stored
- A record of a user’s consent(s), which is/are updated as any changes are made to technologies or features said consent relates to
A sticky part in GDPR’s consent definition is that it is not considered freely given, if there is an imbalance of power between a data controller and user. So, if you ask a new hire to opt into your FRT policy by implying they may not have a job if they didn’t, then it won’t count as consent. Likewise, denying users service if they don’t tag along your conditions can land you into trouble.
For instance, here’s Facebook’s FRT opt-in page for Europe and Canada…
The page clearly lays out what data will be collected, how it will be used, and gives the user freedom to opt-out. But, some people have pointed out the blue, highlighted “Accept and Continue” button sort of goes against GDPR rules. It’s a good idea to stay away from such practices and stick to an explicit yes or no.
Here’s a complete GDPR Consent Checklist for a more thorough analysis.
2) Include a FRT specific DPIA policy
A DPIA or Data Protection Impact Assessment is a series of steps to identify and minimize risk for captured and stored data. DPIAs are mandatory for data considered to be of high risk to an individual, such as biometric information. A DPIA must…
- Describe the nature, scope, context and purposes of the processing
- Assess necessity, proportionality, and compliance measures
- Identify and assess risks to individuals
- Identify any additional measures to mitigate those risks
If after carrying out a DPIA, you conclude that the risk cannot be overcome, you need to consult with your regulator who has to reply with a solution within eight weeks (which may be extended by another six). If the risk cannot be mitigated, you will be barred from collecting and using biometric data.
3) Anonymize and/or pseudonymize the data
One method to protect FRT data is to anonymize it altogether, making it impossible to determine who the information points to outside its utility. You can consider removing names from data sets before they are logged into a database. Image data anonymizing software can be used to create another layer of security. While anonymizing can work to an extent, it’s not foolproof in real world.
In situations where anonymizing won’t prove practical, pseudonymized data can be used. Pseudonimizing data is covered in GDPR where it is defined as processing personal data in a way that makes it impossible to attribute it to its source without the aid of additional information which can be kept in a secure environment.
4) Ensuring data security
Unlike EU’s previous data protection initiative, the GDPR requires stricter measures to ensure someone’s personal identifiable information is never compromised. The exact text given under GDPR is…
…implement appropriate technical and organisational measures” taking into account “the state of the art and the costs of implementation” and “the nature, scope, context, and purposes of the processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons.
There are certain steps you can take to comply here…
Have a well thought out Data Loss Prevention strategy: DLP tools and strategies are well suited to helping you overcome GDPR rulesets. As organizations that allow breaches to occur can invite heavy penalties under GDPR, DLP makes for an ideal investment.
Make an error-proof Disaster Recovery plan: Article 32 of GDPR states that companies should have “the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident.” There are quite a few disaster recovery software on the market that can help you out here.
Have a GDPR compliant data backup policy: Backup is a double edged sword vis-a-vis GDPR, as not only you need to provide for data portability as mentioned under Article 20, but also the right to erasure as stated by Article 17. In other words, your backup needs to restore information quickly if requested, and should a person opt-out, you need to remove all data from backups as well.
Consider a robust Identity Access and Management controls: IDAM, as it is known helps to enforce a least privilege policy and restrict access to sensitive information.
Conclusion
GDPR should come as no surprise as the last massive regulatory endeavor to control technology, the Data Protection Directive was released in 1995. Since then, few changes were made to legislation even as newer technologies continued to evolve.
What also evolved as a result of tech breakthroughs is cybercrime. With the number of hacks and ransomware attacks on the rise, companies have to do everything in their power to ensure the data they collect is never compromised, doubly so if the data is of a highly personal nature such as facial features.
As long as companies adhere to the highest ethical and security standards, they will never have to fear about any up and coming legislation.
Get the TNW newsletter
Get the most important tech news in your inbox each week.