I had the opportunity to interview Mark Singer and Raf Sanchez, both at Beazley, about the issue of profiling and the GDPR. Mark Singer is a member of the Cyber & Executive Risk Group at Beazley. Mark handles insurance coverage issues arising out of cybersecurity, technology errors and omissions, data privacy, intellectual property, media and advertising liabilities. Raf Sanchez leads the international Beazley Breach Response Services team at Beazley and is responsible for incident response in all territories outside the US and Canada.
SOLOVE: What does profiling mean for GDPR purposes?
SINGER: Profiling is where an individual’s personal information is used to build up a picture of what type of person you are and how you behave. GDPR defines profiling as “automated processing of personal data…to evaluate certain personal aspects relating to a natural person…”.
It is important to distinguish profiling from automated decision making which are linked in the GDPR wording. Profiling is a tool, which can lead to the making of decisions on an automated basis (such as an online approval of a loan). The automated decision-making is restricted under GDPR, save for in particular circumstances. That restriction does not necessarily extend to the activity of profiling. The key is what organisations do with the Intel generated by the process.
Profiling will only be restricted (and consent required) where an organisation conducts profiling using an individual’s sensitive personal data (such as health/ race etc.) or the profiling is used for automated decision-making (i.e. no human review element in the decision making process). Consent is not always required for profiling and profilers will likely seek to rely on non-consent grounds such as legitimate interests under Article 6(1)(f).
SOLOVE: How might US technology companies in the business of profiling be implicated by GDPR?
SINGER: GDPR applies to organisations involved in the processing of personal data of individuals located in the EU. If a US organisation monitors the behaviour of individuals in the EU then they will likely be implicated by GDPR. An organisation does not need to offer goods or services to individuals in the EU in order to be implicated. The profiling alone of individuals in the EU will be sufficient to trigger obligations.
The concept of “monitoring” indicates that a controller has a specific purpose in mind for the collection and use of the collected data – that use will be key in determining the applicability of the regulation. Whether or not the individuals are in the EU is assessed at the moment the profiling takes place. Such profiling could therefore open up a US organisation to the various obligations under GDPR (such as data breach notification and record keeping) and the potential fines and penalties available (at its most severe, up to 4% of annual global turnover).
The European Data Protection Board (“EDPB”) provides this helpful example:
“A marketing company established in the US provides advice on retail layout to a shopping centre in France, based on an analysis of customers’ movements throughout the centre collected through Wi-Fi tracking.
The analysis of a customers’ movements within the centre through Wi-Fi tracking will amount to the monitoring of individuals’ behaviour. In this case, the data subjects’ behaviour takes place in the Union since the shopping centre is located in France. The marketing company, as a data controller, is therefore subject to the GDPR in respect of the processing of this data for this purpose as per its Article 3(2)(b).”
SOLOVE: Does the output generated by profiling constitute personal data for the purposes of GDPR regulation?
SINGER: It depends. Personal data includes information relating to natural persons who: (a) can be identified or who are identifiable, directly from the information in question; or (b) who can be indirectly identified from that information in combination with other information. In the above example of the EDPB a customer’s movements within a shopping centre may not be classed as personal data if it is truly anonymised – i.e. it is just information that “someone” walked through a shopping centre at a particular time. However, if the information collected includes a name or some kind of identification number, then this is likely to be considered personal data and therefore subject to the obligations of GDPR. Ultimately, it will depend on the specific nature of the information collected about the behaviour of relevant individuals, as to whether it is considered personal data.
SOLOVE: Could wearable tech or other smart devices also attract profiling/ behavior monitoring implications?
SANCHEZ: If A.I. was the technology buzzword of 2018, it’s clear that wearables are the 2019 equivalent. During Apple’s March 2019 earnings call, Apple CEO Tim Cook and other executives confirmed that their wearables business set a new quarter revenue record of $5.1 billion and is now the size of a Fortune 200 company.
Wearables are such an important product category because they allow individuals to track personal wellness metrics and they allow businesses to track employees’ use of resources, deliver better customer service and provide customised training. Maybe most importantly of all, wearables with bio-sensors such as heart-rate monitors, pulse-ox sensors and accelerometers allow the sharing of information with health-care providers and insurers to permit earlier detection of health problems, provide remote diagnosis capabilities and even drive significant health-care cost savings.
It is important to note from a profiling perspective that these wearables must be context-aware in order to deliver meaningful insights to their users. For example, in order to inform a user whether they are taking sufficient exercise each day, the wearable will need to know whether that user is a 20-year old male or a 50-year old female. It may also need to be given information either by the user or through other devices, such as smart scales, of say a user’s weight, height and body mass index.
As is obvious from these use-case scenarios, the benefits are only really present if these wearables are able to share information with third-party systems to benefit from the massive data sets being created by the various organisations that are selling these devices and then enriching data received from them with other data sources.
From a privacy perspective the difficulties are that much of this information sharing is happening with little user transparency, without much concern as to the geo-location of the data (as most of these services are hosted in the cloud) and without incorporating key privacy concepts such as the GDPR’s “privacy by design”.
Organisations can clearly obtain huge value from these data sets but the question, as ever, with much of the technology advancements from which society is benefiting they risk becoming a digital panopticon and the question that must be asked is: “we can but should we”?
SOLOVE: What should US profilers be doing to protect against these risks?
SANCHEZ: Scott McNealy, then CEO of Sun Microsystems, famously observed in 1999 that “You have zero privacy anyway. Get over it.”
Contrast this with Sundar Pichai’s comments in December 2018 to the House Judiciary Committee that “…protecting the privacy and security of our users has long been an essential part of our mission” and Facebook CEO Mark Zuckerberg’s comment at the F8 developer conference this year that “…the future is private” and it’s obvious that organisations see the importance of stressing the privacy rights of their users.
This change in approach is not borne of an altruistic desire to protect users. A 2017 Pew Research Center survey found that just 9% of social media users polled were “very confident” that social media companies would protect their data.
However, the Cambridge Analytica scandal and the more recent revelations that Google’s ‘Purchase History’ tool tracks Gmail users’ online purchases and real-world transactions made using a card linked to a Gmail account show that there are still problems.
The main issue for US profilers is that they cannot define privacy on their own terms and must take note of legislation such as the GDPR that is defining user expectations in a world where geographic barriers are increasingly irrelevant.
The New York Times columnist Kevin Roose very succinctly summarised the issue in a tweet on 8th May 2019: “The Valley-wide scramble to redefine privacy as “we take your data and don’t give it to anyone else” instead of “we don’t take your data in the first place” is fascinating to watch.”
US profilers should be ensuring that their data protection practices are in line with, and perhaps even ahead of, the privacy obligations mandated in laws such as the GDPR that are increasingly being adopted globally.
With huge, untapped, markets of potential users in Latin America, South East Asia and China, it’s clear that a strong emphasis on privacy rights could be used as an effective marketing tool as well as an important compliance obligation.
SOLOVE: Thanks Mark and Raf for your great insights!
* * * *
This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy and data security training. He also posts at his blog at LinkedIn, which has more than 1 million followers.
Professor Solove is the organizer, along with Paul Schwartz, of the Privacy + Security Forum and International Privacy + Security Forum, annual events designed for seasoned professionals.
NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.
Our New Privacy Awareness Training Course
Click here to see a demo or to learn more about the course.
Table of Contents
Click here to see a demo or to learn more about the course.