Privacy by design — or “Data Protection by Design” as it is referred to in the General Data Protection Regulation (GDPR) — is essential to meaningful privacy protection. Yet, it is often quite thin and incomplete. As I wrote a few years ago about privacy by design, “The ‘privacy’ the designers have in mind might be so focused on one particular dimension of privacy that it might overlook many other dimensions.”
R. Jason Cronk has a tremendously thoughtful approach to privacy by design that helps avoid this pitfall. He is the author of the forthcoming book Strategic Privacy by Designand has been working in the fields of privacy and information security since 2004. Cronk is one of the rare privacy lawyers who also has a sophisticated technology background. Last year, Cronk and I worked together on a privacy notice generator which won the Department of Health and Human Services Office of Technology Innovation prize. Now, Cronk focuses mostly on bringing his expertise and unique spin on privacy by design to organizations through his boutique consulting firm, Enterprivacy Consulting Group.
I had a chance to read Cronk’s manuscript, and I am particularly impressed by how thorough his approach to privacy by design is, as well as how well he illustrates the issues with concrete examples. When his book comes out, I strongly recommend that anyone with an interest in privacy read it.
Later this year he’ll be offering a Data Protection by Design and Default intensive day in Washington, D.C. at the Privacy and Security Forum (on Wed., Oct. 3). This will be an all-day event about privacy by design. A few weeks later, he’ll be speaking about these issues in New Zealand. For those of you who can’t make it to New Zealand, then definitely join Jason for his intensive day event at the Forum.
Below, I discuss with Jason his views about privacy by design and his approach to it.
SOLOVE: How far has privacy by design progressed in terms of its acceptance, inclusion in laws, and implementation in practice? What challenges remain for privacy by design?
CRONK: Dr. Ann Cavoukian tirelessly promoted privacy by design leading to worldwide interest in it. Without her effort, we wouldn’t have seen a unanimous resolution adopted by the International Conference of Data Protection and Privacy Commissioners calling privacy by design a necessity in data protection. We wouldn’t have seen the FTC call for its adoption by companies in protecting consumer privacy. We wouldn’t have seen the inclusion of Article 25, Data Protection by Design and Default, as part of GDPR. We wouldn’t see other countries, like India, moving to include similar concepts in their laws as well. Unfortunately, part of the strength of her 7 Foundational Principles of Privacy by Design are also their weakness. She purposefully made them robust and flexible to allow organizations to find their own methods to achieve them.
However, privacy by design has remained frustratingly vague – its flexibility might be a virtue in some respects, but it is a curse in other respects. For engineers, the ambitious goals of the principles lacked concreteness. Yes, we need to “embed privacy into designs,” as principle 3 requires, but how do we, as engineers, accomplish that?
In my work, I’ve spent a long time trying to make privacy by design more comprehensive and concrete. I hope I will be able to advance the ball.
SOLOVE What are some of the problems with current approaches to data protection by design and privacy by design?
CRONK: First off, I would say most companies are not even attempting data protection by design. They are still in the early stages of privacy program development, trying to understand what it is they do, develop high level polices, create a vendor management program, etc. In some respects, it’s backwards, because if they were designing privacy into their products, services and, importantly, business processes, that would solve so much more of the major privacy program elements. The underlying problem is most companies, at least those I’ve encountered, don’t have existing design processes. They may have a development life-cycle for their core products or services, but event major companies sometimes lack formality in every aspect of their development. Even amongst those with formal product development processes,
I have yet to find a company that conscientiously designs all or most of its business processes, despite tools and methodologies. I’m not saying they aren’t out there, but I just haven’t encountered one yet. Usually that level of formality seems to reside in consulting firms hired to come in and optimize specific business processes. This means when an HR director decides it would be a good idea to offer employees a parking space registration system for the employee parking lot, nobody is formally designing that process, which further means there isn’t a process for privacy by design to plug into, so the privacy office is tasked with instituting a design process period over and above privacy by design.
The other problem is that some companies label what they do privacy by design, but it really isn’t. They aren’t systematic and most only address a limited set of controls, namely encryption, access controls and notice and choice. They don’t sufficiently correlate risks to the individuals to the available organizational and technical controls. They simply ask the developers (usually software developers): “Have you encrypted data at rest? Have you encrypted date in transit? Are you providing notice? What choices does the user have?”
This is such a limited view of the design space. The result is that companies add some encryption, a few opt out buttons, and voila . . . they claim they’ve done privacy by design. A comprehensive privacy by design approach encompasses so much more.
SOLOVE: What have you done to make privacy by design more “comprehensive and concrete?”
CRONK: There exists a chasm between the academic world of privacy and the professional or corporate world. I’m very comfortable crossing over that divide and for most of my career have attended both academic and professional events. Most of my work leverages and pulls from the outstanding work of academics in the field, but works that remain mostly obscure to the privacy professional community. I’ve built on efforts by those on the forefront of privacy thought including Lorrie Faith Cranor, Sarah Spiekermann, Jaap-Henk Hoepman, Ryan Calo, Allesandro Acquisti, and you, with your taxonomy. My main contribution has been just to piece them together in a cohesive and systematic methodology. One that’s easy enough for non-technical privacy professionals but detailed enough for engineers to act on.
SOLOVE: Can you provide a concrete example of how data protection by design can help a company avoid a privacy incident?
CRONK: Let’s say you’re building a mobile game. You’re worried that players might interrogate other players and ask inappropriate questions and maybe through the answer, the players might be able to identify people. The consequence could be the targeted individuals end up getting swatted (where a fake call to police ends up with the SWAT team raiding the target’s house). Yes, this happens, sometimes with deadly consequences.
Now that you’ve identified the potential violations and consequences, what can the game developer do? They can exclude collection of information by not having a chat feature. They could selectively process information by removing identifying information before being sent to other players. They could isolate gamers to play on servers in games only with known friends. They could create a policy against asking questions about identity. They could log chats and audit them, then ban customers who violated the policy. They could provide only pre-selected questions, phrases and emojis via the chat rather than allow free form text. They could inform players of the risks (of being swatted) if they reveal information. They could give users control to block other players or grant them access only if they know them.
The preceding list of controls may seem haphazard but I’m actually following my very standardized method. I’ve reduced the opportunity of the threat actor, in this case other players, through architecture. I’ve increased the difficulty of using information by obfuscating and abstracting data. I’ve reduced the probability of other players acting through policy enforcement. Finally, I’ve reduced the potential impact of a chat feature by putting information and control in the hands of the players. This is the systematic approach to reducing specific privacy risk factors (opportunity, difficulty, probability of action, and impact) through specific strategies and tactics that ultimately can yield privacy…..by design.
SOLOVE: Be sure to keep an eye out for Jason’s book when it is published soon. It’s a definite must-read. And to learn more about Jason’s approach to privacy by design, please attend his intensive day event on Data Protection by Design and Default (Wed., Oct. 3) at the Privacy+Security Forum. He’ll be joined by Stuart Shapiro of MITRE for this event.
* * * *
This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy and data security training. He also posts at his blog at LinkedIn, which has more than 1 million followers.
Professor Solove is the organizer, along with Paul Schwartz, of the Privacy + Security Forum (Oct. 3-5, 2018 in Washington, DC), an annual event designed for seasoned professionals.
NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.
Data Protection by Design and Default Intensive Day (Oct. 3, 2018)
GDPR and Privacy Awareness Training by Professor Daniel Solove