PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

Blueprint Privacy 03

Hot off the press is Professor Woodrow Hartzog’s new book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies (Harvard Univ. Press 2018). This is a fascinating and engaging book about a very important and controversial topic: Should privacy law regulate technological design?

Privacy's Blueprint

Many have argued that the law should avoid regulating technological design, as this impedes freedom to develop new technology and is a heavy-handed way to regulate. The law should focus on addressing harms arising after technology is developed and shouldn’t meddle early on in the development stage. In contrast. Hartzog makes a compelling case why the law must become involved with design. In these days, as concerns about Facebook and Cambridge Analytica are boiling over the top, a discussion about the law’s fundamental approach to privacy regulation could not be more timely. Hartzog’s book is just the book we need right now! It’s a superb book, the kind of book that changes the way you see and think about things. Privacy’s Blueprint is one of the most important books about privacy and technology, and it’s a definite must-read.

Woodrow HarzigHartzog is a professor of law and computer science at Northeastern University School of Law and the College of Computer and Information Science and an affiliate scholar at the Center for Internet and Society at Stanford Law School. He has written extensively on privacy, media, and robotics.

Below is a short interview with Professor Hartzog. We just scratch the surface of the many great issues he tackles in his book. Privacy’s Blueprint is essential reading, and I urge you to put it at the top of your reading list.

SOLOVE: Can you provide some examples of how design affects privacy?

HARTZOG: Design affects nearly every aspect of our privacy these days, but three areas stand out. The first in the design of the user interfaces of apps and websites, particularly social media. Every aspect of the user experience is designed to extract data out of you, to get you to never stop sharing, and to have you feel good about it in the process. Companies say they respect your privacy by asking for your permission before collecting certain kinds of data. But what’s not made clear to you is that they are going to engineer an environment that all but assures most of us say yes. Companies use relentless pop-ups, nudges to change your current privacy settings, and buttons buried so deeply and presented in such a confusing way, that your exposure is almost pre-ordained.

This sort of adversarial interface is part of what enabled the incident involving Facebook and Cambridge Analytica. Facebook said that the third party app that collected the information on over 50 million Facebook users “requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent.” But for the friends of people that didn’t even interact with the third party app in question, that consent was buried where few people would be likely to find it and the nature of the permission was so vague that it was hardly meaningful.

The next area where design matters for our privacy is in technologies that are built to make things significantly easier to find—things like biometrics and search functions. Each of these kinds of seeking technologies erode the obscurity that people rely upon every day. Most people do not assume that when they are walking about in public that their every move is being sensed, critiqued, and stored to be used against them later. They are obscure and they make decisions about what to share and where to go based on that. Design changes that, often dramatically and without people knowing it. Facial recognition can destroy the obscurity of people’s whereabouts instantly. Making things searchable can expose to a mass audience what was previously likely to be found only by people who knew where to look and what to look for. Design decisions like whether to use biometrics or make things searchable should matter more in law and policy.

Finally, design matters regarding the processors, sensors, and software that are getting jammed into every device that isn’t nailed down. (And many that are!). Companies have been remarkably cavalier in their approach taking an object and outfitting it with surveillance tools and connecting it to the Internet. Consumers can now choose to connect wine openers, basketballs, toilets, and even their underwear (!) to the Internet. Sensors in the home increase both the perception of the surveillance and the risk of unwanted and unknown surveillance. Outdated, unpatched, and uncared for software embedded in ignored and reckless IoT devices can be the Achilles heel of an entire, otherwise relatively secure network. It is nearly impossible for companies to give meaningful notice to users of an entity’s data collection and use practices.

Blueprint with Ruler, Triangle and Pencil

SOLOVE: To what extent is design currently regulated?

HARTZOG: There aren’t many meaningful design interventions or boundaries in privacy law. Essentially, the privacy rules that govern companies around the world are built around three major ethics: 1) Do not lie; 2) Do not harm; and 3) Follow the Fair Information Practices, or “the FIPs.” The rules that are built around these three ethics fail to adequately address how the design of information technologies affects people and can frustrate the goals of privacy law.

For example, companies too often are allowed to bury the truth in dense and unreadable boilerplate contracts that nobody should be expected to read or understand. While they are technically being truthful, users whose expectations are shaped by the design of their environment often form misconceptions about how sites work and what companies are collecting and plan to do with their personal information.

Additionally, modern privacy harms that flow from design are often diffuse and incremental, yet courts and lawmakers usually demand concrete, financial, and visceral harm before they will recognize a legal violation or let people recover damages. Companies are allowed to aggregate information, make it searchable, and design their systems to expose people to all kinds of risk without much fear of ever being held accountable for it.

Finally, the Fair Information Practices functionally operate at the common language of privacy around the world. They form the basis of most data protection regimes worldwide, including Europe’s General Data Protection Regulation and many data protection statues in the US. But they don’t really address the design of information technologies. The FIPs were built around the threat of databases, and they do a relatively decent job addressing the risk of aggregated data. But privacy threats these days come from more than databases. Automated decision-making systems, biometrics, artificial intelligence, and manipulative user interfaces are all woven into the design of information technologies. The FIPs are built the threat from platforms like Facebook or organizational actors like the government. But in the age of Facebook, Snapchat, and Twitter, individuals also harass, betray, and expose each other. And design facilitates all of this. Lawmakers have not created many legal boundaries to guide how the tools are built.

Blueprint with Magnifying Glass

SOLOVE: How ought design to be regulated without being too heavy handed? 

HARTZOG: It’s not easy. But when people think about design rules and interventions, they often think the only option is to heavily regulate technology. But taking design seriously doesn’t mean pulling the plug on all digital technologies. Nor does it mean passing micro-managing, ham-fisted rules that risk burdening companies for minimal privacy gains. It means embracing the full range of legal responses from soft, to moderate, to robust. It also means a general preference for more flexible standards of reasonableness unless more specific, technologically-specific rules are necessary. And above all, it is about matching the right responses to the right problems.

Sometimes, funding, educational efforts, or standards coordination is what’s needed. Sometimes courts need to simply be more aware of the role that design plays in shaping people’s expectations about how a technology works and what has been promised to them. Changes in the common law can better recognize design-based promises and inducements to breach confidentiality. Of course, certain problems deserve robust responses. Spyware is a scourge. The Internet of Things is a runaway train headed for catastrophe. More significant regulation might be needed for these technologies, but a broad spectrum of regulatory and policy options will help make sure legal responses across the board are proportional.

Man at Drafting Table

SOLOVE: Critics of regulation express grave concerns that regulators lack a sufficient understanding of technology and might inhibit innovative new technologies. Critics say that it is dangerous to have regulators tell engineers how to build things. How would you respond to such critics?

HARTZOG: I totally understand the intuitive appeal of these arguments. We certainly have plenty of examples where lawmakers mucked things up because they acted in short-sighted ways. In the past, lawmakers might not have understood how certain technologies work or all the different considerations that needed to be taken into account. We’re still trying to figure out how to fit modern surveillance problems into a regulatory regimes built around business models for computing in the 1980s. And sometimes our privacy is best served if lawmakers avoided interfering with certain kinds of technological design. The battle over encryption backdoors is a good example of this.

But I think peoples’ concerns about lawmakers guiding the design of information technologies is often unjustified. First, ignoring design is a matter of ethics. To ignore design is to leave power unchecked. When courts and lawmakers ignore design, they functionally sanction the way that companies leverage design to affect people’s lives.

Second, lawmakers and courts have imposed boundaries upon design in many different contexts for quite a long time. Cars and planes must be built safely. This is why we have seatbelts, airbags, and gas tanks that won’t explode upon contact. Buildings must be built in structurally sound ways. There are even certain rules about how guns must be designed. When confronted with technological complexity, there are things courts, lawmakers, and regulators can do to mitigate uninformed, ineffective, and unjust rules. They should hire more technologists, ethicists, psychologists, sociologists, economists, and experts from every discipline to confront the ways in which design decisions distribute power and affect our relationships, our economy, and personal wellbeing. They should make sure to articulate all of the different values at stake and use all their different tools to serve those values.

The Internet is exceptional in many ways, but it is still of this Earth. Digital technologies are still just tools created by people. And like other tools, companies can use them to deceive, abuse, manipulate, and harm us. Until lawmakers take that power seriously, our privacy rules will remain incomplete.

SOLOVE: Thanks, Woodrow, for your terrific insights. The book is Privacy’s Blueprint, and it is available at Harvard University Press or Amazon.

Related Posts

Daniel J. Solove, Pivacy by Design: 4 Key Points

* * * *

This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy and data security training. He also posts at his blog at LinkedIn, which has more than 1 million followers.

Professor Solove is the organizer, along with Paul Schwartz, of the Privacy + Security Forum (Oct. 3-5, 2018 in Washington, DC), an annual event designed for seasoned professionals. 

NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.

 

Click here for more information about our
privacy awareness training for GDPRPrivacy Awareness Training - GDPR - TeachPrivacy 01