PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

Three Approaches to Privacy Law

These days, the debate about a federal comprehensive privacy law is buzzing louder than ever before. A number of bills are floating around Congress, and there are many proposals for privacy legislation by various groups, organizations, and companies.  As proposals to regulate privacy are debated, it is helpful to distinguish between three general approaches to regulating privacy:

  1. Privacy Self-Management
  2. Governance and Documentation
  3. Use Regulation

Most privacy laws rely predominantly on one of these approaches, with some laws drawing from two or even all of them.

Each approach has various strengths and weaknesses.  To be successful, a privacy law must use all three approaches. Many laws could be strengthened greatly if they used more of the third approach that I will outline below.

(1) Privacy Self-Management

The most common approach to privacy regulation is privacy self-management.  This approach provides people with various rights to help them exercise greater control over their personal data. Some of these rights include:

  • right to notice about practices regarding personal data
  • right to access personal data
  • right to correct errors in personal data
  • right to deletion of personal data
  • right to data portability
  • right to opt in (or opt out)
  • right to object to data processing (and stop it)
  • right to request information about data collection and transfer

Privacy self-management means that people manage their own privacy by reading privacy notices and finding out about the data being collected about them and how it is being used.  Then, after informing themselves about this knowledge, people can choose how to control the collection and use of their personal data – they can request that processing be stopped, that data be deleted, that they be opted out of the sale of their data, and so on.

Privacy self-management, although laudable, is fraught with challenges. People often don’t know enough to make meaningful choices about privacy. People don’t understand the risks of allowing their data to be used and shared in certain ways. Moreover, privacy self-management doesn’t scale very easily. Managing privacy might work for a handful of sites, but people do business with hundreds – even thousands – of sites.  People will have to spend a ton of time learning about how all these companies collect and use their data and will really struggle in making the appropriate risk decisions about how to respond to what they learn.  (For a more extensive discussion and critique of privacy self-management, see Daniel J. Solove, Privacy Self-Management and the Consent Dilemma, 126 Harv. L. Rev 1879 (2013)).

The California Consumer Privacy Act (CCPA) is a recent law that relies most squarely on self-management.The CCPA provides individuals with a series of rights to manage their privacy such as a right to find out about data collected about them and a right to opt out of the sale of their data.  These goals are laudable, but in practice, they are not very feasible.  As I discuss in a forthcoming article, The Myth of the Privacy Paradox89 Geo. Wash. L. Rev. __ (2021):

At first glance, the [CCPA] appears to give people a lot of control over their personal data – but this control is illusory. First, many companies gather and maintain people’s personal data without people knowing. People must know about the companies gathering their data in order to request information about it and opt out. So, the CCPA helps people learn about the data collected by companies they already know about but doesn’t help them learn much about what data is being gathered by other companies that operate in a more clandestine way.

Second, the CCPA doesn’t scale well. The number of organizations gathering people’s data is in the thousands.  Are people to make 1,000 or more requests? Opt out thousands of times?  People can make a few requests for their personal data and opt out a few times, but this will just be like trying to empty the ocean by taking out a few cups of water.

Third, even when people receive the specific pieces of personal data that organizations collect about them, people will not know enough to understand the privacy risks. Journalist Kashmir Hill notes how requests for personal data from companies often involve a data dump, which has limited utility: “[M]ost of these companies are just showing you the data they used to make decisions about you, not how they analyzed that data or what their decision was.” A list of pieces of personal data mainly informs people about what data is being collected about them; but privacy risks often involved how that data will be used.

My concern about the CCPA is that although it is well-meaning, it might lull policymakers into a false belief that its privacy self-management provisions are actually effective in protecting privacy. Worse, it might greenlight extensive data selling — after all, under the CCPA, companies are allowed to sell data unless the individual opts out. Policymakers might pat themselves on the back and consider the problem of privacy to be largely solved.  Other measures to protect privacy might not be enacted.

The GDPR and most other privacy laws also contain a set of individual rights, but these rights are just one dimension of the GDPR whereas they are much more central to the CCPA.  It is hard to imagine privacy laws that don’t provide consumers with basic rights such as notice or access, so I am not arguing that these rights shouldn’t be included in privacy laws. But the rights are far from enough.

(2) Governance and Documentation

Another approach to privacy regulation is through governance and documentation.  Under this approach, the law mandates certain requirements for governance.  These include:

  • appointing a chief privacy officer or data protection officer
  • conducting routine risk audits
  • having written policies and procedures
  • training the workforce
  • documenting incidents
  • conducting privacy impact assessments
  • engaging in a privacy by design analysis
  • having contracts with vendors that receive personal data

The GDPR follows this approach. Although it has a heavy does of privacy self-management, the real backbone of the GDPR is its strong governance and documentation approach.

The virtues of this approach is that privacy compliance isn’t self-executing. Someone needs to own the issue.  Privacy laws that lack governance requirements are often ignored or not meaningfully followed.  A classic example is the Family Educational Rights and Privacy Act (FERPA). FERPA doesn’t require a privacy officer and doesn’t require training. Without this requirement, most schools lack anyone who knows enough about privacy to ensure compliance. Staff in the registrars office will often know FERPA. But beyond the registrar’s office, few others at most schools know much about FERPA. Rarely do schools train administrators, staff, and faculty about FERPA.  Without training, there is no way for these people to know what the rules are.

Rules and policies are meaningless if people don’t know about them.  You can’t follow a rule if you don’t know about it.  This is one reason why governance is so important in privacy regulation.  Without governance, a privacy law is often ineffective and empty.

However, there are shortcomings to the governance and documentation approach.  As Ari Waldman notes in his provocative article, Privacy Law’s False Promise, forthcoming  97 Wash. U. L. Rev.  __ (2020): “But the law’s veneer of protection is hiding the fact that it is built on a house of cards. Privacy law is failing to deliver its promised protections in part because the corporate practice of privacy reconceptualizes adherence to privacy law as a compliance, rather than a substantive, task. Corporate privacy practices today are, to use Julie Cohen’s term, managerial.”  He further writes: “The focus on documentation as an end in itself elevates a merely symbolic structure to evidence of actual compliance with the law, obscuring the substance of consumer privacy law and discouraging both users and policymakers from taking more robust actions.”

A company can look great on paper, with a robust privacy program with all the trimmings.  Much like a baseball team could look great on paper, a team filled with all-starts each with terrific stats but that ultimately can’t win ballgames.  Organizations can go through the motions with governance and documentation but not really put their heart into it.  Or, organizations could really make a great effort with governance and documentation yet have major privacy incidents due to a few poor decisions and practices. Thus, so much focus can on the trees that the forest is overlooked.

Documentation, however, is not completely meaningless. Although documentation can appear to be a tedious and overly-formal exercise, it isn’t just dotting i’s and crossing t’s.  To use the words of a Zen master, “it is the journey, not the destination, that counts.”  The process of engaging in the documentation hopefully makes organizations more thoughtful and introspective about how they use personal data. But far too often, documentation becomes hollow busywork, and thoughtfulness and self-reflection isn’t occurring during the process.

(3) Use Regulation

The third approach to regulating privacy is to regulate uses.  This is a more substantive way to regulate.  Self-management largely puts the burden on people to manage their own privacy; as long as companies provide rights to people, it’s left to people to figure out their own privacy. As I discussed above, people aren’t really capable of this task in many circumstances. Governance and documentation focuses on organizations, but it is mostly about process rather than substance. Privacy laws using a governance and documentation approach rarely tell organizations what substantive things to do. As long as the organizations have a privacy officer, do privacy impact analyses, have policies and procedures, and so on, the law considers its job as done.  The problem is that process without substance is empty.

The use regulation approach focuses on substantive restrictions on use. This approach is the least frequently used in privacy law, but it is employed in a few well-known laws.  For example, the Fair Credit Reporting Act (FCRA) is an example of a use regulation approach.  The law has fairly specific rules about how credit reporting data should be used. The law specifies particular permissible uses for this information. Other uses are forbidden. The law also has provisions that limit the use of certain data in credit reports, such as bankruptcies and criminal convictions that are very old.

HIPAA also takes a use regulation approach.  Many uses of health data – called “protected health information” under HIPAA — are restricted unless people explicitly consent to them.  And, consent can’t be conditioned on treatment, so healthcare providers can’t try to coerce people into agreeing to certain uses.

The reason why only a few privacy laws significantly restrict uses is primarily because policymakers are reluctant to regulate substance. Policymakers want to avoid making the law too paternalistic.  Imposing specific use restrictions is very constraining and cuts against the basic principle of the American approach to privacy, which is that companies are generally free to use personal data as they desire — as long as they don’t break their promises about how they will use it and don’t cause harm.

But privacy law can’t ignore use regulation.  Without this dimension, privacy laws will rely too much on self-management or governance and documentation to do the work. As I have argued above, these approaches aren’t enough.  There is no escape from substance.

Although the GDPR requires justifications to use personal data, known as lawful bases, some of the recognized lawful bases are rather general such as “legitimate interests.”  The result is that companies have wide discretion about how to use personal data.  The GDPR also says that companies should consider privacy by design early on in the process when designing products and services. But it provides hardly any rules about what it means to design for privacy.  What constitutes “privacy” (or “data protection,” the term used in the EU and in the GDPR) is a challenging question.  Designing for privacy is only as good as one’s conception of privacy.  Far too often, organizations have a narrow conception of privacy.  A conception of privacy – and the design choices to protect it — are substantive issues. There’s really no escape from substance.

* * * *

To be effective, privacy law must use all the approaches I outlined above.  Two out of three is quite insufficient. The sooner this fact is reckoned with, the more effectively privacy law can develop.

_____________________________________________________

This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy and data security training. He also posts at his blog at LinkedIn, which has more than 1 million followers.

Professor Solove is the organizer, along with Paul Schwartz, of the annual Privacy + Security Forum events.

NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.

 

Global Privacy and Data Protection
Privacy Awareness Training Course

Click here to see a demo or to learn more about the course.

TeachPrivacy Privacy Awareness Training - Global Privacy screenshots 01

 

Click here to see a demo or to learn more about the course.