PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

California Consumer Privacy Act of 2018

In the period of just a week, California passed a bold new privacy law — the California Consumer Privacy Act of 2018.  This law was hurried through the legislative process to avoid a proposed ballot initiative with the same name.  The ballot initiative was the creation of Alastair Mactaggart, a real estate developer who spent millions to bring the initiative to the ballot.  Mactaggart indicated that he would withdraw the initiative if the legislature were to pass a similar law, and this is what prompted the rush to pass the new Act, as the deadline to withdraw the initiative was looming.

The text of the California Consumer Privacy Act is here.  The law becomes effective on January 1, 2020.

California palm treesThere are others who summarize the law extensively, so I will avoid duplicating those efforts.  Instead, I will highlight a few aspects of the law that I find to be notable:

(1) The Act creates greater transparency about the personal information businesses collect, use, and share.

(2) The Act provides consumers with a right to opt out of the sale of personal information to third parties and it attempts to restrict penalizing people who exercise this right.  Businesses can’t deny goods or services or charge different prices by discounting those who don’t opt out or provide a “different level or quality of goods or services to the consumer.”  However, businesses can do these things if they are “reasonably related to the value provided to the consumer by the consumer’s data.”  This is a potentially large exception depending upon how it is interpreted. 

(3) The Act allows businesses to “offer financial incentives, including payments to consumers as compensation,” for collecting and selling their personal information.  Financial incentive practices cannot be “unjust, unreasonable, coercive, or usurious in nature.”   I wonder whether this provision will undercut the restriction on offering different pricing or levels of service in exchange for people allowing for the collection and sale of their information.  Through some clever adjustments, businesses that were enticing consumers to allow the collection and sale of their personal data through different prices or discounts can now restructure these into “financial incentives.”

(4) The Act defines “personal information” as “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.”  This definition is similar to the GDPR’s definition of “personal data” in that it includes information that is identifiable — that could be linked directly or indirectly to people.  But it diverges in that it excludes “publicly available information” — “information that is lawfully made available from federal, state, or local government records.”  There is a ton of data in these records, which can readily be aggregated, analyzed, and sold.  I wrote about this issue extensively in my article, Access and Aggregation: Public Records, Privacy, and the Constitution, 86 Minnesota Law Review 1137 (2002).

California Consumer Privacy Act

(5) The Act provides for a right to deletion (often referred to as the “right to be forgotten”).  According to the Act, a “business that receives a verifiable request from a consumer to delete the consumer’s personal information pursuant to subdivision (a) of this section shall delete the consumer’s personal information from its records and direct any service providers to delete the consumer’s personal information from their records.”  There are many exceptions, including the need to complete transactions with the consumer, perform contracts with the consumer, detect security incidents, comply with laws, exercise free speech, and engage in research.  There are a pair of exceptions involving “internal use.”  The first of these involves engaging “internal” uses that are “reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business.”  The second “internal use” exception is to “[o]therwise use the consumer’s personal information, internally, in a lawful manner that is compatible with the context in which the consumer provided the information.” I’m not sure what, exactly, is the distinction between these two, as they seem to be getting at the same category of uses.  These “internal uses” could be interpreted as a rather broad category that could limit the right to deletion.

(6) There is a private right of action for security violations but not for privacy ones.

California(7) The Act requires “reasonable security procedures and practices” and creates a private right of action in the event of an incident, defined as “an unauthorized access and exfiltration, theft, or disclosure.”  People can recover damages  “in an amount not less than one hundred dollars ($100) and not greater than seven hundred and fifty ($750) per consumer per incident or actual damages, whichever is greater.”  I like the minimum damage amount, as harm is often an issue that courts struggle with and fail to recognize.  The minimum damages provision should help address the harm hurdle.  There is another potential hurdle, though, that could render this provision unusable in many cases.  The private right of action applies when there is exfiltration — the data is transmitted to unauthorized parties.  Detecting exfiltration can be quite challenging.

(8) A business has 30 days to “cure” the security violation.  People must provide written notice to a business “identifying the specific provisions of this title the consumer alleges have been or are being violated.”  If the business “actually cures” the violation, then “no action for individual statutory damages or class-wide statutory damages may be initiated against the business.”  The requirement to identify the “specific provisions” violated don’t make much sense as the private right of action is only for the provision involving reasonable security procedures and practices.  And if there is unauthorized “exfiltration,” I’m not sure what the “cure” is short of apprehending the hackers or somehow ensuring that all the exfiltrated data is no longer out there.  This provision would make a lot more sense if the private right of action were to extend to privacy violations,  These are more likely to be curable.

(9) The Attorney General (AG) has 30 days after a consumer files a lawsuit to choose to initiate an action against a business.  If the AG does so, the consumer lawsuit cannot proceed.  This is an interesting and innovative way to balance private lawsuits and state enforcement, but if the AG takes the case, harmed consumers might not be adequately made whole.  There is a “Consumer Privacy Fund” created from AG enforcement action proceeds, but this fund is to pay enforcement costs, not to compensate harmed consumers.

(10) AGs can initiate enforcement actions under the Act — the only vehicle for enforcing the privacy provisions of the Act.  Fines can be up to $7,500 per violation.

(11) The Act attempts to limit the ability of courts to use the standards in the Act as standards of care for common law causes of action.  The Act states: “Nothing in this act shall be interpreted to serve as the basis for a private right of action under any other law. This shall not be construed to relieve any party from any duties or obligations imposed under other law or the United States or California Constitution.”  If you’re interested in this issue, see this post where I discuss how courts can use various laws and regulations in connection with common law causes of action (such as negligence and other torts).

(12) The Act doesn’t apply to all businesses.  Instead, it applies only to businesses with gross revenues that exceed $25 million or that purchases personal data on “50,000 or more consumers, households, or devices” or that “derives 50 percent or more of its annual revenues from selling consumers’ personal information.”

For an interesting critical perspective of the Act, see Professor Eric Goldman’s analysis of the law.  I am much more supportive of stronger privacy regulation than Eric, but his concerns about this law are well worth thinking about.

* * * *

This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy and data security training. He also posts at his blog at LinkedIn, which has more than 1 million followers.

Professor Solove is the organizer, along with Paul Schwartz, of the Privacy + Security Forum (Oct. 3-5, 2018 in Washington, DC), an annual event designed for seasoned professionals. 

NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.

Click here for more information about our GDPR training

GDPR Training TeachPrivacy 07

 

 

 

 

Save

Save

Save

Save