For the first half of 2018, all eyes were focused eastward on the EU with the start of GDPR enforcement this May. Now, all eyes are shifting westward based on a bold new law passed by California. By January 1, 2020, companies around the world will have to comply with additional regulations related to the processing of personal data of California residents. Pursuant to the California Consumer Privacy Act of 2018, companies must observe restrictions on data monetization business models, accommodate rights to access, deletion, and porting of personal data, update their privacy policies and brace for additional penalties and statutory damages. The California Legislature adopted and the Governor signed the bill on June 28, 2018 after an unusually rushed process in exchange for the proposed initiative measure No. 17-0039 regarding the Consumer Right to Privacy Act of 2018 (the “Initiative”) being withdrawn from the ballot the same day, the deadline for such withdrawals prior to the November 6, 2018 election.
In the period of just a week, California passed a bold new privacy law — the California Consumer Privacy Act of 2018. This law was hurried through the legislative process to avoid a proposed ballot initiative with the same name. The ballot initiative was the creation of Alastair Mactaggart, a real estate developer who spent millions to bring the initiative to the ballot. Mactaggart indicated that he would withdraw the initiative if the legislature were to pass a similar law, and this is what prompted the rush to pass the new Act, as the deadline to withdraw the initiative was looming.
The text of the California Consumer Privacy Act is here. The law becomes effective on January 1, 2020.
There are others who summarize the law extensively, so I will avoid duplicating those efforts. Instead, I will highlight a few aspects of the law that I find to be notable:
(1) The Act creates greater transparency about the personal information businesses collect, use, and share.
(2) The Act provides consumers with a right to opt out of the sale of personal information to third parties and it attempts to restrict penalizing people who exercise this right. Businesses can’t deny goods or services or charge different prices by discounting those who don’t opt out or provide a “different level or quality of goods or services to the consumer.” However, businesses can do these things if they are “reasonably related to the value provided to the consumer by the consumer’s data.” This is a potentially large exception depending upon how it is interpreted.
(3) The Act allows businesses to “offer financial incentives, including payments to consumers as compensation,” for collecting and selling their personal information. Financial incentive practices cannot be “unjust, unreasonable, coercive, or usurious in nature.” I wonder whether this provision will undercut the restriction on offering different pricing or levels of service in exchange for people allowing for the collection and sale of their information. Through some clever adjustments, businesses that were enticing consumers to allow the collection and sale of their personal data through different prices or discounts can now restructure these into “financial incentives.”
by Daniel J. Solove
This post was co-authored by Professor Paul Schwartz, Berkeley Law School.
Education was one of the first areas where privacy was regulated by a federal statute. Passed in the early 1970s, the Family Educational Rights and Privacy Act (FERPA) was on the frontier of federal privacy regulation. But now it is old and ineffective. With the growing public concern about the privacy of student data, states are starting to rev up their engines and become more involved. The result could be game-changing legislation for the multi-billion dollar education technology industry.