Today, the California Privacy Protection Agency (CPPA) published a large advertisement in the San Francisco Chronicle encouraging people to exercise their privacy rights. “The ball is in your court,” the ad declared. (H/T Paul Schwartz)
FERPA & School Privacy
When it comes to privacy issues, schools are in the Dark Ages. I cannot think of any other industry that is so far behind.
Unfortunately, education privacy often exists below the radar, and this area hasn’t received the attention it needs.
The scope of coverage of the Family Educational Rights and Privacy Act (FERPA) is a challenging issue. It does not cover all information about students. Nor does it cover all information about people that a school maintains. In this newsletter I have gathered some resources on this topic.
Webinar – Privacy Under the Trump Administration
In this webinar, we discuss how privacy issues will fare under the upcoming Trump 2.0 Administration. What will the impact be on FTC privacy enforcement and the FTC surveillance rulemaking effort? How will HIPAA enforcement be affected? Is a federal privacy law more or less likely? What will happen to AI policy? What other privacy issues and policy developments are on the horizon?
Speakers include:
- Daniel Solove, GW Law and TeachPrivacy
- Omer Tene, Goodwin Procter
- Amie Stepanovich, FPF
The Tyranny of Algorithms
We live today increasingly under the tyranny of algorithms. They rule over us. They shape what we say and how we interact with each other. They shape behavior. They affect whether people get jobs and other essential things in life. And algorithms kill people.
Algorithms work behind the curtains, cloaked in secrecy, often unaccountable.
Algorithmic Predictions about Human Behavior
Yuki Matsumi and I wrote about the dangers of algorithmic predictions in The Prediction Society: AI and the Problems of Forecasting the Future, 2025 U. Ill. L. Rev. (2025). We argue that algorithms that attempt to forecast the future and impede human autonomy in the process.
Increasingly, algorithmic predictions are used to make decisions about credit, insurance, sentencing, education, and employment. We contend that algorithmic predictions are being used “with too much confidence, and not enough accountability. Ironically, future forecasting is occurring with far too little foresight.”
We contend that algorithmic predictions “shift control over people’s future, taking it away from individuals and giving the power to entities to dictate what people’s future will be.” Algorithmic predictions do not work like a crystal ball, looking to the future. Instead, they look to the past. They analyze patterns in past data and assume that these patterns will persist into the future. Instead of predicting the future, algorithmic predictions fossilize the past. We argue: “Algorithmic predictions not only forecast the future; they also create it.”
Additionally, we contend: “Algorithms are not adept at handling unexpected human swerves. For an algorithm, such swerves are noise to be minimized. But swerves are what make humanity different from machines.”
Cybersecurity and Privacy Resources
Here are some great cybersecurity and privacy resources.
Notable Privacy and Security Books 2024
Here are some notable books on privacy and security from 2024. To see a more comprehensive list of nonfiction works about privacy and security for all years, Professor Paul Schwartz and I maintain a resource page on Nonfiction Privacy + Security Books.
Information Fiduciaries and Privacy
Information fiduciaries have emerged as a major part of the discussion of privacy regulation. In a nutshell, the information fiduciaries approach aims to apply aspects of fiduciary law to the companies that collect and use our personal data. As one court explained the fiduciary relationship: “A fiduciary relationship is one founded on trust or confidence reposed by one person in the integrity and fidelity of another. Out of such a relation, the laws raise the rule that neither party may exert influence or pressure upon the other, take selfish advantage of his trust, or deal with the subject matter of the trust in such a way as to benefit himself or prejudice the other except in the exercise of utmost good faith.” Mobile Oil Corp. v. Rubenfeld, 339 N.Y.S.2d 623, 632 (1972).
The earliest proponent of the idea of viewing companies as information fiduciaries was the late Ian Kerr in 2001, who noted that “some service provider-user relationships display all of the constituent elements of a fiduciary relationship.” See Ian Kerr, The Legal Relationship Between Online Service Providers and Users, 35 Can. Bus. L.J. 419 (2001).
In 2004, in my book, The Digital Person: Technology and Privacy In the Information Age (NYU Press 2004) (Amazon) (free digital copy on SSRN), argued that concepts from the law of fiduciary relationships should be applied to situations involving data privacy. (pp. 101-104). I contended that “the law should hold that companies collecting and using our personal information stand in a fiduciary relationship with us.” I contended that “If our relationships with the collectors and users of our personal data are redefined as fiduciary ones, then this would be the start of a significant shift in the way the law understands their obligations to us. The law would require them to treat us in a different way—at a minimum, with more care and respect. By redefining relationships, the law would make a significant change to the architecture of the information economy.” (p. 104).
Digital Dossiers and the Aggregation Effect
This year is the 20th anniversary of my first book, The Digital Person: Technology and Privacy In the Information Age (NYU Press 2004) (Amazon) (free digital copy on SSRN). I thought that it would be a great opportunity to engage in a reflection on some of the points I discussed in the book. Apologies for the self-indulgence.
The key theme in The Digital Person is about the rise of what I call “digital dossiers” – the extensive repositories of personal data about us that are collected and used by large organizations. The government and private industry propelled each other into the digital age and beyond through the collection and use of personal data. At the time I wrote, the story culminated with the rise of the internet. Since that time, new technologies have taken the spotlight – AI, Big Data, smart phones, the Internet of Things, social media, and much more. The book is so old that my publisher long ago allowed me to post the entire digital version online for free. And yet, the basic problems and ideas discussed in the book largely remain the same. There are new chapters in the story, but its direction has been quite predictable. I could practically reissue the book with a new preface that says “I told you so.”
Cartoon – Notice and Choice
Here’s a new cartoon on the notice-and-choice approach to privacy — the way that many U.S. privacy laws regulate. Sadly, most of the state laws are based on notice and choice.
Here are some of my recent writings critiquing the notice-and-choice approach:
- Kafka in the Age of AI and the Futility of Privacy as Control, 104 B.U. L Rev 1021 (2024) (with Woodrow Hartzog)
- ON PRIVACY AND TECHNOLOGY (Oxford University Press, Jan 2025). You can pre-order a copy here.
Do you want to use this cartoon in presentations, classes, or newsletters?
Click here to license this cartoon.
Cartoon: Personal Data
Here’s a new cartoon on the difficulties of identifying personal data.
For my thoughts on this topic, see my post: Personal and Sensitive Data.