Here’s a new cartoon on artificial intelligence, experimentation, and regulation. Creators of new technology often extol the virtues of experimentation. When it comes to policymakers experimenting with legal regulation, I often hear a different tune from those creating new technology. But they are experimenting with our lives and well-being, with society and democracy. Law, too, is an experiment. We often don’t know what works until we try it. So, while those developing new technologies experiment on society, is it so wrong for society to experiment in return?
The Prediction Society: Algorithms and the Problems of Forecasting the Future
I am excited to share my new paper draft with Hideyuki (“Yuki”) Matsumi, The Prediction Society: Algorithms and the Problems of Forecasting the Future. The paper is available for free on SSRN.
Yuki is currently pursuing a PhD at Vrije Universiteit Brussel. Yuki began his career as a technologist, then turned to law, where he has been exploring predictive technologies for more than a decade. The origins of this article trace back to 2011, when Yuki was my student. I supervised Yuki’s thesis about predictive technologies. His work was way ahead of its time. I am beyond thrilled to join him now on exploring these issues. Writing this paper with Yuki has been a terrific experience, and I have learned a tremendous amount in working with him.
We aim to add a unique and important contribution to the discourse about AI, algorithms, and inferences by focusing specifically on predictions about the future. We argue that the law should recognize algorithmic predictions about the future as distinct from inferences about the past and present. Algorithmic predictions about the future present a special set of problems that aren’t addressed by the law. The law’s existing tools and rights are ill-suited for predictions. We examine in depth the issues the law must consider when addressing these problems.
I’m really happy about how the paper turned out, and I want to note that I played but a supporting role. Yuki has been the driving force behind this paper. I joined because I find the issues to be fascinating and of the utmost importance, and I believe we have something important to add to the discussion. We welcome feedback.
LinkedIn Live Chat on AI and Privacy Harms
I chatted on LinkedIn Live about AI and Privacy Harms with Luiza Jarovsky about my article, Privacy Harms with Danielle Citron. Luiza has a great newsletter called The Privacy Whisperer – definitely worth subscribing to.
You can read my Privacy Harms article here.
Here is the video of our chat.
Event at Ohio State University – Protecting Privacy in the Age of AI
On Thursday, April 20, 2023, I will be speaking in an event at Ohio State University to discuss Protecting Privacy in the Age of AI: The Need for a Radical New Direction, with Margot Kaminski as commentator. I will be discussing the need for a radical new direction to protect privacy in today’s age of Big Data, algorithms, and AI. I will be drawing from my recently-published article, The Limitations of Privacy Rights, 98 Notre Dame Law Review 975 (2023).
You can find more information about the event here.
This is an in-person event with a Zoom option.
April 20, 2023, 12:15 pm – 1:15 pm
Ohio State University
Saxbe Auditorium
The Limitations of Privacy Rights
I have posted the final published version of my new article, The Limitations of Privacy Rights, 98 Notre Dame Law Review 975 (2023), on SSRN where it can be downloaded for free. The article critiques the effectiveness of individual privacy rights generally, as well as specific privacy rights such as the rights to information, access, correction, erasure, objection, data portability, automated decisionmaking, and more.
Here’s the abstract:
Individual privacy rights are often at the heart of information privacy and data protection laws. The most comprehensive set of rights, from the European Union’s General Data Protection Regulation (GDPR), includes the right to access, right to rectification (correction), right to erasure, right to restriction, right to data portability, right to object, and right to not be subject to automated decisions. Privacy laws around the world include many of these rights in various forms.
In this article, I contend that although rights are an important component of privacy regulation, rights are often asked to do far more work than they are capable of doing. Rights can only give individuals a small amount of power. Ultimately, rights are at most capable of being a supporting actor, a small component of a much larger architecture. I advance three reasons why rights cannot serve as the bulwark of privacy protection. First, rights put too much onus on individuals when many privacy problems are systematic. Second, individuals lack the time and expertise to make difficult decisions about privacy, and rights cannot practically be exercised at scale with the number of organizations than process people’s data. Third, privacy cannot be protected by focusing solely on the atomistic individual. The personal data of many people is interrelated, and people’s decisions about their own data have implications for the privacy of other people.
The main goal of providing privacy rights aims to provide individuals with control over their personal data. However, effective privacy protection involves not just facilitating individual control, but also bringing the collection, processing, and transfer of personal data under control. Privacy rights are not designed to achieve the latter goal; and they fail at the former goal.
After discussing these overarching reasons why rights are insufficient for the oversized role they currently play in privacy regulation, I discuss the common privacy rights and why each falls short of providing significant privacy protection. For each right, I propose broader structural measures that can achieve its underlying goals in a more systematic, rigorous, and less haphazard way.
Cartoon: Technology, Privacy, and Manipulation
Does European Privacy Law Need a Fix?
I recently had a terrific discussion with Prof. Nikolaus Forgó from the University of Vienna. We talked about my two recent paper — on informed consent and on sensitive data. You can watch the interview on YouTube above. Both articles are available for free download below.
Webinar – HIPAA and Health Privacy: New Developments
If you couldn’t make it to my recent webinar on HIPAA and health privacy developments in 2023, you can watch the replay here. I had a great discussion with Deborah Gersh, Adam Greene, and
Kate Black.
Webinar – Privacy in 2023: New Developments
If you couldn’t make it to my recent webinar on privacy developments in 2023, you can watch the replay here. I had a great discussion with Omer Tene, Kenesa Ahmad, Gabriela Zanfir-Fortuna, and Rob Corbet.
Event at U. Virginia Law – Examining Citron’s Book, THE FIGHT FOR PRIVACY
On Monday, February 20, 2023, I will be speaking in an event at the University of Virginia School of Law to discuss Professor Danielle Citron’s new book, The Fight for Privacy: Protecting Dignity, Identity and Love in the Digital Age, which makes the case for understanding intimate privacy as a civil and human right. This is an in-person event. You can find more information here.
February 20, 2023, 4:00 pm – 5:30 pm
University of Virginia
Purcell Reading Room
Speakers will include:
Danielle Citron, University of Virginia
Anita L. Allen, University of Pennsylvania
Ari E. Waldman, Northeastern University
Moderated by Deborah Hellman, University of Virginia.
UPDATE: The event is now concluded, and the video of it is here:
ALSO OF INTEREST:
Watch the video of my recorded webinar, The Fight for Privacy in a Post-Dobbs World, where I discuss issues in Danielle’s book with Danielle Citron, Mary Anne Franks, Jolynn Dellinger, Elizabeth Joh, and Allyson Haynes Stuart.