PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

Webinar – HIPAA and Health Privacy: New Developments

If you couldn’t make it to my recent webinar on HIPAA and health privacy developments in 2023, you can watch the replay here. I had a great discussion with  Deborah Gersh, Adam Greene, and
Kate Black.

Button Watch Webinar 02

Continue Reading

Webinar – Privacy in 2023: New Developments

If you couldn’t make it to my recent webinar on privacy developments in 2023, you can watch the replay here. I had a great discussion with Omer Tene, Kenesa Ahmad, Gabriela Zanfir-Fortuna, and Rob Corbet.Button Watch Webinar 02

Continue Reading

Event at U. Virginia Law – Examining Citron’s Book, THE FIGHT FOR PRIVACY

On Monday, February 20, 2023, I will be speaking in an event at the University of Virginia School of Law to discuss Professor Danielle Citron’s new book, The Fight for Privacy: Protecting Dignity, Identity and Love in the Digital Age, which makes the case for understanding intimate privacy as a civil and human right.   This is an in-person event.  You can find more information here.

Citron - Fight for PrivacyFebruary 20, 2023, 4:00 pm – 5:30 pm
University of Virginia
Purcell Reading Room

Speakers will include:

Danielle Citron, University of Virginia

Anita L. Allen, University of Pennsylvania

Ari E. Waldman, Northeastern University

Moderated by Deborah Hellman, University of Virginia.

UPDATE: The event is now concluded, and the video of it is here:

ALSO OF INTEREST:
Watch the video of my recorded webinar, The Fight for Privacy in a Post-Dobbs World, where I discuss issues in Danielle’s book with Danielle Citron, Mary Anne Franks, Jolynn Dellinger, Elizabeth Joh, and Allyson Haynes Stuart.

Continue Reading

How to Pursue a Career as a Law Professor

How to Pursue a Career as a Law Professor

In this event, GW faculty members and GW alumni who are law professors provide advice on pursuing a career in legal academia. Watch the recording here. Details about the event are below.

Thursday, March 2, 2023
4:00 PM – 5:30 PM

Faculty Conference Center, 5th Floor

Continue Reading

Webinar – Neurotech and Privacy: The Battle for Your Brain Blog

Webinar Neurotech Privacy 02

If you couldn’t make it to my recent webinar on Neurotech and Privacy , you can watch the replay here. I had a great discussion with Nita Farahany, Jules Polonetsky and Ahmed Shaheed AI on the dangers that neurotechnology poses for privacy, fundamental human rights, and freedom of thought.

Button Watch Webinar 02

Also, make sure to order Nita Farahany’s new book, The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology (March 2023).  

Continue Reading

World Bank Data Privacy Day 2023

I organized the World Bank Data Privacy Day, which was held Wednesday, January 25, 2023 (9 AM – 12:30 PM ET).

The event topics included:

  • Data Privacy: Lessons from the Frontier
  • Emerging Issues in Algorithms, AI, and Data Analytics
  • Current Privacy Issues

Continue Reading

Murky Consent: An Approach to the Fictions of Consent in Privacy Law

Article - Solove - Murky Consent: An Approach to the Fictions of Consent in Privacy Law

I posted a draft of my new article, Murky Consent: An Approach to the Fictions of Consent in Privacy Law. It is just a draft, and I welcome feedback.

You can download it for free here:

Download Button 01

Here’s the abstract:

Consent plays a profound role in nearly all privacy laws. As Professor Heidi Hurd aptly said, consent works “moral magic” – it transforms things that would be illegal and immoral into lawful and legitimate activities. Regarding privacy, consent authorizes and legitimizes a wide range of data collection and processing.

There are generally two approaches to consent in privacy law. In the United States, the notice-and-choice approach predominates, where organizations post a notice of their privacy practices and then people are deemed to have consented if they continue to do business with the organization or fail to opt out. In the European Union, the General Data Protection Regulation (GDPR) uses the express consent approach, where people must voluntarily and affirmatively consent.

Both approaches fail.  The evidence of actual consent is non-existent under the notice-and-choice approach. Individuals are often pressured or manipulated, undermining the validity of their consent. The express consent approach also suffers from these problems – people are ill-equipped to make decisions about their privacy, and even experts cannot fully understand what algorithms will do with personal data. Express consent also is highly impractical; it inundates individuals with consent requests from thousands of organizations. Express consent cannot scale.

In this Article, I contend that in most circumstances, privacy consent is fictitious.  Privacy law should take a new approach to consent that I call “murky consent.” Traditionally, consent has been binary – an on/off switch – but murky consent exists in the shadowy middle ground between full consent and no consent. Murky consent embraces the fact that consent in privacy is largely a set of fictions and is at best highly dubious.

Abandoning consent entirely in most situations involving privacy would involve the government making most decisions regarding personal data. But this approach would be problematic, as it would involve extensive government control and micromanaging, and it would curtail people’s autonomy. The law should allow space for people’s autonomy over their decisions, even when those decisions are deeply flawed. The law should thus strive to reach a middle ground, providing a sandbox for free play but with strong guardrails to protect against harms.

Because it conceptualizes consent as mostly fictional, murky consent recognizes its lack of legitimacy. To return to Hurd’s analogy, murky consent is consent without magic. Instead of providing extensive legitimacy and power, murky consent should authorize only a very restricted and weak license to use data. This would allow for a degree of individual autonomy but with powerful guardrails to limit exploitative and harmful behavior by the organizations collecting and using personal data. In the Article, I propose some key guardrails to use with murky consent.

Continue Reading

Data Is What Data Does: Regulating Use, Harm, and Risk Instead of Sensitive Data

Article - Solove - Data Is What Data Does - Sensitive Data 02

I posted a draft of my new article, Data Is What Data Does: Regulating Use, Harm, and Risk Instead of Sensitive Data. It is just a draft, and I welcome feedback.

You can download it for free here:

Here’s the abstract:

Heightened protection for sensitive data is becoming quite trendy in privacy laws around the world. Originating in European Union (EU) data protection law and included in the EU’s General Data Protection Regulation (GDPR), sensitive data singles out certain categories of personal data for extra protection. Commonly recognized special categories of sensitive data include racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, health, sexual orientation and sex life, biometric data, and genetic data.

Although heightened protection for sensitive data appropriately recognizes that not all situations involving personal data should be protected uniformly, the sensitive data approach is a dead end.  The sensitive data categories are arbitrary and lack any coherent theory for identifying them. The borderlines of many categories are so blurry that they are useless. Moreover, it is easy to use non-sensitive data as a proxy for certain types of sensitive data.

Personal data is akin to a grand tapestry, with different types of data interwoven to a degree that makes it impossible to separate out the strands. With Big Data and powerful machine learning algorithms, most non-sensitive data can give rise to inferences about sensitive data. In many privacy laws, data that can give rise to inferences about sensitive data is also protected as sensitive data. Arguably, then, nearly all personal data can be sensitive, and the sensitive data categories can swallow up everything. As a result, most organizations are currently processing a vast amount of data in violation of the laws.

This Article argues that the problems with the sensitive data approach make it unworkable and counterproductive — as well as expose a deeper flaw at the root of many privacy laws. These laws make a fundamental conceptual mistake — they embrace the idea that the nature of personal data is a sufficiently useful focal point for the law. But nothing meaningful for regulation can be determined solely by looking at the data itself. Data is what data does. Personal data is harmful when its use causes harm or creates a risk of harm. It is not harmful if it is not used in a way to cause harm or risk of harm.

To be effective, privacy law must focus on use, harm, and risk rather than on the nature of personal data. The implications of this point extend far beyond sensitive data provisions. In many elements of privacy laws, protections should be based on the use of personal data and proportionate to the harm and risk involved with those uses.

Continue Reading