PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

Privacy as Contract?

Privacy as Contract

I just posted my new article draft with Professor Woodrow Hartzog (BU Law School) on SSRN (free download): Privacy as Contract?  

Are privacy notices contracts?

Here’s the abstract:

Nearly everything people buy, every service they use, every account they create, and even every website they visit involves the collection, use, and transfer of personal data—a matter that is ostensibly governed by privacy notices (also called “privacy policies”). Privacy notices are the foundation of privacy regulation; most privacy laws rely on the existence of privacy notices as a central pillar. Various statutory rights and obligations are tied or limited by what is specified in privacy notices, including the scope and nature of the collection, use, and transfer of personal data.

On the surface, privacy notices seem like contracts. They feel promissory in nature and are central to the grand bargain between consumers and companies at the heart of surveillance capitalism. When companies break the promises they make in privacy notices, contract law (through regular contract or promissory estoppel) appears to be a tool that could empower consumers to seek redress. But this has rarely been the case. Privacy notices still exist in a weird twilight between a mere description of policy and a binding agreement. They are even strangely still separate documents from companies’ terms of use agreements. Despite nearly forty years after privacy notices emerged to become the dominant mechanism to address privacy issues, the question of how contract law applies to privacy notices has been only thinly addressed by courts.

In this Article, we argue that contract law is unsuitable for governing consumer privacy. The law of consumer contracts is too oblivious to power disparities, too focused on the individual at the expense of groups and society, and too infected with bogus conceptions of consent to serve as a viable foundation to govern privacy in consumer transactions. Applying contract law more robustly and consistently to privacy notices will not better protect consumers—in fact, it will worsen protection and exacerbate the significant power imbalance between companies and consumers.

Even with reforms, applying contract law to privacy notices will not lead to a desirable balance of power between companies and consumers. Current contract law lacks the right tools to address privacy issues; it is rooted far too deeply in an individual control model similar to the one that has failed spectacularly in privacy law.

Instead of being developed to colonize privacy, contract law should be subject to an internal revolution in how it handles transactions in the Digital Age. With the internet and digital technologies, contract doctrine has lost its way and functions mainly to enable companies to wield power unilaterally against consumers. The fundamental goals, scope, and structure of consumer contract law must be rethought to better address problems with consent, fairness, and power.

 

Download ButtonContinue Reading

Enforcing Privacy Law: Why Private Litigation Is Essential

I just posted my new article draft on SSRN (free download): Enforcing Privacy Law: Why Private Litigation Is Essential.  

Here’s the abstract:

Enforcement is an essential dimension for effective privacy and data protection laws—and it is probably the most important one. No matter how many privacy laws are enacted and how strong the laws are, if enforcement falls short, the laws will fail to achieve their goals. Unfortunately, the enforcement of privacy laws is often weak and inadequate, a plague that renders them infirm.

This Article addresses a gap in the academic literature about privacy law, as enforcement is undertheorized. Only a few articles have addressed enforcement in depth, and these pieces have focused on specific laws and types of enforcement; they have not explored the whole picture. Enforcement is an essential issue, as it is not possible to evaluate a privacy law meaningfully without considering how it will be enforced in practice. Policymakers, however, often neglect to reckon with the practicalities of enforcement, resulting in laws that are ineffective.

This Article makes four primary arguments. First, the effectiveness of privacy laws depends upon enforcement, and poor enforcement can neuter even a strong law. The enforcement of privacy laws is currently quite weak.

Second, government enforcement has many substantial shortcomings that undermine its effectiveness. Government enforcement requires far more resources, insulation from political interference, consistency, and potency. Even with considerable improvement, there ultimately will be a ceiling. In most cases, government enforcement will never be nearly enough.

Third, enforcement is about incentives. Far too often, the incentives are poorly aligned for organizations to follow the law. Government enforcement is rarely sufficiently dissuasive. The risk equation comes out heavily in favor of non-compliance or poor compliance because penalties are not severe or frequent enough to outweigh the benefits of noncompliance.

Fourth, enforcement from multiple enforcers and enforcement mechanisms works best, and private litigation is an essential part of the enforcement equation, adding dimensions to enforcement that government enforcement lacks. Only with an understanding of the overall landscape of enforcement can the virtues of private litigation be fully appreciated.

Download ButtonContinue Reading

AI Companies Should Have Information Fiduciary Duties

AI Companies Should Have Information Fiduciary Duties

Nita Farahany (Duke Law) recently made a great point: “Your doctor has a fiduciary duty to you. ChatGPT doesn’t.” She discusses how people are increasingly turning to AI to serve as a kind of virtual doctor. OpenAI and Anthropic recently launched features where their chatbots can analyze a person’s medical records and provide personalized medical advice.

She argues that “we are rapidly normalizing the transfer of trust from accountable institutions to systems that explicitly refuse accountability. We need to answer sooner, rather than later, what legal obligations should apply to tools that function as health authorities while claiming they are not one, especially when tens of millions of daily users already treat that product as a health advisor.”

I wholeheartedly agree. Her entire post is great, and her Substack is essential reading.

Although AI can help with healthcare, it’s not a replacement for a doctor or therapist. We’ve already seen too many tragic suicide cases where people use chatbots at therapists and are given bad counselling and even encouraged to commit self-harm. Doctors and therapists have years of training and experience; they also have experience being human (they’re not just simulations); they must be licensed; and they have well-established legal responsibilities, such as a fiduciary duty to act in the best interest of the patient. AI currently has none of these things.

Continue Reading