PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

AI Companies Should Have Information Fiduciary Duties

Nita Farahany (Duke Law) recently made a great point: “Your doctor has a fiduciary duty to you. ChatGPT doesn’t.” She discusses how people are increasingly turning to AI to serve as a kind of virtual doctor. OpenAI and Anthropic recently launched features where their chatbots can analyze a person’s medical records and provide personalized medical advice.

She argues that “we are rapidly normalizing the transfer of trust from accountable institutions to systems that explicitly refuse accountability. We need to answer sooner, rather than later, what legal obligations should apply to tools that function as health authorities while claiming they are not one, especially when tens of millions of daily users already treat that product as a health advisor.”

I wholeheartedly agree. Her entire post is great, and her Substack is essential reading.

Although AI can help with healthcare, it’s not a replacement for a doctor or therapist. We’ve already seen too many tragic suicide cases where people use chatbots at therapists and are given bad counselling and even encouraged to commit self-harm. Doctors and therapists have years of training and experience; they also have experience being human (they’re not just simulations); they must be licensed; and they have well-established legal responsibilities, such as a fiduciary duty to act in the best interest of the patient. AI currently has none of these things.

Let’s set aside what the right role for AI should be in healthcare. Whatever that role should be, AI companies should not be able to unleash AI into any role without proper legal protections and guardrails.

The current ethos of tech companies is to rush to try anything that seems cool or lucrative, be reckless, ignore all the existing protections for whatever they’re doing, and then grovel to policymakers that they shouldn’t be regulated because it will stifle innovation. When they disrupt existing industries and professions, they not only create job loss, but they sidestep existing legal protections which have developed over decades (or even centuries) and place people in great danger. They create a zone of recklessness where they conduct widescale experiments on humanity without any guardrails. Even lab rats are treated with more care.

The time is long overdue for digital tech companies to be regulated as information fiduciaries, especially when AI is used for healthcare or therapy.

Below are parts of a post I wrote about information fiduciaries that is germane to the current situation:

Information fiduciaries have emerged as a major part of the discussion of privacy regulation. In a nutshell, the information fiduciaries approach aims to apply aspects of fiduciary law to the companies that collect and use our personal data. As one court explained the fiduciary relationship: “A fiduciary relationship is one founded on trust or confidence reposed by one person in the integrity and fidelity of another. Out of such a relation, the laws raise the rule that neither party may exert influence or pressure upon the other, take selfish advantage of his trust, or deal with the subject matter of the trust in such a way as to benefit himself or prejudice the other except in the exercise of utmost good faith.” Mobile Oil Corp. v. Rubenfeld, 339 N.Y.S.2d 623, 632 (1972).

The earliest proponent of the idea of viewing companies as information fiduciaries was the late Ian Kerr in 2001, who noted that “some service provider-user relationships display all of the constituent elements of a fiduciary relationship.” See Ian Kerr, The Legal Relationship Between Online Service Providers and Users, 35 Can. Bus. L.J. 419 (2001).

In 2004, in my book, The Digital Person: Technology and Privacy In the Information Age (NYU Press 2004) (Amazon) (free digital copy on SSRN), argued that concepts from the law of fiduciary relationships should be applied to situations involving data privacy. (pp. 101-104). I contended that “the law should hold that companies collecting and using our personal information stand in a fiduciary relationship with us.” I argued that “If our relationships with the collectors and users of our personal data are redefined as fiduciary ones, then this would be the start of a significant shift in the way the law understands their obligations to us. The law would require them to treat us in a different way—at a minimum, with more care and respect. By redefining relationships, the law would make a significant change to the architecture of the information economy.” (p. 104).

Digital Person

At the time I wrote, I wasn’t aware that Ian Kerr had written about information fiduciaries a few years before me. Although I knew him quite well, he was too modest to tell me about his earlier article. I was drawn to information fiduciaries more for conceptual than doctrinal reasons. I recalled the famous quote from Judge Cardozo from the corporations class I took in law school. Judge Cardozo declared: “Many forms of conduct permissible in a workaday world for those acting at arm’s length, are forbidden to those bound by fiduciary ties. A trustee is held to something stricter than the morals of the market place. Not honesty alone, but the punctilio of an honor the most sensitive, is then the standard of behavior.” Meinhard v. Salmon, 164 N.E. 545, 546 (N.Y. 1928). The concept that jumped out at me was that when unequal power relationships are involved, the law recognizes (at least sometimes) that a more robust system of morality beyond the “morals of the marketplace” should govern.

In the years after my book, many scholars began talking about information fiduciaries, starting with Jack Balkin in 2016 in his article, Information Fiduciaries and the First Amendment, 49 U.C. Davis L. Rev. 1183 (2016). Balkin embraced the idea of information fiduciaries and argued that the First Amendment provides greater regulatory leeway: “Because of their special power over others and their special relationships to others, information fiduciaries have special duties to act in ways that do not harm the interests of the people whose information they collect, analyze, use, sell, and distribute. These duties place them in a different position from other businesses and people who obtain and use digital information. And because of their different position, the First Amendment permits somewhat greater regulation of information fiduciaries than it does for other people and entities.”

Here are some of the most important points, in my opinion, about information fiduciaries:

Conceptual, Not Just Doctrinal. My intent in invoking information fiduciaries was not to merely apply existing law. Instead, it was to draw from the conceptual underpinnings of fiduciary law to develop the law to fit situations involving privacy. Fiduciary law has already embraced certain privacy issues, such as breach of confidentiality tort cases. But I think the common law can develop in even broader directions if courts use even a small amount of imagination. Beyond the common law, information fiduciary concepts can be incorporated into privacy statutes.

Beyond Companies. Information fiduciaries need not be limited to companies. Any powerful person or entity should be deemed to be an information fiduciary, including government agencies and non-profit institutions, such as schools and hospitals.

Open-Ended Recognition of Relationships. In terms of the types of relationships that are deemed to be fiduciary ones, the law is open-ended. Courts use factors to recognize many different types of relationships as being fiduciary ones. It’s not a fixed list. As Ethan Leib aptly observes: “[N]o typology of the fiduciary could be complete without recognizing a few central features: the concept is self-consciously open, flexible, and adaptable to new kinds of relationships—and those relationships trade upon high levels of trust and leave one party in a position of domination, inferiority, or vulnerability.” Ethan J. Leib, Friends as Fiduciaries, 86 Wash. U. L. Rev. 665, 672 (2009).

Open-Ended Recognition of Duties. Although there are several common fiduciary duties, courts have recognized other fiduciary duties suitable to the relationship and context. As Lauren Scholz observes: “Fiduciary duties vary, based on the nature of the relationship between the fiduciary and entrustor. They may include duty of loyalty, duty of care, duty of disclosure and honesty, duty of confidentiality, and a heightened duty of good faith.” Lauren Henry Scholz, Fiduciary Boilerplate: Locating Fiduciary Relationships in Information Age Consumer Transactions, 46 J. Corp. L. 144 (2020).

Duty of Loyalty. A key duty of information fiduciaries is a duty of loyalty – to look out for the best interests of the individuals whose data one is collecting and using. Neil Richards and Woodrow Hartzog have written extensively about this in nearly a billion articles. I’ve listed a few in the bibliography below. They argue: “[L]oyalty would manifest itself primarily as a prohibition on designing digital tools and processing data in a way that conflicts with a trusting party’s best interests. Data collectors bound by such a duty of loyalty would be obligated to act in the best interests of the people exposing their data and engaging in online experiences, but only to the extent of their exposure.” Neil Richards & Woodrow Hartzog, A Duty of Loyalty for Privacy Law, 99 Wash. U. L. Rev. 961 (2021).

Duty of Confidentiality. Another key duty for fiduciaries is confidentiality. This duty forms the basis of countless cases involving breach of confidentiality, a tort with roots in fiduciary relationships. In some of the classic cases involving doctors breaching patient confidentiality, the doctor-patient relationships is analogized to a fiduciary one. See McCormick v. England, 494 S.E. 2d 431 (S.C. Ct. App. 1997). Neil Richards and I wrote about the breach of confidentiality tort in Privacy’s Other Path: Recovering the Law of Confidentiality, 96 Geo. L.J. 123 (2007).

Duty of Disclosure and Honesty. There are also duties to disclose pertinent information and to be transparent about any potential conflicts of interest. Currently, there is an enormous conflict of interest with surveillance capitalism — companies want to use personal data to monetize, even when particular uses are not in the best interests of the individuals. Privacy notices rarely approach the heightened disclosure demands of the fiduciary; these notices are not really forthcoming about the risks to individuals or the interests of the companies. Instead, they are vague meaningless statements that are long on text and lean on informative details.

Third-Party Liability. A key feature of fiduciary law is liability on third parties for inducing breaches of fiduciary duties. Although this facet of information fiduciaries is tremendously powerful, it remains under-examined. For example, in Hammonds v. Aetna Casualty & Surety Co., 243 F. Supp. 793 (D. Ohio 1965), an insurance company was held liable for inducing a doctor to breach his patient’s confidentiality. A robust law of information fiduciaries could open up liability for companies that purchase data from other companies or that scrape data online. Even further, companies that create surveillance technologies that facilitate fiduciaries breaching trust could potentially be liable. The Restatement of Torts provides: “A person who knowingly assists a fiduciary in committing a breach of trust is himself guilty of tortious conduct.” Restatement (Second) of Torts Sec. 874 comment (c). Data brokers that sell data to the government or companies to improperly pry into the private lives of individuals might be subject to liability. Companies such as Clearview AI that market facial recognition systems to many law enforcement entities could potentially be liable. Companies that develop and market technologies to enable the government to analyze its repositories of personal data might be liable. If developed conceptually and doctrinally, this dimension of fiduciary law has immense potential.

Power. At the core, the law of fiduciary relationships is about power — unequal power relationships that warrant special treatment under the law. This power asymmetry is increasingly present in the relationships between large organizations that gather and use personal data about people. It is cliché today to say that “information is power,” but if it so cliché, then it is not too controversial to recognize that amassing massive quantities of personal data about people gives organizations a massive amount of power over people. In my book, ON PRIVACY AND TECHNOLOGY (Oxford Univ. Press 2025), I write: “Overall, the concern animating the law of fiduciary relationships is power. Today, many corporate, governmental, and other entities have enhanced their power over us by collecting and using vast quantities of personal data. Imposing fiduciary duties on these entities would significantly help protect individuals. Fiduciary law is one of the law’s wisest creations—a recognition that with great power should come great responsibility. Policymakers should use this body of law in more relationships involving digital technologies.”

Further Reading and Viewing:

This post was initially published on my Substack newsletter

Divider 01

Professor Daniel J. Solove is a law professor at George Washington University Law School. Through his company, TeachPrivacy, he has created the largest library of computer-based privacy and data security training, with more than 180 courses. 

Subscribe to Solove’s Free Substack

A supplement to Solove’s regular newsletter with more in-depth discussions

Substack

Button - Subscribe

AI Training Course

Button Learn More 01

Artificial Intelligence and Privacy

AI and Privacy

Button Read 01