PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

Social Media CDA 230 Algorithms

The U.S. Court of Appeals for the Third Circuit just handed down a very important decision on the Communications Decency Act (CDA) Section 230 and accountability for algorithmic decisions. In Anderson v. TikTok (3rd Cir. Aug, 27, 2024), the Third Circuit held that there are limits to the broad immunity under the CDA Section 230. As I’ve long argued, going back to my book, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet (2007) (free download here) the statute has been interpreted by courts in an overzealous way, far beyond its original intent.  Finally, a court is pushing back and holding social media companies accountable for the harms they create. In particular, the concurring opinion by Judge Matey is worth reading in full; it is powerful and persuasive.

The facts of the case are tragic. Videos on TikTok encouraged viewers to engage in the “Blackout Challenge” — to choke themselves with various things until they passed out. TikTok’s algorithm recommended a Blackout Challenge video to Nylah Anderson, a 10-year old girl, on her “For You Page.”  She tried out the conduct in the video and died.  Anderson’s estate sued TikTok for recommending the video to Nylah, and TikTok argued that the CDA Section 230 immunized it because the video was content from another user.

The CDA Section 230, at 47 U.S.C. § 230(c)(1), states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

CDA 230 02The CDA Section 230 was written to protect online platforms from being held responsible as a “publisher or speaker” for content posted by users. But starting with Zeran v. AOL, 129 F. 3d 327 (4th Cir. 1997), courts expanded and twisted the CDA Section 230 into a much broader immunity for nearly anything that happens on online platforms. The original goal of the law was to protect online platforms from being held to the same level of liability as anyone speaking on the platforms even when these platforms were just acting like a passive bulletin board.  But these days, online platforms are not bulletin boards. They are active manipulators of content, where algorithms act as puppeteers pulling the strings behind the scenes to influence what users see and how they interact. Platforms don’t just passively serve as places where content is posted; instead, they actively shape the content viewers see with algorithms programmed to deliver content to spark engagement with users. Content is curated and promoted.

The Third Circuit finally had enough of the overly-expansive interpretations of the CDA 230 and held: “ICSs [interactive computer services] are immunized only if they are sued for someone else’s expressive activity or content (i.e., third-party speech), but they are not immunized if they are sued for their own expressive activity or content (i.e., first-party speech).”

The court reasoned that the U.S. Supreme Court’s decision in Moody v. NetChoice, 144 S. Ct. 2383 (U.S. 2024) concluded that social media platforms are engaging in speech through their content moderation decisions. As the Third Circuit explained the Supreme Court’s holding: “The Court held that a platform’s algorithm that reflects ‘editorial judgments’ about ‘compiling the third-party speech it wants in the way it wants’ is the platform’s own ‘expressive product’ and is therefore protected by the First Amendment.”

Indeed, in recent cases, NetChoice, an organization created by social media companies to aggressively litigate against attempts to regulate them, has argued that social media companies are speaking when they are engaging in content moderation and are entitled to First Amendment protection. But then, NetChoice turns around and argues in Section 230 cases that such companies are not speaking when they are engaging in content moderation.  NetChoice wants it both ways — for social media companies to be speakers when the law protects them but not when the law holds them accountable for their speech.

The Third Circuit further stated: “Given the Supreme Court’s observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms, it follows that doing so amounts to first-party speech under § 230, too.”  Applying this conclusion to Anderson’s argument, the court held: “Accordingly, TikTok’s algorithm, which recommended the Blackout Challenge to Nylah on her FYP, was TikTok’s own ‘expressive activity,’ and thus its first-party speech. Such first-party speech is the basis for Anderson’s claims.”

In a concurring opinion, Judge Matey indicated he would go further to push back on the overly-expansive interpretations of the CDA Section 230.  His opinion is powerful and eloquent, and I will provide some key quotations from it below:

Ten-year-old Nylah Anderson died after attempting to recreate the “Blackout Challenge” she watched on TikTok. The Blackout Challenge—performed in videos widely circulated on TikTok—involved individuals “chok[ing] themselves with belts, purse strings, or anything similar until passing out.” App. 31.3 The videos “encourage[d]” viewers to record themselves doing the same and post their videos for other TikTok users to watch. App. 31. Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her. But TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her “For You Page”4 after it “determined that the Blackout Challenge was ‘tailored’ and ‘likely to be of interest’ to Nylah.” App. 31.

No one claims the videos Nylah viewed were created by TikTok; all agree they were produced and posted by other TikTok subscribers. But by the time Nylah viewed these videos, TikTok knew that: 1) “the deadly Blackout Challenge was spreading through its app,” 2) “its algorithm was specifically feeding the Blackout Challenge to children,” and 3) several children had died while attempting the Blackout Challenge after viewing videos of the Challenge on their For You Pages. App. 31–32. Yet TikTok “took no and/or completely inadequate action to extinguish and prevent the spread of the Blackout Challenge and specifically to prevent the Blackout Challenge from being shown to children on their [For You Pages].” App. 32–33. Instead, TikTok continued to recommend these videos to children like Nylah. . . .

Today, § 230 rides in to rescue corporations from virtually any claim loosely related to content posted by a third party, no matter the cause of action and whatever the provider’s actions. See, e.g., Gonzalez v. Google LLC, 2 F.4th 871, 892–98 (9th Cir. 2021), vacated, 598 U.S. 617 (2023); Force, 934 F.3d at 65–71. The result is a § 230 that immunizes platforms from the consequences of their own conduct and permits platforms to ignore the ordinary obligation that most businesses have to take reasonable steps to prevent their services from causing devastating harm.

But this conception of § 230 immunity departs from the best ordinary meaning of the text and ignores the context of congressional action. Section 230 was passed to address an old problem arising in a then-unique context, not to “create a lawless no-man’s-land” of legal liability. . . .

Properly read, § 230(c)(1) says nothing about a provider’s own conduct beyond mere hosting. A conclusion confirmed by § 230(c)(2), which enumerates acts that platforms can take without worrying about liability. . .

Judge Matey’s opinion also discusses the legislative history of Section 230 extensively, and he reaches the right conclusion about the proper interpretation and scope of Section 230.  The Third Circuit majority focuses on TikTok’s responsibility for its algorithmic decisions, and Judge Matey concurs with this holding but would go further to hold that Section 230 does not immunize TikTok from distributor liability:

§ 230(c)(1)’s preemption of traditional publisher liability precludes Anderson from holding TikTok liable for the Blackout Challenge videos’ mere presence on TikTok’s platform. A conclusion Anderson’s counsel all but concedes. But § 230(c)(1) does not preempt distributor liability, so Anderson’s claims seeking to hold TikTok liable for continuing to host the Blackout Challenge videos knowing they were causing the death of children can proceed. So too for her claims seeking to hold TikTok liable for its targeted recommendations of videos it knew were harmful.

I am glad that finally online platforms are being held accountable for their actions. For too long, they’ve wanted to have it both ways — to use algorithms to determine which content users are exposed to (and to be protected as “speakers” under the First Amendment) but then to be held immune because they are not “speakers” under the CDA Section 230. For too long, they’ve escaped accountability for their actions. Platforms are not passive conduits; they are far from bulletin boards. Their algorithms curate, promote, and downgrade content. These algorithms affect what content users see, and they also affect the content users create, because users create content in response to what the algorithms encourage and promote.  It’s high time for companies to be held responsible for what they are doing.

For more on the CDA Section 230 and its interpretation, see my blog post: Restoring the CDA Section 230 to What It Actually Says.  I will also note my appreciation to Judge Matey for citing this post in his opinion.

Also see Danielle Citron, How to Fix Section 230 and Mary Anne Franks, Reforming Section 230 and Platform Liability.

H/T Bob Sullivan and Zephyr Teachout

Daniel J. Solove is John Marshall Harlan Research Professor of Law at George Washington University Law School. He wrote about online social media, privacy, and free speech in his book, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet (Yale University Press 2007). The full book is now available as a free download on SSRN. Although the main players have changed, the book is still quite relevant today. The writing was on the wall 17 years ago.   

Future of Reputation Cover 01