PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

FTC Zoom Case

Co-authored by Prof. Woodrow Hartzog

It was inevitable. On Monday, Zoom joined an exclusive club of tech companies – Facebook, LinkedIn, Twitter, Microsoft, Google, Uber, Snap, and more. This club involves companies that have been under a Federal Trade Commission (FTC) consent decree. In a weird sense, for tech companies, being enforced against by the FTC for a privacy or security violation has become an initiation ritual to being recognized in the pantheon of the tech company big leagues.

As is the typical process, the FTC announced a complaint and consent order against Zoom for a violation of Section 5 of the FTC Act. More specifically, the FTC charged Zoom with unfair and deceptive data security practices related to encryption and efforts to bypass browser security safeguards.

The Zoom case is notable for several reasons. It signals that Zoom has arrived and is in the club. It’s hard to escape Zoom these pandemic-riddled days; their platform has become the go-to for videoconferencing, and Zoom is becoming a verb. Although we appreciate Zoom, we long for the days when people would just ask to talk with us rather than Zoom with us.

In the end, Zoom proved to have a similar story to the other FTC enforcement actions against tech companies – all had some serious privacy and security problems. Having read all of the FTC cases, what is shocking is that the infractions are not ambiguous or open to a lot of interpretation. They are rather egregious problems. The FTC rarely brings cases unless it has a slam dunk. Why doesn’t any company learn from its predecessors? Why do they all seem to pick up an FTC enforcement action along the way?

Beyond the case involving the new tech “prodigy” Zoom that everyone is buzzing about, this case involves some new developments about FTC jurisprudence as well as a blistering critique of the FTC by Commissioners Rebecca Kelly Slaughter and Rohit Chopra. This critique has been developing through their dissents in the Facebook and Equifax cases. In the Zoom case, their critique has developed into a broader charge that the FTC needs to take a bolder new approach in its enforcement. Are they right?

We’ll explore these issues in this post. First, we’ll discuss some of the notable parts of the complaint and consent decree. Then we’ll turn to the dissents.

I. The Zoom Complaint and Consent Decree

At first blush, readers might be tempted to think that this complaint and consent order is typical of the FTC’s security enforcement efforts. In many ways it is. Like previous complaints against companies like Facebook, Google, and Twitter, the FTC alleged that Zoom made deceptive statements and engaged in unfair practices around data security and entered into a consent order which requires that the company refrain from further deception on the topic, create and execute a comprehensive security program, and undergo regular assessments of compliance.

But if you look a little deeper, the Zoom complaint, consent order, and dissenting statements reveal some new developments in the FTC’s jurisprudence about privacy and security.

A. Claims of “Encryption” and “Security” Remain Highly Scrutinized

The bulk of this complaint relates to deceptive statements, implications, and omissions that Zoom made regarding security, specifically encryption. The agency alleges that Zoom made multiple representations touting the strength of the company’s data security practices, including that it offered “end-to-end” encryption, “256-bit encryption,” and encrypted storage.

This is probably the least controversial aspect of the action, as broken promises of security have been a staple of FTC actions for twenty years. Zoom’s own definition for end-to-end encryption was quite different from the industry standard and commonly accepted meaning of the term, which is that “[n]o other persons [except the true sender and recipient] can decrypt the communications because they do not possess the necessary cryptographic keys to do so.” But Zoom maintained cryptographic keys that allowed it to access the contents of peoples’ meetings. Zoom also promised 256-bit encryption, but only offered the less protective 128-bit encryption.

Two of the additional allegations involved deceptive implications and omissions. That’s rarer, but still consistent with the FTC’s past privacy and data security complaints. Zoom claimed that cloud recordings of meetings were stored in Zoom’s cloud “after the meeting has ended” where they are “stored encrypted as well” and stated elsewhere that recordings were “processed and securely stored in Zoom’s cloud once the meeting has ended.” According to the FTC, this was just ambiguous enough from the consumer’s perspective to imply that cloud recordings were encrypted immediately or shortly after they were processed on Zoom’s servers. But in fact meetings often sat unencrypted for up to 60 days.

The other deceptive practice was actually an omission—Zoom’s failure to disclose to users that an update to its Mac application ostensibly to resolve minor bug fixes included deploying a “local hosted web server,” which “would circumvent a Safari browser privacy and security safeguard” that prompted users to first approve routing the user to the Zoom app. This action also automatically opening users’ meeting with their camera on unless they changed the default setting. The FTC noted that this was a dangerous enough action that failure to give users notice of it was deceptive.

B. There’s a New Unfairness Injury: Limiting the Intended Benefit of a Privacy and Security Safeguard

In this complaint, the FTC carved out a new articulation of harm as “limiting the intended benefit of a privacy and security safeguard.” The FTC reasoned that the Safari prompt was a safeguard that added friction to the process of exposing your computer, video, and voice to others and mitigated different kinds of privacy and security violations that occur from easy access to telecommunications and peoples’ computers. The FTC argued that “[w]ithout the circumvented Safari safeguard, one wrong click could expose consumers to remote video surveillance by strangers through their webcams.”

This is an important new kind of consumer injury under Section 5. Platforms like Apple and Google are developing increasingly robust security and privacy protocols in their browsers and operating systems to limit the security vulnerabilities inherent in the apps and websites they host. But these safeguards sometimes introduce friction into the process of using an app or a website. Third party developers like Zoom have plenty of incentives to bypass the gatekeepers in the fight for market supremacy. The FTC’s new theory of harm takes the side of the platforms, protecting their efforts to impose privacy and security controls by viewing circumvention of these efforts as an injury. As a shorthand, we’ll call this injurious practice “circumvention.”

One of the biggest debates around the FTC’s enforcement of the rule against unfair privacy and data security practices is what counts as a “consumer injury” under Section 5 of the FTC Act. In the realm of data security, companies like Wyndam Hotels have argued that things short of financial harm such as exposure to risk, anxiety, and reimbursable fraudulent charges do not constitute injury under the statute. The standard kinds of data security-related injuries from unfair trade practices were identity theft, financial loss, and general exposure of deeply private information resulting from a breach. The FTC has pushed beyond, and rightly so.

Interestingly, although the more conservative commissioners have traditionally tried to push for a narrower view of harm, the conservative commissioners in this case embraced the theory of harm from circumvention.

C. The FTC was Proactive and Didn’t Wait on a Data Breach

We have long asserted that the FTC must act proactively rather than wait for a data breach to occur. The FTC has done so in only a few cases, such as Microsoft Passport and Guess. The Zoom case joins the list.

We believe that it is important for the FTC to enforce proactively. After a data breach, so many costs and legal entanglements await a company that an FTC action is about as effective as kicking a dead horse. The FTC is at its most effective when companies know it can come knocking at any time, not just after the breach. To take a quote from Breaking Bad, the FTC needs to be the “one that knocks” if it wants to have a strong effect in changing corporate behavior.

II. A Reckoning with FTC Enforcement: Are Radical Reforms Needed?

The dissenting opinions by Commissioners Chopra and Slaughter are raise bold critiques of the FTC’s approach to data privacy and security. Their critique isn’t isolated to this case. Previously, they had dissented in the Facebook case. In that case, they argued that the FTC’s enforcement penalties weren’t dissuasive enough. This was a particularly bold thing to argue in the Facebook case, where the FTC fine was $5 billion, the largest ever fine for privacy or security violations worldwide. The FTC imposed some new requirements in these cases and seemed to be making significant steps. But Chopra and Slaughter argued that the FTC should have insisted on stronger measures and even pursued litigation rather than settle.

The critiques in their dissents in Zoom flesh out more of their critique of FTC enforcement. We find their critiques to be quite persuasive. Their position on FTC enforcement transcends the Zoom case; it is a meta view of what effective FTC enforcement ought to be.

A. Dissuasive Penalties

In his dissent, Commissioner Chopra writes:

In matters like these, investigations should seek to uncover how customers were baited by any deception, how a company gained from any misconduct, and the motivations for this behavior. This approach can help shape an effective remedy. While deciding to resolve a matter through a settlement, regulators and enforcers must seek to help victims, take away gains, and fix underlying business incentives.

Of course, all settlements involve tradeoffs, but like other FTC data protection settlements, the FTC’s proposed settlement with Zoom accomplishes none of these objectives.

Commissioner Chopra argues that the settlement does little to compensate victims for harm or nullify all the contracts made based upon deceptive statements made by Zoom. Moreover, he notes that there isn’t a strong enough penalty for Zoom:

In my view, the evidence is clear that Zoom obtained substantial benefits through its alleged conduct. However, the resolution includes no monetary relief at all, despite existing FTC authority to seek it in settlements when conduct is dishonest or fraudulent. If the FTC was concerned about its ability to seek adequate monetary relief, it could have partnered with state law enforcers, many of whom can seek civil penalties for this same conduct.

We agree. As Chris Hoofnagle has noted, the “FTC’s catch and release policy appears to have lost its deterrent effect.” The FTC lacks the ability to issue fines for Section 5 violations; it can only issue fines for violations of consent decrees. The FTC has given itself sharper enforcement teeth by insisting on a very long consent decree period (20 years), but it isn’t enough. The current approach allows companies to “cash in” by breaking the law first, entrenching itself, and paying a modest regulatory price later on. When dealing with the FTC, it’s far more advantageous to sin and ask for forgiveness rather than not to sin in the first place.

Reading through the FTC privacy and security cases is an eye-opening exercise. It’s hard for one’s jaw not to drop and for one not to mutter: “Seriously? The company really did that?” The violations are not minor mistakes; they are flagrant fouls. They are not muddy or gray or fuzzy; they are clear, pristine, black-and-white violations. And, despite the fact that so many prominent companies have been caught in the past, we still see more violations. All this raises the question: Is current FTC enforcement really teaching companies enough of a lesson? Or have FTC consent decrees become not much more effective than speeding tickets?

Commissioner Chopra makes a number of recommendations to strengthen FTC consent decrees. First he argues that FTC orders should be strengthened. “[T]he FTC’s status quo approach often involves requiring the company engaged in misconduct to follow the law in the future and submit periodic paperwork. In certain orders, the Commission requires the retention of a third-party assessor, which the company might already be doing.”

He recommends that consent decrees mandate that the company provide “meaningful help and assistance to affected consumers and small businesses.”

The FTC should also “order releases from any long-term contractual arrangements.” According to Chopra, “When customers are baited with deceptive claims, it would be appropriate to allow them to be released from any contract lock-in or otherwise amend contractual terms to make customers whole.”

Indeed, one problem is that the assessments required by FTC consent decrees is often quiet minimal. According to Chopra, “it is unclear whether those assessments are truly effective when it comes to deterring or uncovering misconduct.”

Megan Gray and Chris Hoofnagle have engaged in extensive critiques of how thin and ineffective the FTC consent decree assessments are. Gray critiques the assessments for relying far too heavily on unexamined assertions by the companies being audited. Hoofnagle notes that the assessments are quite minimalist. The first Google privacy assessment was only 30 pages long; “three of the pages are an appeal for confidential treatment and another five are the company’s privacy policy.” That leaves just 22 pages for assessing privacy at Google!

Chopra also recommends that the FTC “can do more to comprehensively use its authorities across its mission, particularly when unfair or deceptive practices can advance dominance in digital markets. When we do not, investigations may result in ineffective resolutions that fail to fix the underlying problems and may increase the likelihood of recidivism.”

Additionally, Chopra suggests that the FTC might pursue rulemaking, which would give it greater ability to issue fines. Rulemaking under Section 5 is currently quite difficult. Congress should consider giving the FTC regular rulemaking authority for privacy and security. (One of us has written about this.)

There are more recommendations in Chopra’s dissent, and it is a very thoughtful list of ways for the FTC to improve.

Chopra also recommends that the FTC engage in more litigation. On this point, we understand the merits of the argument, but we are concerned that protracted litigation against these large companies might not result in the best outcomes for consumers. The judiciary is already not particularly friendly to consumers and regulatory power. Judges will often bend into contortions to help companies; they will reject harm when consumers are clearly harmed; they will reduce damages as too excessive even when companies are raking in billions. With a judiciary filled with Trump judges, it seems unlikely the FTC will fare well in the courts with consumer protection actions. In the absence of more structural support from Congress for the FTC, private causes of action seem to be the better, more fruitful path for meaningful litigation, such as with the limited but meaningful success of private lawsuits under the Illinois Biometric Information Privacy Act (BIPA).

The FTC majority statement in the Zoom case states: “Our dissenting colleagues suggest additional areas for relief that likely would require protracted litigation to obtain.” We agree in part, as protracted litigation can be long, bruising, and uncertain. The FTC can score quick wins with settlements. But the problem is that these quick wins aren’t dissuasive enough. There are many things, though, that the FTC could do to strengthen its enforcement without litigation.

Ultimately, we recommend that the FTC should as a matter of practice include in its orders a detailed justification as to how its enforcement penalties are truly dissuasive and make it clear that the penalties leave the company clearly not better off for having engaged in violations. The message must be clear that companies will be worse off if they violate the FTC Act Section 5.

2. The Relationship Between Privacy and Security

In her dissent, Commissioner Slaughter writes:

The Commission’s proposed order resolving its allegations against Zoom requires the company to establish an information-security program and submit to related independent third-party assessments. These provisions strive to improve data-security practices at the company and to send a signal to others regarding the baseline for adequate data-security considerations. Nowhere, however, is consumer privacy even mentioned in these provisions. This omission reflects a failure by the majority to understand that the reason customers care about security measures in products like Zoom is that they value their privacy.

She also notes:

The Commission’s proposed order tries to solve for this problem solely as a security issue and makes it difficult for Zoom to bypass third-party security features in the future. But the order does not address the core problem: Zoom’s demonstrated inclination to prioritize some features, particularly ease of use, over privacy protections. Dumping Safari users automatically into a Zoom meeting, with their camera on, the first time they clicked on a link was not only a data-security failing—it was a privacy failing.

Commissioner Slaughter states that the FTC “ought to have required elements of both privacy and security programs.” She elaborates:

A more effective order would require Zoom to engage in a review of the risks to consumer privacy presented by its products and services, to implement procedures to routinely review such risks, and to build in privacy-risk mitigation before implementing any new or modified product, service, or practice.

Commissioner Slaughter also ends her dissent by stating that she “join[s] Commissioner Chopra’s call for the Commission to engage in critical reflection to strengthen our enforcement efforts regarding technology across the board—from investigation to resolution.”

We agree wholeheartedly with Commissioner Slaughter. The Zoom case isn’t just a security case but is also a privacy case, and privacy should have been a focus of the consent decree. Indeed, the issues that the FTC faulted Zoom for were only some of the problems reported in the news. Zoom’s story is similar to that of many young tech companies – they put so much focus on growth and their technology and only come to privacy later on, after everything is built. It’s akin to building a skyscraper in California and only afterwards starting to address how it should be made safe from earthquakes. Privacy should be taken seriously at the start, but there are hardly any incentives for companies to do so. Growth is the main goal, and only when a company is big enough will it start to focus on privacy. The story is often the same for security.

The FTC often doesn’t step in until a company has grown quite large and prominent, so companies know that they can ignore privacy and security in the shadows until they step into the limelight.

Zoom would have been much better off had it focused on privacy and security earlier on. But the same old story with young tech companies played out once again.

* * *

Back in 2014-2015, we wrote several articles about the FTC:

We wrote some shorter pieces as well.

Additionally, along with Chris Hoofnagle, we wrote a short piece last year:

Overall, we issued extensive praise for the FTC, but we had some suggestions for improvement. We still hold a positive view of the FTC, but we agree with Commissioners Chopra and Slaughter that significant changes are needed for the agency to hold powerful tech companies accountable for protecting our privacy and securing our data.

* * *

This post was co-authored by Professor Daniel J. Solove and Professor Woodrow Hartzog.

Daniel J. Solove is the John Marshall Harlan Research Professor of Law at the George Washington University Law School. He also posts at his blog at LinkedIn, which has more than 1 million followers.

Woodrow Hartzog is a Professor of Law and Computer Science at Northeastern University.

Their book, Breached! How Data Security Law Fails and How to Fix It, is forthcoming from Oxford University Press.

NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.

New TeachPrivacy GDPR 20-Min Training Course

Privacy+Security Academy Course on Consumer Privacy and the FTC