News, Developments, and Insights

high-tech technology background with eyes on computer display

Oscar Gandy Interview

Back in 1993, Professor Oscar Gandy, Jr. wrote one of the most insightful and prescient books about privacy: The Panoptic Sort: A Political Economy of Personal Information.

The Panoptic Sort

Oscar Gandy is an emeritus professor with the Annenberg School for Communication at the University of Pennsylvania, having retired from active teaching in 2006. He has continued to publish in the areas of the political economy of communication and information, focusing most recently on the development and use of algorithmic technology.

Oscar Gandy Jr

The book is remarkable for its insightfulness and how well it has predicted today’s “surveillance capitalism” to use Shoshana Zuboff’s term. Gandy called it the “panoptic sort” named after the famous prison design by philosopher Jeremy Bentham, who devised a prison architecture of cells arrayed around a central observation tower where everyone could be observed at all times. Michel Foucault used the panopticon in his classic work, Discipline and Punishas a metaphor for modern society.

Panopticon - Bentham Foucault

In The Panoptic Sort, Gandy focused on the emerging data gathering about individuals by corporations and the government — which he viewed as a significant threat to freedom and democracy.

It is amazing how much of today’s discussion about privacy is captured in Gandy’s book. Gandy makes people who are early to the party feel like they have arrived way too late. The book could have been written today rather than nearly three decades ago. The book remains so relevant today that Oxford University Press has released a new edition of the book, with a new forward and afterward by Gandy.

I recently interviewed Gandy about his reflections on the past 30 years following the publication of the new his book.

SOLOVE: It is remarkable to me how many of your points back in 1993 are still highly relevant today. Your book could have been written published this year, and nobody would be able to figure out that it was written nearly 30 years ago. If you were writing your book from scratch today, what would you have written differently? What are some of the most notable things that are the most different today from 1993? What are some of the most notable things that have remained the same?

GANDY: When I wrote the first edition, I was attempting to call public attention to the attempts by businesses to make use of information they were gathering about members of the public to influence the kinds of economic and political choices that people would make in their lives. I focused on corporate behavior because much of the literature on privacy and surveillance at the time had been focused primarily on activities of the government. However, if were to write a book about privacy and surveillance today, I would have focused much more on the kinds of joint efforts being pursued by business and government to shape behavior through their numerous public/private partnerships.

Of course, the most notable differences between then and now are the developments in communication and information technology, and the social, cultural, and behavioral changes taking place in population segments as a function of their reliance upon social media and the platforms that influence their use. Much of the data and information previously gathered about individuals was based on the analysis of interviews, surveys, focus groups and laboratory experiments. Now, the massive amounts of data and information being used to classify, characterize, and predict the behavioral responses of individuals is being derived from sophisticated devices that are connected through high-capacity digital networks, leading to their characterization as part of the Internet of Things. Of particular importance today are the advances in data science and algorithmic information processing that facilitate the generation of both inferences and manipulative strategies that are so increasingly complicated that the nature of their production almost defies meaningful explanation. This level of inscrutability limits the levels of transparency that make regulatory accountability meaningful.

Unfortunately, what has pretty much remained the same are the motivations and justifications for the kinds of surveillance that business and governmental agencies, and yes, even members of the public engage in today. The primary focus continues to be on the efficient and effective production of influence over other people through targeted communications. The primary difference is the extent to which those communications are being targeted to increasingly narrow population segments.

SOLOVE: How would you characterize the past nearly 30 years? Was any progress made? Are there new issues or problems today that you didn’t expect back in 1993?

GANDY: Well, of course, 30 years is long time over which to characterize the kinds of changes that have taken place. Even if I limit my comments to those things that were hinted at in the first edition, the changes in technology, economics, and social relations have been quite dramatic. Certainly, there has been considerable progress in some areas of concern, including the kinds of adjustments that have been made in the treatment of people who have been defined in terms of race, gender, class, and sexual orientation. There has also been some progress made to the extent that the kinds of information being gathered from people through the devices and services they come to depend upon, are increasingly being revealed to the users as well as to the service providers. Some of those improvements have been made in response to privacy regulations, while some are newly developed features of the technological systems meant to improve their appeal to consumers.

However, I couldn’t begin to imagine the kinds of developments in data processing, behavioral and biometric analyses, and manipulative strategies that algorithmic processes enhanced through artificial intelligence would make available for such widespread use. Zuboff’s characterization of this moment as being central to the development of “surveillance capitalism,” invites greater concern about the number and variety of users and uses of this technology to extend the efficiency and effectiveness of segmentation and manipulative targeting.

GANDY: We’re currently witnessing profound legislative activity around privacy, especially the GDPR in 2016 and the CPRA in 2018-2020. What do you think of these recent laws? Are they on the right or wrong path?

GANDY: Of course, I am pleased that considerable progress is being made in responding to the kinds of concerns being expressed about many of the kinds of privacy-related risks that we are increasingly being subject to. While some of the criticisms of early developments in the GDPR have been well placed, this effort is still being viewed as the strictest data protection regulation around the globe. Still, there are continuing signs that attempts to address some of the problems associated with defining the nature of those harms, such as the identification of activities likely to “have a significant effect” on data subjects, is a continuing challenge. My sense is however, that this regulatory initiative is continuing to move in the right direction.

The same is true for the legislative efforts in California focused on developing regulations beyond the initial steps taken in 2018 with the passage of the California Consumer Privacy Act. Additional provisions in the original legislation have expanded on an act that had already been thought of as being “the strictest set of privacy regulations” in the United States. The legislature also took a very special step forward in establishing a well-funded privacy protection agency. Although the legislative documents have made an important effort to define relevant terms such as “sensitive information,” we can only hope that the efforts necessary to define this and other critically important terms will continue to be demanded by members of the public, despite mounting opposition on many fronts.

Teach Privacy Interview

Many of my concerns regarding these legislative efforts are associated with the extent to which individuals and members of communities of interest or groups are expected to take responsibility for requesting access to information about the kinds of data, and the kinds of uses to which that analytically generated information is likely to be used. This represents an unfairly placed burden on individuals for challenging the imposition of risks that were not initially sought, nor generally understood by those being placed at risk. Of course, the fact that the weight of these burdens falls most heavily upon those least able to act in their own interest as a function of their lacking the knowledge, communicative skills, and other resources necessary to develop an effective self-defense, remains a common problem with these legislative efforts.

SOLOVE: What privacy issues most urgently need to be addressed today? Do you have recommendations for the direction that law and policy should take?

GANDY: Although I suspect that I am in a rather small minority of those who are less concerned about the collection of data and information than I am about the users, and the uses for which they are acquiring the data in the first place, I am convinced that law and policy needs to turn its attention toward the purposes, goals, and methods through which strategic information use should be allowed or limited within society.

I also think that we need to establish and support national and regional agencies that would be responsible for facilitating the routine assessments of the quality of the algorithmic procedures used to analyze and apply the insights derived from these data in terms of ethical standards related to fairness, reliability, and the absence of biases that contribute to distributional harms.

And while I realize that this would be something of a stretch, certainly within the United States, where the historical focus on privacy has been centered on the individual, the regulation of privacy relevant activities needs to be expanded to include consideration of the impact of data gathering and use on communities of interest, or “groups,” however defined.

And finally, and probably least likely to see very much in the way of agreement, is the hope that the reduction of social and economic inequality would be included among the justifications required, or at least privileged for the utilization of data generated by the activities of human beings making use of digital technologies. Indeed, I firmly believe that among the improvements that we need to achieve in the laws related to the protection of privacy, would be those that are designed to reduce the privacy harms related to expansions in inequality within society.

SOLOVE: Thanks, Oscar, for your great insights and for writing such an important and influential work. This book belongs on the bookshelf of every privacy scholar. The Panoptic Sort is now available in a new 2021 edition from Oxford University Press.

* * * *

This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy and data security training. He also posts at his blog at LinkedIn, which has more than 1 million followers.

Professor Solove is the organizer, along with Paul Schwartz, of the Privacy + Security Forum an annual event designed for seasoned professionals. 

NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.

Prof. Solove’s Privacy Training: 150+ Courses

Privacy Awareness Training 03

Prof. Solove’s Privacy Law Whiteboard Library

Whiteboard Library - by Daniel Solove - TeachPrivacy Training 03