PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

Bruce Schneier has recently published a new book, Liars and Outliers: Enabling the Trust that Society Needs to Thrive (Wiley 2012). Bruce is a renowned security expert, having written several great and influential books including Secrets and Lies and Beyond Fear.

Liars and Outliers is a fantastic book, and a very ambitious one — an attempt to conceptualize trust and security. The book is filled with great insights, and is a true achievement. And it’s a fun read too. I recently conducted a brief interview with Bruce about the book:

Q (Solove): What is the key idea of your book?

A (Schneier): Liars and Outliers is about trust in society, and how we induce it. Society requires trust to function; without it, society collapses. In order for people to have that trust, other people must be trustworthy. Basically, they have to conform to the social norms; they have to cooperate. However, within any cooperative system there is an alternative defection strategy, called defection: to be a parasite and take advantage of others’ cooperation.

Too many parasites can kill the cooperative system, so it is vital for society to keep defectors down to a minimum. Society has a variety of mechanisms to do this. It all sounds theoretical, but this model applies to terrorism, the financial crisis of 2008, Internet crime, the Mafia code of silence, market regulation…everything involving people, really.

Understanding the processes by which society induces trust, and how those processes fail, is essential to solving the major social and political problems of today. And that’s what the book is about. If I could tie policymakers to a chair and make them read my book, I would.

Okay, maybe I wouldn’t.

Q: What are a few of the conclusions from Liars and Outliers that you believe are the most important and/or provocative?

A: That 100% cooperation in society is impossible; there will always be defectors. Moreover, that more security isn’t always worth it. There are diminishing returns — spending twice as much on security doesn’t halve the risk — and the more security you have, the more innocents it accidentally ensnares. Also, society needs to trust those we entrust with enforcing trust; and the more power they have, the more easily they can abuse it. No one wants to live in a totalitarian society, even if it means there is no street crime.

More importantly, defectors — those who break social norms — are not always in the wrong. Sometimes they’re morally right, only it takes a generation before people realize it. Defectors are the vanguards of social change, and a society with too much security and too much cooperation is a stagnant one.

Q: What’s wrong generally with current governmental approaches toward national security?

A: In general, democratic countries are pretty good. You can argue that the U.S. spends too much money on the military, that the U.K. relies too heavily on cameras, and that many countries are overreacting to the terrorist threat and generally engaging in too much worst-case thinking — but those are really in the margins. Society has a pretty good risk thermostat, and people naturally set that thermostat to the level they find most comfortable.

Where we systemically get risk wrong is in response to changing technology. Technology is an outside force that changes risk levels. Firearms makes robbery more deadly; the Internet makes banking fraud easier. On the other side, fingerprint and DNA technology make crimes easier to solve, and magnetometers make it harder to sneak firearms on airplanes. Social networking sites make some sorts of fraud easier, and make undercover police work harder.

What this all means is that the risks we’ve so carefully regulated get out of whack, and we have to adjust our security. We tend to be both slow and bad at it. We’re bad at it because we don’t have the data. We don’t really know how good airport security has been at deterring terrorism, and we don’t know how much a zero-tolerance drug policy has affected street crime. And we’re slow because society moves slower than the bad guys. We saw that in Internet crime. As soon as banking and commerce moved to the Internet, there emerged a new breed of Internet criminal that was agile and tech savvy. Meanwhile, the police, who were basically trained in the Agatha Christie model of police work, took seven or so years to adapt.

Getting security right in a world of rapid technological change, and a world of rapid social change due to that technological change, requires an adaptability that we’re not good at.

Q: Privacy and trust are often understood as in conflict with each other. According to some people, privacy impedes society’s ability to find out and prevent troublesome behavior, making it harder to trust people because less is known about them. Is it possible to have a lot of privacy and a lot of trust in society? How so?

A: Of course it is, and we all know how — even if we don’t think about it. When we walk into a bank and deposit our money, we trust that bank, even though we don’t need to know any of the bank employees. When we use an ATM — I used one in Bangkok just last week — we don’t even trust the bank; we trust the international funds transfer system, even though we don’t understand it. When we walk through town, we trust that we won’t be attacked; when we lock our doors, we trust that we won’t be robbed. In all of these cases, our trust has nothing to do with privacy. In some cases, we’re trusting a corporation instead of a person; in others, we’re trusting a socio-technical system. We trust the social and legal systems that regulate people’s behaviors, and we trust technical security measures that have nothing at all to do with privacy.

Privacy is an important social value, and one worth preserving for its own sake. While personal reputation, which requires a certain lack of privacy, is one way society prevents bad behavior — and throughout the history of our species, it has been a very important way– it’s not the only way. In Liars and Outliers, I describe morals, reputation, institutions, and security technologies as the four types of societal pressure. Many ways to facilitate trust have nothing to do with privacy: anti-counterfeiting technologies in currency, guards around military bases, metal detectors at airports, feelings of guilt from bad behavior, and so on.

Q: What is the most important thing about security that policymakers should understand?

A: That our feelings about security are different from the reality of how secure we are — and that fear gets in the way of rational decision making. I want governments to rise above fear and fear mongering, and implement policies that reflect actual risks of actual threats.

I’m not holding my breath.

Originally posted at Concurring Opinions.

* * * *

This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy and data security training. He also posts at his blog at LinkedIn, which has more than 1 million followers.

Professor Solove is the organizer, along with Paul Schwartz, of the Privacy + Security Forum and International Privacy + Security Forum, annual events designed for seasoned professionals.

NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.

 

 

 

Save

Save

Save

Save