PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

Facebook and Privacy

Recently, I’ve been complaining about Facebook’s mishaps regarding privacy. Back in 2006, Facebook sparked the ire of over 700,000 members when it launched News Feeds. In 2007, Facebook launched Beacon and Social Ads, sparking new privacy outcries. An uprising of Facebook users prompted Facebook to change its policies regarding Beacon. For more about Facebook’s recent privacy issues, see my post here.

But that’s not all. Over at CNET, Chris Soghoian reports about some severe privacy concerns with Facebook applications. An application (or “app” for short) is a program that is created by a third party that adds interesting features to one’s profile. These apps have become quite popular with Facebook users. But they come with some very serious potential dangers. Soghoian writes:

[A] new study suggests there may be a bigger problem with the applications. Many are given access to far more personal data than they need to in order to run, including data on users who never even signed up for the application. Not only does Facebook enable this, but it does little to warn users that it is even happening, and of the risk that a rogue application developer can pose. . . .

In order to install an application, a Facebook user must first agree to “allow this application to…know who I am and access my information.” Users not willing to permit the application access to all kinds of data from their profile cannot install it onto their Facebook page.

What kind of information does Facebook give the application developer access to? Practically everything. . . .

The applications don’t actually run on Facebook’s servers, but on servers owned and operated by the application developers. Whenever a Facebook user’s profile is displayed, the application servers contact Facebook, request the user’s private data, process it, and send back whatever content will be displayed to the user. As part of its terms of service, Facebook makes the developers promise to throw away any data they received from Facebook after the application content has been sent back for display to the user.

So when you use a third party application, you basically must put your trust in that third party to follow Facebook’s rules in good faith. In other words, Facebook users use applications at their own risk.

But what if an application is created by some hacker in Russia? Or is designed by a creepy child molester to harvest people’s personal information? Should Facebook be doing more to protect users against the bad-apple application developers?

Soghoian notes that in many cases, applications are being given access to much more personal data than they actually need to function:

[A]s researchers from the University of Virginia have detailed in a recent report [link no longer available], Facebook provides applications with access to far more private user information than they need to function. Adrienne Felt, a student and lead researcher on the project, told me that of the top 150 applications they examined in October 2007, “8.7 percent didn’t need any information; 82 percent used public data (name, network, list of friends); and only 9.3 percent needed private information (e.g., birthday). Since all of the applications are given full access to private data, this means that 90.7 percent of applications are being given more privileges than they need.”

But that’s not the end of the problem. There’s more:

Facebook’s Web site and lengthy application terms of service curiously fail to mention something rather important. In addition to providing the application developer access to most of your private profile data, you also agree to allow the developer to see private data on all of your friends too.

Many Facebook users set their profiles to private, which stops anyone but their friends from seeing their profile details. This is a great privacy feature that can protect users from cyberstalkers and is completely gutted by the application system. To restate things–if you set your profile to private, and one of your friends adds an application, most of your profile information that is visible to your friend is also available to the application developer–even if you yourself have not installed the application.

The good news is that Facebook lets you configure the amount of your own private data that your friend’s applications can see. The bad news is that it’s hidden away, requiring several clicks through menus to find a page listing specific privacy settings (Privacy -> Applications -> Other Applications). Furthermore, the default values are extremely lax, such that a user who has yet to discover the preference page is essentially sharing her entire profile by default.

This friend data-sharing “feature,” and the ability to protect against it, isn’t mentioned anywhere else on Facebook’s site, nor are users informed about it when they install an application.

Soghoian’s story hasn’t gained a lot of traction, and an outcry hasn’t yet ensued over Facebook’s policies for its applications. I was recently on a panel with Chris Kelly, Facebook’s Chief Privacy Officer, at the Advisory Committee to the Congressional Internet Caucus’s State of the Net Conference. The issue of applications didn’t come up, so unfortunately, I didn’t have the opportunity to speak with him about it. Facebook’s general position on privacy seems to be that they are being transparent about the privacy risks their users are facing, that they offer their users a choice, and that when there’s an outcry over privacy, they respond. All these things are true, but there are flaws in this approach.

First, the notice about privacy risks currently isn’t effective. At the panel, I complained that privacy policies are woefully ineffective at informing consumers because nobody reads them. In a humorous moment, panelist and FTC Commissioner Jon Leibowitz, who uses Facebook, admitted that he hadn’t yet read Facebook’s privacy policy.

Second, the choice users have is often difficult to make, as Soghoian demonstrates in his article. Moreover, the choices consumers are given are often all-or-nothing, take-it-or-leave-it choices that encourage often ill-informed users not to opt out or to agree to use a feature such as an application. But for many users, they may prefer a better menu of choices, such as the ability to use an application but not surrender all of their personal information or that of their friends.

Third, I think that the better privacy strategy is for companies to think proactively about privacy, rather than to wait until the people are banging on the castle doors calling for the king’s head. The older generation of Information Age companies — Microsoft and ChoicePoint for example — have learned from their privacy fiascoes and now are attempting to embrace privacy rather than resist it. But the newer generation of companies, such as Facebook and others, do not seem to have learned these lessons.

 

Originally Posted at Concurring Opinions

* * * *

This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy training, data security training, HIPAA training, and many other forms of awareness training on privacy and security topics. Professor Solove also posts at his blog at LinkedIn. His blog has more than 1 million followers.

Professor Solove is the organizer, along with Paul Schwartz, of the Privacy + Security Forum and International Privacy + Security Forum, annual events designed for seasoned professionals.

If you are interested in privacy and data security issues, there are many great ways Professor Solove can help you stay informed:
*
LinkedIn Influencer blog
*
Twitter
*
Newsletter

TeachPrivacy Ad Privacy Training Security Training 01