by Daniel J. Solove
We’re in the midst of a crisis in data protection. Billions of passwords stolen. . . Mammoth data breaches. . . Increasing threats. . . Malicious hackers . . .Continue Reading
by Daniel J. Solove
We’re in the midst of a crisis in data protection. Billions of passwords stolen. . . Mammoth data breaches. . . Increasing threats. . . Malicious hackers . . .Continue Reading
by Daniel J. Solove
It happens all the time. An organization has a privacy incident or data breach. The news stories proliferate. Cries of “shame on you” reverberate across the Internet. A number of organizations have an incident response plan, but they often don’t have much of a plan for PR. Certain incidents can take on a life of their own in the media, like a sudden tornado that swoops in and leaves devastation in its path.
by Daniel J. Solove
I was corresponding with K. Royal the other day, as she was graciously providing some feedback on a training program I created, and we got to talking about sensitive data. In their privacy laws, many countries designate a special category of data called “sensitive data” that receives especially stringent protections.
The most common list of categories for sensitive data is the list in the EU Data Protection Directive, which includes data about “racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union memberships, health, and sex life.”
The US has no special category of “sensitive data” but US privacy law does protect certain forms of data more stringently (health, financial).
I find it interesting what various countries define as sensitive data, and K Royal has created an awesome chart that she shared with me:
Chart of Sensitive Data in Various Countries
To a privacy wonk like me, a chart like this makes me giddy with excitement, and so I thought I’d share it with you (with her permission, of course).
Here’s a tally of the various types of most-commonly recognized categories of sensitive data. This is based on a chart of the sensitive data category of many countries that K Royal created.
SPECIFIC COUNTRIES’ DEFINITIONS OF SENSITIVE DATA
You can access the full Excel spreadsheet of the data here.
Note: The entry for “standard” means the standard list from the EU Data Protection Directive. The categories encompassed by “standard” include the one beginning “national, Racial/Ethnic” through “sexual preferences and practices.” More background about K’s project can be found at her blog.
If you want to see the spreadsheet data laid out in a blog post, you can see my longer post about the issue at my LinkedIn Blog.
by Daniel J. Solove
In three earlier posts, I’ve been exploring the nature of privacy and data security harms.
In the first post, Privacy and Data Security Violations: What’s The Harm?, I explored how the law often fails to recognize harm for privacy violations and data breaches.
In the second post, Why the Law Often Doesn’t Recognize Privacy and Data Security Harms, I examined why the law has struggled in recognizing harm for privacy violations and data breaches.
by Daniel J. Solove
I’ve been a teacher for the past 15 years, and I’ve taught in several mediums including live classes and computer-based e-learning. I have come to the conclusion that the most effective factor in education and training is fostering emotional investment.
Simply put, students must care about learning the material. The more they care, the more they learn.
The notion of getting emotional investment from students might sound like simple common sense, but it is often not done …and often not even attempted.
by Daniel J. Solove
In two earlier posts, I’ve been exploring the nature of privacy and data security harms.
Post 1: Privacy and Data Security Violations: What’s The Harm?
Post 2: Why the Law Often Doesn’t Recognize Privacy and Data Security Harms
In this post, I want to explore two issues that frequently emerge in privacy and data security cases: (a) the future risk of harm; and (b) individual vs. social harm.
by Daniel J. Solove
In my previous post on privacy/security harms, I explained how the law is struggling to deal with privacy and data security harms. In this post, I will explore why.
The Collective Harm Problem
One of the challenges with data harms is that they are often created by the aggregation of many dispersed actors over a long period of time. They are akin to a form of pollution where each particular infraction might, in and of itself, not cause much harm, but collectively, the infractions do create harm.
If you are interested in privacy and data security issues, there are many great ways Professor Solove can help you stay informed:
Professor Solove’s LinkedIn Influencer blog
You can follow Professor Solove on his blog at LinkedIn, where he is an “LinkedIn Influencer.” He blogs about various privacy and data security issues. His blog has more than 600,000 followers.
* * * *
Professor Solove’s Twitter Feed
Professor Solove is active on Twitter and posts links to current privacy and data security stories and new scholarship, cases, and developments of note.
* * * *
Sign up for our newsletter where Professor Solove provides information about his recent writings and new training programs that he has created.
* * * *
Professor Solove’s LinkedIn Discussion Groups
Please join one or more of Professor Solove’s LinkedIn discussion groups, where you can follow new developments on privacy, data security, HIPAA, and education privacy issues. You can also participate in the discussion, share interesting news and articles, ask questions, or start new conversations:
Privacy and Data Security |
HIPAA Privacy and Security |
Education Privacy and Data Security |
by Daniel J. Solove
This weekend, the results of an experiment conducted by researchers and Facebook were released, creating a fierce debate over the ethics of the endeavor. The experiment involved 689,003 people on Facebook whose News Feed was adjusted to contain either more positive or more negative emotional content. The researchers were looking for whether this had an effect on these people’s moods. And it did, albeit a small one. People exposed to more positive content had posts that were more positive, and those exposed to more negative content had posts that were more negative. This was measured by the types of words they used.
The experiment launched a fierce response from critics, some of whom decried it as unethical and creepy. In my view, it isn’t productive to castigate Facebook or the researchers, as the problems here emerge from some very difficult unresolved issues that go far beyond this experiment and Facebook. I want to explore these issues, because I’m more interested in making progress on these issues than on casting stones.
by Daniel J. Solove
“It’s just a flesh wound.”
– Monty Python and the Holy Grail
Suppose your personal data is lost, stolen, improperly disclosed, or improperly used. Are you harmed?
Suppose a company violates its privacy policy and improperly shares your data with another company. Does this cause a harm?
In most cases, courts say no. This is the case even when a company is acting negligently or recklessly. No harm, no foul.