There have been quite a number of state HIPAA enforcement cases this year, and one expert points out a trend toward increasing state enforcement of HIPAA.
An article in Data Breach Today discusses a number of state HIPAA enforcement cases. Here are some of the ones discussed:
Massachusetts — $75,000 settlement with McLean Hospital for a data breach involving 1,500 victims based on an employee who routinely took home unencrypted backup tapes with PHI. From the state press release:
The AG’s complaint alleges that McLean, a psychiatric hospital in Belmont, allowed an employee to regularly take home eight unencrypted back-up tapes containing clinical and demographic information from the Harvard Brain Tissue Resource Center that the hospital possessed. The tapes contained personal information such as names, social security numbers, diagnoses and family histories. When the employee was terminated from her position at McLean in May 2015, she only returned four of the tapes, and the hospital was unable to recover the others.
New Jersey — $100,000 settlement with EmblemHealth for a 2016 breach involving 81,000 victims. Details from the state’s press release:
The incident at issue took place on October 3, 2016 when EmblemHealth’s vendor sent a paper copy of EmblemHealth’s Medicare Part D Prescription Drug Plan’s Evidence of Coverage to 81,122 of its customers, including 6,443 who live in New Jersey.
The label affixed to the mailing improperly included each customer’s HICN, which incorporates the nine digits of the customer’s Social Security number, as well as an alphabetic or alphanumeric beneficiary identification code. (The number shown was identified as the “Package ID#” on the mailing label and did not include any separation between the digits.)
During its investigation, the Division found that following the departure of the EmblemHealth employee who typically prepared the Evidence of Coverage mailings, the task was assigned to a team manager of EmblemHealth’s Medicare Products Group, who received minimal training specific to the task and worked unsupervised. Before forwarding the data file to the print vendor, this team manager failed to remove the patient HICNs from the electronic data file.
It is sad to say goodbye to Concurring Opinions, a law professor blog I co-founded in 2005. The blog began when a group of us (Dave Hoffman, Kaimi Wenger, Nate Oman, and me) who were blogging at PrawfsBlawg decided we wanted more autonomy in blog governance, so we founded Concurring Opinions. Over the years, we added many great permabloggers: Danielle Citron, Deven Desai, Frank Pasquale, Gerard Magliocca, Ronald K.L. Collins, Larry Cunningham, Naomi Cahn, Sarah Waldeck, Solangel Maldonado, Corey Yung, Jaya Ramji-Nogales, and others.
I have a few final thoughts about Concurring Opinions below, as well as a small piece of good news — I’ve archived most of my posts here on this special archive page. More on the archive later.
12/13/18 Update: Here is the video from the session described below.
On Wednesday, December 12, 2018, I’ll be speaking at the Data Security hearing, part of the FTC Hearings on Competition and Consumer Protection in the 21st Century. My panel begins at 1:00 PM:
The U.S. Approach to Consumer Data Security
Wednesday, December 12, 2018 from 1:00 PM to 2:30 PM
Center for Democracy & Technology
Daniel J. Solove
George Washington University Law School
University of Pittsburgh
Perkins Coie LLP
Lisa J. Sotto
Hunton Andrews Kurth LLP
Moderator: James Cooper
Federal Trade Commission, Bureau of Consumer Protection
I previously spoke at an earlier hearing in this series back in September on a panel about consumer privacy protection (video / transcript). The upcoming hearing focuses on data security.
In the annals of what must be one of the most ridiculous data security incidents, a law firm employee sent a client file on an unencrypted thumb drive in the mail. The file contained Social Security information and other financial data.
The envelope arrived without the USB drive. The firm contacted the post office.
What happened next is most bizarre. Here’s an excerpt from the law firm’s letter notifying the state attorney general:
Cybersecurity litigation is currently at a crossroads. Courts have struggled in these cases, coming out in wildly inconsistent ways about whether a data breach causes harm. Although the litigation landscape is uncertain, there are some near certainties about cybersecurity generally: There will be many data breaches, and they will be terrible and costly. We thus have seen the rise of cybersecurity insurance to address this emergent and troublesome risk vector.
I am delighted to be interviewing Kimberly Horn, who is the Global Focus Group Leader for Cyber Claims at Beazley. Kim has significant experience in data privacy and cyber security matters, including guiding insureds through immediate and comprehensive responses to data breaches and network intrusions. She also has extensive experience managing class action litigation, regulatory investigations, and PCI negotiations arising out of privacy breaches.
In the period of just a week, California passed a bold new privacy law — the California Consumer Privacy Act of 2018. This law was hurried through the legislative process to avoid a proposed ballot initiative with the same name. The ballot initiative was the creation of Alastair Mactaggart, a real estate developer who spent millions to bring the initiative to the ballot. Mactaggart indicated that he would withdraw the initiative if the legislature were to pass a similar law, and this is what prompted the rush to pass the new Act, as the deadline to withdraw the initiative was looming.
The text of the California Consumer Privacy Act is here. The law becomes effective on January 1, 2020.
There are others who summarize the law extensively, so I will avoid duplicating those efforts. Instead, I will highlight a few aspects of the law that I find to be notable:
(1) The Act creates greater transparency about the personal information businesses collect, use, and share.
(2) The Act provides consumers with a right to opt out of the sale of personal information to third parties and it attempts to restrict penalizing people who exercise this right. Businesses can’t deny goods or services or charge different prices by discounting those who don’t opt out or provide a “different level or quality of goods or services to the consumer.” However, businesses can do these things if they are “reasonably related to the value provided to the consumer by the consumer’s data.” This is a potentially large exception depending upon how it is interpreted.
(3) The Act allows businesses to “offer financial incentives, including payments to consumers as compensation,” for collecting and selling their personal information. Financial incentive practices cannot be “unjust, unreasonable, coercive, or usurious in nature.” I wonder whether this provision will undercut the restriction on offering different pricing or levels of service in exchange for people allowing for the collection and sale of their information. Through some clever adjustments, businesses that were enticing consumers to allow the collection and sale of their personal data through different prices or discounts can now restructure these into “financial incentives.”
Co-Authored by Prof. Woodrow Hartzog
On Wednesday, the U.S. Court of Appeals for the 11th Circuit issued its long-awaited decision in LabMD’s challenge to an FTC enforcement action: LabMD, Inc. v. Federal Trade Commission (11th Cir. June 6, 2018). While there is some concern that the opinion will undermine the FTC’s power to enforce Section 5 for privacy and security issues, the opinion actually is quite narrow and is far from crippling.
While the LabMD opinion likely does have important implications for how the FTC will go about enforcing reasonable data security requirements, we think the opinion still allows the FTC to continue to build upon a coherent body of privacy and security complaints in an incremental way similar to how the common law develops. See Solove and Hartzog, The FTC and the New Common Law of Privacy, 114 Columbia Law Review 584 (2014).
I hope you enjoy my latest cartoon about data security — a twist on the angel on one shoulder and devil on the other. Humans are the weakest link for data security. Attempts to control people with surveillance or lots of technological restrictions often backfire. I believe that the most effective solution is to train people. It’s not perfect, but if training is done right, it can make a meaningful difference.
Recently published by Cambridge University Press, Re-Engineering Humanity explores how artificial intelligence, automated decisionmaking, the increasing use of Big Data are shaping the future of humanity. This excellent interdisciplinary book is co-authored by Professors Evan Selinger and Brett Frischmann, and it critically examines three interrelated questions. Under what circumstances can using technology make us more like simple machines than actualized human beings? Why does the diminution of our human potential matter? What will it take to build a high-tech future that human beings can flourish in? This is a book that will make you think about technology in a new and provocative way.
I hope you enjoy my latest cartoon about passwords on the Dark Web. These days, it seems, login credentials and other personal data are routinely stocking the shelves of the Dark Web. Last year, a hacker was peddling 117 million LinkedIn user email and passwords. And, late last year, researchers found a file with 1.4 billion passwords for sale on the Dark Web. Hackers will have happy shopping for a long time.