Cybersecurity litigation is currently at a crossroads. Courts have struggled in these cases, coming out in wildly inconsistent ways about whether a data breach causes harm. Although the litigation landscape is uncertain, there are some near certainties about cybersecurity generally: There will be many data breaches, and they will be terrible and costly. We thus have seen the rise of cybersecurity insurance to address this emergent and troublesome risk vector.
I am delighted to be interviewing Kimberly Horn, who is the Global Focus Group Leader for Cyber Claims at Beazley. Kim has significant experience in data privacy and cyber security matters, including guiding insureds through immediate and comprehensive responses to data breaches and network intrusions. She also has extensive experience managing class action litigation, regulatory investigations, and PCI negotiations arising out of privacy breaches.
One of the biggest challenges for organizations is locating all the personal data they have. This task must be done, however, to comply with the General Data Protection Regulation (GDPR) and other privacy laws. Moreover, the GDPR and the new California Consumer Privacy Act provide that individuals have rights regarding their data. These rights often require that organizations must keep records of individual privacy preferences regarding their data.
I had the opportunity to interview Dimitri Sirota about these issues. Dimitri is the CEO and co-founder of one of the first enterprise privacy management platforms, BigID, and a privacy and identity expert.
The U.S. Supreme Court has been notoriously slow to tackle new technology. In 2002, Blackberry launched its first smart phone. On June 29, 2007, Steve Jobs announced the launch of the original Apple iPhone. But it took the Supreme Court until 2014 to decide a case involving the Fourth Amendment and smart phones – Riley v. California, 134 S.Ct. 2473 (2014). This past summer, the Supreme Court issued another opinion involving smart phones – Carpenter vs. United States, 138 S.Ct. 2206 (2018).
I am thrilled to have had the opportunity to interview Bart Huffman, a partner in Reed Smith’s global IP, Tech & Data Group, about the Supreme Court’s recent foray into smart phones.
This HIPAA cartoon involves confidentiality. There are countless cases of misdirected PHI that is emailed or faxed to the wrong people.
I recently created a new short course on HIPAA Confidentiality. You can learn more about it here.
In recent years, there have been tremendous advances in artificial intelligence (AI). These rapid technological advances are raising a myriad of ethical issues, and much work remains to be done in thinking through all of these ethical issues.
I am delighted to be interviewing Kurt Long about the topic of AI. Long is the creator and CEO of FairWarning, a cloud-based security provider that provides data protection and governance for electronic health records, Salesforce, Office 365, and many other cloud applications. Long has extensive experience with AI and has thought a lot about its ethical ramifications.
Blockchain is taking the world by storm. I am delighted to have the opportunity to interview Steve Shillingford, Founder and CEO of Anonyome Labs, a consumer privacy software company.
Steve was previously at Oracle and Novell, then was President of Solera Networks before founding Anonyome. Steve speaks and writes extensively on identity management, cybersecurity, privacy, and Big Data.
I’ll be speaking at the FTC Hearings on Competition and Consumer Protection in the 21st Century on a panel about consumer data on Thursday, September 13, 2018 at 3:15 PM. My panel information is here:
The Regulation of Consumer Data
Maureen K. Ohlhausen
Federal Trade Commission
George Washington University School of Business
George Washington University Law School
Georgetown University Law Center
Moderator: James Cooper
Federal Trade Commission, Bureau of Consumer Protection
More information about the day’s schedule is here.
This cartoon is about consent under the GDPR. Under the GDPR Article 6, consent is one of the six lawful bases to process personal data. Article 7 provides further guidance about consent, including the data subject’s right to withdraw consent. The meaning of what “consent” requires is most thoroughly stated in Recital 32:
Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement. This could include ticking a box when visiting an internet website, choosing technical settings for information society services or another statement or conduct which clearly indicates in this context the data subject’s acceptance of the proposed processing of his or her personal data. Silence, pre-ticked boxes or inactivity should not therefore constitute consent. Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. If the data subject’s consent is to be given following a request by electronic means, the request must be clear, concise and not unnecessarily disruptive to the use of the service for which it is provided.
Privacy by design — or “Data Protection by Design” as it is referred to in the General Data Protection Regulation (GDPR) — is essential to meaningful privacy protection. Yet, it is often quite thin and incomplete. As I wrote a few years ago about privacy by design, “The ‘privacy’ the designers have in mind might be so focused on one particular dimension of privacy that it might overlook many other dimensions.”
Here’s a new HIPAA cartoon. This cartoon is about protected health information (PHI). In the HIPAA regulations, the definition of PHI is quite complicated, as it is splintered into at least three separate parts that appear in HIPAA’s definitions section. Pursuant to HIPAA, 45 CFR 160.103:
Health information means any information, including genetic information, whether oral or recorded in any form or medium, that:
(1) Is created or received by a health care provider, health plan, public health authority, employer, life insurer, school or university, or health care clearinghouse; and
(2) Relates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual.