Cybersecurity litigation is currently at a crossroads. Courts have struggled in these cases, coming out in wildly inconsistent ways about whether a data breach causes harm. Although the litigation landscape is uncertain, there are some near certainties about cybersecurity generally: There will be many data breaches, and they will be terrible and costly. We thus have seen the rise of cybersecurity insurance to address this emergent and troublesome risk vector.
I am delighted to be interviewing Kimberly Horn, who is the Global Focus Group Leader for Cyber Claims at Beazley. Kim has significant experience in data privacy and cyber security matters, including guiding insureds through immediate and comprehensive responses to data breaches and network intrusions. She also has extensive experience managing class action litigation, regulatory investigations, and PCI negotiations arising out of privacy breaches.
One of the biggest challenges for organizations is locating all the personal data they have. This task must be done, however, to comply with the General Data Protection Regulation (GDPR) and other privacy laws. Moreover, the GDPR and the new California Consumer Privacy Act provide that individuals have rights regarding their data. These rights often require that organizations must keep records of individual privacy preferences regarding their data.
I had the opportunity to interview Dimitri Sirota about these issues. Dimitri is the CEO and co-founder of one of the first enterprise privacy management platforms, BigID, and a privacy and identity expert.
The U.S. Supreme Court has been notoriously slow to tackle new technology. In 2002, Blackberry launched its first smart phone. On June 29, 2007, Steve Jobs announced the launch of the original Apple iPhone. But it took the Supreme Court until 2014 to decide a case involving the Fourth Amendment and smart phones – Riley v. California, 134 S.Ct. 2473 (2014). This past summer, the Supreme Court issued another opinion involving smart phones – Carpenter vs. United States, 138 S.Ct. 2206 (2018).
I am thrilled to have had the opportunity to interview Bart Huffman, a partner in Reed Smith’s global IP, Tech & Data Group, about the Supreme Court’s recent foray into smart phones.
In recent years, there have been tremendous advances in artificial intelligence (AI). These rapid technological advances are raising a myriad of ethical issues, and much work remains to be done in thinking through all of these ethical issues.
I am delighted to be interviewing Kurt Long about the topic of AI. Long is the creator and CEO of FairWarning, a cloud-based security provider that provides data protection and governance for electronic health records, Salesforce, Office 365, and many other cloud applications. Long has extensive experience with AI and has thought a lot about its ethical ramifications.
Blockchain is taking the world by storm. I am delighted to have the opportunity to interview Steve Shillingford, Founder and CEO of Anonyome Labs, a consumer privacy software company.
Steve was previously at Oracle and Novell, then was President of Solera Networks before founding Anonyome. Steve speaks and writes extensively on identity management, cybersecurity, privacy, and Big Data.
The privacy world has been abuzz with the passage of the California Consumer Privacy Act of 2018. In June 2018, within just a week, California passed this strict new privacy law. Some commentators have compared it to the GDPR, but it is a much more narrow law and is a far cry from the GDPR. Nevertheless, it is a significant entry in California’s considerable canon of privacy laws.
For more on California privacy laws, see this collection compiled by the California Attorney General.
For the first half of 2018, all eyes were focused eastward on the EU with the start of GDPR enforcement this May. Now, all eyes are shifting westward based on a bold new law passed by California. By January 1, 2020, companies around the world will have to comply with additional regulations related to the processing of personal data of California residents. Pursuant to the California Consumer Privacy Act of 2018, companies must observe restrictions on data monetization business models, accommodate rights to access, deletion, and porting of personal data, update their privacy policies and brace for additional penalties and statutory damages. The California Legislature adopted and the Governor signed the bill on June 28, 2018 after an unusually rushed process in exchange for the proposed initiative measure No. 17-0039 regarding the Consumer Right to Privacy Act of 2018 (the “Initiative”) being withdrawn from the ballot the same day, the deadline for such withdrawals prior to the November 6, 2018 election.
Below is an interview with Lothar Determann, a leading expert on California privacy law. He has a treatise on the topic: California Privacy Law (3rd Edition, IAPP 2018).
Recently published by Cambridge University Press, Re-Engineering Humanity explores how artificial intelligence, automated decisionmaking, the increasing use of Big Data are shaping the future of humanity. This excellent interdisciplinary book is co-authored by Professors Evan Selinger and Brett Frischmann, and it critically examines three interrelated questions. Under what circumstances can using technology make us more like simple machines than actualized human beings? Why does the diminution of our human potential matter? What will it take to build a high-tech future that human beings can flourish in? This is a book that will make you think about technology in a new and provocative way.
Hot off the press is Professor Woodrow Hartzog’s new book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies (Harvard Univ. Press 2018). This is a fascinating and engaging book about a very important and controversial topic: Should privacy law regulate technological design?
Countless women have been coming forward to say #MeToo and share their traumatic stories of sexual harassment and assault. But there are many stories we’re not hearing. These stories are being silenced by extremely broad nondisclosure agreements (NDAs), some made at the outset of employment and others when settling litigation over sexual harassment. They stop victims from talking. They also silence other employees who witness sexual harassment of co-workers. NDAs were a powerful device used by Harvey Weinstein to hush up what he was doing.
In her new book, You Don’t Own Me: How Mattel v. MGA Entertainment Exposed Barbie’s Dark Side, Professor Orly Lobel tells a fascinating story about the Barbie versus Bratz litigation, which went on for about a decade. Her book is a page turner — told as a story that could readily be a movie. The book succeeds brilliantly as a gripping tale. But it goes beyond great storytelling to explore many important issues related to business, employment, and intellectual property: the enormous power of corporate employers, the weaponized use of intellectual property to stifle innovation, the dismal failure of business ethics, the troubling use of nondisclosure agreements (NDAs) to maintain dominance and power, and the punishing litigation process. Continue Reading