According to a recent Ponemon Institute study, the odds of an organization having a data breach are 1 in 4. The study also found that the average cost of a data breach is $3.62 million in 2017. That’s a drop of 10%, but the size of data breaches has increased.
The Human Problem
The vast majority of information security incidents and data breaches occur because of human mistakes. Information security is only in small part a technology problem; it is largely a human problem. The biggest risks to security are human errors — people putting data where it doesn’t belong, people not following policies, people losing portable electronic devices with data on them, people falling for phishing and social engineering schemes.
Having a robust technical cybersecurity infrastructure is very important, but it alone isn’t enough. A recent Harvard Business Review article by Dante Disparte and Chris Furlow reinforces this point quite well. “Firms can be lulled into a dangerous state of complacency by their defensive technologies, firewalls, and assurances of perfect cyber hygiene. The danger is in thinking that these risks can be perfectly ‘managed’ through some sort of comprehensive defense system. It’s better to assume your defenses will be breached and to train your people in what to do when that happens.”
The Human Answer
In addition to technology, effectively preventing and dealing with data breaches involves humans. The problem is the humans, but so is the answer.
According to the Ponemon study, there were significant data breach cost reductions for having an incident response team, extensively using encryption, and engaging in workforce training.
Last year, major incidents involving law firm data breaches brought attention to the weaknesses within law firm data security and the need for more effective plans and preparation. An American Bar Association (ABA) survey reveals that 26% of firms (with more than 500 attorneys) experienced some sort of data breach in 2016, up from 23% in 2015.
A while ago, I wrote about a case involving a member of the St. Louis Cardinals baseball team staff who improperly accessed a database of the Houston Astros. There is now an epilogue to report in the case. The individual who engaged in the illegal access — a scouting director named Chris Correa — was fired by the Cardinals, imprisoned for 46 months, and banned permanently from baseball. The Cardinals were fined $2 million by Major League Baseball Commissioner Rob Manfred, and they must forfeit their first two picks in the draft to the Houston Astros.
According to an article about the incident in the St. Louis Post-Dispatch: “As outlined in court documents, the U.S. attorney illustrated how Correa hacked Houston’s internal database, ‘Ground Control,’ 48 times during a 2½-year period. He viewed scouting reports, private medical reviews and other proprietary information. The government argued that Correa may have sought to determine if Houston borrowed the Cardinals’ data or approach, but the information he accessed was ‘keenly focused on information that coincided with the work he was doing for the Cardinals.'”
As I wrote in my piece about the case, there are several lessons to be learned. One lesson is that it is a myth that hacking and computer crime must be hi-tech. Here, Correa’s hacking was nothing sophisticated — he just used another person’s password. The person had previously worked for the Cardinals, and when he went to the Astros, he kept using the same password. In my piece, I discussed other lessons from this incident, such as the importance of teaching people good password practices as well as teaching people that just because they have access to information doesn’t make it legal to view the information. The Cardinals organization appears to have learned from the incident, as the “employee manual has been updated to illustrate what is illegal activity online,” and the organization is using two-factor authentication to protect its own sensitive data. The article doesn’t say whether the Astros also stepped up their security awareness training by teaching employees not to reuse their old passwords from another team.
I have good news and bad news about ransomware. First, the good news — here’s a cartoon I created. I hope you enjoy it, because that’s the only good news i have. Now, for the bad news . . .
The Bad News: Be Afraid, Very Afraid
Everyone seems to be afraid of ransomware these days, but is the fear justified? Is ransomware more about hype than harm? Unfortunately, a recent study of international companies conducted by Malwarebytes provides some startling statistics to back up the fears. According to the study, 40% of companies worldwide and more than 50% of the US companies surveyed experienced a ransomware incident in the last year.
The stakes are very high — 3.5% of companies surveyed even indicated that lives were also at stake which was exemplified by a recent attack in Marin, California where doctors lost access to patient records for over 10 days.
As ransomware escalates and poses serious security risks for healthcare institutions, many privacy experts and legislators have called for more specific guidance from the U.S. Department of Health and Human Services (HHS).
A few weeks ago, HHS responded to these calls with a detailed fact sheet to explain ransomware and provide advice. Although most of the document outlines what should be obvious for an organization that already has a solid data security plan (including reliable back-ups, workforce training, and contingency plans), the major headline is HHS’s verdict on whether or not a ransomware attack qualifies as a data breach under HIPAA.