HIPAA, Technology, and You Part 1: Encryption

Over the past eight months I worked to evaluate, demo and test out a wide array of apps, hardware, services and programs all designed to assist companies with HIPAA Compliance.  This post will be the first in a series evaluating these different tools and my experiences with them.

In light of the HIPAA breaches by Anthem (~80m) and Aspire Indiana(~45k), there has been a lot of discussion, information, and debate over what encryption is and who has to encrypt data. While nothing official has been published, layman reviews of the OCR data indicate that more than half of all HIPAA breaches are attributable to unencrypted data. To begin, this post will attempt to explain the basics of what encryption is in plain language, how it works and how it fails.  It is not meant to cover every corner of encryption practices, but rather provide essential understanding of what is a very complex topic.  It will then turn to the question at hand, what duty, if any, does a covered entity have to encrypt its data?

Basics of Encryption:

At its most basic level, think of a server as a house. The house contains all the data within its walls; the front door is the only proper entry to the house and the house key grants access to everything within.

Taken a step further, visualize the alternate entrances to a home: back door, windows and chimney.  Each of these other entrances serve as good examples of how a potential thief might target different ways into a home, thereby circumventing the front door and bypassing the need for a key. Sometimes these alternate entrances are wide open, subject to attack, or even available with keys unknown or uncontrolled by the homeowner. These alternate entrances are good metaphors for the security vulnerabilities of a server.  Every potential route into a server acts as a path to the data contained within.

With that understanding, there are two key terms to grasp: “in motion” and “at rest.” Data in motion is just that, data being sent from one point to another (e.g., e-mails, file downloads, and opening a client record remotely). Data at rest is, as its name implies, data that is sitting on a server.

Data in Motion:  Unencrypted data in motion is akin to a letter written in plain English, sealed in a paper envelope, and put in the mail.  If someone is able to intercept and open the envelope, they can read the contents without any further effort. The Jeb Bush email dump is a good example of unencrypted data containing names, address and social security numbers being available for anyone to find.  Encrypted data in motion uses the same mail carrier theory, but this time, the piece of paper within the envelope is written in code and completely illegible unless you have the right key to unscramble the letter.

Data at Rest:  Unencrypted data at rest means that once the mail enters the home, regardless of how it was sent, it is stored in clear and plain text. In this scenario, a thief only needs to gain access to the home to read anything contained within. A piece of mail sitting on the counter after being delivered is an good analogy. Data encrypted at rest means that each individual piece of mail is encrypted when it enters the house. Once you are in the house, all the contents of the house are present, but are scrambled and randomized. You might see a fragment of a couch or a picture, but nothing is recognizable or makes any sense unless you have a key to unscramble and assemble the couch and pictures.

Putting these pieces together, we end up with four different transmission/storage scenarios:

  1. Unencrypted in motion/unencrypted at rest
    • This is the most un-secure possible scenario. Here, the covered entity neither encrypts its own data in motion, nor requires the entities it works with to encrypt the data in motion.  The situation faced by Aspire Indiana indicates that the breach involved unencrypted data at rest. The stolen laptops all had unencrypted data with clear text of patient data.
  2. Unencrypted in motion/encrypted at rest
    • In this scenario, data is moved around and received unencrypted, but once it is in the house, it is encrypted.  This scenario presents what is perhaps a very common occurrence. Companies encrypt the data on their servers, but pay little or no attention to how data is transmitted or sent (i.e. e-mail, listserv, etc.)
  3. Encrypted in motion/unencrypted at rest
    • This is similar to the Anthem situation where a company encrypts its data in motion, but once it enters the house, the data is unencrypted and requires no further key to unscramble and read.
  4. Encrypted in motion/encrypted at rest
    • This presents the most secure scenario. Every piece of data is not only encrypted when in motion, but also when at rest within the house. This means that a thief would need to not only break through the house’s defenses, but through the individual defenses of the data to yield usable information.

There are a variety of pros and cons to each approach, but at a bare minimum, encryption of PHI data in motion is something that every company should do. Scenario’s 1 and 2, should never occur.  Otherwise, every time PHI is in motion it is completely unprotected and out in the wilds where anyone could capture and read it. The horror stories of hackers sitting by the door to your house reading everything that comes in and goes out is real, it’s called packet sniffing.

Scenario 3 – Encrypted in motion/unencrypted at rest

The real issue here, as evidenced by Anthem’s choice to not encrypt its data, is whether to encrypt data at rest. This scenario is the process implemented by Anthem.  All of its data in motion was encrypted, but as soon as it passed through front door, it was decrypted and put into its appropriate place. The touted benefit of this approach is that the data within the home is very easy to work with, manipulate and leverage into applications.  The downside of this approach is evidenced by the breach on February 5, 2015.  Once hackers were able to gain access to Anthem’s house, they had free reign and access to everything inside.

Scenario 4 – Encrypted in motion/encrypted at rest

When a company chooses to implement this security plan, it means that a piece of data is never unencrypted. In the house and out of the house, the data is encrypted and requires a key to unscramble.  For example, when Dr. Jones needs to view Bob Smith’s record, the doctor types in Bob’s name and searches for him in his office’s EHR. The server receives the request, and with its key, finds the appropriate data. The data then passes out of the house, retaining its encryption, and remains encrypted until it displays on the doctor’s computer. After any changes are made, the process reverses with the encrypted data going back to the house for safekeeping. While an argument can be made that unencrypted data is easier to work with, these companies are not keeping records for a lemonade stand. This is PHI, perhaps some of the most private and important information about a person.  The difference between working with encrypted and unencrypted PHI is fractions of a nanosecond for today’s computers.

How Encryption Fails

Sadly, no encryption program is perfect, although many come close. The complexity of those systems, coupled with the nature of humans often leads to inadvertent gaps, holes or forgotten maintenance access doors. Those tasked with system design and maintenance can also simply forget, refuse, or for various business and technical reasons, be unable to update the server’s defenses to protect against new threats.  Many times, hackers will utilize known security vulnerabilities to invade a server. While the numbers are not completely clear, several industry analysts believe that more than half of all server breaches are due to un-patched server software. While the excuses and reasons not to update are common, if you have an unpatched server, it needs to be fixed, isolated or transitioned out. Simply leaving it as is, is a recipe for disaster.

Duty of a Covered Entity to Encrypt

Encryption and decryption are addressed primarily under HIPAA in 45 CFR 164.312; Access Controls and Transmission Security. As a quick HIPAA primer, there are two types of safeguard specifications within the statute; those that are required and those that are addressable.  A covered entity must implement policies and procedures that meet what a specification requires. When a specification is addressable, it does not mean optional, but rather requires the entity to analyze the specification. Here is what OCR says about addressable safeguards:

“If an implementation specification is addressable, then the covered entity must assess whether it is a reasonable and appropriate safeguard in the entity’s environment. This involves analyzing the specification in reference to the likelihood of protecting the entity’s EPHI from reasonably anticipated threats and hazards. If the covered entity chooses not to implement an addressable specification based on its assessment, it must document the reason and, if reasonable and appropriate, implement an equivalent alternative measure.”

With this in mind, Anthem may have determined that it did not need to encrypt its data at rest, but solely while it was in motion.  Assuming it has the proper documentation in place and can justify the decision, it is possible they are not in violation of this HIPAA safeguard. However, in 2009 Anthem had over 1 million individual records stolen by hackers from hard drives that were unencrypted. You read that right, Anthem had a breach 6 years ago where unencrypted data was hacked and stolen.  It’s going to take an amazing amount of legal gymnastics to demonstrate why encrypting its data was not reasonable and appropriate in light of the 2009 breach.


Anthem did not have a specific requirement under the law to  encrypt its data at rest. That said, it did have a duty to implement an encryption and decryption plan that would address reasonably anticipated threats and hazards. So the question becomes whether Anthem could have reasonably anticipated its unencrypted data being hacked.  After the 2009 breach, Anthem issued the following statement:

“This was an isolated occurrence,” said Cindy Wakefield, spokesperson on behalf of Anthem, in a written statement to Healthcare IT News. “Appropriate corrective actions have been implemented, and process improvements for posting provider data online have been reviewed with the team.”

That statement was apparently premature and incorrect. A short six years later, Anthem’s unencrypted data was yet again hacked and stolen.  The decision to encrypt or not encrypt in this day and age is rapidly becoming moot. The difficulty in working with encrypted data does not exist on a level that justifies maintaining client records in clear text (i.e. unencrypted) once they are in the house.  In the next round of HIPAA statute amendments, it would not be a surprise to see a requirement that all PHI data must be encrypted in motion and at rest.

If you find yourself in a position where you do not know what scenario your company fits into, or you know you have unencrypted data moving or resting, reach out to your CIO and hammer out a game plan. HIPAA compliance crosses departmental boundaries and may involve at a minimum: legal, HR, and IT.  Get the right people to the table now and avoid pushing this topic under the rug.

/s/ HH @legallevity

Anthem HIPAA Breach

Unless you have been hiding from the television, internet and/or smoke signals, you have heard something about the recent data breach at Anthem. On February 5, 2015, initial reports surfaced indicating a massive consumer data breach. The first few articles placed the number of affected individuals just north of 40 million people; more recent reports are doubling that number.

So far, Anthem has identified the following information as compromised: Name, Date of Birth, Social Security Number, Medical ID, Home Addresses, E-mail Addresses, Employment Information and Income Data. In a remarkably quick response, Anthem posted this letter to its members (past and present) at anthemfacts.com:

“To Our Members, Safeguarding your personal, financial and medical information is one of our top priorities, and because of that, we have state-of-the-art information security systems to protect your data. However, despite our efforts, Anthem was the target of a very sophisticated external cyber attack. These attackers gained unauthorized access to Anthem’s IT system and have obtained personal information from our current and former members such as their names, birthdays, medical IDs/social security numbers, street addresses, email addresses and employment information, including income data. Based on what we know now, there is no evidence that credit card or medical information, such as claims, test results or diagnostic codes were targeted or compromised…”

For those affected directly by the breach, many received personal notices from Anthem and/or their employer. Here is one of those letters:

“As you may have heard in the news, Anthem, was recently the target of a cyber attack.

 While we don’t yet know if our employees’ personal information was involved, we do know that the attackers obtained information from as many as 80 million of Anthem’s current and former members, including:

  • Names
  • Birthdays
  • Medical IDs/Social Security numbers
  • Street addresses
  • Email addresses
  • Employment information, including income data

According to Anthem there is no evidence, at this time, that credit card or medical information was compromised.

  • If your information was accessed, Anthem will individually notify you via mail or email (if possible).
  • Anthem will provide credit monitoring and identity protection services free of charge.
  • You can access information as it becomes available on this website: www.AnthemFacts.com or via this toll-free number: 1-877-263-7995.

 I’m passing along information from the Better Business Bureau web site that addresses data breach situations.  The BBB offer some great suggestions and advice – many of these items were discussed during the seminar today. 

 http://www.bbb.org/council/news-events/consumer-tips/2015/02/bbb-advice-on-what-to-do-after-a-data-breach-compromises-your-identity/ “

Somewhat amazingly, not only was client information exposed, but Anthem employee data was stolen as well. It begs the question, why was client and employee data on the same system, or even on mutually accessible systems? Even further, why weren’t these databases encrypted? Encryption would have done wonders for Anthem in this breach. While encryption is not perfect, it certainly ensures that data is, at a very minimum, not presented in an easily accessible format.

Even more confounding than the facts leading up to the breach is the misinformation being pushed out by the media. Several articles have made statements indicating that the leaked information is not PHI. The media logic seems to be that since the data itself is not medical information, it is not PHI. This is incorrect.

According to HHS:

“Protected health information is information, including demographic information, which relates to:

  • the individual’s past, present, or future physical or mental health or condition,
    the provision of health care to the individual, or
  • the past, present, or future payment for the provision of health care to the individual, and that identifies the individual or for which there is a reasonable basis to believe can be used to identify the individual.
  • Protected health information includes many common identifiers (e.g., name, address, birth date, Social Security Number) when they can be associated with the health information listed above.”

The key phrase in the first sentence is “demographic information.” Contrary to common opinion (including the media’s), demographic information held by a covered entity is PHI. This is further supported by the third bullet point above, which states: “Protected health information includes many common identifiers (e.g., name, address, birth date, Social Security Number).”

Despite the media misconceptions, the data breached by Anthem is PHI and subject to the fines and sanctions imposed by HIPAA. Further, the protections of HIPAA apply to PHI held by covered entities and their business associates. HIPAA defines a covered entity as 1) a health care provider that conducts certain standard administrative and financial transactions in electronic form; 2) a health care clearinghouse; or 3) a health plan.

Under HIPAA, fines range from $100 to $50,000 per violation. The level of the fine depends on whether the entity knew of the security gap, acted on the gap, or even whether the decision to not address the gap was a willful decision. There are fine caps for each statutory violation. Assuming OCR finds that 1) Anthem failed to safeguard the data and 2) there was an impermissible use or disclosure, the statutory fines would cap out at $1.5million for each violation, for a total of $3million.

While, this fine represents the maximum amount that could be levied as a statutory penalty, it does not include costs of credit monitoring and notification. Generally, these costs account for $100 per affected person, for a total of $8 billion ($100 per person affected). This means that even conservatively, the potential HIPAA related fines and administrative costs for Anthem could exceed $8 billion.  Add to this, the class action law suits for negligence, state mandated penalties/fines, and Anthem is staring down the barrel of a very serious problem.

Let’s take a minute to appreciate the volume of those potential fines. Anthem brought in $74 billion last year in revenue. Of that amount, $69 billion went to overhead, leaving an annual profit of roughly $5 billion for 2014. This means that even assuming a conservative estimate, the administrative costs and fines could far surpass the profit and/or revenue of Anthem. It also does not take into account the investor cost, client flight, or credibility loss. This is shaping up to not only be the single largest HIPAA breach on record, but also a gargantuan financial cost. It also bears mentioning that this is not the first large scale HIPAA breach by Anthem. In 2009-2010, over 230,000 individuals had their PHI compromised due to a known security flaw that was not patched. The 2009 breach resulted in a $1.7 million fine.

Takeaways: While we are still in the early days of this breach, the early information looks startlingly poor. Anthem, one of the top three healthcare providers in the country, appears to have failed yet again at ensuring the security of its data. Given the severity of the consequences and their prior breach, it’s a wonder they are here yet again. Prior planning coupled with up to date security measures are only the beginning of securing electronic PHI. Companies must work to utilize data security philosophies such as data silos, device hardening, and robust password management if they want to prevent the types of breaches we hear about in the news.

/s/ HH @legalevity