Site moved to www.healthblawg.com/2011/11/privacy-and-security-joke-or-no-joke.html, redirecting in 1 second...

« TEDxCambridge 2011 Twitter Feed | Main | Occupy With Grace »

November 21, 2011

Privacy and Security: Joke or No Joke?

The Wall of Shame welcomes Sutter Health. Another computer with unencrypted protected health information on over 4 million patients - gone. Now, those guys are pretty smart, so why don't they encrypt all computers with PHI?  One of life's persistent questions.  I mean, I can accept the fact that a health plan operator like Cignet Health might have issues with getting a grip on HIPAA compliance, but Sutter Health?What were they thinking? Can't happen here?  Encryption is a drag?  It's an easy way to avoid major egg-on-face and to avoid spending significant coin on PR, credit reporting services, and potentially on court judgments -- all in addition to significant administrative fines payable to HHS and state regulators.

So the federales are piloting the HIPAA audit program. I know it's required by the HITECH Act, but who believes that it will motivate behavior change?  Anyone?  Sutter Health was clearly not motivated to seek a safe harbor that would have made the loss of 4 million patient records a non-event.  I know encryption can be a drag, but I'm not a techie. If you are, I invite you to educate me (and the other non-techies out there) on the question of how miserable it really is to have to deal with encrypted data; if you're really a techie, write a program to enable light-touch encryption that doesn't interfere with use of data.

Whether or not encryption is miserable, we should be asking: Why is this data on a barely secured computer (password-protected desktop) in the first place? Shouldn't it be stored on a server that stays in a secure facility, or in a secure private cloud?

Furthermore, as data loss incidents like this keep happening -- even among other industry leaders (see, e.g., Mass General) -- perhaps we need a new framework for thinking about access to health information. If we knew for sure that employment and insurance decisions would not be affected by the availability of otherwise private health record information, perhaps we would be more sanguine about their release. Perhaps government resources would be better spent on beefing up education and enforcement in those arenas (vs. auditing and enforcing compliance with privacy and security standards).

David Harlow 
The Harlow Group LLC
Health Care Law and Consulting
 

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d83451d52c69e20154372a38b0970c

Listed below are links to weblogs that reference Privacy and Security: Joke or No Joke?:

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Perhaps a cost-benefit decision. [($$$ saved from not implementing encryption) minus ($$$ spent on potential consequences of not having encrypted data)]. If this difference is positive, then it's a win for Sutter Health. How much money did Merck make on Vioxx versus how much they spent on lawsuits for it? HIPAA compliance? Get real. For things to change, incentives and consequences have to change. Corporations are in the Gilded Age. They follow one rule, the Golden one. Whoever has it make the rules.

David, Most of us in the US talk about security but as we used to say at Cybercash, the first online credit card processor "Security is perception". Even banks still don't get it, they send us mail with all types of personal information printed on their correspondence.

I wonder how many health care facilities have personnel with any experience in security? Most of these issues that we see in the news everyday could be solved with simple policies and procedures, but it takes someone with authority to execute. I am a contributing author in a mHealth book published by HIMSS in February where I go into detail on these issues.

I agree that education is the key to limiting these breaches.

Jeff Brandt

I guess I don't get the comment 'encryption is a drag' - it should be transparent to end user if properly implemented. I think architecting and planning for appropriate security posture is key (budget, staff, consultants) - trying to reverse engineer encrypted data management solutions is time and cost prohibitive. Many orgs may be caught in that trap (trying to meet HITECH on existing systems). I also see orgs try to do in house or with consultants who don't have appropriate expertise - that may add to pain and suffering associated. However, if planned for appropriately and implemented by qualified teams it should have little to no impact on data sharing or end users.

"trying to reverse engineer encrypted data management solutions is time and cost prohibitive"

That's right -- and most health care providers are not starting from scratch with up-to-date systems. Thus, kludgy non-solutions ensue, and folks often choose to avoid the "PIA" as you put it in your blog post.

Thanks for your comments via twitter and your blog -- No Laughing Matter: Healthcare IT Security is PIA - Life in Caps Lock: cyberslate's posterous – http://vsb.li/1SHOGd

Comment posted by "Ben" on this post on another site:

The database encryption question is somewhat complicated, but there are a number of major practical problems with encrypting data at the database level and some critiques of the premise. The first is performance: any time data is written to the database or called from the database in needs to undergo an encryption or decryption process. For a single short piece of information this isn’t a big deal, but when you’re pulling up a bunch of records and showing them it can put a significant drag on performance. Secondly, encryption can mess with data indexing. Large databases are generally indexed to increase search-query performance (imagine trying to find a specific word in a novel vs in a dictionary). If a database is encrypted the index is as well, and this can make the index useless (particularly for full-text indexes that are using spacing and punctuation to parse up data for indexing). This can have dramatic negative effects on the speed on medium-large applications.

Encryption can also mess with the ability to run specific types of “select” queries against the database. Imagine you’re trying to view all transactions that occurred yesterday. A standard query would look like “select [data to retreive] from [table with data] where DATE > [2 days ago] and DATE < [today]"… easy and fast. However, if the whole table were encrypted you couldn't do this, as all the dates are seemly random strings and you can't use the "greater than" or "less than" operators. Your only choice would be to retrieve all data, decrypt it, and then run through each row to see if it matches your criteria. This is a huge drag on performance and requires a lot more programming.

There are also some questions as to how important database security is. The most important thing is making sure than no one has access to the database, because once they have it – encrypted or not – there are almost ways to extract the information. One worry is that anyone who has broken in so deeply that they can get the database has also broken in deeply enough to get the decryption keys (this is especially the case for scripted languages). Even without the key itself, once they have the encrypted database, they have all the time in the world to run software than will crack the encryption. In the event that your data is taken it's much better that it be encrypted, but it's no guarantee that they won't get your data.

There are also trouble-shooting and data-recovery concerns around encrypted databases. If something isn't working at the application level, techs are usually able to log into the database directly and see the current state of the data… logs are helpful to a point, but the data itself is often the key to solving a bug and if the data in encypted you cannot easily see the faulty data. And lastly, it is often even worse for you to lose data than it is for unauthorized people to see it (not always the case). If an encryption key is lost (which can and does happen in organizations) you have effectively lost all the data with it. In order to protect data the keys themselves are often encrypted (to protect against a problem mentioned above), but if a file gets corrupted or goes missing it could be the end of the data as well.

This isn't meant as a condemnation of encryption, which is appropriate in many cases (especially when the database is stored on someone's local computer, as it was in this instance), but it isn't always safer and is almost always a pain (sometimes involving crippling performance issues). Generally speaking, the best route is a middle path. One-way encryption for passwords and verifying information, 2-way encryption on fields that hold particularly sensitive data, and no encryption on everything else. Certain key fields (i.e. name, address, date of birth, ssn, phone, credit card number, and other identifying info that can be encrypted without ruining key indexes) should be encrypted, and will make it much harder for hackers to reconstruct an accurate picture of a specific person even if they can read a lot of the patient's medical data. It's not fool-proof if a hacker is dedicated (and users will inevitably put identifying information into unencrypted fields some of the time), but then there are generally not workable solutions that can stop a dedicated and skilled hacker.

An example of PHI that needs to be stored on a laptop is Home Health. I used to support a Home Health department, and they were often at remote patient homes with no wireless access, so they would download all patient data they needed for the day onto the client machine then sync with the server at night.

The software vendor did not encrypt so we put password protection on every hard drive in addition to encrypting the Windows files with the data.

The comments to this entry are closed.