The causes of digital patient privacy loss in EHRs and other health IT systems

January 26, 2014

This past Friday I was invited by the Patient Privacy Rights (PPR) Foundation to lead a discussion about privacy and EHRs. The discussion, entitled “Fact vs. Fiction: Best Privacy Practices for EHRs in the Cloud,” addressed patient privacy concerns and potential solutions for doctors working with EHRs.

While we are all somewhat disturbed by the slow erosion of privacy in all aspects of our digital lives, the rather rapid loss of patient privacy around health data is especially unnerving because healthcare is so near and dear to us all. In order to make sure we provided some actionable intelligence during the PPR discussion, I started the talk off giving some of the reasons why we’re losing patient privacy in the hopes that it might foster innovators to think about ways of slowing down inevitable losses.

Here are some of the causes I mentioned on Friday, not in any particular order:

  • Most patients, even technically astute ones, don’t really understand the concept of digital privacy. Digital is a “cyber world” and not easy to picture so patients believe their data and privacy is protected when it may not be. I usually explain patient privacy in the digital world to non-techies using the analogy of curtains, doors, and windows. The digital health IT world of today is like walking into a patient’s room in a hospital in which it’s a large shared space with no curtains, no walls, no doors, etc. (even for bathrooms or showers!). In this imaginary world, every private conversation occurs so that others can hear it, all procedures are performed in front of others, etc. without the patient’s consent and their objections don’t even matter. If they can imagine that scenario, then patients will probably have a good idea about how digital privacy is conducted today — a big shared room where everyone sees and hears everything even over patients’ objections.
  • It’s faster and easier to create non-privacy-aware IT solutions than privacy-aware ones.  Having built dozens of HIPAA-compliant and highly secure enterprise health IT systems for decades, my anecdotal experience is that when it comes to features and functions vs. privacy, features win. Product designers, architects, and engineers talk the talk but given the difficulties of creating viable systems in a coordinated, integrated digital ecosystem it’s really hard to walk the privacy walk  Because digital privacy is so hard to describe even in simple single enterprise systems, the difficulty of describing and defining it across multiple integrated systems is often the reason for poor privacy features in modern systems.
  • It’s less expensive to create non-privacy-aware IT solutions. Because designing privacy into the software from the beginning is hard and requires expensive security resources to do so, we often see developers wait until the end of the process to consider privacy. Privacy can no more be added on top of an existing system than security can — either it’s built into the functionality or it’s just going to be missing. Because it’s cheaper to leave it out, it’s often left out.
  • The government is incentivizing and certifying functionality over privacy and security. All the meaningful use certification and testing steps are focused too much on prescribed functionality and not enough on data-centric privacy capabilities such as notifications, disclosure tracking, and compartmentalization. If privacy was important in EHRs then the NIST test plans would cover that. Privacy is difficult to define and even more difficult to implement so the testing process doesn’t focus on it at this time.
  • Business models that favor privacy loss tend to be more profitable. Data aggregation and homogenization, resale, secondary use, and related business models tend to be quite profitable. The only way they will remain profitable is to have easy and unfettered (low friction) ways of sharing and aggregating data. Because enhanced privacy through opt-in processes, disclosures, and notifications would end up reducing data sharing and potentially reducing revenues and profit, we see that privacy loss is going to happen with inevitable rise of EHRs.
  • Patients don’t really demand privacy from their providers or IT solutions in the same way they demand other things. We like to think that all patients demand digital privacy for their data. However, it’s rare for patients to choose physicians, health systems, or other care providers based on their privacy views. Even when privacy violations are found and punished, it’s uncommon for patients to switch to other providers.
  • Regulations like HIPAA have made is easy for privacy loss to occur. HIPAA has probably done more to harm privacy over the past decade than any other government regulations. More on this in a later post.

The only way to improve privacy across the digital spectrum is to realize that health providers need to conduct business in a tricky intermediary-driven health system with sometimes conflicting business goals like reduction of medical errors or lower cost (which can only come with more data sharing, not less). Digital patient privacy is important but there are many valid reasons why privacy is either hard or impossible to achieve in today’s environment. Unless we intelligently and honestly understand why we lose patient privacy we can’t really create novel and unique solutions to help curb the loss.

What do you think? What other causes of digital patient privacy loss would you add to my list above?

  • Blaine Warkentine MD

    Great post. Couldn’t agree more. Glad you brought this up. Important to take the privacy issues and keep them in check with patients needs and desires. i.e. startups today offer chance to share your STD’s of all things. https://itunes.apple.com/us/app/hula-find-clinic-or-doctor/id681370869?mt=8 I think this proves that the era of privacy as it pertains to healthcare has now been consumerized. Of course if you are a payer or provider you want to retain the “stigma” of privacy as a necessary element as it keeps your customer beholden. cheers. blaine

  • http://www.ShahidShah.com Shahid N. Shah

    Thanks, Blaine — consumerization of health IT will indeed be very interesting. Scary in some limited cases like sharing STD information, but interesting :-)

  • Robbieboy

    I completely disagree. Patients want privacy for their conditions. The thought they do not is ludicrous. There will be lawsuits on breaches of patient health data that will make current breach settlements look tiny by comparison. This factor could render EHR completely dead from a liability point of view. Patient identifiable information must be kept private for EHR to succeed.

  • Morgan Stuart

    In this rapidly changing health care environ, it’s often a matter of simply not knowing of, or accessing the best technology available. Axesson provides secure data exchange bridging EHRs / HIE / Hospitals / providers & ancillary services – enabling collaboration across the care continuum. Successfully used by the nation’s leading & longest established HIE (Santa Cruz HIE), the value of complete, timely health data exchange is proven. Secure messaging & data exchange is vital, and the technology is available to do so with maximized privacy.

  • Paulo Machado

    Excellent points! Consumers’ need to demand control over their privacy. We need a tech solution that empowers the consumer to set their Personal Privacy settings once and then they can be accessed by anyone in healthcare that may have a need for their data. Full access to who has used their data should be available – think credit report – so that consumers can develop trust in the users of their info & to make sure that it is used according to their wishes.

  • Pingback: Making the Case for Affordable, Integrated Healthcare Data Repositories and PHRs | Parity Research()

  • Kamal Govindaswamy

    I would be interested in seeing an elaboration of your last point… How regulations such as HIPAA make it easy for Privacy Loss to occur? The point is counter intuitive and contrary to common knowledge, hence my curiosity.

    As for your points #2 and #3 about technology solutions, I think the problem is pervasive and will continue until the “Privacy by Design” concept is embraced wholeheartedly by the solution developers. I strongly recommend a read of this book
    http://www.amazon.com/gp/aw/d/1430263555

    The Kindle version is available free of cost.

  • http://www.shahidshah.com Shahid N. Shah

    Great points, Kamal. The real problem with HIPAA is not its intentions — the gov’t had its heart in the right place when it tried to protect citizens. The problem is that the law creates paperwork and compliance requirements that have yielded bad software design practices that don’t really do much to help privacy except in name only.

    For example, it talks about disclosure and consent requirements but without specificity and standards in place for the different workflows that a massive healthcare bureaucracy with thousands of organizations must support. We have yet to come to conclusion in the tech world about how to create digital disclosures and consent e-signatures so HIPAA is mostly useless for true security and privacy in a world moving from paper to EHRs. HIPAA causes data sharing headaches so instead of finding a good way to share data that would result in increased patient safety we in the engineering community often get beaten down by lawyers who are trying to protect individual companies legally.

    None of this is HIPAA’s fault per-se (some law is necessary) but as written and as audited you get HIPAA compliance but not real data security.

Previous post:

Next post: