Archive for the ‘EMR’ Category

Selling Anonymized Health Data

Sunday, October 18th, 2009

anonymous350The New York Times article When 2+2 Equals a Privacy Question raises some serious medical data privacy concerns.

But by 2020, when a vast majority of American health providers are expected to have electronic health systems, the data mining component alone could generate sales of up to $5 billion...

The magnitude of data needed to generate that kind a revenue is significant.  The likelihood that "de-identification" of someone's health information will occur is very high.  "Anonymized" Data Really Isn't points out the same thing that the NYT article does:

Computer scientists over the past fifteen years show that it is quite straightforward to extract personal information by analyzing seemingly unrelated, “anonymized” data sets.

The demand for the secondary use of health data (and here) is high because it is believed it will

Significantly improve the quality of patient care and offers the promise of even greater benefits in the future.

Many feel that use of secondary health information should be regulated by the government.

Here's a good overview that covers many health data secondary use issues: Toward a National Framework for the Secondary Use of Health Data: An American Medical Informatics Association White Paper.

UPDATE (10/20/09): Also see the Wired article Medical Records: Stored in the Cloud, Sold on the Open Market.

Medical Data in the Cloud

Saturday, September 26th, 2009

secure_documentI just ran across a three-part series of articles by Practice Fusion:

Medical Data in the Internet “cloud”:

Being an EMR in the cloud requires Practice Fusion to address these issues in depth. If you're thinking about putting health information in the cloud like I've previously discussed, these articles are worth a careful read.

Hat Tip:  Healthcare IT News

UPDATE (10/1/2009):

Some commentary on the Practice Fusion EHR:  Cloud based EHRs – a response to PracticeFusion.

UPDATE (10/5/2009): EMR Vendor Practice Fusion’s CEO Interview

Access to Medical Data: Are PC Standards and PHRs (You) the Answer?

Tuesday, September 22nd, 2009

Dana Blankenhorn's article Give medicine access to PC standards makes some good points about the medical device industry but (IMHO) misses the mark when trying to use PC standards and PHRs as models for working towards a solution.

I'll get back to his central points in a minute. One thing I find fascinating is the knee-jerk reaction in the comments to even a hint of government control.  How on earth can someone jump from "industry standard" to a "march towards socialism"? We saw the same thing at this summer's town hall meetings and in Washington a couple of weeks ago.  The whole health care debate is just mind boggling!

Anyway, let's focus on the major points of the article. First:

Every industry, as its use of computing matures, eventually moves toward industry standards. It happened in law, it happened in manufacturing, it happened in publishing.

It has not happened, yet, in medicine.

Very true.  In the medical device world, connectivity and interoperability are hot topics. A couple of recent posts -- Plug-and-Play Medicine and Medical Device Software on Shared Computers -- point out the significant challenges in this area.  In particular, the development and adoption of standards is a very intensive and political process. But where's the incentive for the industry to go through this? Dana's comment addresses this (my emphasis):

The role I like best for government is in directing market incentives toward solutions, and not just to monopolies or bigger problems.

The reason health care costs jump every year is because market incentives cause them to. Those incentives must be changed, but the market won't by itself because the market profits from them.

Only government can transform incentives. ...

Like it or not, this may to the only way to push the medical industry to do the right thing.  But those other industries didn't need government intervention in order to create their standards.  Using PC (or other industry) standards as a model for facilitating medical data access just doesn't work.  The health industry will have to dragged to the table kicking and screaming, and the carrot (or stick) will have to be large in order for them to come to a consensus.

Second, I don't see the relationship between the use of PHRs and the promotion of standards.

By supporting PHRs, you support your right to your own data. You support liberating data from proprietary systems and placing it under industry standards.  You support integrating your health with the world of the Web, and the benefits such industry standards can deliver to you.

Taking responsibility for your own health data is great, but both Microsoft HealthVault and Google Health are proprietary systems.  Just because your data is on the Web doesn't make it any more accessible.  And even if one of these PHRs did became an industry standard, it would have very little impact on how EMRs communicate with each other or medical devices in general.

There are no easy answers.

Liberate the Data!

Friday, April 3rd, 2009

Peter Neupert's post Tear Down the Walls and Liberate the Data is worth reading. There are some Microsoft-centric comments, but a number of the linked articles are good and the overall message is correct (IMO anyway).

I might have tried to find a better analogy than 'tear down this wall', but that's because I was never a Ronald Reagan fan.  Nevertheless, this gets across the primary point:

What’s of paramount importance is liberating the data and making it available for re-use in different contexts.

Two major 'walls' stand in the way of this:

  1. “it’s-my-data”
  2. “waiting-for-the-right-standards-set-by-government”

Both exist because of the perceived competitive advantages they provide to organizations and vendors.

Interoperability of data, or enabling data to become "liquid" would allow it to flow easily from system to system. These challenges are the same ones addressed by Adam Bosworth that I discussed in Dreaming of Flexible, Simple, Sloppy, Tolerant in Healthcare IT.

The technical issues are complicated, but I also believe that they not the primary reason that prevent  health IT systems from inter operating.  As Peter suggests, it would be good for HiTech dollars to be used to break down some of the more difficult barriers that prevent data liquidity.

The "proven model[s] for extracting and transforming data" do exist and there is no excuse not to use them.

After thinking about it some more, a more cautionary analogy may be The Exodus -- Mosses leading the Israelites out of the Land of Egypt ("let my data go!").  1) It took an act of God to part the Red Sea, and 2) after their dramatic escape they roamed the desert for 40 years. Let's hope that health IT interoperability does not need devine intervention or suffer the same fate.

Contradictory Observations and Electronic Medical Records

Tuesday, March 3rd, 2009

Martin Fowler has an interesting discussion in his ContradictoryObservations post.  This little slice of medically related software design insight is particularly relevant because it highlights (at least for me) the complexity of the use of electronic medical records and their interoperability.

In a broader sense I suppose it also shows some of the underlying difficulties that face the Obama administration's new EMR adoption push.  But I'm not going there.

The concepts of observationsrejection, and evidence are good, but they're just the tip of the iceberg:

rejected and evidence

Even after you've modeled the data interactions, how do you effectively communicate these concepts to the user?  Or to another EMR that doesn't know about your model or how it's used?

Martin's view is that:

Most of the time, of course, we don't use complicated schemes like this. We mostly program in a world that we assume is consistent.

Unfortunately, many of the issues facing electronic medical records do require complex solutions. And even when the world is consistent, how you implement a solution may be (actually, will probably be) very different than how I implement it.  Either way, interoperability will always be a challenge.

We're going to need lots of good software design tools to solve these problems.

Dreaming of Flexible, Simple, Sloppy, Tolerant in Healthcare IT

Saturday, January 3rd, 2009

I was recently browsing in the computer (nerd) section of the bookstore and ran across an old Joel Spolsky edited book: The Best Software Writing I.  Even though it's been about four years, good writing is good writing, and there is still a lot of insightful material there.

One of the pieces that struck a cord for me was Adam Bosworth's ISCOC04 Talk (fortunately posted on his Weblog).  He was promoting the use of simple user and programmer models (KISS -- simple and sloppy for him) over complex ones for Internet development.  I think his jeremiad is just as relevant to the current state of  EMR and interoperability.  Please read the whole thing, but for me the statement that get's to the point is this:

That software which is flexible, simple, sloppy, tolerant, and altogether forgiving of human foibles and weaknesses turns out to be actually the most steel cored, able to survive and grow while that software which is demanding, abstract, rich but systematized, turns out to collapse in on itself in a slow and grim implosion.

Why is it that when I read "demanding, abstract, rich but systematized" the first thing that comes to mind is HL7?  I'm not suggesting that some sort of open ad hoc system is the solution to The EMR-Medical Devices Mess.  But it's painfully obvious that what has been built so far closely resemble "great creaking rotten oak trees".

The challenge for the future of Healthcare interoperability is really no different than that of the Internet as a whole (emphasis mine):

It is in the content and the software's ability to find and filter content and in the software's ability to enable people to collaborate and communicate about content (and each other).

I would contend that the same is true for medical device interoperability. Rigid (and often times proprietary) systems are what keep devices from being able to communicate with one another.  IHE has created a process to try to bridge this gap, but the complexity of becoming a member, creating an IHE profile, and having it certified is a also a significant barrier.

Understanding how and why some software systems have grown and succeeded while others have failed may give us some insights. Flexible, Simple, Sloppy, Tolerant may be a dream, but it also might not be a bad place to start looking for future innovations.

Adam also had this vision while he was at Google: Thoughts on health care, continued (see the speech pdf):

... we have heard people say that it is too hard to build consistent standards and to define interoperable ways to move the information. It is not! ... When we all make this vision real for health care, suddenly everyone will figure out how to deliver the information about medicines and prescriptions, about labs, about EKGs and CAT scans, and about diagnoses in ways that are standard enough to work.

Also see the Bosworth AMIA May07 Speech (pdf) for how this vision evolved, at least for Google's PHR.

UPDATE (2/9/09): Here's a  related article: The Truth About Health IT Standards – There’s No Good Reason to Delay Data Liquidity and Information Sharing that furthers this vision:

We don’t have to wait for new standards to make data accessible—we can do a ton now without standards.  What we need more than anything else is for people to demand that their personal health data are separated from the software applications that are used to collect and store the data.

UPDATE (4/17/09): John Zaleski’s Medical Device Open Source Frameworks post is also related.

Use of an open-source framework approach is probably as good as any. As a management model, I don’t see it as being that much different from the way traditional standards have been developed. Open-source just provides a more ad-hoc method for building consensus. Less bureaucracy is a good thing though. It may also allow for the introduction and sharing of more innovative solutions. In any case, I like visions.

USB plug-n-play (plug-n-pray to some) may be a reasonable connectivity goal, but it does not deal at all with system interoperability. Sure, you can connect a device to one or more monolithic (and stable) operating systems, but what about the plethora of applications software and other devices?  This just emphasizes the need to get out of the “data port” (and even “device driver”) mind-set when envisioning communication requirements and solutions.

EMR at Kaiser Permanente

Monday, September 1st, 2008

Yesterdays San Diego Union Tribune had an article entitled: Digital divide. The subtitle is: Some doctors and hospitals are embracing electronic records for patients, but most are not making the technological leap from paper due to the cost.

It discusses Kaiser Permanentes' $4 billion roll out of their custom EMR system, called KP HealthConnect (the primary vendor is Epic Systems), to 8.7 million members in 9 states.

It also talks about the difficulties of EMR adoption in general -- cost being a major barrier. An interesting read.

UPDATE (9/10/08): Todays HIStalk Readers Write has a very good writeup about the rollout of this system in Northern California:  A Physician’s Experience with Kaiser’s Epic/HealthConnect Rollout.

CCHIT, The 800-Pound Gorilla

Friday, August 29th, 2008

There's a lively discussion on HIStalk (Readers Write 8/27/08) regarding the merits of CCHIT.  The Jim Tate piece isn't long, or even that informative. It simply touts CCHIT as an

.. organization that is helping level the playing field and make it safer for clinicians to take the plunge into electronic records.

This seems innocuous enough.  Judging by the negative responses though, some people have real problems with CCHIT. In particular, the Al Borges, MD Response to "CCHIT: the 800-Pound Gorilla" is a detailed point-by-point rebuttal ("CCHIT is simply not necessary.").

I guess it's not surprising that the biggest issue seems to revolve around money. The cost of obtaining CCHIT certification has stratified EMR companies (big vs. small) and makes it even more difficult for practices to see a ROI.

Also, the discussion of interoperability standards (or lack thereof) is one of my hot buttons. As Mahoghany Rush (comment #9) says:

Anyone here who talks about being HL7 compliant - and thinks it really solves a problem - has never personally written an interface.

So true.

In this regard Inga asks (in News 8/29/08):

Are lab standards an issue one of the various work groups is addressing? Are the labs on board?

When you say "lab" what you're really talking about are the large number of medical devices commonly found in both hospitals and private practice offices. As you note, the need for interfaces to these devices is so the data they generate can be associated with the proper patient record in the EMR. This not only allows a physician to have a more complete picture of the patients' status, but the efficiency of the entire clinical staff is vastly improved when they don't have to gather all of this information from multiple sources.

The answer to your second question is yes, many "labs" -- medical device companies, are actively in involved in the development of interoperability standards.  The EMR companies are also major participants.

There are two fundamental problems with "standards" though:

  1. A standard is always a compromise.
  2. A standard is always evolving.

By their very design, the use of a standard will require the implementer to jump though at least a few hoops (some of which may be on fire).  Also, a device-to-EMR interface you complete today will probably not work for the same device and EMR in a year from now -- one or both will be implementing the next generation standard by then.

Nobody dislikes standards. Interoperability is usually good for business. There are two primary reasons why a company might not embrace communications standards:

  1. The compromise may be too costly, either from a performance or resources point of view, so a company will just do it their own way.
  2. You build a proprietary system in order to explicitly lock out other players. This is a tactic used by large companies that provide end-to-end systems.

The "standards" problem is not just a Healthcare interoperability issue. The IT within every industry struggles with this.  The complexity of Healthcare IT and its multi-faceted evolutionary path has just exacerbated the situation.

So, the answer is that everyone is working very hard to resolve these tough interoperability issues. Unfortunately, the nature of beast is such that it's going to take a long time for the solutions to become satisfactory.

UPDATE (9/3/08): The response to Inga was published here: Readers Write 9/3/08. Thanks Inga!

Digital Devices and EHRs: the ROI

Monday, July 28th, 2008

Here's a piece that provides a quick analysis of the ROI of having vital signs and ECG devices connected to an EMR:

Digital Devices and EHRs -- Perfect Together

The right way to do it:

Electronic transfer of patient information

Electronic transfer of BP and heart rate

Electronic recording of test

MD views results from anywhere

DONE -- half the steps, half the time

Ah, if it were only that easy.

Interoperability: Google Protocol Buffers vs. XML

Monday, July 14th, 2008

Google recently open sourced Protocol Buffers: Google's Data Interchange Format (documentation, code download). What are Protocol Buffers?

Protocol buffers are a flexible, efficient, automated mechanism for serializing structured data – think XML, but smaller, faster, and simpler.

The documentation is complete and worth a quick read through. A complete analysis of PB vs. XML can be found here:  So You Say You Want to Kill XML.....

As discussed, one of the biggest drawbacks for us .NET developers is that there is no support for the  .NET platform. That aside, all of the issues examined are at the crux of why interoperability is so difficult. Here are some key points from the Neward post:

  1. The advantage to the XML approach, of course, is that it provides a degree of flexibility; the advantage of the Protocol Buffer approach is that the code to produce and consume the elements can be much simpler, and therefore, faster.
  2. The Protocol Buffer scheme assumes working with a stream-based (which usually means file-based) storage style for when Protocol Buffers are used as a storage mechanism. ... This gets us into the long and involved discussion around object databases.
  3. Anything that relies on a shared definition file that is used for code-generation purposes, what I often call The Myth of the One True Schema. Assuming a developer creates a working .proto/.idl/.wsdl definition, and two companies agree on it, what happens when one side wants to evolve or change that definition? Who gets to decide the evolutionary progress of that file?

Anyone that has considered using a "standard" has had to grapple with these types of issues. All standards gain their generality by having to trade-off for something (speed, size, etc.). This is why most developers choose to build proprietary systems that meet their specific internal needs. For internal purposes, there's generally not a need to compromise. PB is a good example of this.

This also seems to be true in the medical device industry.  Within our product architectures we build components to best meet our customer requirements without regard for the outside world. Interfacing with others (interoperability) is generally a completely separate task, if not a product unto itself.

Interoperability is about creating standards which means having to compromise and make trade-offs.  It would be nice if Healthcare interoperability could be just a technical discussion like the PB vs. XML debate. This would allow better integration of standards directly into products so that there would be less of the current split-personality (internal vs. external  needs) development mentality.

Another thing I noticed about the PB announcement -- how quickly it was held up to XML as a competing standard. With Google's clout, simply giving it away creates a de facto standard. Within the medical connectivity world though, there is no Google.

I've talked about this before, but I'm going to say it again anyway. From my medical device perspective, with so many confusing standards and competing implementations the decision on what to use ends up not being based on technical issues at all. It's all about picking the right N partners for your market of interest, which translates into N (or more) interface implementations. This isn't just wasteful, it's also wrong. Unfortunately, I don't see a solution to this situation coming in the near future.