Archive for the ‘Interoperability’ Category

CCHIT, The 800-Pound Gorilla

Friday, August 29th, 2008

There's a lively discussion on HIStalk (Readers Write 8/27/08) regarding the merits of CCHIT.  The Jim Tate piece isn't long, or even that informative. It simply touts CCHIT as an

.. organization that is helping level the playing field and make it safer for clinicians to take the plunge into electronic records.

This seems innocuous enough.  Judging by the negative responses though, some people have real problems with CCHIT. In particular, the Al Borges, MD Response to "CCHIT: the 800-Pound Gorilla" is a detailed point-by-point rebuttal ("CCHIT is simply not necessary.").

I guess it's not surprising that the biggest issue seems to revolve around money. The cost of obtaining CCHIT certification has stratified EMR companies (big vs. small) and makes it even more difficult for practices to see a ROI.

Also, the discussion of interoperability standards (or lack thereof) is one of my hot buttons. As Mahoghany Rush (comment #9) says:

Anyone here who talks about being HL7 compliant - and thinks it really solves a problem - has never personally written an interface.

So true.

In this regard Inga asks (in News 8/29/08):

Are lab standards an issue one of the various work groups is addressing? Are the labs on board?

When you say "lab" what you're really talking about are the large number of medical devices commonly found in both hospitals and private practice offices. As you note, the need for interfaces to these devices is so the data they generate can be associated with the proper patient record in the EMR. This not only allows a physician to have a more complete picture of the patients' status, but the efficiency of the entire clinical staff is vastly improved when they don't have to gather all of this information from multiple sources.

The answer to your second question is yes, many "labs" -- medical device companies, are actively in involved in the development of interoperability standards.  The EMR companies are also major participants.

There are two fundamental problems with "standards" though:

  1. A standard is always a compromise.
  2. A standard is always evolving.

By their very design, the use of a standard will require the implementer to jump though at least a few hoops (some of which may be on fire).  Also, a device-to-EMR interface you complete today will probably not work for the same device and EMR in a year from now -- one or both will be implementing the next generation standard by then.

Nobody dislikes standards. Interoperability is usually good for business. There are two primary reasons why a company might not embrace communications standards:

  1. The compromise may be too costly, either from a performance or resources point of view, so a company will just do it their own way.
  2. You build a proprietary system in order to explicitly lock out other players. This is a tactic used by large companies that provide end-to-end systems.

The "standards" problem is not just a Healthcare interoperability issue. The IT within every industry struggles with this.  The complexity of Healthcare IT and its multi-faceted evolutionary path has just exacerbated the situation.

So, the answer is that everyone is working very hard to resolve these tough interoperability issues. Unfortunately, the nature of beast is such that it's going to take a long time for the solutions to become satisfactory.

UPDATE (9/3/08): The response to Inga was published here: Readers Write 9/3/08. Thanks Inga!

Interoperability: Google Protocol Buffers vs. XML

Monday, July 14th, 2008

Google recently open sourced Protocol Buffers: Google's Data Interchange Format (documentation, code download). What are Protocol Buffers?

Protocol buffers are a flexible, efficient, automated mechanism for serializing structured data – think XML, but smaller, faster, and simpler.

The documentation is complete and worth a quick read through. A complete analysis of PB vs. XML can be found here:  So You Say You Want to Kill XML.....

As discussed, one of the biggest drawbacks for us .NET developers is that there is no support for the  .NET platform. That aside, all of the issues examined are at the crux of why interoperability is so difficult. Here are some key points from the Neward post:

  1. The advantage to the XML approach, of course, is that it provides a degree of flexibility; the advantage of the Protocol Buffer approach is that the code to produce and consume the elements can be much simpler, and therefore, faster.
  2. The Protocol Buffer scheme assumes working with a stream-based (which usually means file-based) storage style for when Protocol Buffers are used as a storage mechanism. ... This gets us into the long and involved discussion around object databases.
  3. Anything that relies on a shared definition file that is used for code-generation purposes, what I often call The Myth of the One True Schema. Assuming a developer creates a working .proto/.idl/.wsdl definition, and two companies agree on it, what happens when one side wants to evolve or change that definition? Who gets to decide the evolutionary progress of that file?

Anyone that has considered using a "standard" has had to grapple with these types of issues. All standards gain their generality by having to trade-off for something (speed, size, etc.). This is why most developers choose to build proprietary systems that meet their specific internal needs. For internal purposes, there's generally not a need to compromise. PB is a good example of this.

This also seems to be true in the medical device industry.  Within our product architectures we build components to best meet our customer requirements without regard for the outside world. Interfacing with others (interoperability) is generally a completely separate task, if not a product unto itself.

Interoperability is about creating standards which means having to compromise and make trade-offs.  It would be nice if Healthcare interoperability could be just a technical discussion like the PB vs. XML debate. This would allow better integration of standards directly into products so that there would be less of the current split-personality (internal vs. external  needs) development mentality.

Another thing I noticed about the PB announcement -- how quickly it was held up to XML as a competing standard. With Google's clout, simply giving it away creates a de facto standard. Within the medical connectivity world though, there is no Google.

I've talked about this before, but I'm going to say it again anyway. From my medical device perspective, with so many confusing standards and competing implementations the decision on what to use ends up not being based on technical issues at all. It's all about picking the right N partners for your market of interest, which translates into N (or more) interface implementations. This isn't just wasteful, it's also wrong. Unfortunately, I don't see a solution to this situation coming in the near future.

Microsoft Interoperability Principles

Saturday, February 23rd, 2008

As with many Microsoft press offerings, Microsoft Makes Strategic Changes in Technology and Business Practices to Expand Interoperability has created quite a stir. When you think about interoperability, Microsoft isn't the first thing that pops into your head.

The talk in the open source community regarding this announcement is understandably negative (read through the Slashdot commentary). According to the Interoperability Principles, Microsoft is opening up their protocols and APIs and embracing industry standards. That's good. But look at the Open Source Compatibility (I.5, my highlight):

Microsoft will covenant not to sue open source developers for development and non-commercial distribution of implementations of these Open Protocols.

This plus the licensing of patented software "at low royalty rates" have made the Microsoft naysayers say "nay" (yet again) and that this is all just another marketing ploy.

Maybe so. This philosophical change is not meant to make Microsoft a FOSS (free open source software) company. Their goal is still to make money, which is exactly what their shareholders demand.

I think providing an improved forum for open source interoperability, and presumably better support moving forward, is a good thing. As long as Microsoft continues to own the desktop, the easier it is to inter-operate with their “high-volume products”, the better. This will certainly be true for the healthcare industries, which like most others, depend on Microsoft products.

UPDATE (3/4/08):

Check out Microsoft's Interoperability Principles and IE8 (and here). This is good news for the Web development community. Doing the right thing must be good for business.

EHR System Modeling

Tuesday, January 1st, 2008

I'm a regular Slashdot reader and it's rare to come across a health care related post. So Arguing For Open Electronic Health Records of course caught my eye. I'm sure it was the open standards aspect that attracted them, but I also wanted to point out why the use of software modeling is so important to the development of EHRs.

The Tim Cook post is interesting in several respects.

The first is the reiteration of the importance of the "lack of true interoperability standards" and its affect on adoption of EMR. I've talked about this numerous times.

Another important point is that even though open source licensing may be free, the real costs of implementing any EHR system (i.e. going paperless) are significant.

The importance of understanding and communicating the "semantic context" of patient data is also a key concept.

The goal of the openEHR open-source project is to provide a model and specifications that capture patient data without loss of semantic context. A "two-level modeling" approach is used (from here):

Two-Level Software Engineering
(click on the image to see it at full resolution)

Within this process, IT developers concentrate on generic components such as data management and interoperability, while groups of domain experts work outside the software development process, generating definitions that are used by systems at runtime.

If you've done any work with Microsoft's WPF, this model should look familiar. Separation of responsibilities (designer vs. developer) is one of the fundamental shifts in GUI development that XAML provides. Separating the domain experts from the developers when building a health care IT system is also clearly beneficial.

No matter how good the openEHR model is, it unfortunately has the same adoption problems as many other health care interoperability systems: competing "standards". For example, HL7 V3 Reference Information Model (RIM) and CEN 13606 have the same goals as openEHR.

Developing software systems based on conceptual models of the real world is not new. For example, the OMG Model-driven Architecture (MDA, also see here):

These platform-independent models document the business functionality and behavior of an application separate from the technology-specific code that implements it, insulating the core of the application from technology and its relentless churn cycle while enabling interoperability both within and across platform boundaries.

These types of systems not only provide separation of responsibilities but are also designed to provide cross-platform interoperability and to minimize the cost of technology changes.

The future of inter-operable EHR systems will depend on choosing a common information/behavior model so that the benefits of these development technologies can be realized. The challenge is to make the use of a framework that implements that model something that all stakeholders find advantageous.

HL7 Interfacing: The last mile is the longest.

Saturday, December 15th, 2007

Tim Gee mentions the Mirth Project as a cost effective solution for RHIOs (regional health information organizations). In particular, he notes that the WebReach appliance is "ready to go" hardware and software.

I've recently started looking at HL7 interface engines for providing our ICG electronic records to customer EMR systems. I've mainly been evaluating Mirth and NeoIntegrate from Neotool.

One of the Neotool bullet points about HL7 V2 states:

Not “Plug and Play” - it provides 80 percent of the interface and a framework to negotiate the remaining 20 percent on an interface-by-interface basis

Since HL7 V2 is the most widely adopted interface in the US, that last 20% can be a significant challenge. This is one of the primary purposes for HL7 integration tools like Mirth and NeoIntegrate -- to make it as easy as possible to complete that last mile.

If you look closely at the Mirth appliance literature you'll see this in the Support section:

For customers requiring assistance with channel development, WebReach consulting and engineering services are available, and any custom work performed by WebReach can be added to your annual support agreement.

They're providing a turn-key hardware and integration engine system, but you either have to create the custom interfaces yourself or hire them (or someone else) to do it for you.

<AnalogyAlert>
This means that you have bought the hammer and identified the nail(s) to pound in. All you need to do now is find and hire a carpenter to complete the job.
</AnalogyAlert>

This really shouldn't be that surprising though. Custom engineering and support is the business model for the WebReach Mirth project and I'm sure a large revenue generator for Neotool.

There is certainly great value in being able to purchase a preconfigured and supported HL7 interface appliance. Just be aware that it's not quite ready to go.

Update 17-Dec-07:

If anyone has experience using HL7 integration engines that they'd like to share, I'd love to here for you (preferably through the comments so they're shared, but private mail is also fine). In particular, I know there are a number of competing offerings to the ones mentioned in this post, and it would be good to know if they are worth evaluating. Thanks!

Personal Health Record Challenges

Tuesday, December 11th, 2007

The Journal of the American Medical Informatics Association has an article entitled The Early Experiences with Personal Health Records [Volume 15, Issue 1, January-February 2008, Pages 1-7] (found via Constructive Medicine 2.0).

I've discussed PHR a number of times in the past. This article details three PHR implementations and discusses past and future challenges. Many of these are access policy issues -- privacy, security, data stewardship, and personal control. There are also many technical challenges. For example:

Thus, we will need to modify our existing PHR systems to support a service oriented architecture that permits multiple applications to retrieve our institutional data with patient control and consent. Providing such an architecture will require the nation to create and adopt national standards for clinical data content transmission, terminology and security to ensure interoperability.

I've also previously discussed interoperability in some detail. A SOA is a great idea, but with so many standards and proprietary implementations out there, it seems like the only way a national standard will ever be adopted is if the federal government steps in. This would require not only the definition of what the standards are, but what types of PHR/EMR systems are affected, along with a time-line for mandatory implementation.

All of that seems like a tall order. The technical aspects alone would be tough, but getting it done would also require a political process that would likely drag on for years. You really have to wonder if a national standard for interoperability is even possible?

Their conclusion regarding the importance of the P in PHR is a good one:

By placing the patient at the center of healthcare data exchange and empowering the patient to become the steward of their own data, protecting patient confidentiality becomes the personal responsibility of every participating patient.


On a semi-related note (i.e. isn't worth a separate post, so I'll park it here), I finally saw Michael Moore's movie SiCKO the other day. It achieved a certain level of shock value regarding how people in the US are treated by insurance companies and HMOs, but it was a disappointment overall. It made no effort to further the understanding of why the health care system in this country is so messed up in the first place or to try to suggest ways to improve it. Instead it just showed the US problems and quickly turned to how great the UK, France, and Cuba socialized systems are (gosh, you mean you don't have to pay any money at all!). Oh well, that's Hollywood for you.

Healthcare IT Interoperability Defined

Tuesday, November 20th, 2007

I guess I've been obsessed with interoperability (or lack thereof) lately.

Definitions:

interoperability

Dictionary interoperability 1,0,0,0;interoperability=555039

Main Entry: in·ter·op·er·a·bil·i·ty Listen to the pronunciation of interoperability
Pronunciation: \ˌin-tər-ˌä-p(ə-)rə-ˈbi-lə-tē\
Function: noun
Date: 1977
: ability of a system (as a weapons system) to work with or use the parts or equipment of another system

Interoperability:

The ability of two or more systems or components to exchange information and to use the information that has been exchanged.

From the National Alliance for Health Information Technology (NAHIT) definition:

In healthcare, interoperability is the ability of different information technology systems and software applications to communicate, to exchange data accurately, effectively, and consistently, and to use the information that has been exchanged.

The four NAHIT levels are:

  1. Non-electronic data. Examples include paper, mail, and phone call.
  2. Machine transportable data. Examples include fax, email, and un-indexed documents.
  3. Machine organizable data (structured messages, unstructured content). Examples include HL7 messages and indexed (labeled) documents, images, and objects.
  4. Machine interpretable data (structured messages, standardized content). Examples include the automated transfer from an external lab of coded results into a provider’s EHR. Data can be transmitted (or accessed without transmission) by HIT systems without need for further semantic interpretation or translation.

From IEEE 1073 (Point of Care Medical Device Communications) and IEEE-USA Interoperability for the National Health Information Network (NHIN) -- original definitions are from IEEE Standard Computer Dictionary: Compilation of IEEE Standard Computer Glossaries, IEEE, 1990:

Functional: The capability to reliably exchange information without error.

  • Shared architectures (conceptual design)
  • Shared methods (processes and procedures)
  • Shared frameworks (goals and strategies)

An architecture is the conceptual design of the system. Systems inter-operate if their architectures are similar enough that functions that execute on one system execute identically (or nearly identically) on another system.

Shared methods refer to the processes and procedures that a system performs. To ensure interoperability, these operations must be capable of being performed identically at any point in the network, regardless of implementation.

A shared framework is a shared set of goals and strategies. Stakeholders must agree on a shared set of goals and approaches to implementation.

Semantic: The ability to interpret, and, therefore, to make effective use of the information so exchanged.

  • Shared data types (types of data exchanged)
  • Shared terminologies (common vocabulary)
  • Shared codings (standard encodings)

Shared data types refer to the types of data exchanged by systems. Interoperability requires that systems share data types on many different levels, including messaging formats (e.g. XML, ASCII), and programming languages (e.g. integer, string).

Shared terminologies refer to establishing a common vocabulary for the interchange of information. Standardized terminology is a critical requirement for healthcare applications to ensure accurate diagnosis and treatment, and has led to developing standards such as SNOMED-CT.

Shared codings refer to establishing standard encodings to be shared among systems. Codings refer not only to encoding software functions, but also to encoding medical diagnoses and procedures for claims processing purposes, research, and statistics gathering (e.g. ICD9/10, CPT).

At Healthcare Informatics Technology Standards Panel (HITSP) I came across a maturity model of interoperability types from the Mayo Clinic. Even though this table is taken out of context, the Technical-Semantic-Process model shows yet another view of interoperability.

Mayo Clinic Interop Types

There are also a number of Models of Interoperability that describe abstract interoperability. For example, a finer grained layered model is called Conceptual Interoperability. This model encompasses the previous healthcare IT definitions:

Levels of Conceptual Interoperability Model (LCIM)

Besides these definitions there are many articles (and books), especially as it relates to healthcare and EMR/EHR, that espouse the benefits of interoperability.

From a broader software industry point of view you can imagine the number and variety of issues that a company like Microsoft has to deal with. They claim Interoperability by Design. Of course Microsoft has gotten a lot of attention about their push to get Open Office XML (OOXML) approved as a standard by EMCA -- some quite negative:

NoOOXML

Even though I don't believe the healthcare industry has the equivalent of 'Da Bill' (or maybe it does?), this points out one of the necessary components for the implementation of interoperability: standards.

I was on an ASTM standards committee a number of years ago (E1467-94, now superseded) , so I have some understanding of how difficult it is to get consensus on these types of issues. The process is long (years) and can sometimes be contentious. Even after a standard has been balloted and approved, there's no guarantee that it will be widely adopted.

Summary:

In my previous post on this subject I pointed out the plethora of healthcare IT standards and their confusing inter-relationships. The definitions of interoperability can be just as confusing. Each of the different models has a slightly different way of looking at the same thing. This is certainly one reason why there are so many overlapping standards.

Conculsion:

Interoperability in healthcare IT is multi-faceted and complex. This makes it difficult to agree upon exactly what it is (the definition) and even harder to develop the standards on which to base implementations.

Healthcare Un-Interoperability

Wednesday, November 7th, 2007

Or maybe that should be "non-interoperability"? Anyway, I have ranted in the past about the state of the EMR industry. I thought I'd add a little meat to the bone so you could better appreciate the hurdles facing device interoperability in healthcare today.

Here's a list of the standards and organizations that make up the many components of health information systems. I'm sure that I've missed a few, but these are the major ones:

Medical Coding

  • SNOMED (Standardized Nomenclature for Medicine)
  • LOINC (Logical Observation Identifiers Names and Codes)
  • ICD9/10 (The International Classification of Diseases)
  • CPT (Current Procedural Terminology)

Organizations

  • FDA CDRH (Food and Drug Administration Center for Devices and Radiological Health)
  • NHIH (National Health Information Network)
  • HIMSS (Healthcare Information and Management Systems Society)
  • CCHIT (Certification Commission for Healthcare Information Technology)
  • PHIN (Public Health Information Network)
  • VISTA (Veterans Health Information Systems and Technology Architecture)

Standards

  • HL7 (Health Level Seven: v2 and v3)
  • HIPAA (The Health Insurance Portability and Accountability Act of 1996)
  • 21 CFR Part 11 (FDA/HHS Electronic Records and Signatures)
  • IEEE-1073 (Point of Care Medical Device Communications)
  • IHE (Integrating the Healthcare Enterprise)
  • DICOM (Digital Imaging and Communications in Medicine)
  • HITSP (Healthcare Information Technology Standards Panel)
  • EHRVA (HIMSS Electronic Health Record Vendors' Association)
  • NCPDP (National Council for Prescription Drug Programs)
  • openEHR (International Foundation that promotes Electronic Health Records)
  • CEN (European Committee for Standardization)
  • CCR (Continuity of Care Record)
  • ANSI X12 (Electronic Data Interchange)
  • MLLP (Minimal Lower Layer Protocol)
  • ebXML (Electronic Business using eXtensible Markup Language)

This list does not include any of the underlying transport or security protocols. They are either data formatting (many based on XML) or specialized messaging systems.

The diagram below gives an overview of how many of these standards are related (from an IEEE-USA purchased e-book -- copying granted for non-commercial purposes):

Taxonomy of Core Standards for the NHIN

I don't know about you, but trying to make sense of all these standards and protocols is not an easy task. A discussion of next generation PHRs summarizes the situation well:

... not only is information scattered, but standards for defining and sharing the data are still evolving; where standards exist, many of them predate the Internet. Standards about how to define consistently the information (clinical standards) and to transmit and exchange the information (technical standards) are not yet formalized and agreed upon.

The point about predating the Internet is an important one. This particularly pertains to HL7 v2.x which still uses ASCII delimited messages for transmission over serial lines. For all you 21st century programmers that may have never seen one before, here's what an HL7 v2.x message looks like:

MSH|^~\&|AcmeHIS|StJohn|ADT|StJohn|20060307110111||ADT^A04
|MSGID20060307110111|P|2.4EVN|A04PID|||12001||Jones^John|
|19670824|M|||123 West St.^^Denver^CO^80020^USAPV1||O
|OP^PAREG^||||2342^Jones^Bob|||OP|||||||||2|||||||||||||||
||||||||||20060307110111|AL1|1||3123^Penicillin
||Produces hives~Rash~Loss of appetite

HL7 v3 uses XML for it's message format but it has not been widely adopted yet. A good history of HL7 v2 and v3, and an explanation of their differences, can be found here (pdf).

HL7 v2 is commonly used in hospitals to communicate between medical devices and EMR/HIS systems. Even though the communications framework is provided by HL7, new interfaces must still be negotiated, developed, and tested on a case-by-case basis.

Most of the large EMR companies provide HL7 interfaces, but many of the smaller ones do not. This is because hospitals are not their primary market so they don't generally need device interfaces. These EMRs are essentially clinical document management, patient workflow, and billing systems. The only external data they may deal with are scanned paper documents that can be attached to a patients record. The likelihood that they would conform to any of the standards listed above is low.

I'm not sure things will improve much with the recent PHR offerings from Microsoft (HealthVault) and Google (Google Health -- not yet launched) . Microsoft appears to be embracing some of these standards as discussed in Designing HealthVault’s Data Model, but there are a couple of telling comments:

Some of the data types we needed in order to support our partners’ applications where not readily available in the standards community.

Our types also allow each vendor to add “extensions” of their own making to item data – so to the extent that we are missing certain fields, they can be added – and the industry can rally around those extensions if it makes sense.

Microsoft says they are not the "domain experts", so they're leaving it to the industry to sort it all out. Great! This is probably the same attitude that got us to where we are today.

Hopefully you can now see why I've used the words "mess" and "chaos" to describe the current situation. The challenges facing interoperability in healthcare are immense.

The EMR-Medical Devices Mess

Saturday, September 22nd, 2007

Tim Gee's EMR Connectivity for Medical Devices Is a Mess post is right on target. I've also expressed my opinion ("total chaos") on this, but as you'll see I'm coming from a different perspective.

The HealthLeaders Technology article primarily focuses on the challenges of interfacing medical devices in the hospital environment. I think the situation in a physician's private practice or small group office is even worse.

The needs for medical device connectivity in both environments are essentially the same:

  • Reduction in medical errors.
  • Decrease in paperwork.
  • Increased staff productivity.
  • Timely delivery of clinical results.

Hospitals have the advantage of working with big EMR vendors who in turn can provide connectivity solutions for a larger variety of medical devices. Also, large hospital chains have enough clout to be able to mandate performance criteria from their vendors that include interoperability.

The typical physician's office uses a smaller EMR provider (or even worse, a 'home brew' system) where in many cases connectivity with external devices is an afterthought, if it exists at all. Even when you work with mid-sized EMR companies, each has their own proprietary external data interface. Very few of these smaller stand-alone EMR management systems provide standard interfaces (e.g. HL7) for external device data capture.

The interoperability problem is not a technical one. The issue is the time and resources it takes to implement and validate a given medical device interface. The real hope of a MD PnP-like solution is that the cost of that interface can be significantly reduced.

So, here's my perspective. As a medical device manufacturer, when we take our device into a private practice physician that has an existing (or planned) EMR system the first requirement is pretty much always the same. It's simply that the diagnostic results from our device automatically appear in a patient's record in their EMR system. To make this happen, they have to choose between three possibilities:

  1. Pay the medical device manufacturer (us) to interface with their EMR system.
  2. Pay their EMR vendor to do the interface.
  3. Find a third party vendor or contractor that will provide or build the custom interface.

The problem with #1 is that we don't have the resources to build each unique interface required to satisfy all of our customers. Plus that, our business is building medical devices, not EMR solutions.

#2 might not work out because unless the EMR company has a lot of customers with our devices they will either charge a large custom engineering fee or may just say they won't do it at all. The third option is doable, but is also potentially costly.

Notice that all of the choices require additional investment by the physician. I wonder if these types of issues may be one of the contributing factors for the low adoption rate of EMR for office-based physicians.

Not being able to provide cost-effective EMR integration is bad for everyone involved. It's bad for a medical device manufacturer (like us) because it makes it that much harder to sell systems. It's bad for the physician's office because without EMR integration they'll end up with a less effective paper-based solution. It's also bad for the EMR companies because they won't be able to take advantage of the future opportunities that a fully integrated medical office would provide.

The reasons may be different, but my conclusion about EMR connectivity with medical devices is the same as Tim's: “It’s a mess.”

UPDATE (7/21/08): Ran across this MIT Technology Review post about the MD PnP program:
"Plug and Play" Hospitals (Medical devices that exchange data could make hospitals safer).

Google, Microsoft, and Health

Wednesday, August 15th, 2007

I think the recent New York Time's article entitled Google and Microsoft Look to Change Health Care missed the bigger picture. The article talks about other Internet companies (like WebMD), but it does not make any mention of the Federal Government's involvement in this arena.

In particular is the Nationwide Health Information Network (NHIN) which was initiated by an executive order in April 2004:

The Nationwide Health Information Network (NHIN) is the critical portion of the health IT agenda intended to provide a secure, nationwide, interoperable health information infrastructure that will connect providers, consumers, and others involved in supporting health and healthcare. The NHIN will enable health information to follow the consumer, be available for clinical decision making, and support appropriate use of healthcare information beyond direct patient care so as to improve health.

At the end of May NHIN published four prototype architectures. The proposals are standards-based, use decentralized databases and services ('network of networks'), and try to incorporate existing healthcare information systems. The companies involved were Accenture, CSC/Connecting for Health, IBM, and Northrop Grumman.

It seems to me that Google and Microsoft are using their proprietary technologies to try to achieve the same goals as NHIN. One of the major differences of course is transparency. Everything that NHIN does is open to public scrutiny whereas GOOG/MSFT have their own market research programs and keep their strategies (for making money) close to the vest.

Besides ensuring privacy, I would argue that one of the key components for creating a successful NHIN is interoperability. Even with "standards" like HL7 and DICOM being available, IMHO the current state of the Electronic Health/Medical Records industry is total chaos. Just like GOOG/MSFT are creating their own islands of knowledge, there are a lot of other vendors (84 listed on Yahoo! Directory) doing the same. As a medical device developer trying to interface with customer EMR systems, we're faced with having to provide essentially unique solutions to (what seems like) just about every customer. If that's the reality down here in the trenches, a NHIN is most likely a very long way off.

In a related item, there are some screen shoots from the future Google Health service (codenamed "Weaver") here.

Update: Dr. Bill Crounse at the HealthBlog also has some thoughts about the NYT article: Doctor Google and Doctor Microsoft; if not them, who?