Here in Orlando at HIMSS'08 Google is barely noticeable and Microsoft is making a huge splash. Even so, Scott Shreeve unabashedly declares Google the PHR winner in his Getting Giga Over Google (Again) post:
My prediction: Google by a long shot. A really long, interconnected, collaborative, collective intelligence, networked kind of aggregated intelligence kind of a shot.
Whether you a agree with Scott or not, it's a good read.
My only caution: It's never a good idea to underestimate Microsoft. They may come to the game late, but when they put their minds (and resources) to it, they often win.
The talk in the open source community regarding this announcement is understandably negative (read through the Slashdot commentary). According to the Interoperability Principles, Microsoft is opening up their protocols and APIs and embracing industry standards. That's good. But look at the Open Source Compatibility (I.5, my highlight):
Microsoft will covenant not to sue open source developers for development and non-commercial distribution of implementations of these Open Protocols.
This plus the licensing of patented software "at low royalty rates" have made the Microsoft naysayers say "nay" (yet again) and that this is all just another marketing ploy.
Maybe so. This philosophical change is not meant to make Microsoft a FOSS (free open source software) company. Their goal is still to make money, which is exactly what their shareholders demand.
I think providing an improved forum for open source interoperability, and presumably better support moving forward, is a good thing. As long as Microsoft continues to own the desktop, the easier it is to inter-operate with their “high-volume products”, the better. This will certainly be true for the healthcare industries, which like most others, depend on Microsoft products.
Interpreting motion and/or motor signals (facial expressions) is one thing. I'd love to know what type of EEG processing will be used to detect conscious thoughts and non-conscious emotions. At the very least, this type of quantitative EEG analysis has to begin with high quality EEG signals. I don't know how this can be done from an unprepared scalp and electrodes that are applied without gel.
UPDATE (3/2/08):
Via Slashdot, here (and here) is another game controller from OCZ (the Neural Impulse Actuator (NIA) is not currently a listed product). The NIA works by "... reading bioptentials. These include activities of the brain, the autonomous nervous system and muscle." You can't help but be skeptical about the value of this technology for these purposes.
Many of these are encompassed by the iterative Agile software development methodologies. Collectively they are sometimes referred to as the XDD acronyms. As you might expect, these along with all of the other competing, contrasting, and overlapping development philosophies can cause a software developer much consternation. Confessions of a Software Developer* is a good example of the overload that can occur.
My reason for bringing up driven methodologies is not to complain about being overwhelmed by them (which, like most others, I am). It's simply to point out the contradiction of X-Driven with the Merriam-Webster definition. I think this will help us better understand what should really be driving us.
Look closely at definition #2. Propelled or motivated by something ... results-driven. What is that something? Ah ha!
The fundamental motivation for all of these development approaches is to:
Improve productivity and quality.
This is the result, the goal. Behavior, Model, Test, etc. are all just the means by which we are trying to achieve this desired result. It's the result that we're driven towards, not the methods and techniques we use to get there.
So, in order to make this distinction clear and to eliminate confusion in the future, I propose that all these methodologies be renamed from Driven to Guided. Think of them like you would a GPS system in your car, except these will allow you to find software Nirvana. TDD is now TGD, and the whole lot is known as XGD.
The point here is that you should not let any particular development philosophy blind you to what the real purpose of using it is in the first place. Being guided by a methodology helps me remember that better than when I'm driven by it. Also, the whole concept of being driven seems exclusionary to me. You shouldn't hesitate use the pieces and parts of any combination of these techniques that best suites your needs.
As reported here and here, the FDA is proposing to reclassify a MDDS from a Class III to a Class I medical device. On the surface this might seem like a big deal. If you read the fine print though (my highlight):
FDA has already recognized that the class III requirements are not necessary for ensuring the safety and effectiveness of MDDS devices and has been exercising enforcement discretion with MDDS device manufacturers. These firms have not been required to submit PMAs or meet other requirements typically required of manufacturers of class III devices, but the agency believes that all or nearly all firms in this industry have in place good business practices, including quality systems.
They're just formalizing already established practices.
I'm not a computer scientist. I'm also not one of the many über programmers that create and analyze software frameworks and techniques. I simply design and develop software that attempts to meet my customer's needs. To that end I'm always looking for the best tools available to get the job done.
Jeremy Miller states the importance of design patterns well:
I know many people blow off design patterns as ivory tower twaddle and silly jargon, but I think they're very important in regards to designing user interface code. Design patterns give us a common vocabulary that we can use in design discussions. A study of patterns opens up the accumulated wisdom of developers who have come before us.
My opinion: You don't need to be a rocket scientist to understand design patterns. Most are just common sense. Complex patterns are designed to solve complex problems. Design patterns should be thought of as a tool that you use just like any other. Don't let the 'ivory tower twaddle' scare you away.
I think most people would agree that one of the key components to creating a successful software product is quality. I've developed .NET applications in the past and have experienced the difficulty of testing and maintaining the functionality of Winform forms and components when they are created with the default Visual Studio tools. If you're not careful, here's what you end up with:
I should note here that the development of software for medical devices already has rigorous verification and validation processes to ensure quality. See FDA Good Manufacturing Practice (GMP - Quality System Regulation) subpart C–Design Control (§ 820.30 sections f & g). However, these requirements do not preclude the need for development techniques that make the software more robust and maintainable. On the contrary, the higher the software quality, the easier it is to meet these standards.
I've recently spent some time trying to select a GUI architecture that will allow us to create a more robust unit testing environment. This is why I started looking at Model-View-Controller (MVC) and Model-View-Presenter (MVP) design patterns. The need for these types of design patterns is twofold:
There are many articles and blog posts that describe MVC, MVP, and their numerous variations. These techniques have been around for many years but the current corner-stone comes from these Martin Fowler articles:
Once you understand these concepts you can start to grasp the trade-offs of all of the available MVC/MVP flavors. If you're like me, one of the problems you'll run into is that there are so many different approaches (and opinions) that you'll be left wondering which is best to implement. The only advice you'll get is that it's a matter of choice. Great, thanks! From the article above, Josh puts it best:
If you put ten software architects into a room and have them discuss what the Model-View-Controller pattern is, you will end up with twelve different opinions.
This is when you turn from the theory and start looking for concrete implementations that might be suitable for your situation. Microsoft has released an ASP.NET MVC Framework as part of VS2008, but all of the Winform code samples I found were part of either blog posts or articles.
As you look at the different implementations (and relevant snippets), you quickly realize that following these design patterns requires significantly more work than just adding your application's logic directly to the IDE generated delegates. The additional work is expected and is the trade-off for improved testability.
That's fine, and worth it, but it's still time and money. We do not have the resources, or experience, to undertake a full Test-Driven Development (TDD) effort. We will implement MVC/MVP on only the displays that we feel are the most vulnerable.
I'm not going to list all of the candidate examples I looked at. I will mention that Jeremy's series of articles (here) dig deep into these issues and have lots of good code examples. Each approach has their pros and cons, just like the one I'll present here. We'll try to use it, but may end up with something else in the end. As we become more experienced, I suspect we'll evolve into a customized solution that best meets our needs.
This hybrid approach appealed to me for a couple of reasons. The first is that I spent several years doing Swing development, which uses a MVC that also allows multiple simultaneous views of a single model. I also like the event driven approach, which is not only heavily used in Java, but is also well supported in .NET. In any case, the View is passive and all of the important functional logic is centralized in the Controller class which can be easily tested with real or mock Model and View objects.
Matthew has done a good job of providing support generic classes that make implementation somewhat less cumbersome. The MvcControlBase class provides generic Control-View wiring while ChangeRequestEvents manages all events in a single class.
The project download provided by the article is a VS2008 solution. We're still using VS2005, but I was able to back-port the project to VS2005 with only minor modifications that had no effect on functionality. The VS2005 project is available for download here:
I see adoption of MVC/MCP methodology for GUI development as a critical component for improvement in software reliability, quality, and long-term maintainability. Also, structuring the application side with MVC/MVP is only half the battle. Developing an effective testing strategy must go along with it in order to achieve these objectives. Until Microsoft provides an integrated Winforms MVC solution like they did for ASP.NET, we'll just have to continue to roll our own.
I'd like to hear about your experiences, suggestions, and recommendations on any of these topics.
There was a link in Joel's recent Five whys post that I found to be a fascinating read: A New Yorker article called The Checklist.
This got me thinking about how (and if) technology could facilitate the Checklist. The requirements aren’t really very high-tech:
The checklists provided two main benefits, Pronovost observed. First, they helped with memory recall, especially with mundane matters that are easily overlooked in patients undergoing more drastic events. … A second effect was to make explicit the minimum, expected steps in complex processes.
Dr. Pronovost’s efforts were successful because he was able to identify a single well focused process that ended up having a significant ROI. Within the complex workings of an ICU, the checklist ensured minimal compliance.
Point of care computing is ready-made for this type of task, and has many obvious advantages over a paper-based system. As the article points out, even with its dramatic results and minimal cost, adoption is slow. One reason may be:
…where I.C.U. nurses and doctors are in short supply, pressed for time, overwhelmed with patients, and hardly receptive to the idea of filling out yet another piece of paper?
In reality though, the problem isn’t the piece of paper, and technology will also not improve the situation much. I think Dr. Pronovost hits the nail on the head with this statement (my highlight):
The fundamental problem with the quality of American medicine is that we’ve failed to view delivery of health care as a science. The tasks of medical science fall into three buckets. One is understanding disease biology. One is finding effective therapies. And one is insuring those therapies are delivered effectively. That third bucket has been almost totally ignored by research funders, government, and academia.
Dr. Pronovost's reaction to HSS Office for Human Research Protection deciding that the "list method" was unethical and that the program had to stop: "Shocked." No duh! Also, the discussion regarding the increased error rates associated with computerized physician order entry (CPOE) is great -- yet another call for standardization.
UPDATE (2/29/08):
Just ran across this: The Checklist Saga: Victory! It seems that OHRP had a change of heart after all. It's great news when common sense (a.k.a. sanity) prevail!
Free Medical Software appears to be a comprehensive list that's currently being kept up-to-date (hat tip: LinuxMedNews). The project attributes and annotations make the list quite useful. Nice job Holger!
I read with interest "Art Vandelay's" (an anonymous Seinfeld fan and HIStalk contributor) comments regarding Patient Command Centers on HIStalk Update 1/21/08. Here's what the PCC is supposed to do:
It will manage service desks for IT, facilities, clinical engineering, and equipment, as well as clinical alerts and data from medical devices and the computerized patient record for singular issues and trended problems.
Wow! That's a tall order. Even with advanced tracking systems and AI software trying to make sense of all the activity (events), it seems to me that all these management tasks would require intensive human interaction.
"Art" also raised a question regarding device communications:
SNMP isn’t that complex. What are the chances of getting the medical device vendors to add this to their devices?
IMHO, probably not real good. SNMP was primarily used for the management of network devices (routers, hubs, etc.) and protocols. On the surface, its ability to manage large numbers of enterprise networked devices seems like it would be a good fit for hospitals. Unfortunately, the development and adoption of a unique object identifier (OID) for each medical device seems unlikely. Also, a new OID would require the SNMP manager software to be updated in order to recognize it and properly handle the new data.
As medical devices have become networked, the trend has been to embed an HTTP (Web) server in them. This allows secure remote access and control via any Web browser. Many other commercial networked devices have also taken this route.
Even with my pessimistic view on a couple of the details, I don't think the vision for a PCC is any worse off than it was before. The concept of a PCC is broad and ambitious. As such, its value to an institution and implementation details need to be continually explored and refined.
...in healthcare these same types of solutions will save lives...as well as reduce stress levels...don't fear the technology that could some day save your life or the lives of others...
Yes, Software as a Service (SaaS) is a real technology that has a number of pros and cons. All major EMR/EHR software vendors effectively use server-based systems in order to protect data and provide disaster recovery.
The referenced video is essentially a Microsoft advertisement for their Windows Mobile and Surface technologies. This is cool stuff, but I don't see how it's going save lives. I also doubt it will reduce the stress levels of clinicians.
If you're interested in SaaS topics, here are some good resources: