I guess I'm a sucker for EEG related technology (see all my HCI posts). So when I run across an article like A baseball cap that reads your mind I can't help but comment on it.
Unlike other "mind reading" systems that make unrealistic claims, I can see this research and wireless technology leading to something quite useful. The ability to discern closed eyes and drowsiness by the presence of alpha waves (8-12 Hz) in human EEG is well known.
Developing an affordable product that provides a timely audible alert to a driver that's about to fall asleep could have a huge impact. From (beware, this is a PowerPoint presentation) Fatigue and Automobile Accident Risk:
The US Department of Transportation estimates that 100,000 accidents reported are due to drowsiness and/or fatigue. These crashes result in 1550 deaths annually (4% of traffic fatalities) and $12.5 billion in monetary losses.
Even the annoyance of false alerts would be worth the lives saved. And of course it's convenient that a lot of truck drivers already wear baseball caps.
Last year I looked at the MSCUI. Along with the v1.3 release is a glimpse into the future. The user experience is greatly enhanced with the use of Silverlight 2.0. Check out the Patient Journey Demonstrator (you'll need to install Silverlight first):
The UI busy and complex, but there seems to be an intuitive consistency to the madness. Besides the slick look, one of the great things about Silverlight (and WPF) are the transition animations that makes using the controls a feel-good experience.
Most of the Silverlight/WPF demos I've seen are photo/video mashup types of applications. The MSCUI clinical workflow and patient record demos give you a much better appreciation for how this UI technology can be put to effective use.
I still think the MSCUI Design Guidance documents are the key to success. Everybody goes gaga over the cool graphics and animations, but these requirements need to drive the use of the technology.
If you follow Microsoft comings and goings, one of the more interesting developments (at least to me) over the last 8 months has been the formation of a community that calls themselves ALT.NET.
like-minded individuals within the larger world of the Microsoft® .NET Framework who felt a growing frustration that Microsoft tooling, guidance, and .NET culture at large did not reflect or support an important set of core values.
The name is misleading because even though most members are from the .NET community, the group's purpose is to promote a set of core values that are platform/language independent. To summarize from Jeremy's article:
Keeping an eye out for a better way.
Adopt the best of any community.
Not content with the status quo -- experimenting with techniques.
It's the principles and knowledge that really matter.
The members of the ALT.NET group are distinguished technologist and many are productive bloggers, e.g. codebetter.com and Ayende@Rahien. Also, the discussion group altdotnet is very active (over 6200 posts since the beginning of the year) and lively. There are also periodic group meetings (see the ALT.NET site for links) that use Open Space Technology (OST) to organize conference agendas. Check out the interesting videos (by David Laribee) from the recent conference in Seattle.
So why are ALT.NETters not like the rest of us? We're experienced developers that use modern tools and techniques, but we:
As with Agile, we know about all the different "driven" software development approaches, but have never had the opportunity to fully embrace any of them.
Have heard about Boo, Spec#, and F#, but have never used them.
This list could go on and on. Many have never used an ORM or the MVC design pattern either. The point isn't what we know versus what they know. I've talked about Stereotyping Programmers before and how it's just plain bad. I think the ALT.NET community has made a conscious effort to improve their inclusiveness.
The ALT.NET group is certainly on the cutting edge of useful and innovative software technologies and techniques. We may not understand everything they're talking about, but the conversation is well worth listening to. Someday you may be faced with a challenge that will need just the type of solutions they've been discussing.
The New York Times had a couple of articles over the last few days that deal with short-term memory loss.
David Brooks' 11-Apr-08 commentary called The Great Forgetting summarizes the Bad Memory Century best with:
In the era of an aging population, memory is the new sex.
In addition to taxes and death, aging is something that none of us can avoid. I was born in the later part of the baby boom, so over the last few years I have become acutely aware of these short-term memory challenges. Remembering to write down thoughts and lists has become essential. And I'm still young, relatively speaking anyway. Like the other inevitables, you never think it's going to happen to you. But it does!
The 13-Apr-08 Sunday Magazine's Idea Lab Total Recall speculates about embedding a computer chip in the brain in order to improve short-term memory. I know that short-term memory loss is a problem, but who would have guessed that "sky divers have been known to forget to pull their ripcords — accounting, by one estimate, for approximately 6 percent of sky-diving fatalities." !!
Interestingly, the brain is a particularly effective associative memory system:
..., studies suggest that if you learn a word while you happen to be slouching, you’ll be better able to remember that word at a later time if you are slouching than if you happen to be standing upright.
Neural prosthetics are a long way off (see here), but the concept of embedding a Google search engine in your brain is certainly intriguing -- and scary to most.
This presentation (pdf) was given by Matthew Holt at the MS-HUG Tech Forum, Feb. 24, at HIMSS08 in Orlando (all presentations are here).
There are many informative slides that cover EMR adoption, consumer health demographics, the progression of IT, Web 2.0, and a definition of Health 2.0:
It's too bad there's not a video available of the presentation. I'd love to hear a full explanation of the Mongolian Marxists and their relationship to healthcare IT.
There are five slides on PatientsLikeMe. This site and many of the related issues are well covered in the New York Times Magazine article last month called Practicing Patients.
If you've never run across it before, the TIOBE Index is an interesting measure of the popularity of current programming languages.
The definition of how the index is calculated is here. The rankings are based on relative number of "hits of the most popular search engines." As noted in the FAQ, even these relative numbers can be affected by changes in the methodology used by a search engine.
I've looked at Google Trends in the past, so I thought I'd do a quick check on some of the highly ranked non-grouped languages (here):
This result is surprising in that C seems to have more hits than either Java or C++. Maybe it's because GT doesn't include Google Blogs (or does it?), MSN, Yahoo!, and YouTube. With Google having ~70% of the search market share (see here) the addition of the others shouldn't make that much of a difference. It would be nice to understand the discrepancy.
I can't say I have a lot of faith in Google Trends results in general though. Their definition:
Google Trends analyzes a portion of Google web searches to compute how many searches have been done for the terms you enter, relative to the total number of searches done on Google over time. We then show you a graph with the results -- our search-volume graph -- plotted on a linear scale.
This explains the general downward trend -- there are a lot more searches being done overall relative to these terms. Actually, just about any search term produces a similar looking plot. Only those that have specific events (like this), an annual activity (this), or seasonal fluctuations (this) show anything of interest. Trying to use GT for anything else seems fruitless.
Anyway, as to the usefulness of Tiobe Index:
The index can be used to check whether your programming skills are still up to date or to make a strategic decision about what programming language should be adopted when starting to build a new software system.
Some people think The TIOBE index is meaningless. There's also some good analysis and discussion in Programming Language Trends. I don't think the index is meaningless, but using it for either of the stated purposes should be put into perspective.
The first thing is that in terms of the mainstream (A) languages, not a lot has changed in the last five years. Because these trends move so slowly, an A language you pick today will probably still be A in four or five years from now.
The trends for lessor used languages (B and below) are of academic interest only and the position jocking ('Python beats Perl') is meaningless without knowing the reason why. If you want to know what the next A-list language is going to be in the future, try Top 10 programming languages of the future.
More importantly though is that your language choice(s) should be based on the best practices used in the industry that you work. The selection of a language or tool is based on many criteria. The Tiobe (or any other) list position is most likely going to be a small part of that decision process.
UPDATE (4/12/09): Just ran into this site - Programming Language Popularity which appears to use actual search results (instead of hits) for its rankings.
Creating a corporate culture that can carry out a quality system plan is a big challenge, especially for new medical device companies. As this figure aptly shows (hacked from page 42), aligning the organization with a formal project Mission Statement, Scope, Goals, and Communication Plan is a key component for implementing an effective CAPA system.
The survey statistics on the use of risk management tools and CAPA effectiveness are interesting. Of particular interest of course are the number of firms sited by the FDA for CAPA deficiencies. The slides detailing CAPA implementation and advice are also quite informative.
I stared at the "Big C" (page 23) for a while. I know it's trying to show the relationship between CAPA (820.100, "little c"?) and the rest of the quality system. I thought I figured it out a couple of times, but in the end I was not able to decipher its true meaning. I could buy the audio CD to find out or maybe Jan can just clarify it for us.
The human-computer interface (HCI) will continue to be a major challenge for the future. The iPoint Presenter is an approach that makes a lot of sense. It's been depicted as the future of computer interaction in movies like Minority Report and could easily be imagined as the next generation Wii.
Unlike the EEG-based "mind reading" devices that I've discussed before, this technology could be made affordable and reliable, so it holds much more promise. Plus that, it's very cool.
UPDATE (3/5/08):
This is HCI related anyway: University of Bremen’s Brain-Computer Interface: The future world is here. This is an interesting approach for helping the disabled. LEDs are flashed at specific frequencies which causes the visual cortex to respond in a corresponding manner. When the person looks at that one LED or another the EEG response is detected and initiates the desired activity or makes the associated selection (e.g. letters or numbers). The communication rate is slow, but this is a realistic technique nevertheless.