Archive for the ‘HCI’ Category

BCI: Brain Computer Interface

Friday, December 21st, 2007

I ran across Commercial brain computer systems are coming today (original article is here). This is a cool graphic:

BCI

I've talked previously about Human-Computer interface (HCI) work.

Invasive BCI technologies have incredible potential. In addition to the unique device-human interface challenges (primarily sensors), it will also require significant advancements in the basic understanding of the underlying electrophysiology (cortical and motor). As reported, EEG-based processing will continue to be a corner-stone for this work, just as it is for non-invasive systems.

This slide from the report (pp. 170) summarizes the uses and importance of this type of work:

Figure C.9. Possible applications of BCIs.

Here's a shorter overview of BMI: Mind Controlled Bionic Limbs

Mind Reading Software

Wednesday, September 5th, 2007

I spent quite a few years developing diagnostic Electroencephalography (EEG) systems and software. I always get a kick out of articles with titles like this one: Microsoft Working On Mind-Reading Software. It's the mind-reading part that gets me because your first impression is that Microsoft is developing technology that will allow it to somehow detect what you're thinking. This, of course, will not be happening in the near or foreseeable future.

The work that Microsoft Research is doing in this area (see here) is fundamental research on the Human-Computer Interface (HCI). The Using a Low-Cost EEG for Task Classification in HCI Research article uses standard frequency domain EEG features (delta, theta, alpha, etc.) as classifiers in a Bayesian Network for differentiating three mental tasks. What was interesting to me was that they recognized the limitations of using EEG technology alone as a human-computer interface. The understanding and use of other physiological data (e.g. motor activity) along with EEG will have to be explored as a way to improve task detection.

Not only is this type of work important for meeting the needs of the physically disabled, as the Wii and Surface have shown, innovative HCI systems can have a dramatic affect on how we all interact with computers.

'Thought-reading' system controls wheelchair and synthesizes speech is another one. The system processes larynx nerve signals for speech synthesis and wheelchair control. The technology looks very cool and has the potential to improve the lives of handicapped individuals. I suppose you could consider motor neuron activity as the output of thought, but 'thought-reading' just feels like a misnomer. Maybe it's just me.

Another 'mind-reading' technique is the use of Evoked Potentials (EP). One that got a lot of press is a few years back was Brain Fingerprinting (also see here). I'm sure there's still on-going research in the P300 area, but nothing has grabbed much attention since.

Also, checkout Computers can read your mind. Amazing!

UPDATE:

I found some companies that appear to be trying to use EEG processing algorithms for HCI. Both are focused on the gaming industry. They provide no details on how their products work, so it's hard not to be skeptical about their functionality claims.

NeuroSky

Emotiv

Also, Smart BrainGames provides more classical biofeedback development systems. All are mentioned here.

UPDATE-2:

Here's another interesting technology: Functional near-infrared spectroscopy (fNIRS) is an emerging non-invasive, lightweight imaging tool which can measure blood oxygenation levels in the brain. Check out the article here.

UPDATE-3 (15-Oct-2007):

Here's the Microsoft patent application: Using electroencephalograph signals for task classification and activity recognition (via here).

UPDATE-4 (13-Nov-2007):

Check out Brain2Robot Project which uses EEG signal processing (my highlighting):

Highly efficient algorithms analyze these signals using a self-learning technique. The software is capable of detecting changes in brain activity that take place even before a movement is carried out. It can recognize and distinguish between the patterns of signals that correspond to an intention to raise the left or right hand, and extract them from the pulses being fired by millions of other neurons in the brain. These neural signal patterns are then converted into control instructions for the computer.

If they can do this reliably, that's quite an accomplishment.