As announced at a recent MIT workshop: The BCI X PRIZE: This Time It’s Inner Space:
The Brain-Computer Interface (BCI) X PRIZE will reward nothing less than a team that provides vision to the blind, new bodies to disabled people, and perhaps even a geographical “sixth sense” akin to a GPS iPhone app in the brain.
As I’ve discussed many times (e.g. BCI: Brain Computer Interface), “mind reading” with EEG is a huge challenge. Another hurtle they have to overcome:
The foundation must court donors to make the $10 million+ prize a reality. Once funding is secured,…
That will be the easy part.
The problem with the X Prize incentive approach is one of expectations. If people believe that Avatar-like advances (“new bodies”) is a realisitic result, they will be sorely disappointed.
Even though I’m a certified “mind reading” skeptic I think great BCI strides will inevitably be made. The good news is that these innovations will provide numerous benefits for handicapped individuals.
UPDATE (2/5/10): Here’s a great example: Technology Behind Second Sight Retinal Prosthesis
I ran across Commercial brain computer systems are coming today (original article is here). This is a cool graphic:
I’ve talked previously about Human-Computer interface (HCI) work.
Invasive BCI technologies have incredible potential. In addition to the unique device-human interface challenges (primarily sensors), it will also require significant advancements in the basic understanding of the underlying electrophysiology (cortical and motor). As reported, EEG-based processing will continue to be a corner-stone for this work, just as it is for non-invasive systems.
This slide from the report (pp. 170) summarizes the uses and importance of this type of work:
Here’s a shorter overview of BMI: Mind Controlled Bionic Limbs
That’s right! It is wallace-shawn-inconceivable that the Indegogo No More Woof campaign raised over $22,000 from 231 contributors. The project has been around since late 2013, but this is the first time I’ve run across it (via the recent IEEE article below). I just couldn’t resist posting the picture.
It goes without saying that the Scandinavian-based company NSID (currently “hibernating”) failed to deliver on its promise. This is well chronicled by IEEE: The Cautionary Tale of “No More Woof,” a Crowdfunded Gadget to Read Your Dog’s Thoughts.
The article even mentions Melon, a human EEG headband Kickstarter that I was involved with. I feel fortunate that I actually received a working device.
BCI is very difficult even under the best of circumstances with humans. I think the correct thought sequence for working with any EEG-based device is:
- “I’m excited”
- “I’m curious who that is?”
- “I’m tired”
The latest incarnation of EEG-based devices comes from Muse – The Brainwave Sensing Headband.
Just like other BCI claims, How Mind-Controlled Games Work – And Why It’s Way, Way Bigger Than That is a new approach to consumer brain monitoring applications. From the Muse site (my highlighting):
Our early apps will be focused on building the core of your mind to improve intellectual skills such as memory and concentration, or emotional skills like maintaining composure in high stress situations. Other Muse apps would be just plain fun stuff so you could paint or compose music with your mind or play video games using your mind as the game controller.
The FAQ assures you they’re not mind reading and that it’s not a mind control device.
Taking the “brain heath” approach, see CES 2013: InteraXon debuts Muse along with Brain Health System application, is an interesting twist. I’m a big fan of EEG-based technology. The research efforts and advancements in the BCI field have the potential to improve many lives.
InteraXon is probably doing great things (e.g. the headband is very clever) and they appear to be active in the BCI community. My only issue is with the marketing claims being made. Just like the mind control game controllers that have come before (see Turning the Mind Into a Joystick), the reality of the current technology is still not able to live up to most people’s expectations. This seems especially true when it comes to something as subjective as concentration or stress. Also, painting with your mind — really?
InteraXon raised over $287,000 through Crowdfunding at Indiegogo: MUSE: The Brain-Sensing Headband that lets you control things with your mind. Many of the contributions levels included receiving a device and the brain fitness app. They also expect to provide developers with a SDK by mid-year. That might be fun to play with.
BCI research is important work (see here). The availability of reasonably priced hardware and general purpose APIs has made it easy to investigate many aspects of how EEG processing of can be used to control the external environment.
The extrapolation of this work into the concept of mind reading software appears to be inevitable, but even after all these years, is still annoying. The latest incarnation of this is based on reputable work at Universities of Oxford and Geneva, and the University of California, Berkeley: Hackers backdoor the human brain, successfully extract sensitive data.
To start with, finding a correlation between P300 responses and a person’s image recognition — by 15% – 40% compared to random guessing — isn’t exactly earth shattering. Also, note that P300 is an average of multiple evoked responses. This requires many repetitions of the stimulus (16 times in this study) to reduce the noise enough and see the signal at all. As a practical matter, this is a really long way from brain malware.
Checkout Computers can read your mind. Still amazing!
Here’s the graphic from Hacking the Human Brain? Not As Impossible As You Think about the same research:
I didn’t realize there was a whole website for “NEWS ABOUT BRAIN-COMPUTER INTERFACES (BCI), MIND-CONTROLLED GADGETS & BIOFEEDBACK” — interesting stuff.
Microsoft has applied for a patent for using finger flexing to control a computer.
I’ve talked many times in the past about the use of EEG technology for computer control (brain-computer interface, BCI).
As discussed in the RWW article, there are many challenges to making this work. Just like with EEG, calibration of the EMG sensors and training will require innovative solutions.
It seems to me that this type of gesture-based control has quite a bit more potential than what can be obtained through the interpretation of EEG signals. In either case, the big benefit of advancements in these human-computer interface (HCI) technologies is that they could ultimately improve communications capabilities for the disabled.
There have been some interesting EEG related stories lately:
I’ve followed BCI: Brain Computer Interface and EEG work for a long time. There is still a long way to go on the “mind reading” front, but these types of developments are all encouraging.
The IntendiX (by g.tec) is a BCI device that uses visual evoked potentials to “type” messages on a keyboard.
The system is based on visually evoked EEG potentials (VEP/P300) and enables the user to sequentially select characters from a keyboard-like matrix on the screen just by paying attention to the target for several seconds.
P300 refers to the event related averaged potential deflection that occurs between 300 to 600 ms after a stimuli. This is a BCI research platform that has been made into a commercial reality. The system includes useful real-life features:
Besides writing a text the patient can also use the system to trigger an alarm, let the computer speak the written text, print out or copy the text into an e-mail or to send commands to external devices.
I’m usually skeptical of “mind reading” device claims (e.g. here), but P300-based technology has many years of solid research behind it. It may be pricey ($12,250) and typing 5 to 10 characters per minute may not sound very exciting, but this device would be a huge leap for disabled patients that have the cognitive ability but no other way of communicating.
(hat tip: medGadget)
UPDATE (3/24/10): Mind Speller Lets Users Communicate with Thought
I guess I’m a sucker for EEG related technology (see all my HCI posts). So when I run across an article like A baseball cap that reads your mind I can’t help but comment on it.
Unlike other “mind reading” systems that make unrealistic claims, I can see this research and wireless technology leading to something quite useful. The ability to discern closed eyes and drowsiness by the presence of alpha waves (8-12 Hz) in human EEG is well known.
Developing an affordable product that provides a timely audible alert to a driver that’s about to fall asleep could have a huge impact. From (beware, this is a PowerPoint presentation) Fatigue and Automobile Accident Risk:
The US Department of Transportation estimates that 100,000 accidents reported are due to drowsiness and/or fatigue. These crashes result in 1550 deaths annually (4% of traffic fatalities) and $12.5 billion in monetary losses.
Even the annoyance of false alerts would be worth the lives saved. And of course it’s convenient that a lot of truck drivers already wear baseball caps.
The New York Times had a couple of articles over the last few days that deal with short-term memory loss.
David Brooks’ 11-Apr-08 commentary called The Great Forgetting summarizes the Bad Memory Century best with:
In the era of an aging population, memory is the new sex.
In addition to taxes and death, aging is something that none of us can avoid. I was born in the later part of the baby boom, so over the last few years I have become acutely aware of these short-term memory challenges. Remembering to write down thoughts and lists has become essential. And I’m still young, relatively speaking anyway. Like the other inevitables, you never think it’s going to happen to you. But it does!
The 13-Apr-08 Sunday Magazine’s Idea Lab Total Recall speculates about embedding a computer chip in the brain in order to improve short-term memory. I know that short-term memory loss is a problem, but who would have guessed that “sky divers have been known to forget to pull their ripcords — accounting, by one estimate, for approximately 6 percent of sky-diving fatalities.” !!
Interestingly, the brain is a particularly effective associative memory system:
…, studies suggest that if you learn a word while you happen to be slouching, you’ll be better able to remember that word at a later time if you are slouching than if you happen to be standing upright.
Neural prosthetics are a long way off (see here), but the concept of embedding a Google search engine in your brain is certainly intriguing — and scary to most.
In the mean time, you can follow the suggestions in How to Cope With Short Term Memory Problems.