I’ve been tracking EEG-related stories for many years. This perfect Valentine’s Day technology story: ‘EEG Dating’ matches people based on their brainwave data is certainly worth adding to the catalog. The end goal:
Many dating services ask countless questions. With EEG matching, there should be no need for the questions that most people shade the truth with.
I have no idea what this ‘Color Spectrum Analysis of EEG Data’ (from Biometric Dating) is, but it’s sure pretty:
Granted, they are in the process of testing their theory by using data from long-term married couples. I sure hope they’re using happily married couples, otherwise the consequences could be disastrous!
Oh, and don’t forget to try: Computers can read your mind! (still amazing!).
The IntendiX (by g.tec) is a BCI device that uses visual evoked potentials to “type” messages on a keyboard.
The system is based on visually evoked EEG potentials (VEP/P300) and enables the user to sequentially select characters from a keyboard-like matrix on the screen just by paying attention to the target for several seconds.
P300 refers to the event related averaged potential deflection that occurs between 300 to 600 ms after a stimuli. This is a BCI research platform that has been made into a commercial reality. The system includes useful real-life features:
Besides writing a text the patient can also use the system to trigger an alarm, let the computer speak the written text, print out or copy the text into an e-mail or to send commands to external devices.
I’m usually skeptical of “mind reading” device claims (e.g. here), but P300-based technology has many years of solid research behind it. It may be pricey ($12,250) and typing 5 to 10 characters per minute may not sound very exciting, but this device would be a huge leap for disabled patients that have the cognitive ability but no other way of communicating.
(hat tip: medGadget)
UPDATE (3/24/10): Mind Speller Lets Users Communicate with Thought
The Technology Review article Better Brain-Wave Analysis looks at start-up ElMindA that is trying to find new quantitative methods for broadening the clinical use of EEG.
The company has developed a novel system that calculates a number of different parameters from EEG data, such as the frequency and amplitude of electrical activity in particular brain areas, the origin of specific signals, and the synchronicity in activity in two different brain areas as patients perform specific tests on a computer.
This description doesn’t sound very novel, but I’ve always felt that EEG analysis has tremendous clinical potential. This is particularly true for rehabilitation purposes (like the stroke example) and EEG-based communications for paralysis patients.
I am skeptical of “objective diagnosis” claims for things like attention-deficit hyperactivity disorder (ADHD) though. In the 1980’s EEG Topography was thought to be able to distinguish some psychiatric disorders. These claims were never proven to be true.
I’m not saying that new quantitative techniques like those being developed by ElMindA are comparible to the old EEG “brain mapping”, but significant clinical validation will be required before they can be used clinically.
UPDATE (6/3/09): Another related article (also from Technology Review) Reading the Surface of the Brain (and cool picture):
UPDATE (6/10/09): The BrainGate project at Massachusetts General Hospital has recently started clinical trials that may help paralyzed patients. The on-going MGH project:
… the ultimate goals of which include “turning thought into action“: developing point and click capabilities on a computer screen, controlling a prosthetic limb and a robotic arm, controlling functional electrical stimulation (FES) of nerves disconnected from the brain due to paralysis, and further expanding the neuroscience underlying the field of intracortical neurotechnology.
UPDATE (7/5/09): This is more related to Brain Control Headsets, but if you’re interested in developing your own EEG-based controller you should check out An SDK for your brain. The free NeuroSky MindSet Development Tools along with a $200 headset will get you started developing your own “mind-controlled” game. Good luck with that!
India’s Novel Use of Brain Scans in Courts Is Debated details how the Brain Electrical Oscillations Signature (BEOS) test was used to convict a 24 year old woman of murder. It was reported in July (This brain test maps the truth) that the BEOS test was found admissible in court in two murder cases.
I’ve discussed Mind Reading Software a number of times in the past, including Brain Fingerprinting (also see here). A more thorough analysis of this type of EEG technology, BEOS, and fMRI can be found here: Is Guilt Written in the Brain? There are several good links to scientific papers on the subject, and it also hits the nail on the head with this conclusion about the murder conviction:
At this point this is the equivalent of using pseudoscience in the courtroom. This is as irresponsible as basing a verdict on the ramblings of a psychic – except that it comes with the trappings of science and legitimacy.
(Hat tip: Medgadget)
This is related, but not worth another post: The Army’s Totally Serious Mind-Control Project (Hat tip: Slashdot). The goal is to “lead to direct mental control of military systems by thought alone.” That’s pretty ambitious. They reference the Emotiv headset, but the whole concept of using EEG-based systems for any type of control purposes is still a stretch. Fortunately, the investment is small — the Army probably spends more than $4 million a day on toilet paper.
That’s right! It is wallace-shawn-inconceivable that the Indegogo No More Woof campaign raised over $22,000 from 231 contributors. The project has been around since late 2013, but this is the first time I’ve run across it (via the recent IEEE article below). I just couldn’t resist posting the picture.
It goes without saying that the Scandinavian-based company NSID (currently “hibernating”) failed to deliver on its promise. This is well chronicled by IEEE: The Cautionary Tale of “No More Woof,” a Crowdfunded Gadget to Read Your Dog’s Thoughts.
The article even mentions Melon, a human EEG headband Kickstarter that I was involved with. I feel fortunate that I actually received a working device.
BCI is very difficult even under the best of circumstances with humans. I think the correct thought sequence for working with any EEG-based device is:
- “I’m excited”
- “I’m curious who that is?”
- “I’m tired”
The latest incarnation of EEG-based devices comes from Muse – The Brainwave Sensing Headband.
Just like other BCI claims, How Mind-Controlled Games Work – And Why It’s Way, Way Bigger Than That is a new approach to consumer brain monitoring applications. From the Muse site (my highlighting):
Our early apps will be focused on building the core of your mind to improve intellectual skills such as memory and concentration, or emotional skills like maintaining composure in high stress situations. Other Muse apps would be just plain fun stuff so you could paint or compose music with your mind or play video games using your mind as the game controller.
The FAQ assures you they’re not mind reading and that it’s not a mind control device.
Taking the “brain heath” approach, see CES 2013: InteraXon debuts Muse along with Brain Health System application, is an interesting twist. I’m a big fan of EEG-based technology. The research efforts and advancements in the BCI field have the potential to improve many lives.
InteraXon is probably doing great things (e.g. the headband is very clever) and they appear to be active in the BCI community. My only issue is with the marketing claims being made. Just like the mind control game controllers that have come before (see Turning the Mind Into a Joystick), the reality of the current technology is still not able to live up to most people’s expectations. This seems especially true when it comes to something as subjective as concentration or stress. Also, painting with your mind — really?
InteraXon raised over $287,000 through Crowdfunding at Indiegogo: MUSE: The Brain-Sensing Headband that lets you control things with your mind. Many of the contributions levels included receiving a device and the brain fitness app. They also expect to provide developers with a SDK by mid-year. That might be fun to play with.
BCI research is important work (see here). The availability of reasonably priced hardware and general purpose APIs has made it easy to investigate many aspects of how EEG processing of can be used to control the external environment.
The extrapolation of this work into the concept of mind reading software appears to be inevitable, but even after all these years, is still annoying. The latest incarnation of this is based on reputable work at Universities of Oxford and Geneva, and the University of California, Berkeley: Hackers backdoor the human brain, successfully extract sensitive data.
To start with, finding a correlation between P300 responses and a person’s image recognition — by 15% – 40% compared to random guessing — isn’t exactly earth shattering. Also, note that P300 is an average of multiple evoked responses. This requires many repetitions of the stimulus (16 times in this study) to reduce the noise enough and see the signal at all. As a practical matter, this is a really long way from brain malware.
Checkout Computers can read your mind. Still amazing!
Here’s the graphic from Hacking the Human Brain? Not As Impossible As You Think about the same research:
I didn’t realize there was a whole website for “NEWS ABOUT BRAIN-COMPUTER INTERFACES (BCI), MIND-CONTROLLED GADGETS & BIOFEEDBACK” — interesting stuff.
Microsoft has applied for a patent for using finger flexing to control a computer.
I’ve talked many times in the past about the use of EEG technology for computer control (brain-computer interface, BCI).
As discussed in the RWW article, there are many challenges to making this work. Just like with EEG, calibration of the EMG sensors and training will require innovative solutions.
It seems to me that this type of gesture-based control has quite a bit more potential than what can be obtained through the interpretation of EEG signals. In either case, the big benefit of advancements in these human-computer interface (HCI) technologies is that they could ultimately improve communications capabilities for the disabled.
The recent Time Magazine article Thought Control (subscription required) describes what is essentially another brain-computer interface. What’s novel about this device is that the EEG signal is monitored from dry electrodes on the arm or leg. The BodyWave® Brain Wave Monitoring (pdf) system developed by Freer Logic claims to allow measurement of brain wave activity away from the head:
BodyWave simply views brain energy as a field, collects the field energy as if the brain were a radio tower broadcasting from the brain and through the body.
For the purposes of teaching “stress control, increase attention, and facilitate peak mental performance”, this may well be an adequate method. Not having to wear the more traditional EEG head gear is certainly an advantage. Providing reliable control of computer interaction tasks via either “mind reading” method is not likely to happen any time soon (see Turning the Mind into a Joystick).
More “mind reading” hyperbole in today’s New York Times Magazine: The Cyborg in Us All.
I’ve talked about EEG-related technology many times in the past. Here are some quotes from the article:
This creates a pulse in his brain that travels through the wires into a computer. Thus, a thought becomes a software command.
We’re close to being able to reconstruct the actual music heard in the brain and play it.
… a “telepathy helmet” that would allow soldiers to beam thoughts to one another.
The NeuralPhone was meant to demonstrate that one day we might mind-control the contact lists on our phones.
The general public has two reactions when the lay press publishes this kind of stuff:
- I always knew this would come true. I.e. perpetuation of scientific fantasies.
- This is really scary stuff. I don’t want anybody reading my mind — or worse, controlling it.
If you know anything about the underlying techniques and algorithms you also know that “mind reading” and useful brain-controlled interfaces
are a long way off. Because the article fails to provide any sort of time-frame perspective, why won’t someone think these capabilities exist now.
The real problem I have with these kinds of articles is that this is important work that could potentially improve the quality of life for many disabled individuals. Hyping it up to be something it’s not doesn’t help anyone.
One more quote:
“This is freaky.” And it was.
Huh? … I think the NYT needs to improve their editorial oversight.