The Thinking Cap

Scientific American has an article on how the mind-machine interface is about to go commercial with a wearable EEG game controller that reads your mind (Sergo, Peter, “Head Games: Video Controller Taps into Brain Waves,” 14 April 2008). How’d they do it? Exactly the way the people at Wired would imagine. Rather than developing a series of hard-won determinate correlations between identified brain waves and intentions they just brute forced it. They recorded a gigantic quantity of sample data and processed it using a cloud computer to find the patterns:

Emotiv solved this brain-computer interface problem with the help of a multidisciplinary team that included neuroscientists, who understood the brain at a systems level (rather than individual cells), and computer engineers with a knack for machine learning and pattern recognition. Over the last four years, the company has conducted thousands of EEG recordings on hundreds of volunteers — not all gamers — as they experienced virtual scenarios that elicited various emotions, facial expressions and cognitive demands. The aim was to find a revealing brain activity that many people shared — a needle in a haystack of frenzied signals. Now, the EPOC allows users to fine-tune settings that allow it to pick up on even the subtlest of smirks.

When building these algorithms commenced two years ago, it had taken up to 72 hours for a bank of powerful computers to run through a mere 10 seconds of individual brain data and extract important features. Sorting through a seemingly endless stream of recordings eventually led Emotiv to find consistent signal patterns that revealed specific mental experiences. “Through a large enough sample size,” Le says, “we were able to get some consistency around the population to attain a high degree of confidence that it accurately measures an emotional state.”

And in dispensing with theoretical purity and just going with base correlation, the engineers at Emotive didn’t even have to concern themselves with the signal to noise ratio of the data:

Buch also suspects that the facial expressions that the EPOC detects are based more on the electrical activity of facial and scalp muscles than the brain per se. Although the electrical activity of muscles, he explained, is normally considered as artifact noise that needs to be filtered out to attain clean EEG signals that are of interest, they are still informative about how facial muscles move, such as during a wink. Tan agrees, saying that in their classification strategy some of the EPOC’s detections are based on muscle movements.

It’s all just correlation and if the noise helps identify the correlation, than it’s just as good as signal. In the petabyte age there is no phenomenon under consideration, not phenomenon under consideration issue. Any possible interference will be defeated by the size of the data set.

Now if they would just make a model that looks like this:

Robotech, Rick Hunter in the thinking cap

And maybe control an F-14 that transforms into a 50 foot tall robot instead of stupid games.