When it comes to controlling our computers, the last five years has seen incredible improvements in user interfaces including amazing touch screens and much more natural vocal recognition. Now, a Toronto company wants to take the UI to the next level — by going directly to the brain. You think it, and the Muse headband will make it happen under very limited circumstances.
InteraXon, the maker of the Muse headband (seen above), has listed it device on Indiegogo in hopes of raising $150,000 for building out a mass-produced headband that translates your mental commands into a computer action. The example they show on the site is playing a game using an iPad(s aapl), where the rotation of a wooden block occurs when the user focuses on it. The user tilts the iPad to change the angle of the rotation.
The ideas behind the Muse are echoed in a project released by Chaotic Moon Studios earlier this year called the Board of Imagination, whereby a user controls a skateboard that connected to an iPad and a brainwave reader made by a different company called Emotiv. In that use case, the user’s focus is what makes the skateboard move forward.
The idea of a real brain to computer interface is cool and has been around for years. There are wonderful examples of people using their minds to control wheelchairs or even hooking prosthetic limbs into a person’s nervous system and then learning to control them using their thoughts. There are also similar research efforts combining brain waves with vision tracking that could make an even more effective UI. But to turn our thoughts into something computers can understand — and perhaps make the most efficient UI of all — we need several things to happen:
1) Comfortable implementation — Today it’s a headband or a helmet that reads brainwaves from external EEG sensors, but to get to the subtleties that a true user interface would require we’d need to put sensors inside the head or add more components, such as the vision mentioned in the research above. But if we want to rely on the brain, then we need better electronics that could be implanted into a person’s body, which requires new coatings and research into chips and sensors that is ongoing. We also need to learn more about the brain.
2) A compelling use case for the UI to drive usage and adoption — As William Hurley of Chaotic Moon told me when I asked him if I could ride the Board of Imagination, focusing is essential. You don’t already know how to work these interfaces, you have to learn how to focus on a way that the EEG readers can understand. The Muse may help with this regard, because by letting users play simple games, they can train their brains to focus in a way detectable by the EEG monitors. Some of the use cases even give people a brain score that shows how well they “focus.” Games and personal improvement apps seem like a good reason for people to adopt the technology and thus, get it into the mainstream.
3) A set of standards for the hardware and implementation of a brainwave database — A good UI platform should be like the keyboard or language recognition. There needs to be a consistent set of meaning for each thought across all platforms. So if there are three different brainwave reading helmets on the market, you need to be able to control objects by using all of them the same way. That will speed adoption and the development of applications.
So, while the Muse is a cool project and may indeed usher in a new user interface, there’s plenty of other work going on in this realm and a lot more needs to happen. But, when we consider the massive amounts of digital information we’re going to be negotiating in real-time, the idea of some kind of computer-oriented telekinesis is pretty compelling.