Imagine being able to make a machine do your bidding with your thoughts alone, no button pressing, typing, screen tapping or fumbling with remote controls, just brain power. Well, this sci-fi scenario could be closer to reality than you think.
Bill Kochevar’s life was changed, seemingly irrevocably, when he was paralysed from the shoulders down following a cycling accident nearly a decade ago.
His future looked bleak.
But last year he was fitted with a brain-computer interface, or BCI, that enabled him to move his arm and hand for the first time in eight years.
Sensors were implanted in his brain, then over a four-month period Mr Kochevar trained the system by thinking about specific movements, such as turning his wrist or gripping something.
The sensors effectively learned which bits of the brain fired up – and in what sequence – for each movement.
Then, when 36 muscle-stimulating electrodes were implanted into his arm and hand, he was able to control its movements simply by thinking about what he wanted to do. Within weeks, he could feed himself again.
“This research has enhanced my ability to be able to do things,” he told the BBC last year.
Research teams are also working on mind-controlled wheelchairs, and on using sensors to allow people who are completely paralysed to give yes-and-no answers through the power of thought.
But this technology doesn’t just have health-related applications. Many tech companies are exploring brain control as a user interface.
Recently, for example, car maker Nissan unveiled a “brain-to-vehicle” headset that monitors a driver’s brainwaves to work out what you’re about to do – before you do it.
The aim of the system is to allow the vehicle to respond that split-second more quickly than a driver’s natural reaction time.
On a mountain road with lots of hairpin bends for example, brain-to-vehicle technology should make it easier to keep the car under control, says Nissan. In tests, even very experienced drivers have performed noticeably better using the system, the firm claims.
Meanwhile, virtual reality (VR) firm Neurable has developed a mind-controlled computer game that it says should be hitting the arcades later this year. Players wearing a sensor-equipped VR headset simply need to focus their thoughts on an object to manipulate it: there’s no hand controller at all.
At the more light-hearted end of the scale, EmojiMe has built a pair of brainwave-reading headphones that display the wearer’s emotional state in the form of animated emojis. It was originally invented as a joke, its creators say.
And there are plenty of other mind-controlled devices in the pipeline.
The driving force behind research has come from the world of medicine, where great strides are being made in the use of BCIs. Implanting sensors on the surface of the brain gives them far greater sensitivity.
If all this sounds awfully like magic, here’s how these devices work.
BCIs measure brain activity through an electroencephalogram (EEG) that’s almost as sophisticated as those used in hospitals. The device picks up the tiny electrical signals produced when neurons in the brain communicate with each other.
These signals include alpha, beta, delta, theta and gamma waves, as well as various types of signal triggered by visual cues. Certain patterns of activity can be associated with particular thoughts, allowing the system to make predictions about the user’s wishes.
In the case of the Nissan brain-to-vehicle system, for example, this means monitoring the signals associated with what’s known as motion-related preparatory brain activity. This data is then correlated with information gathered by the vehicle itself.
“The headset would read this preparatory activity, and would pair that with the information the vehicle has from sensors and maps – for example, ‘there is a turn coming up in 200 metres’,” says spokesman Nick Maxfield.
“The AI uses this combination of the brainwaves and the sensor data to work out what to do – for example, ‘there’s a turn coming up, and she’s started to think about turning – at this rate, she’ll go into the turn a bit late, so let’s start the turn now’.”
For this reason, assures Mr Maxfield, there’s no chance of causing an accident by simply thinking about steering or braking.
Neurable, developer of the world’s first mind-controlled arcade game, claims that its system is the fastest non-invasive BCI and the most accurate at determining what the user wants to do, due to its machine learning system.
This interprets a set of brainwave patterns, known as event-related potentials (ERPs), to establish when a game player wants to act.
“ERPs are a brain response that occurs when a user is interested in a target selection – for example, a button, object, et cetera – and that target changes,” says founder and chief executive Ramses Alcaide, adding that this response applies to any change at all, whether the object be moving, flashing or making a sound.
“We then leverage this response to give users control that can very much be likened to using a mental computer mouse.”
The company has put its system to use in Awakening, a VR game developed in partnership with eStudiofuture. The game, a little like the Netflix series Stranger Things, involves children with telekinetic powers who escape by manipulating objects and battling enemies with thought alone.
Neurable has released a kit allowing games developers to use its technology, and says it expects brain sensors eventually to become a standard feature of VR headsets.