MindAffect wants to let us control devices with our minds

MindAffect, a team which presented in today’s TechCrunch Disrupt Startup Battlefield, wants to explore whats possible when we can control the devices around us with our minds — and to let others explore the possibilities, too. MindAffect was selected as a wildcard entry into the Startup Battlefield from the companies exhibiting at the show. One […]

MindAffect, a team which presented in today’s TechCrunch Disrupt Startup Battlefield, wants to explore whats possible when we can control the devices around us with our minds — and to let others explore the possibilities, too.

MindAffect was selected as a wildcard entry into the Startup Battlefield from the companies exhibiting at the show.

One early area of research for the team has been helping those who are unable to move anything but their eyes (due to neurological disorders such as ALS or stroke) communicate by typing.

Through this research, the team has designed a brain-computer interface which uses existing electroencephalogram (or EEG) hardware and unique flashing patterns to allow a user to control a device using only their eyes and the signals generated by their brain. Now they want to let others build with this interface (whether it’s for medical use cases like they’ve explored so far, or things like gaming/entertainment) with plans to launch a development kit next month at CES.

Here’s MindAffect’s system wired up to control an AppleTV:

While there are existing solutions for tracking eye movements to control computers, this approach sort of flips the concept upside down.

Whereas eye tracking solutions mostly use cameras and infrared reflections bounced off the eye to determine where a user is looking, MindAffect’s approach analyzes signals from the brain to determine what a user is looking at.

To accomplish this, MindAffect flashes each button on an interface (such as every key on an onscreen keyboard) at a different frequency. As the user shifts their gaze from button to button, the company says, the unique frequency the user sees causes their brain’s visual cortex to generate similarly unique signals. A non-invasive EEG headset detects and amplifies these signals, and MindAffect’s algorithms work backwards to match the signal to the desired action or input. MindAffect says its current algorithms require little to no training to function accurately.

With those differences in mind, what are the advantages of this approach over camera-based eye tracking? In a chat with me backstage shortly before his pitch, MindAffect CEO Ivo de la Rive Box was quick to note that they’re still trying to figure that out. He mentions, as an example, environments where lighting conditions might interfere with eye trackers.

MindAffect is on the hunt for use cases where this tech could prove particularly advantageous – something that opening up a dev kit to others could help with.

Founded in September of 2017, the company has raised $1M to date.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.