| 
			 
 
 
			by Arielle Pardes 
			from 
			
			Wired Website 
 
			Getty 
			Images wants to make brain-computer interfaces accessible without needing surgery. Just strap on the device and think... 
			 
 
			No, he was merely joining 
			the long line of entrepreneurs (like Elon Musk,
			
			Mark Zuckerberg) who believe that 
			we will one day manage our machines with our thoughts. 
 Today, BCIs can regulate tremors from Parkinson's disease and restore some basic movement in people with paralysis. But they are still surgically implanted, and still quite experimental. 
 
			Even so, the likes of 
			Musk already envision a future where we'll all have chips in our 
			brains, and they'll replace our need for keyboards, mouses, 
			touchscreens, joysticks, steering wheels, and more. 
 
			The mysteries of the mind 
			remain vast, and implanting hardware in healthy brains - well, 
			forget about that, at least until the FDA deems it safe (light-years 
			away). In the meantime, a wave of companies is betting on bringing 
			Mind Control Lite to the masses with a neural interface that 
			requires no surgery at all. 
 His startup, NextMind, makes a noninvasive neural interface that sits on the back of one's head and translates brain waves into data that can be used to control compatible software. 
 Kouider's vision begins with simple tasks (sending text messages with a thought; calling up a specific photo in your camera roll with passing thoughts) and ends somewhere close to science fiction (controlling every device in our world, like the sorcerer in Fantasia). 
 Going the nonsurgical route comes with some trade-offs, namely all that skin and bone between your soggy brain and any device that's trying to read the neural signals it emits. 
 On the other hand, it's cheaper, it's safer, and it's much easier to iterate or push software updates when you don't need to open someone's head. And for all the promise of BCIs, people first need to see that this stuff can be useful at all. 
 
			For that, devices like 
			NextMind's do the trick... 
 
			It weighs 60 grams, about 
			as much as a kiwi fruit, and bears a passing resemblance to 
			flattened TIE fighter. 
 His lab, in Paris, specialized in studies of consciousness. 
 In a hospital setting, EEGs often require the use of gel and some skin preparation, but recently researchers have developed functional dry electrodes that only require contact with the skull. 
 The NextMind device uses these, along with a proprietary material that Kouider says is, 
 
			(He wouldn't tell me 
			what, exactly, the material is.) 
 There, the device's electrodes are well positioned to record activity from the visual cortex, a small area in the rear of the brain. 
 
			Then it translates the 
			signals to digital data, processes them on the computer, uses a 
			machine learning algorithm to decipher them, and translates those 
			signals into commands. 
 The NextMind device is designed to work on anyone, but it works faster when someone has had practice. 
 Kouider says it's about a neural feedback loop: 
 Neural profile generated, I was ready to play some games. 
 NextMind will announce its developer kit at CES in January. In an effort to court developers, the company has designed a few demos to show off what its device can do. I tried one that's a riff on Nintendo's Duck Hunt, which Kouider played as a kid. 
 As ducks danced across the screen, Kouider leaned over. 
 I focused my gaze on the ducks and, in less than a second, they exploded. 
 This little magic trick was repeated through a series of demos. I changed the channel on a mock TV set by glancing at one corner of the screen. I cracked a digital vault by concentrating on the right numbers on a pincode. 
 
			I changed the colors on a 
			set of smart lightbulbs that Kouider had set up for me. It's hard to 
			say why you'd need to do these things with your mind, but when you 
			do, you really feel like a Jedi. 
 Another startup, CTRL-Labs, released a developer kit last year for a similar noninvasive neural interface. It also uses dry electrodes, but this device is an armband and captures signals from nerves. 
 
			Facebook 
			acquired the company for close to $1 billion in September (2019). 
 The demo was designed to show off the company's vision: 
 I strapped the device to my arm and played some games. 
 One involved a dinosaur jumping over a series of obstacles. I thought jump and, with just a twitch of my arm, the dinosaur jumped. At one point, Patrick Kaifosh (then CTRL-Labs' CTO, now Facebook Reality Labs' research manager) entered the credentials to unlock his laptop by simply staring at it. 
 
			Neuro-authentication, he 
			called it... 
 Most of the clinical work around BCI also involves the motor cortex, in part because so much of the research has focused on movement disorders: 
 But Kouider thinks the visual cortex offers a richer set of neural signals for people trying to control their personal devices. 
 When I asked him why so much of the work was being done in the motor cortex, he paused, and then said, 
 Because the NextMind device utilizes signals associated with sight, the technology can feel a little like gussied-up eye-tracking. 
 So, 
 People have been doing that for years. (After the demo, Kouider claimed his BCI could work even if I closed my eyes.) 
 
			Right now, you control 
			things with your gaze. Soon, Kouider believes, the device will be 
			able to tap into our imagination, turning visual thoughts into 
			actions. 
 InteraXon, a Canadian startup, used to make a head-worn device that could control lights with the power of thought but eventually gave it up. 
 
			While, arguably, there 
			would be accessibility use cases for this technology, InteraXon 
			pivoted to make 
			
			Muse, a meditation headband. 
 At this early stage, though, BCI is more like the virtual-reality headset than the Next Great Interface: 
 
 
  |