Technology to type directly from your brain at 100 words per minute
View(s):A plan to develop a non-invasive brain-computer interface that will let you type at 100 wpm — by decoding neural activity devoted to speech, was revealed by Regina Dugan, PhD, Facebook VP of Engineering, Building8, on April 19 at the Facebook F8 conference 2017.
She explained in a Facebook post that over the next two years, her team will be building systems that demonstrate “a non-invasive system that could one day become a speech prosthetic for people with communication disorders or a new means for input to AR (augmented reality)”.
Dugan previously headed Google’s Advanced Technology and Projects Group, and before that, was Director of the Defense Advanced Research Projects Agency (DARPA).
Dugan said that “even something as simple as a ‘yes/no’ brain click … would be transformative”. That simple level has been achieved by using functional near-infrared spectroscopy (fNIRS) to measure changes in blood oxygen levels in the frontal lobes of the brain, as KurzweilAI recently reported. (Near-infrared light can penetrate the skull and partially into the brain.)
Dugan agrees that optical imaging is the best place to start, but her Building8 team plans to go way beyond that research — sampling hundreds of times per second and precise to millimeters. The research team began working on the brain-typing project six months ago and she now has a team of more than 60 researchers who specialise in optical neural imaging systems that push the limits of spatial resolution and machine-learning methods for decoding speech and language.
The research is headed by Mark Chevillet, previously an adjunct professor of neuroscience at Johns Hopkins University.
Besides replacing smartphones, the system would be a powerful speech prosthetic, she noted — allowing paralysed patients to “speak” at normal speed.
Dugan revealed one specific method the researchers are currently working on to achieve that: a ballistic filter for creating quasi ballistic photons (avoiding diffusion) — creating a narrow beam for precise targeting — combined with a new method of detecting blood-oxygen levels.
Dugan also described a system that may one day allow hearing-impaired people to hear directly via vibrotactile sensors embedded in the skin. “In the 19th century, Braille taught us that we could interpret small bumps on a surface as language,” she said. “Since then, many techniques have emerged that illustrate our brain’s ability to reconstruct language from components.” Today, she demonstrated “an artificial cochlea of sorts and the beginnings of a new a ‘haptic vocabulary’.” (Extracted from www.kurzweilai.net)