Commanding Electronic Devices from Discrete Neural Gestures

­Overview

We have developed a brain-machine interface for control of computers and other electronic devices through natural but virtual (imagined) gestures. A multistate discrete decoder detects neural activity evoked from natural hand gestures and maps them to commands for computer interaction.

 Market Opportunity

Emerging brain-computer interface (BCI) technology offers the possibility of building new artificial links between the nervous system and the external world for the clinical population living with neuromotor diseases. Neural activity related to upper limb motion has been widely adapted as a control signal for BCI systems. While the majority of this work has focused on 2 or 3D control with relatively few distal degrees of freedom engaged, recent studies have highlighted the rich informational content of signals related to coordinated digit movement.  Understanding the cortical activity patterns driving dexterous upper limb motion can further expand the capabilities of BCI systems.

 Innovation and Meaningful Advantages

We have developed novel methods for decoding neural activity evoked from a set of virtual (imagined) hand gestures, mapping these decoded gestures to computer actions and delivering the commands to the computer. We use a multistate decoder to detect the neural activity of ensembles of human motor cortical neurons across different natural hand gestures such as swipes, grasps, and finger movements. Currently, 40 to 50 finger, hand, and wrist actions can be distinguished reliably in real time.

With our method, the device can receive and interpret gesture commands that are not natively understood by that device. For example, a hand-wave gesture detected in the neural signals can be decoded and mapped to a text-to-speech function that generates a “Hello” voice output from the computer. The gesture need not be a single movement (e.g., swipe left) but can be a dynamic gesture (e.g., a hand wave that oscillates briefly). While discrete state decoding has been achieved in brain-computer interactions in the past, ours is the first that decodes a large set of natural but virtual hand gestures to create a brain-computer interaction that enables a virtual “touch” interface.

Collaboration Opportunity

We are interested in exploring 1) research collaborations with leading neuroengineering companies; and 3) licensing opportunities with neurotechnology companies.

Principal Investigators

John D. Simeral, PhD

Assistant Professor of Engineering

Brown University

john_simeral@brown.edu

https://vivo.brown.edu/display/jsimeral

Carlos Vargas-Irwin, PhD

Assistant Professor of Neuroscience

Brown University

Carlos_vargas_irwin@brown.edu

https://vivo.brown.edu/display/cvargas

IP Information

Provisional Application Filed 

 

Contact

Melissa Simon, PhD

Director of Business Development, Life Sceinces

melissa_j_simon@brown.edu

Brown Tech ID 3179

Patent Information:
Category(s):
Neurotechnology
For Information, Contact:
Brown Technology Innovations
350 Eddy Street - Box 1949
Providence, RI 02903
tech-innovations@brown.edu
401-863-7499
Inventors:
John Simeral
Thomas Hosman
Carlos Vargas-Irwin
Daniel Thengone
Leigh Hochberg
Keywords:
© 2024. All Rights Reserved. Powered by Inteum