Abstract: Neural interface technologies provide new opportunities to assist and augment human behaviors. For instance, muscle activity can be transformed into commands for an assistive device for people with disabilities or provide richer control for a computer than interfaces like mice and keyboards. Connecting signals from the nervous system to an external device in this way presents users with a new, potentially unintuitive, mapping between their movements and those of the device. Users often change their behavior as they learn to control neural interfaces, and many neural interfaces leverage machine learning to let the device adapt to the users. This co-learning creates complex and high-throughput interactions between algorithms and the nervous system. In my talk, I’ll present recent research in my lab demonstrating that the algorithms we use in neural interfaces influence neural computations and user learning. I will then present new computational frameworks we’ve developed to predict and shape user-algorithm interactions. These discoveries open possibilities to build neural interfaces that intelligently interact with the nervous system to assist and rehabilitate motor function across diverse users and applications.
Bio: Dr. Orsborn is a Cherng Jia and Elizabeth Yun Hwang Associate Professor in the departments of Electrical & Computer Engineering and Bioengineering at the University of Washington. Her research explores sensorimotor plasticity in brain-computer interfaces and how plasticity is influenced by the algorithms used. She completed her Ph.D. at the UC Berkeley/UCSF Joint Graduate Program in Bioengineering and her postdoctoral training at NYU’s Center for Neural Science. She recently received the NSF CAREER award, a Sloan Fellowship, and was named an Emerging Leader by the American Institute of Medical and Biological Engineering.
