Using our current understanding of how hand movements are represented in motor, premotor, and parietal brain areas, we are developing brain-machine interfaces that can read out such movement intentions to control robotic devices in real-time. In this context, our main research question is: **what are the neural coding and plasticity principles that determine the execution of complex movements such as those required for detailed object manipulation?**. To explore this, we employ permanently implanted electrode arrays that can read out cortical signals simultaneously from about 200 channels or more. Using dedicated analysis software, these signals are then decoded in real-time to make predictions on up-coming grasping actions, which are then fed back to the subject and used to control robotic arms and hands. Such insight could be useful for future applications aiming to restore hand function in paralyzed patients.
To answer the main research question of “what are the neural coding and plasticity principles that determine the execution of complex object manipulation?”, we elaborate in the following subprojects and have developed the following techniques:
Decoding of hand movement for brain computer interfaces
How does the brain produce movement? How do billions of brain cells harmonize the activity of hundreds of muscles to produce the movement basic for our daily life such as grasping? This question has fascinated scientist and engineers for millenia, but beyond that it has become crucial for medical advance: in cases such as paralysis and extreme cases such as locked-in syndrome, a better understanding of how the brain produces motion would allow engineers to develop devices that can recover some mobility and restore a patient´s social connection. We aim to understand how to brain produces the necessary signals to control arm movements. Specially, we investigate the crucial question: how can multiple parallel signals, such as the ones required for arm movement, be decoded fom the cerebral cortex to control an artificial device? For this we perform experiments in primate while they perform real and virtual grasps via a brain computer interface to understand the neural population activity patterns that coordinate sequential actions required for grasping.
Hand robotics
Neural interfaces for grasping need robots to execute the decoded hand actions. While virtual environment solutions are possible at an initial stage, robotic hands provide far superior feedback and interaction with the real world. Robotic hand control integrates command signals from the neural interface and with sensory information from the robot to generate grasping actions. Furthermore, proprioceptive (sensory) information can be sent to the brain with an additional sensory interface to provide robotic feedback independent of vision. Such bi-directional neural interfaces could therefore lead to considerably improved neuro-prosthetic systems.
Investigation of the primate hand grasping network with pathway-specific neuro-optogenetics
Drawing not only correlative conclusions, but causal links, between the specific neural components of the hand grasping network involved in the planning and executing of hand grasping movements is essential for the development of neuroprosthetic devices. Accordingly, we are currently applying the emerging approach of optogenetics on the grasping network of behaving monkeys. Neuro-optogenetic tools have recently been established in primates and allow the precise manipulation of neuronal activity with unprecedented temporal, spatial, and cell-type specificity. We combine several behavioral and neural recording methods including neuro-optogenetics, intracortical electrophysiological recordings, and hand kinematic tracking in macaque monkeys while performing a delayed grasping task. We examine the effects of optogenetical stimulation on the local and remote, but directly connected, neuronal activity as well as on hand grasping behavior. Results of this study will likely provide significant new insights on the functional contributions of the fronto-parietal hand grasping network and their causal interconnections. Ultimately, results from these studies could contribute to the development of improved neuroprostheses for impaired patients and the translation of optogenetic methods into the realm of clinical trials for humans.
This project will continue to be funded within the Else Kröner Fresenius Center for Optogenetic Therapies: https://www.ekfs.de/en/scientific-funding/center/else-kroener-fresenius-center-for-optogenetic-therapies
Development of a communication platform in the body to control prostheses
The collaborative project B-CRATOS ("Wireless Brain-Connect inteRfAce TO machineS") aims to control prostheses or "smart" devices by the power of thought. For this purpose, a battery- and wireless high-speed communication platform is to be integrated in the body to connect the nervous system with signal systems and thus stimulate various functions of, for example, prostheses. The project, coordinated by Sweden's Uppsala University, also involves besides five European partners including universities, companies and institutes, the the German Primate Center (DPZ) to test the new technology in non-human primates. The highly ambitious project, which combines expertise and cutting-edge technologies from the fields of novel wireless communication, neuroscience, bionics, artificial intelligence (AI) and sensor technology, will receive 4.5 million euros in EU funding over the next four years. More information: www.b-cratos.eu