DPZ-Homepage
Menu mobile menu
F5 cell

Cortical representation of grasping actions

Primate hand movements are complex and highly cognitive forms of behavior. Appropriate planning of hand movements requires the integration of sensory information with internal (volitional and memory) signals in order to generate appropriate hand actions. We investigate how hand movements are generated in the primate brain, specifically in the higher order brain areas of the parietal (anterior intraparietal area; AIP) and the premotor cortex (ventral premotor area, i.e., area F5). We perform dual-area recordings in behaving animals (spiking activity and local field potential (LFP)) in order to characterize the role of these areas for sensorimotor transformation and decision making related to hand grasping. This will shed new light on how hand movements are generated in the brain.

Decoding scheme

Real-time decoding of hand movements

Using our current understanding of how hand movements are represented in motor, premotor, and parietal brain areas, we are developing brain-machine interfaces that can read out such movement intentions to control robotic devices in real-time. For this we employ permanently implanted electrode arrays that can read out cortical signals simultaneously from about 100 channels or more. Using dedicated analysis software, these signals are then decoded in real-time to make predictions on up-coming grasping actions, which are then fed back to the subject and used to control robotic hands. Such systems could be useful for future applications aiming to restore hand function in paralyzed patients.

6cond-proc

Optimal decoding from neural ensembles

The brain represents information in a distributed fashion. Neural interfaces that attempt to read out neural information from the brain therefore have to take the multi-variate nature of these signals into account. Using neural data from decoding experiments from multiple cortical areas, we investigate how prediction algorithms can be improved by taking timing information, the multi-variate nature of the signals, and various classifier methods into account. Such improved decoding methods will advance the efficiency of neural interfaces.

Monk Powergrasp

Grasp kinematics

An important aspect of the neural activity related to hand grasping is the representation of hand kinematics. To understand the neural representation of hand kinematics, neural activity has to be correlated with hand kinematic data. We are currently developing a hand tracking system that can monitor three-dimensional hand and finger movements in small primates during object grasping. Such a system is essential to understand the neural representation of hand and finger movements.

shape_2

Hand robotics

Neural interfaces for grasping need robots to execute the decoded hand actions. While virtual environment solutions are possible at an initial stage, robotic hands provide far superior feedback and interaction with the real world. Robotic hand control integrates command signals from the neural interface and with sensory information from the robot to generate grasping actions. Furthermore, proprioceptive (sensory) information can be sent to the brain with an additional sensory interface to provide robotic feedback independent of vision. Such bi-directional neural interfaces could therefore lead to considerably improved neuro-prosthetic systems.