Neural decoding of full reaching and grasping movements. Single-unit activity in motor cortical area M1 during naturalistic reaching and grasping movements. Raster plot showing the action potentials fired by 25 individual neurons over a span of 5.5 s. Raster plots are colored according to set specific relationships with kinematics (significant semipartial correlations >0.05.): blue, hand only; green, wrist only; red, hand only; orange, hand+wrist; magenta, hand+arm; black, hand+arm+wrist. Of the 30 neurons recorded in M1, only the 25 shown had set-specific relationships to at least one set of kinematic parameters. During the time span shown, the monkey performed three separate reach-to-grasp movements targeting a small ball. Grip aperture measures are overlaid over the raster plot to highlight the moments when prehension occurs (troughs). Top: The full posture of the arm is shown for four time points in the first movement.

In this project we ask how objects are represented for grasping, how this representation evolves over time and how it is affected by context. This work addresses questions in basic neuroscience but has implications for machine vision and robotics as well as in brain-computer interfaces. Our work focuses on naturalistic grasping activities in a range of contexts from free-reach-to-grasp scenarios to instructed-delay tasks. We use motion capture to record behavior during complex grasping tasks with the aim of elucidating the underlying neural code enabling natural grasping.

Previous work has shown that ventral premotor cortex (PMv) is a key node in the parieto-frontal network specialized in transforming visual information representing object shape into hand postures best suited for grasping. These previous studies, however, use task conditions that do not separate two main factors: the visual features of the object and the motor strategy chosen to grasp it. It is not clear whether PMv operates in a primarily visual or motor regime, or if it transitions from one into the other at different times.

In order to address this question, we have developed an instructed delay task that includes objects that can be grasped using two different types of grips. All the objects share one common grip (power grip), but additionally can be grasped using an additional strategy unique to each object. We have recorded PMv ensemble activity in monkeys trained to perform different grips on cue using chronically implanted microelectrode arrays.

Grip selectivity has been described as a feature of PMv, but comparison of the same grips in tasks with differing temporal presentation of cues has not been pursued. We examined the differences in neural activity associated with movements cued after an instructed delay (sequential condition) and movements cued at the moment of motor execution (simultaneous condition). While the sequential condition allows for planning in advance of the movement, movement planning and execution must be performed at the same time in the simultaneous condition.

Our results suggest that grip selectivity in PMv may be influenced by the prior conditions that set up the task, including the ability to plan upcoming actions. Further, these results suggest that the neural population activity during a movement is affected by the timing of the cues to lead up to the action. Finally we find support for the hypothesis that objects may be characterized by their affordances - that is, their neural "representation" may "describe" how they may be grasped.

Neural activity in ventral premotor cortex (PMv) has been associated with the process of matching perceived objects with the motor commands needed to grasp them. It remains unclear how PMv networks can flexibly link percepts of objects affording multiple grasp options into a final desired hand action. Here, we use a relational encoding approach to track the functional state of PMv neuronal ensembles in macaque monkeys through the process of passive viewing, grip planning, and grasping movement execution. We used objects affording multiple possible grip strategies. The task included separate instructed delay periods for object presentation and grip instruction. This approach allowed us to distinguish responses elicited by the visual presentation of the objects from those associated with selecting a given motor plan for grasping. We show that PMv continuously incorporates information related to object shape and grip strategy as it becomes available, revealing a transition from a set of ensemble states initially most closely related to objects, to a new set of ensemble patterns reflecting unique object-grip combinations. These results suggest that PMv dynamically combines percepts, gradually navigating toward activity patterns associated with specific volitional actions, rather than directly mapping perceptual object properties onto categorical grip representations. Our results support the idea that PMv is part of a network that dynamically computes motor plans from perceptual information.
Significance Statement: The present work demonstrates that the activity of groups of neurons in primate ventral premotor cortex reflects information related to visually presented objects, as well as the motor strategy used to grasp them, linking individual objects to multiple possible grips. PMv could provide useful control signals for neuroprosthetic assistive devices designed to interact with objects in a flexible way.

Our goal is to understand the principles of Perception, Action and Learning in autonomous systems that successfully interact with complex environments and to use this understanding to design future systems