Representation of Grasp Parameters in the Inferior Frontal Gyrus in a Human Tetraplegic Participant

Event Date:
June 14th 9:00 AM - 10:00 AM

NEC Seminar, Friday, June 14, 9:00 AM

DSC_0479.jpeg

 

Speaker: Emily Conlan

Advisor: Prof. B. Ajiboye

 

Title: Representation of Grasp Parameters in the Inferior Frontal Gyrus in a Human Tetraplegic Participant

Abstract: Persons with tetraplegia indicate that one of the top priorities is restored hand function, for which object interaction is the main goal. Restoration of hand grasp and object interaction would improve quality of life and allow participants to perform activities of daily living such as eating, drinking, and self-care. Brain Machine Interfaces (BMI) offer an opportunity to record signals related to grasp and object from the brain during motor visualization tasks. These signals predominantly reside in the human homologue of the non-human primate grasp network, comprised of the anterior intraparietal area (AIP), the inferior frontal gyrus (IFG) and primary motor cortex (M1).IFG has been found to encode both visual aspects of the object to be grasped as well as the grip type used at various points during movement execution in non-human primates. Currently, there has been no work in humans that elucidate the individual contributions of grasp and object to neural modulation in IFG, which could affect the ability of future BCI-FES systems from decoding movement intent. To decouple the contributions of grasp and object, we recorded neural data from implanted microelectrode arrays in study participant RP1 (motor complete, C3/C4 AIS B SCI). Cortical information was recorded while the participant was instructed to attempt to use a power, pinch, or lateral grasp around a ball, cube, or rod, for a total of nine randomized grasp-object pairings presented in a motor imagery visualization. A one-way ANOVA revealed that 53 out of the 64 recorded channels in IFG showed significant modulation when compared to baseline screen blanking activity (p<0.05). A subsequent two-way ANOVA with post-hoc multiple comparisons revealed that of the 53 channels, the largest contributor to the variance in the data was found to be grasp type followed by the interaction between grasp and object. Further population level analysis found that grasp alone was able to be decoded at above chance levels whereas object and pair were not. Principal component analysis revealed that each grasp type had a preferred object pairing that maximized its firing rate, which varied by epoch. These results are the first to our knowledge that indicate that not only grasp but the presentation of objects in conjunction with the presentation of grasp impacts the modulation of neural data in IFG. Future work will focus on decoding the grasp state from IFG for real time grasp control in a BMI-Functional Electrical Stimulation (FES) system.