Our PhD student Michelangelo Tani presented a poster entitled: “Target updating during reach-to-grasp action in a virtual environment: behavioural correlates & pilot fMRI data” at CSA 2024, Brixen, Italy.

Investigating online sensorimotor interactions with neuroimaging techniques is highly challenging, both theoretically and methodologically. Little is known about how potential actions are selected and reprogrammed during ongoing movements, and how learning effects are triggered when updating internal models. We first conducted a behavioural study adapting to a virtual environment a reaching-to-grasp task requiring online adjustments to a changing target, while recording kinematics and psychophysiological measures (HRV). Then, we piloted an fMRI study leveraging a novel system we developed (Motion Online Tracking Under MRI: MOTUM), that combines virtual reality (VR) with an amagnetic motion tracking setup to stream kinematics in real-time into an MRI-compatible VR headset. Behavioral performance (accuracy, time to contact) generally improved in trials following an online adjustment. Univariate analyses of preliminary fMRI data showed that online motor program adjustments were supported by a parieto-frontal network involving supramarginal gyrus (SMG) and dorsal premotor cortex (PMd). This pilot study offers the first working example of online hand tracking with VR under fMRI and suggests that SMG triggers a cascade of events crucial for learning and for signalling urgency for a rapid action reprogramming, which is gated by premotor areas.

Read the abstract on the publisher’s web site