Encoding of the hand form in the resting somatomotor region
In the case of hand, use and visibility often co-occur, with beneficial effects on behavioral performance: hand visibility improves the accuracy of volitional movements 24, reduces the perception of pain 25, increases the tactile perception 26, and allows motor-visual regularities that compute the sense of agency 27. During the interaction with the external objects, we also rely on an intrinsic model of the body structure to mediate an understanding of position 28,29. Other evidence suggests that the brain employs a standard posture or a Bayesian prior for guiding body-space perception and action 30. This is interesting because spontaneous activity maintains statistical regularities (priors) to anticipate and even predict environmental demands 15. This hypothesis has been tested using natural visual stimuli and cognitive tasks 11,19,31–33. More specifically, the idea is that during offline periods, the brain forms generic priors or low-dimensional representations as categories or synergies, rather than individual instances or movements, that summarise the relative abundance of visual stimuli, objects, or motor patterns in the natural environment (statistical regularities). Interestingly, this reduced subspace of summary representations is formed along a hierarchy with the somatomotor cortices at the lower level 10. Consistently, our results show task-rest patterns similarity in the cortical hand region.
The rest-task multivoxel activity tests the occurrence of similar temporal fluctuations in a given region, during the visual stimulation and the resting-state. Correlation, that it does not imply causation, tests the relationship between the intrinsic and task-evoked activity. In this context, results show that at rest the hand somatomotor region maintains a stronger multivoxel pattern of activity that resembles that evoked by the presentation of the natural hands compared to control stimuli (e.g., food). We use two methods to correlate our data: while the Kolmogorov-Smirnov test evaluates the whole of two distributions to determine a goodness of fit, with the multivoxel similarity analysis we correlated task-related patterns with those extracted from each time point of the resting-state to generate cumulative distribution functions using the upper 90th percentile as a measure of strength of coherence between stimulus group and rest patterns for each ROI and subject. This allows us to highlight the differences in the tails of the distributions. As shown in Fig. 2B, while the mean distribution of task-rest similarity values is consistently zero across all tasks, the tail of the distribution for the hand category is more positive (higher number of values > 90th percentile) compared to other conditions 11,19,20. Therefore, the somatomotor regions acting as a central node of processing of afferent and efferent inputs, fundamental for the active tactile feedback and proprioception, may retain low-dimensional representations (e.g., the body form) during the offline periods, instrumental to the interaction with the environment. We often rely on the physical properties of our body (especially the hand) to grasp and manipulate objects and the co-occurrence of sight and use contribute to generate priors tied to the actual experience. According to the idea that things occurring nearly coincidentally in time are represented together in the cortex 34, i.e., cutaneous, proprioceptive, and visual signals, the co-occurrence statistics of usage and visibility may be represented in the somatomotor regions. Despite diverse spatial and temporal resolutions, previous findings in humans have demonstrated the existence of preferred tuning of single neurons to visually cued non-grasp-related hand shapes in the posterior parietal cortex 35. Moreover, in monkeys, a substantial number of neurons in the arm/hand region of the postcentral gyrus is activated by both somatosensory and visual stimulation 36. Here, for the first time, we found that the rest-task similarity in the somatomotor cortex is driven by the hand form, and we can access it through a visually cued paradigm without explicit motor processing.
Our results align with the multimodal role of M1/S1 that embodies different body/motor-related representations, including the one mediated by hands. Linguistic studies show correlates of action words in the somatotopic activation of the motor and premotor cortex (e.g., 37,38. Similarly, embodied cognition theories suggest that understanding action verbs is reliant on the involvement of action-related areas; this representation is found to be body-specific. For example, right- and left-handers perform actions differently and use different brain regions for semantic representation 39. In summary, the stability of the spontaneous activity suggests that this set of neural signals is a possible candidate to preserve long-term models and priors of common behaviors and natural stimuli 9,10. These prior representations are the result of statistical learning mechanisms that store the co-occurrence statistics of hand visibility and usage, instrumental to the exploration and manipulation of the surroundings.
From birth, humans learn to use their hands in a more refined and precise fashion to interact with external objects. Spontaneous activity has been hypothesized to reflect recapitulation of previous experiences or expectations of highly probable sensory events. More precisely, the ongoing activity could be related to the statistics of habitual cortical activations during real life, both in humans 19,33,40 and animals 21,41. For example, a higher similarity between motor 11 and association cortex 20 and spontaneous patterns have been shown for natural hand sequences than for novel sequences. Based on these findings, our results can be interpreted as evidence that rest-task similarity reflects natural stimuli, or more specifically, hand-like objects as compared to artificial ones. More recent studies have found that this effect is higher in stimuli-selective regions 19.
Beyond being natural, hands have sensory and motor attributes. From a visual point of view, gloves and robotic hands share with the hands both size and shape, but they are non-living items with either synthetic motor properties such as the robot, or no independent motor attributes such as the glove. While the visual attributes may explain the similarity between rest and task activity induced by the natural hand compared to control objects (i.e., food) (Fig. 3A/C), the inferred action/used can bias such similarities along a continuum where hands are higher, as tied to natural movements, than robotic hands performing similarly, yet unnatural, and gloves, without autonomous motor attributes (Fig. 4B). The interplay of these factors offers an interpretation of the rankings obtained: on top of the continuum, the natural hand, most abundant environmentally, has necessary visual features and motor attributes, then the robot hand though not as environmentally abundant, has the same visual features and synthetic motor attributes, the glove retains only the visual features but cannot act on its own, and finally the food objects neither have the same visual nor motor attributes.
Studies in the visual cortex demonstrated that the long-term natural experience shapes the response profile. The high-level cortical representations of these regions capture the statistics with which visual stimuli occur 42,43. Furthermore, animal studies demonstrated that when visual stimuli are natural scenes, the reliability of visual neurons' response increases and persists in the subsequent spontaneous activity; these effects are not observed with the stimulation with the noise of flashed bar stimuli 41. In the ferret’s visual cortex, the tuning function of neurons “learns” the statistics of natural but not artificial stimuli as the animal grows 21. Here, for the first time, we found evidence of visual representations encoded in non-visual regions at rest, but regions still specific to our hand stimuli (hand notch area). Thus, we believe that the cumulative impact of the statistics with which natural stimuli occur during the development and the experience shape the ongoing activity of the whole brain, not limited to the visual cortex. Our results are confirmed and opposed by Stringer and coworkers 44 that show representations of natural motor sequences at rest in the mouse visual brain but also across the forebrain. Similar to our results, they confirm that natural sequences are coded in resting state and shape the activity of the whole brain, but conversely, they find motor sequence representations in the visual system. The discrepancy could be explained by the fact that our stimuli did not represent any actions (i.e., strictly open hands), and the fact that spontaneous activity in mice is recorded differently than in humans (mouse running in darkness vs humans staring at a cross with eyes open). However, their results similarly show evidence of generic multimodal representations encompassing both motor and visual attributes.
Our results could be alternatively interpreted as a result of motor imagery. Very early works have found that motor imagery (i.e., imagining a movement without executing it) has been found to share overlapping networks with motor performance 45. However, we can exclude the possibility that our participants were engaged in hand motor imagery during the resting state scan, since that was acquired before the presentation of the visual stimuli task and they were naive to the aim of the study.
Our study shows that the multivoxel activity of hands in the somatomotor area is most represented in resting-state activity. This effect was lateralized to the left, not the right, somatomotor area (Fig. 4B). From a theoretical point of view, the lateralization result is well aligned with the existing literature: compared to other body parts and objects, static pictures of hands and tools have overlapping activations in the LOTC that are then selectively connected to the left intraparietal sulcus and left premotor cortex 46–48. Moreover, a body of literature shows bilateral motor cortical activations are produced with the left non-dominant hand. In contrast, movements with the dominant right hand induce only contralateral (left) activations 49. Precisely, visuospatial orientation attention, measured with eye movements, activates a network of premotor and parietal areas in the right hemisphere, while motor attention and selection, measured as the attention needed to redirect a hand movement, activate the left hemisphere 50,51. TMS and lesion studies further support this left lateralization, describing motor selection, motor attention, or motor learning 49,51. The effector independent activation in the left hemisphere is also found in kinesthetic motor imagery that activates common circuits for motion in the premotor, posterior parietal, and cerebellar regions 50. More recently, Karolis and colleagues 52, using fMRI, built a lateralization functional taxonomy along four axes representing symbolic communication, perception/action, emotion, and decision-making. Along the action/perception axis, the categories movement, finger tapping, motor observation, and touch were all found to activate the sensorimotor areas of the left hemisphere selectively. Interestingly, all these categories had the term hand or finger as the principal components with the highest loadings.
In summary, we provide the first evidence that the ongoing activity in the left somatomotor regions maintains a long-term representation of the hand shape in the absence of any motor task or sensory stimulation. Furthermore, this result may lend support to the representation of visually related information in M1/S1, enforcing a multimodal role in these areas.
The most significant limitation of our study is the use of highly controlled still images. Stimuli presented were in black and white with noise imposed on top to correct for low and mid-level features. This is necessary as a first step because we are looking for generic representations in resting state activity and are using early visual regions as a control. Based on current theoretical investigation, resting state patterns represent the statistical regularities of the environment. Therefore, future studies may employ naturalistic stimuli (for example, videos).
Our sample only included right-handed participants, similar to previous neuroimaging studies testing the relationship between spontaneous and task-evoked studies, employing motor paradigms 11,17,20, visual stimulation 19,32,53 or cognitive tasks 54,55, to give some examples. Moreover, it is noteworthy that the numerosity employed here is aligned with these previous studies and is even higher (n = 19).
Another limitation is the lack of hand motion recordings, which is beyond the aim of our study, i.e., testing the relationship between resting state and visually-evoked activity patterns. Therefore, this would not interfere with our results even with minimal movements.
On a methodological note, our ROIs encompass pre and post-central gyrus. Even though the searchlight analysis shows the spatial specificity of the task-rest association localized on the postcentral gyrus activity, further studies should entangle the hand region alone. Moreover, since kinesthetic motor imagery activates the left premotor, posterior parietal, and cerebellar regions and is effector-independent 50, this would help us rule out mental imagery as a possible explanation of our results.