Psychophysical experiments have demonstrated large and highly systematic perceptual distortions of tactile space. Such a space can be referred to our experience of the spatial organisation of objects, at representational level, through touch, in analogy with the familiar concept of visual space. We investigated the neural basis of tactile space by analysing activity patterns induced by tactile stimulation of nine points on a 3 × 3 square grid on the hand dorsum using functional magnetic resonance imaging. We used a searchlight (...) approach within pre-defined regions of interests to compute the pairwise Euclidean distances between the activity patterns elicited by tactile stimulation. Then, we used multidimensional scaling to reconstruct tactile space at the neural level and compare it with skin space at the perceptual level. Our reconstructions of the shape of skin space in contralateral primary somatosensory and motor cortices reveal that it is distorted in a way that matches the perceptual shape of skin space. This suggests that early sensorimotor areas critically contribute to the distorted internal representation of tactile space on the hand dorsum. (shrink)
Plasticity of body representation fundamentally underpins human tool use. Recent studies have demonstrated remarkably complex plasticity of body representation in humans, showing that such plasticity (1) occurs flexibly across multiple time scales and (2) involves multiple body representations responding differently to tool use. Such findings reveal remarkable sophistication of body plasticity in humans, suggesting that Vaesen may overestimate the similarity of such mechanisms in humans and non-human primates.
Schilbach et al. contrast second-person and third-person approaches to social neuroscience. We discuss relations between second-person and first-person approaches, arguing that they cannot be studied in isolation. Contingency is central for converging first- and second-person approaches. Studies of embodiment show how contingencies scaffold first-person perspective and how the transition from a third- to a second-person perspective fundamentally involves first-person contributions.
The empirical support for the shared circuits model (SCM) is mixed. We review recent results from our own lab and others supporting a central claim of SCM that mirroring occurs at multiple levels of representation. By contrast, the model is silent as to why human infants are capable of showing imitative behaviours mediated by a mirror system. This limitation is a problem with formal models that address neither the neural correlates nor the behavioural evidence directly.
Hands are commonly held up as an exemplar of well-known, familiar objects. However, conceptual knowledge of the hand has been found to show highly stereotyped distortions. Specifically, people judge their knuckles as farther forward in the hand than they actually are. The cause of this distal bias remains unclear. In Experiment 1, we tested whether both visual and tactile information contribute to the distortion. Participants judged the location of their knuckles by pointing to the location on their palm directly opposite (...) each knuckle with: 1) a metal baton (using vision and touch) 2) a metal baton while blindfolded (using touch), 3) a laser pointer (using vision). In Experiment 2, we investigated whether judgments are influenced by visual landmarks such as the creases at the base of each finger on the palm. Participants localized their knuckles on either a photograph or a silhouette of their hand. In both experiments, clear distortions were found across conditions, of generally similar magnitude. These results show that distal bias is resistant to changes in the stimulus information and does not rely on any specific stimulus cue or single sensory modality, suggesting that such mislocalisations reflect a conceptual misrepresentation of hand structure. (shrink)
We question the generalizability of Glover's model because it fails to distinguish between different forms of planning. The highly controlled experimental situations on which this model is based, do not reflect some important factors that contribute to planning. We discuss several classes of action that seem to imply distinct planning mechanisms, questioning Glover's postulation of a single “planning system.”.
Tactile distance perception is believed to require that immediate afferent signals be referenced to a stored representation of body size and shape (the body model). For this ability, recent studies have reported that the stored body representations involved are highly distorted, at least in the case of the hand, with the hand dorsum represented as wider and squatter than it actually is. Here, we aim to define the neural basis of this phenomenon. In a behavioural experiment participants estimated the distance (...) between touches on two points by adjusting the length of a visually-presented line on the screen. The technique of multidensional scaling (MDS) was used to reconstruct a perceptual map of tactile space. Analysis of spatial distortion using Procrustes alignment showed that maps were stretched in the mediolateral hand axis. In order to determine the neural correlates of these body distortions, we performed an fMRI study. For each participant, we used a searchlight pattern classifier with Euclidean distance on pre-defined regions of interests (ROIs). In order to relate the representations between the different points and to computational models, we compare response-pattern dissimilarity matrices in these ROIs. Similar to the behavioural experiment, we used MDS to reconstruct maps of the neural representation of tactile space using the values from the dissimilarities matrices. We were able to reconstruct the perceptual map of tactile space in the contralateral primary somatosensory and motor cortices. This suggests that these areas are critical to generate the tactile representations of the dorsum of the hand. (shrink)
We examined the neural basis of tactile distance perception by analyzing activity patterns induced by tactile stimulation of nine points on a 3 x 3 square grid on the hand dorsum using functional magnetic resonance (fMRI). We used a searchlight approach within pre-defined regions of interests (ROIs) to compute the pairwise Euclidean distances between the activity patterns elicited by tactile stimulation. Then, we used multidimensional scaling (MDS) to reconstruct skin space at the neural level and compare it with skin space (...) at the perceptual level. Our reconstructions of the shape of skin space in contralateral primary somatosensory (SI) and motor (M1) cortices reveal that it is distorted in a way that matches the perceptual shape of skin space. This suggests that early sensorimotor areas are critical to processing tactile distance perception. (shrink)