Self-frontier and impact prediction
The construction of a coherent representation of our body and the mapping of the space immediately surrounding the body is really important to evolve and confront our environment as well as possible. In last decades, numerous studies were interested on the peripersonal space (PPS) corresponding to the space surrounding us in which we can interact with. A lot of different experiments and designs were performed to understand how this space is constructed, encoded and modulated. The majority of studies focused only on one part of the body: the hand, the face or the trunk. In the present review, we propose to sum up the last advances on this research which open news perspectives for futures investigations on this topic. We describe the recent methods used to estimate the PPS boundaries by the means of dynamic stimuli. We then highlight how impact prediction and approaching stimuli modulated this space by social, emotional and action components involving principally a parieto-frontal network. Last, we review evidence that there is not a unique representation of peripersonal space but at least three sub-sections (hand, face and trunk PPS) and how bodily self-consciousness (BSC) is link with this space. The parieto-frontal networks involved in these two processes are strongly connected but not exactly with the same areas suggesting that PPS areas underlie a multisensory-motor interface for body-objects interaction and BSC areas are involved in bodily awareness and self-consciousness. To conclude, this review show that our PPS representation is continuously change around us with sensory information, experiencing and feeling to adapt the best possible to our environment.
1 PERIPERSONAL SPACE
In our world, we are solicited by a multiple of things. The space around us is often filled with conspecifics, other animals and objects. The majority of time we interact with them but we not interact with the same way following the context and the nature of this objects. Therefore, it’s crucial to predict the probable consequences of contact with this objects, animals or people to avoid the object or prepare an appropriate response (to escape or to protect). This required the construction of a coherent representation of our body and the mapping of the space immediately surrounding the body. To do this, the brain has a specific system and mechanisms to specifically process information occurring in the space that direct surrounds us and in which we can interact, that is, to represent the so-called peripersonal space (PPS).
Research in non-human primates has shown that multisensory cues, in particular those involving the body, are processed and integrated by a specialized neural system mapping the peripersonal space. Specific populations of multisensory neurons integrate tactile information on the body (arm, face or trunk) with visual or auditory stimuli occurring in peripersonal space, i.e. close to the body. These multisensory neurons are described in a fronto-parietal network of the macaque brain involving the ventral premotor cortex (vPM; F4, G. Rizzolatti et al., 1981a, 1981b; or polysensory zone PZ, Graziano et al., 1994; M. S. A. Graziano et al., 1997; Graziano et al., 1999; Fogassi et al., 1996; Graziano and Cooke, 2006), the ventral intraparietal area on the fundus of the intraparietal sulcus (VIP, J. Hyvärinen and Poranen, 1974; Duhamel et al., 1997, 1998; Avillac et al., 2005; Schlack et al., 2005; Graziano and Cooke, 2006), in the parietal areas 7b and in the putamen (Graziano and Gross, 1993). The response properties of these neurons are independent from eye position but depend on the position of the different body part in space. This suggests that they encode a body-part centered multisensory of PPS (Duhamel et al., 1997; Avillac et al., 2005; Graziano and Cooke, 2006; Graziano et al., 2000).
Extinction is a phenomenon whereby patients fail to detect contralesional stimuli only under the condition of double (ipsilesional and contralesional) simultaneous stimulation (Bender, 1952; Mattingley et al., 1997; Làdavas and Serino, 2008). This phenomenon can emerge when concurrent stimuli are presented in the same (unimodal extinction) or different modality (cross-modal extinction). In right brain-damaged patients with tactile extinction, visual or auditory stimulations on the ipsilesional side exacerbates contralesional tactile extinction. If visual (or auditory) and tactile stimuli are on the same contralesional side, this can reduce the deficit (Làdavas et al., 1998a). Therefore, cross-modal extinction can be modulated as a function of the spatial arrangement of the stimuli with respect to the patient’s body (Farnè et al., 2005a, 2005b; for review, see Làdavas, 2002). Besides, this modulation is most consistently obvious when visuo-tactile interaction or auditory-tactile interaction occur in the space close to the patients’ body, compare to in the far space (di Pellegrino et al., 1997; Làdavas et al., 1998a, 1998b, 2000). This finding has shown, in humans, an evidence of the existence of a peripersonal space, with a integrated visuo-tactile system coding the space close to the body, in a similar way to that described in monkeys (Làdavas, 2002). Visuo-tactile information and auditory-tactile information are also integrated, similarly, in other regions of space surrounding different body parts, such as around the face (Farnè et al., 2005a; Farnè and Làdavas, 2002; Làdavas et al., 1998b).
This first evidence for the existence of a PPS system in the human brain was corroborated by behavioral studies in healthy participants (Spence et al., 2004; Macaluso and Maravita, 2010; Occelli et al., 2011). These studies showed that tactile perception is more strongly modulated by visual or auditory stimuli when these are presented close, as compared to far, from the body. Neuroimaging studies using EEG (Sambo and Forster, 2008), TMS (Serino et al., 2011) and fMRI (Bremmer et al., 2001; Makin et al., 2007; Gentile et al., 2011; Brozzoli et al., 2011, 2013) demonstrated that multisensory representation of PPS occurs in both in parietal and prefrontal areas where PPS neurons have been identified in the homologous macaque regions (for reviews, see Cléry et al., 2015b; di Pellegrino and Làdavas, 2015).
There is no physically separation between the PPS (near space) and the extrapersonal space (far space) in the real world, however the brain does represent a boundary between these two spaces. That is to say between what is close to our bodies, which can potentially impact, interact or attack us, and what is further away, at a distance in which we cannot acted upon. More importantly, this boundary is not fixed and can vary within individuals between contexts and situation, and also between individuals (Cléry et al., 2015b; de Vignemont and Iannetti, 2015; Farnè et al., 2005a, 2005b; Maravita and Iriki, 2004).
Objects approaching us or a predator may pose a threat, and signal the need to initiate defensive behavior. These potential impacts suggest an intrusion in our peripersonal space, which conduct to enhance the system to be more efficient and to face to the potential threat (Canzoneri et al., 2012; Cléry et al., 2015a; De Paepe et al., 2016; Kandula et al., 2015). Besides, the peripersonal space is the space closest at us, so it is the closest at self. Some studies and reviews show the link between peripersonal space a body self-consciousness, and in particular a recent paper of Grivaz et al. (2017) who made a meta-analysis on human studies to compare these two components.
The following sections will review the methods to measure PPS (sections 2), the role of impact prediction (section 3), modulations of peripersonal space (section 4), different representation of body-related PPS (section 5) and the link between peripersonal space and body self-consciousness (section 6).
2 MEASUREMENTS OF PERIPERSONAL SPACE
Interestingly, neural systems representing PPS both in humans and in monkeys show response preference for moving stimuli, over static stimuli. Indeed, neurophysiological studies in monkeys showed that bimodal and trimodal neurons are more effectively activated by presenting three dimensional objects approaching toward and receding from the animal’s body, compared to static stimuli, both in the ventral intraparietal area (Colby et al., 1993a; Duhamel et al., 1997) and in the premotor cortex (Fogassi et al., 1996; Graziano et al., 1994; M. S. A. Graziano et al., 1997; Graziano et al., 1999). Some of these neurons also show direction-selective and velocity dependent response patterns, as firing rates in certain cells increase as a function of the velocity of approaching stimuli (Fogassi et al., 1996). In humans, Bremmer and colleagues (2001) demonstrated an increased neural activity in the depth of the intraparietal sulcus and in the ventral premotor cortex evoked by approaching visual, auditory and tactile stimuli.
Based on these findings, Serino’s group has developed a method to estimate the boundary of the PPS using dynamic stimuli which have more ecological contexts because our environment is constantly moving, instead of static stimuli. Besides, this approach is more similar to the experimental conditions used in monkeys’ neurophysiology, thus allowing a more direct comparison across species (Canzoneri et al., 2012).
This paradigm consists to measure behavioral responses in humans reflecting the concept and the properties of receptive fields (RFs) in primate PPS neurons during a dynamic audio-tactile interaction task or visuo-tactile interaction task in order to assess the extension of PPS in a more functionally and ecologically valid condition. Participants are requested to respond as fast as possible to tactile stimulation administered on a part of their body, while task-irrelevant approaching or receding external cues (auditory or visual stimuli) from the stimulated body part are presented (Canzoneri et al., 2012, 2013a, 2013b, 2016; Galli et al., 2015; Noel et al., 2015a, 2015b; Teneggi et al., 2013). They measured reaction time (RTs) to the tactile stimulus at body part (the hand, the face or the trunk). On each trial, tactile stimulation was delivered at different temporal delays from the onset of the sound/visual, such that it occurred when the sound/visual source was perceived at varying distances from the participant’s body.
They show that RTs to tactile stimuli progressively decrease as a function of the sound/visual source’s perceived approach; and conversely, that they would increase as a function of the sound/visual source’s perceived recession. They used the function describing the relationship between tactile processing and the position of sounds or visual stimuli in space to estimate the critical distance at which an external stimulus affects tactile processing. This critical distance, along a spatial continuum between far and near space, can be considered as the boundary of PPS representation in humans.
This new paradigm is first developed and used with dynamic audio-tactile interaction task with tactile stimulations delivered on the hand (Canzoneri et al., 2012, 2013a, 2013b). They used also this paradigm with dynamic audio-tactile interaction task with tactile stimulations delivered on the face to study the social effect on this boundary (Teneggi et al., 2013). Recently this group has developed also this paradigm using a dynamic visuo-tactile interaction task in full body illusion (Noel et al., 2015a, 2015b; Serino et al., 2015b).
This method developed by Serino’s group opens new perspectives to study the peripersonal and how it can be modulated following contexts, experiences and action.
3 LOOMING STIMULI AND IMPACT PREDICTION
The ecological significance between stable stimuli close to our body (e.g. a wall, a desk) and dynamic stimuli looming towards us (e.g. mosquito, a ball) are different. Looming stimuli are potentially more dangerous than other visual stimuli, including dynamic stimuli with no predicted impact to the body. A predator, a dominant conspecific, or a mere branch coming up at high speed are dangerous if one does not detect them fast enough to produce the appropriate escape motor repertoire. Indeed, such looming stimuli are known to trigger stereotyped defense responses (in monkeys: Schiff et al., 1962; in human infant: Ball and Tronick, 1971). Interestingly, threatening looming stimuli are perceived as having a shorter time-to-impact latency as compared to non-threatening objects moving at the same objective speed (Vagnoni et al., 2012).
In a visuo-auditory context, looming visual stimuli have shown to trigger pronounced orienting behavior toward simultaneous congruent auditory cues compared with receding stimuli, both in non-human primates (Maier et al., 2004) and in 5-month-old human infants (Walker-Andrews and Lennon 1985). Looming structured sounds can specifically benefit visual orientation sensitivity (Romei et al., 2009, Leo, et al 2011). In a recent study (Cléry et al., 2015a), we show that subjects have an enhanced tactile detection with looming visual stimuli rather with receding visual stimuli. Therefore, looming stimuli are more relevant than receding stimuli to the body. Indeed, while both size and depth cues most probably contribute to the modulation of tactile sensitivity on the face, we propose that the movement vector cue (away from or toward the subject) is actually the dominant cue affecting tactile detection. Slower looming stimuli result in a delayed predicted time of impact on the face, and hence a delayed time at which tactile sensitivity is maximally enhanced (Cléry et al., 2015a). So, the spatial, temporal, and dynamic predictive cues are fully accounted for by the trajectory and speed of the looming visual stimuli.
Other auditory or visuo-tactile integration studies (Canzoneri et al., 2012; Kandula et al., 2015) have shown the reaction time is shorter when the tactile is delivered at the impact time of the looming stimulus and suggest that looming stimuli predictively speeds up tactile processing. Besides, we found in our psychophysics study that also tactile sensitivity is enhanced at the predicted location and predicted time of impact of a looming visual stimulus to the face (Cléry et al., 2015a), reflecting the expected subjective consequence of the visual stimulus onto the tactile modality. Interestingly, the physiological and perceptual binding of two stimuli into the representation of a same and unique external source is subjected to some degree of temporal tolerance, which has driven of a multisensory temporal binding window description (for review, see Wallace and Stevenson, 2014).
On top of a baseline multisensory enhancement, tactile sensitivity is further enhanced by the predictive components of the heteromodal visual or auditory stimuli. To anticipate a potential impact onto the body and especially onto the face can be vital importance. This process involves cross-modal influences, and it was suggested that the cortical regions responsible for this multisensory impact prediction highly overlap with the multisensory convergence and integration functional network. Some studies are in accordance with this hypothesis. Firstly, the visual response occasionally observed in parietal tactile neurons (and more generally in bimodal visuo-tactile neurons) was initially interpreted as an “anticipatory activation”, predictive of touch in the corresponding skin (Juhani Hyvärinen and Poranen, 1974). Secondly, some neurons in the ventral intraparietal area (VIP) integrate vestibular proprioceptive self-motions and visual motion cues to encode relative self-motion with respect to the environment (Bremmer et al., 1997, 2000, 2002a, 2002b; Duhamel et al., 1997). These neurons have been shown to respond to both visual and tactile stimuli (Duhamel et al., 1997; Guipponi et al., 2013, 2015) and perform nonlinear sub-, super-, or additive multisensory integration operations (Avillac et al., 2004, 2007). Recently, a fMRI study on non-human primate confirms that this area VIP is implied in impact prediction to the face in a visuo-tactile context (Cléry et al., 2015b, Chapter 2). Thirdly, area F4 are also robustly activated, bilaterally by impact prediction. Most importantly, these activations are systematically significantly higher when the looming stimulus is predictive of the tactile stimulus than when these two stimuli are presented simultaneously (Cléry et al., 2015b, Chapter 2).
As seen in section 1, the area VIP and F4 play a key role in the definition of peripersonal space. A recent study in fMRI performed in monkeys to assess the neural bases encoding near and far space during naturalistic 3D moving objects, show the involvement both of VIP and F4 for peripersonal space encoding (Chapter 3). This confirms that it was found in electrophysiological studies in monkeys (Colby et al., 1993; Bremmer et al., 2002a, 2002b; Rizzolatti et al., 1981; Graziano et al., 1997) and notably the strong link between multisensory integration and peripersonal space representation.
In monkeys, the electrical microstimulation of the neurons of these two regions induces a behavioral defense and avoidance repertoire of whole body movements, suggesting their involvement in the coding of a defense peripersonal space (Cooke and Graziano, 2004; Graziano et al., 2002; Graziano and Cooke, 2006). Besides, a visual stimulus intruding into peripersonal space close to one’s cheek has a higher impact prediction effect on our cheek than a visual stimulus predicting an impact to the other cheek (Cléry et al., 2015b, Chapter 2). Therefore, we are alert to a potentially harmful impact to the face. Indeed, another study (Canzoneri et al., 2012) demonstrates that tactile processing on the hand is speeded by the presence of a looming sound, predicting an impact on the hand or within a well-defined distance from the hand. This suggests the existence of a security margin around the face and the body subserved by this parieto-frontal network (for reviews, see Cléry et al., 2015b; di Pellegrino and Làdavas, 2015).
Some studies performed in humans were interested not just in tactile stimuli but more precisely in nociceptive stimuli. In two studies (De Paepe et al., 2014, 2015), they use temporal order judgment tasks, to assess whether the spatial perception of nociceptive stimuli is coordinated with that of proximal visual stimuli into an integrated representation of peripersonal space. Participants judged which of two nociceptive stimuli, one presented to either hand, had been presented first. Each pair of nociceptive stimuli was preceded by lateralized visual cues presented either unilaterally or bilaterally, and either close to, or far from, the subject’s body. In the second study, during the task, participant’s hands were either uncrossed or crossed over the body midline. They found that the unilateral visual cue prioritized the perception of nociceptive stimuli delivered on the hand adjacent to the unilateral visual cue. This effect increased when the cue was presented close to the participant’s hand (De Paepe et al., 2014), irrespective of posture. This demonstrated that the temporal order of nociceptive stimuli was influenced by the position of the nociceptive stimuli on the body, but also mainly by the position itself of the stimulated hand in external space (De Paepe et al., 2015). In a third study (De Paepe et al., 2016), participants were asked to respond as fast as possible at which side they perceived a nociceptive stimulus on their hand while a visual stimulus with different temporal delays was either approaching or receding the participant’s left or right hand. Authors found that reaction times were fastest when the visual stimulus appeared near the stimulated hand and more pronounced for visual looming. These three studies suggest that the coding of nociceptive information in a peripersonal frame of reference may contribute to a safety margin representation around the body that is designed to protect it from potential physical threat.
A recent review (Van der Stoep et al., 2015) suggests that different combinations of sensory information might be more or less relevant depending on the distance from which this information happens to be presented. For example, touch and vision are dominant in peripersonal space, as they may imply an interaction between the body and the environment (e.g., for grasping or defense), whereas auditory and visual information may be more relevant in extrapersonal space as they provide useful information about farther objects, for spatial orienting, navigation and interaction with others (e.g. during conversation). As tactile stimuli can only be perceived when applied to the body, visuotactile and audiotactile interactions (like impacts onto the body) inherently occur near the body and the peripersonal space boundary can therefore be explained by spatial alignment of different stimulus modalities with the body. Another review (Nathan Van der Stoep, 2016) was interested on whether multisensory integration operates according to the same rules throughout the whole of 3-D space. They found not only that the space around us seems to be divided into distinct functional regions, but they also suggest that multisensory interactions are modulated by the region of space in which stimuli happen to be presented. Therefore, futures studies on peripersonal space and notably on impact prediction onto the body need to take account of the spatial constraints on multisensory integration (moving stimuli, 3D space, moving subject or isolated observers).
4 SOME MODULATIONS OF PERIPERSONAL SPACE
Peripersonal space seems to be really important in our representation of space. We have seen that in this space, some processes are enhanced (reaction times, sensitivity). In last years, a lot of studies were interest on the flexibility and plasticity of this peripersonal space (for review, see Cléry et al., 2015b; de Vignemont and Iannetti, 2015).
Using a tool to reach objects in the far space can extend the boundaries of PPS representation. In monkeys, Iriki et al. (1996) showed that, after a training period of using a rake, hand-centered visual RFs of neurons located in the intraparietal sulcus extended to retrieve pieces of food placed at a distance. In humans, neuropsychological (Farnè and Làdavas, 2000; Maravita et al., 2001) and psychophysical (Holmes et al., 2004; Maravita and Iriki, 2004; Serino et al., 2007; Galli et al., 2015) studies demonstrated that, after using a tool, crossmodal interactions between visual or auditory stimuli presented in the far space, more specifically at the location where the tool has been used, and tactile stimuli at the hand increase, suggesting extension of PPS representation. Taken together, these findings suggest that the extent of PPS representation is dynamically shaped depending on experience, extending the action possibilities of the body over its structural limits (Maravita and Iriki, 2004; Gallese and Sinigaglia, 2010; Costantini et al., 2011). At the beginning, some of these studies suggest that actively using the tool is necessary for extending PPS representation. Since, studies have shown that neither a physical, nor a functional interaction between near and far space is necessary to extend PPS representation (Bassolino et al., 2010; Goldenberg and Iriki, 2007; Serino et al., 2015a). This last study (Serino et al., 2015a) support to the unconventional hypothesis, generated by a neural network model, that plasticity in PPS representation after tool-use does not strictly depend on the function of the tool nor from the actions performed with the tool, but it is triggered by the sensory feedback of tool-use, i.e., synchronous tactile stimulation at the hand, due to holding the tool, and multisensory (auditory or visual) stimulation from the far space, where the tool is operated (for a review on tool-use, see Martel et al., 2016).
A lot of studies show that tool can remap the peripersonal space. This define the peripersonal space in a point of view of a “goal-directed action” in which we want to grasp and reach somethings (for review, see de Vignemont and Iannetti, 2015). However, some evidence show that other aspect can remap this space like fear, anxiety, social aspects and contribute rather to a “protective and defensive” peripersonal space.
Claustrophobia is a situational phobia featuring intense anxiety in relation to enclosed spaces and physically restrictive situations (American Psychiatric Association, 2000). Lourenco et al. (2011) investigated whether the size of near space relates to individual differences in claustrophobic fear, as measured by reported anxiety of enclosed spaces and physically restrictive situations and show that claustrophobic fear is associated with increased size of the near space immediately surrounding the body. Vagnoni et al. (2012) show the same results and extends these by showing that emotion not only alters the perception of space as a static entity, but it also affects the perception of dynamically moving objects, such as those on a collision course with the observer. Previous studies have described a facilitation of tactile processing when a physical threatening picture (for instance a snake or a knife) was presented nearby, leading to faster responses than when it was presented further away (for instance near a different body part) (Poliakoff et al., 2007; Van Damme et al., 2009). The distance from a visual stimulus has a stronger influence on tactile reaction times if it is perceived as threatening, which indicates that the distance to a threatening visual stimulus is more important for visuo-tactile interaction than to a non-threatening one (de Haan et al., 2016). Other studies found that noise sounds that elicited a negative emotion and negative ecological sounds (dogs barking, screaming woman, …), induced faster reactions times when threatened sounds appear close to the subject and are influencing earlier in the trial than sounds with a neutral or positive valence (Taffou and Viaud-Delmon, 2014; Ferri et al., 2015). Besides, whereas individuals low in claustrophobic fear demonstrated the expected expansion of peripersonal space using a stick during line bisection task, individuals high in claustrophobic fear showed less expansion, suggesting decreased flexibility (Hunley et al., 2017). Whatever the estimated level of objects’ danger, the extent of peripersonal space was reduced when the threatening part of dangerous objects was oriented towards participants, not when oriented away (Coello et al., 2012). This suggests that the characteristics of the here and now body-objects interaction are crucial in affecting the boundary of peripersonal space. These different studies show that the emotional aspects and characteristic of the danger itself in relation to the body can influence the defensive peripersonal space and so on the safety body margin but also can decrease peripersonal plasticity following the context.
Defensive reflex responses can be finely modulated by the position of the stimulus within the peripersonal space, particularly, in relation to the area of the body for which the reflex response provides protection (Sambo et al., 2012a, 2012b). Besides, the size of an individual’s peripersonal space is correlated with trait anxiety, with a larger peripersonal space in more anxious individuals (Sambo and Iannetti, 2013; for review, see de Vignemont and Iannetti, 2015). This is also observed with empathic subjects. Subcortical defensive responses like hand-blink reflex (HBR) is enhanced when the threat is brought close to the face by one’s own stimulated hand, by another person’s hand and when the hand of the participant enters in the PPS of another individual. The enhancement of this HBR is larger in participants with strong empathic tendency when the other individual is in a third person perspective suggesting that interpersonal interactions shape perception of threat and defensive responses (Fossataro et al., 2016).
Others evidence show that the presence and interaction with others shape the peripersonal space representation (Teneggi et al., 2013). Indeed, if we compare the peripersonal space measurements when subjects face another individual or face mannequin, placed in far space, peripersonal space boundaries shrink in condition in which subjects face another person, suggesting that one’s owns peripersonal space accommodates in the presence of others. Others experiments show that, peripersonal space boundaries between self and other merge after an economic game with another person, but only if the other behaved cooperatively. Peripersonal space is shaped by valuation of other people’s behaviour during the interaction, so there is a direct link between peripersonal representation a feelings generated by interaction with others. Therefore, the peripersonal space representation is sensitive to social modulation, showing a link between low-level sensorimotor processing and high-level social cognition. However, the expansion and contraction of our PPS representation may not be the only change induced by the presence of others. Some studies suggest that we remap observed sensory and motor experiences of others onto our own bodily representations. Indeed, a similar mirror system in the brain for both human and non-human primates exists, this system was activated both when we are touched ourselves, when we view another person being touched, and also when events occurring in the space near the other’s body (Blakemore et al., 2005; Keysers and Gazzola, 2009; Serino et al., 2008; Cardini et al., 2010). Ishida et al. (2009), using single cell recordings in non-human primates, show that bimodal parietal neurons which encode sensory events occurring in the space around the monkey’s own hand as well as the space around another monkey’s hand. Same results are observed in premotor cortex in humans (Brozzoli et al., 2013; Holt et al., 2014). However, Maister et al. (2015) show that increased multisensory integration near another person involved remapping of peripersonal space to the other person without including the space between the two people. A review by Ishida et al. (2015) based on monkey neurophysiology as well as human fMRI studies, reports shared self-other body representation coding in multiple brain areas including visuotactile neurons in parietal cortex (Ishida et al., 2009), secondary somatosensory cortex (Keysers et al., 2004, 2010; Blakemore et al., 2005; Ebisch et al., 2008; Keysers and Gazzola, 2009) and in insular cortex (Fitzgibbon et al., 2010, 2012; Lamm and Singer, 2010; Krahé et al., 2013) associated with affective touch and interoception.
Recent studies were interested in the link between the peripersonal space for acting and interpersonal space. It’s the space in which we maintain a distance around our bodies and in which any intrusion by others may cause discomfort. This space can be modified by emotional and socially relevant interactions. As we have seen previously PPS is also modulated by some social factors but also with more complex social information such as perceived morality of another person, age and gender (Iachini et al., 2015, 2016). Peripersonal space for acting and interpersonal space share a common motor nature and are sensitive, at different degrees, to social modulation. Therefore, social processing seems embodied and grounded in the body acting in space (Iachini et al., 2014). However, tool-use remapped the action-related PPS, measured by a reaching-distance toward a confederate, but did not affect the social-related interpersonal space measured by a comfort-distance task suggesting that these two space representation have no full functional overlap between them (Patané et al., 2016). But using another paradigm in which participants observed a point-light walker approaching them from different directions and passing near them at different distances from their right or left shoulder, Quesque et al.(2016) found that comfortable, interpersonal distance is linked to the representation of peripersonal space. As a consequence, increasing peripersonal space through tool use has the immediate consequence that comfortable interpersonal distance from another person also increases, suggesting that interpersonal-comfort space and peripersonal-reaching space share a common motor nature (Iachini et al., 2014, 2016; Coello and Fischer, 2015).
Peripersonal space is not a fixed space but a dynamic space with its own characteristics and which is continuously modulated by our environment (social, emotional, functional). The fact that this “boundary” vary suggests that the body tends to the more appropriate behaviour (protective, pro-active) using all the information around us and particularly multisensory integration cues (visual, tactile, auditory, proprioceptive…) (Cléry et al., 2015b; de Vignemont and Iannetti, 2015). Consequently, rules and effects of impact prediction to the body seen in section 3 are really dependant of numerous factors around us (type of stimulus, speed of stimulus, congruence of stimuli, the relevance of the stimulus impacting with the subject: static, moving, neutral, negative, social…). However, this is also depending of the body part stimulated or the is a unique body PPS representation?
5 DIFFERENT REPRESENTATION OF BODY-RELATED PPS
Lot of studies on peripersonal space were interested on the face and more on the hand. We have seen that this “boundary” of peripersonal space representation was modulated both for action (for example after tool-use) and emotional/social context (fear, anxiety, cooperation). Besides, these modulations can vary within individuals between contexts and situation, and also between individuals. However, the representation of peripersonal space is it the same following the different part of the body?
Serino et al. (2015b) have applying the paradigm developed in section 2 (Canzoneri et al., 2012, 2013a, 2013b; Teneggi et al., 2013; Galli et al., 2015; Noel et al., 2015a, 2015b) to measure behavioural responses in humans with a tactile stimulus delivered at different part of the body and space. In first experiment, they test the effect of looming and receding auditory stimuli from the peri-trunk which is stimulated. They show that looming sounds modulated tactile processing as a function of the distance of the sound from the body and this effect was selective for looming sounds. The majority of experiments on peripersonal space are done only in the front space of the subject. Therefore, in second experiment, they test the effect of looming and receding auditory stimuli from the front or back of peri-trunk which is stimulated. This confirms that just the sounds looming from the trunk are mapped into the representation of the peri-trunk PPS. There found any effect of mapping space (Front vs. Back). In third experiment, they test the effect of looming and receding auditory stimuli from the peri-hand which is stimulated. They show that sounds modulated tactile processing as a function of the distance of the sound from the hand but in this case this effect was observed not only for the looming sounds but also for the receding sounds. Nonetheless, the effect is stronger for looming stimuli than for receding stimuli and the distance. Besides, the distance in which the sounds modulated tactile processing is shorter for the peri-hand than for peri-trunk. In experiment 4 and 5, they compare directly representation of the peri-hand PPS and peri-trunk PPS. For this, always using looming and receding sounds from the body part stimulated, the tactile stimulation is applied either in peri-trunk or in peri-hand placed close to the peri-trunk (experiment 4) or either in the peri-hand placed far to the peri-trunk (like in experiment 1) or close to the peri-trunk (experiment 5). They found that if the peri-hand is close to the peri-trunk, only the effect of looming sounds modulates tactile processing and the distance of boundary PPS was relatively the same between them whereas when the peri-hand is far to the trunk, they observed the same results as experiment 3, i.e, looming and receding sounds have an effect but for shorter distance. To summarize, there is two different PPS representations, one for peri-hand sensitive to looming and receding stimuli at close distance and another one for peri-trunk sensitive only for looming stimuli and for distance less closer than for peri-hand. Importantly, these two representations are not independents. In two last experiments, they test the effect of looming and receding stimuli (auditory or visual) from the peri-trunk or the peri-face while one or the other was stimulated. If tactile stimulus was applied in the peri-trunk they observed that only looming stimuli elicited a significant speeding effect on tactile processing, at the relatively same distance than the other experiments on the peri-trunk and with no difference either the looming stimuli was toward the face or the trunk. If tactile stimulus was applied in the peri-face, they observed that only looming stimuli towards the face elicited a significant speeding effect on tactile processing but at shorter distance than for peri-trunk.
To summarize this really complete study, authors show that the size of PPS representation varied according the stimulated body part, being progressively bigger for the hand, the face and largest for the trunk (Figure 1A). Tactile processing is modulated in a space-dependant manner by looming stimuli for these different body parts but also by receding stimuli for the hand, with a smaller effect. More importantly, the size of PPS representation around the trunk is relatively constant, the PPS representation around the hand or the face varied according to their relative distance positioning and stimuli congruency (Figure 1B). These findings are compatible with the function of the peripersonal space as a multisensory-motor interface for body-object interaction (Brozzoli et al., 2012b), as to plan an approaching.
Authors suggests that there is not a unique representation of body peripersonal space but at least three body-part specific PPS representation, with different extension and direction tuning, and referenced to the common reference frame of the trunk. For the future studies, research need to take these new characteristics of PPS representation into account for design experiments. This first extensive mapping in humans PPS representation opens a new sight on peripersonal space research: how these three body-part specific PPS representation interfered with the other PPS representation as know the “goal-directed action” and the “protective/defensive” space?
6 PERIPERSONAL SPACE AND BODILY SELF-CONSCIOUSNESS
The trunk PPS representation integrates both body-related signals but also information related to external stimuli potentially interacting with the body, in a global, egocentric reference frame. This representation may constitute a basic neural representation that is of relevance for the self and self-consciousness (Blanke and Metzinger, 2009; Blanke, 2012; Blanke et al., 2015; Tsakiris et al., 2007; Tsakiris, 2010; Serino et al., 2015b). What is the link between peripersonal space and body self-consciousness?
Bodily self-consciousness, that is the feeling that the physical body and its parts is one’s owns, (BSC), is argued to be one of the cardinal features of subjective experience (Gallagher, 2000; Blanke and Metzinger, 2009). In last years, multisensory bodily illusion paradigms have been developed to study BSC in the laboratory, describing the detailed behavioural mechanisms underlying the sensation of ownership for the hand using rubber hand illusion (Botvinick and Cohen, 1998), the face using enfacement illusion (Tsakiris, 2008; Sforza et al., 2010), the entire body using full-body illusion, out-of-body illusion or body-swap illusion (Ehrsson et al., 2007; Lenggenhager et al., 2007; Petkova and Ehrsson, 2008). These illusions are based on the application of visuo-tactile stimuli between the body (or part of body) of the participant and a virtual body (or fake part of body). By manipulating multisensory cues, it is possible to induce ownership over fake or virtual body parts or whole bodies. These studies, have led to a growing consensus that ownership over hands, faces, and bodies crucially relies on the integration of multiple bodily signals in the brain (Blanke, 2012; Blanke et al., 2015; Ehrsson et al., 2004; Ehrsson, 2012; Makin et al., 2008; Serino et al., 2013; Tsakiris, 2010).
Discussions on these studies suggest a direct link between the neural mechanism underlying multisensory PPS processing and BSC. However, nowadays, most of studies investigated separately the brain mechanism of either PPS or either BSC processing, using a lot of different paradigms. To compensate for this, a recent study (Grivaz et al., 2017), have conducted an extensive meta-analysis of functional neuroimaging studies to determine the key neural structures for PPS, for BSC and their potential common structures in humans.
To do this, authors performed a systematic quantitative coordinate-based meta-analysis on human functional neuroimagining studies (Eickhoff et al., 2009, 2012; Turkeltaub et al., 2002). They selected 35 PET or fMRI studies: 18 for the PPS category assessing brain regions involved in encoding of unisensory and multisensory stimuli within PPS (in peri-hand, peri-face and peri-trunk); 17 for the BSC category assessing brain regions involved of a body or a part of the body. They identified a bilateral PPS network including superior parietal, temporo-parietal and ventral premotor regions. These parioto-frontal networks are involved in many sensory-motor processes, mediating interactions between individual and the immediate environment, integrating sensory information and driving potential motor responses. (Graziano and Cooke, 2006; Làdavas and Serino, 2008; Cléry et al., 2015b; Grivaz et al., 2017). The BSC network include posterior parietal cortex (right intraparietal sulcus, IPS; and left IPS and superior parietal lobule, (SPL), right ventral premotor cortex, and the left anterior insula. These regions are involved in multisensory integration, attention and awareness. The insula plays a key role in integration of exteroceptive body-related cues and interoceptive signals that have been considered important for generating subjective experience (Craig, 2009; Damasio and Meyer, 2009; Park and Tallon-Baudry, 2014; Seth, 2013; Seth and Friston, 2016; Tsakiris, 2010). Although BSC and PPS representations was not associated with exactly the same functions, they have a largely overlapping fronto-parietal networks.
The conjunction analysis showed that PPS and BSC tasks anatomically overlapped only in two clusters located in the left parietal cortex (dorsally at the intersection between the SPL, the IPS and area 2 and ventrally between areas 2 and IPS). The activation of this dorsal SPL/IPS comforts the hypothesis that multisensory integration of bodily cues in the PPS is a crucial mechanism involved in BSC (Brozzoli et al., 2012a; Gentile et al., 2013; Grivaz et al., 2017).
The premotor and insular clusters involved in BSC were systematically co-activated with the parietal clusters involved in PPS processing during several cognitive tasks suggesting that these regions were functionally largely interconnected between each other. Fronto-parietal areas involved in PPS are located more proximal to the central sulcus and BSC areas appeared more distal. Therefore, this anatomical distinction can subserve the different functions of the two processes, whereby PPS areas underlie a multisensory-motor interface for body-objects interaction and BSC areas being involved in bodily awareness and self-consciousness.
Peripersonal space representation subserved principally by a parieto-frontal network, involved complex mechanisms and depend on a numerous factor. Our PPS representation is continuously change around us with sensory information, experiencing and feeling to adapt the best possible to our environment (Figure 1). Besides this PPS representation network have a strong connectivity with the BSC network and we may suggest that impairment in PPS representation can have consequences onto self-consciousness. This opens new research way for the future years.
Figure 1: Peripersonal space representation is modulated by numerous factor as much by impact prediction or approaching stimuli as by social, emotional and action components. A/ There are at least three sub-representations of the PPS: the trunk, the face and the hand. B/ These representations can be merge following their relative distance from the trunk.