1. Introduction
Understanding human-robot interaction (HRI) has become increasingly important with the rise in development and use of social robots in fields such as education, elderly care, and therapy [1-4].There is a large body of existing studies looking into how humans interact with and perceive social and humanoid robots, and specifically, to what extent such interactions are comparable to human-human interactions (HHI). Touch, a common form of interpersonal communication between humans, is understudied within HRI. Yet, studies show people regularly seek to engage with robots through touch [5-7] and that robot touch may have similar influences to touch between humans [8-10]. Understanding the role of touch in HRI, including its similarities to HHI and its social and physiological impact on humans, is essential to the future development and implementation of social robotics, ultimately improving the overall quality and effectiveness of human-robot interactions.
We aim to contribute to this process by exploring the physiological effect of human-initiated robot touch, and further, the influence of anthropomorphic framing on this interaction.
Touch is defined as physical contact between two or more individuals [
11] and historically, it has served both communicative and relational functions for humanity. Touch allows for the nonverbal communication of messages and emotions, as well as creation of intimacy and trust between individuals [
11]. It has shown to influence the trust and liking of others [
12] and encourage or discourage different prosocial behaviors and performances [
13]. Touch between humans also has significant physiological effects: in his arousal model of interpersonal intimacy, Patterson explains how touch evokes measurable changes in physiological arousal [
14]. Touch in social communication, for example, in the form of handshaking and hugging, has also proven to reduce stress and anxiety [15-18] and produce a positive care effect [
19].
The impact of touch is highly dependent on the social context of the interaction, including which body part is touched. The concept of body accessibility addresses people’s willingness to let others touch various regions of their body [
20]. Coined by S.M Jourard, he assessed body accessibility by how frequently people touched and were touched in 24 different regions. While touching hands and arms during handshakes or hugs is generally more acceptable, touching more vulnerable areas, such as the head, neck, torso, lower back, buttocks, and genitalia can be seen as less positive, and even an invasion of privacy, depending on the relationship between individuals [20-21].
Gender dynamics have also been proven to influence touch behavior, unsurprising considering the prevalent role of gender norms in society. In their study, Richard Hesin et al. found that women often derive meaning from touch based on their relationship with the other individual, while men tend to be more impacted by the other person’s gender [
22]. For example, women in this study found touch from an opposite sex stranger unpleasant and touch from an opposite sex close friend pleasant. On the other hand, male participants were just as comfortable with touch from a woman who is a stranger and a woman who is a close friend. A study conducted by Hubbard et al. found that cross-gender touch results in more favorable perceptions and reactions while waitressing, or counseling [
23]. At the same time, it has been shown that men tend to perceive physical contact more positively than females [
24].
How much of what is understood about touch transfers to robots? Social robots have already been implemented in various fields, and thus we know that touching social, pet-like robots such as PARO reduces pain and stress [9-10], and robotic arms performing touches can enhance positive emotional responses from human participants [
19]. Humans touch robots in similar ways they would touch humans, as shown in a study conducted by Andreasson et al. on affective touch in HRI [
25], in which tactile conveyance of emotions on humanoid robot NAO was observed, as well as in a study conducted by Yohanan et al. on how humans touch a haptic creature robot, in which humans were found to convey nine different emotions through touch [
8]. Such studies suggest a similarity in the use and impact of human-human and human-robot touch interactions, but more studies are needed to understand the true extent of this claim, including an exploration of what variables impact the effects of touch within HRI.
One of the more significant studies in this area was conducted by Jamy Jue Li et al. [
26]. It explores the physiological impact of touching robots on humans through examining body accessibility. In their study, they instructed participants to touch or point to various body parts of a humanoid robot, varying in levels of accessibility. Their skin conductance response was recorded in order to measure physiological arousal. They found that touching less accessible regions (e.g., genitals, thigh, buttocks) resulted in a higher physiological arousal of human participants compared to more accessible regions (e.g., shoulder, arm, hand). Pointing to these same regions, however, did not result in differences in arousal. The study demonstrates the physiological impact of HRI touch and showcases the transfer of accessibility zones to robots.
The results of this study raise a number of questions about the physiological arousal effect found from touching robots. It is not clear how much of the physiological arousal observed in the study is due to perceiving the robot’s body as human-like and how much of it is due to the semantics used to describe the robot's body, in this case, human anatomical body part names. Does the observed process of anthropomorphization take place through what we see? Or rather, through the words we hear?
Therefore, the study presented in this paper is a replica of the robot touch study conducted by Li et al., with the addition of several changes and conditions. We wish to study the effect of anthropomorphization on physiological arousal, through the anthropomorphic framing of a robot’s body parts (anatomical names vs. numbered body parts) and gender (male-robot vs. female-robot). Additionally, we will conduct the experiment on the humanoid robot Pepper instead of Nao.
Anthropomorphization is defined as attributing human qualities, characteristics, and behaviors to nonhuman entities. Various variables have been proven to influence human anthropomorphization of robots, namely robot characteristics including physical embodiment [
27], movements, gestures [
28], and language [
29]. Anthropomorphization can also be impacted by humans’ mental models of robots, an internal representation that dictates how we perceive and understand them [
30]. This is impacted by our individual experiences and characteristics, such as gender, age [31-32], and technological experience level [33-35], but it can also be altered through the use of language, or linguistic framing.
The way we speak about robots has the power to change our perception and understanding of them. Thus, if we use language to frame robots as if they were human, for example, by introducing it with a human name, backstory, and even gender, we can trigger anthropomorphization. The effects of linguistic framing for the purpose of anthropomorphization, i.e. anthropomorphic framing, have been shown in several studies. Reactions to kicking Spot, a robot dog, were significantly more negative after the dog was given a name and backstory [
36]. Similarly, research study participants exhibited more hesitancy when striking the Hexbug Nano, a robotic insect, with a mallet after it was given a name and backstory [
37]. In their study, Tobias Kopp et al. explored effects of linguistic framing on anthropomorphization and human-robot trust in industrial environments, and found that human-like framing of robots in the workplace increased employee trust when the human-robot relation was perceived as cooperative [
38]. Even the pronouns we use when referring to robots have a significant impact: using “he” and “she” instead of “it” can indicate the robot appears to us as a “quasi-other” as opposed to just an object, as Coeckelbergh discusses in his study on the linguistic construction of artificial others [
39]. Westlund et al. examined the use of pronouns as well, and found that when experimenters introducing a robot to children spoke to the robot using the personal pronoun “you” instead of the impersonal pronoun “it”, children showed more signs of social interaction with the robot [
40].
Attributing gender to the robot through the use of names and pronouns, can also itself be a form of anthropomorphic framing, with significant impacts. For example, in their experiment on anthropomorphism in autonomous vehicles, Waytz et al. discovered providing a name, gender, and voice helped users anthropomorphize the vehicle, and in turn, trust it more [
41].
Touch is heavily influenced by context, and thus, it's important to explore the role of framing on touch between humans and robots. To address these limitations in the original study, in our study the robot used will be anthropomorphically framed in two ways: the use of anatomical body part names and gendered names and pronouns. During the study, either human body part names or numerical digits will be used when instructing participants to touch different regions of a robot. The robot will also be attributed either a male or female name (Adam, Ada) and personal pronouns (he, she) during its introduction. The use of this language will frame the robot as human-like, impacting the way it is perceived and encouraging anthropomorphization.
We expect to observe the following outcomes:
H1: In the condition of anthropomorphically framing the robot body parts through the use of anatomical names, the subjects will feel stronger arousal in comparison to the control condition, in which the parts are referred to using numerical digits.
The anthropomorphic framing of the robot through use of body part names should increase anthropomorphization, leading to participants utilizing human-human interaction frameworks. Due to what we know about touch and accessibility zones between humans, we hypothesize increased levels of physiological arousal when compared to participants instructed to touch body parts referred to with numbers.
H2: Physiological arousal is inversely related to the “availability” of a given part of the robotic body for touch.
Physiological excitation was defined as the change in electrodermal arousal from the prompt stage to the action stage [
42]. Researchers [
26] reported differences in skin conductance response when they categorized robot’s bodily parts by their body accessibility rating into high, medium, and low tertiles according to how frequently that region is touched in interpersonal communication according to Jourard [
22]. Thus, within this study we wish to verify those findings.
Attributing a gender to the robot in our study will also allow us to explore the impact of gender on human-robot touch interactions. In general, there is a well-documented influence of gender in HRI. For example, in their study, Kuchenbrandt et al. show us that the gender typicality of HRI tasks substantially influences human-robot interactions as well as human perception and acceptance of a robot [
43]. Further studies have shown that people evaluate a robot of the opposite gender more positively than a same-gender robot [
43], and that men tend to trust and engage with female robots more [
44].
In regards to anthropomorphization, it seems men tend to anthropomorphize robots more than women [
31] and that they may be more impacted by anthropomorphization. A study conducted by Pelau et al. indicates men are more sensitive to anthropomorphic characteristics of AI devices [
45], and Cheng and Chen’s study results indicate that robots with anthropomorphic appearances generate higher pleasure among men in comparison to women [
46].
However, there are limited HRI studies on the relationship between gender and touch, and no clear results regarding the role of gender on the physiological arousal of humans when touching robots. When considering the documented gender effect in HRI and the role of gender in touch between humans, there is reason to believe that robot and participant gender will play a significant role in our study. It is for this reason we formulated the following research question:
RQ1: How will the physiological arousal experienced from cross-gender touch vary from arousal experienced from same-gender touch?
Finally, a person’s attitude towards robots has been shown to impact the way they interact with and are impacted by robots. For example, in a study conducted by Cramer et al. it was found that participants’ attitudes towards robots influenced how they perceived human-robot touch interactions: participants with more positive attitudes towards robots found the robots engaging in touch less machine-like [
47]. Attitudes were evaluated using the NARS (Negative Attitude towards Robots Scale), developed by Nomura et al. [
48]. In their study, Picarra et al. also used this scale to predict future intentions to work with social robots [
49]. With this understanding, we wish to further explore how our participant’s attitudes towards robots impacts the physiological arousal they may experience during human-robot touch interaction, and thus we formulated the following research question:
RQ2: Will physiological arousal when touching a robot be related to the attitudes and beliefs of the subjects about robots?
3. Results
In order to take the sampling hierarchy and handling with missing data the analysis employed multilevel modeling (MLM) with restricted maximum likelihood performed with the use of Jamovi 2.3 and gamlj package. The fixed effect’s structure included four a priori selected factors: names of robot parts (body parts vs. numbers), participant gender (female vs. male), robot gender (female vs. male) and the accessibility of robot parts (high vs. medium vs. low - see [
22], with high accessibility being a reference level) and their respective interactions, while the random effects structure were selected based on a bottom-up model building strategy. First, the model was created with a minimal factor structure - i.e., only random intercepts for participants. Next, random-effects of each factor (random slopes) along with their interactions were added to the model. All models that did not fail model convergence were then compared based on Akaike Information Criterion (AIC). The model that fit the data best, except for random intercept included also random effects of accessibility. The covariance structure was set as correlated (unstructured).
The results show that in general, the group of participants asked to touch robot regions referred to with anatomical body part names were more physiologically aroused than those asked using numerical digits - main effect of names of robot parts B = 0.08 [0.01, 0.15], t(132) = 2.25, p = .027. Also, touching the male robot caused participants to be more aroused than touching the female robot - main effect of robot gender B = 0.08 [0.02, 0.15], t(132) = 2.41, p = .017. Additionally, we found an interaction of the way robot parts were named and robot gender - B = 0.21 [0.07, 0.64], t(132) = 2.92, p = .004. A simple effect analysis revealed that in the case of touching a male robot referred to with anatomical body part names, arousal was significantly increased as compared to naming body parts with numbers (numbers - M=0.28, SE=0.04 vs body parts - M=0.46, SE=0.03 - B = 0.18 [0.08, 0.28], t(130) = 3.70, p < .001), but there there were no differences in case of female robot (numbers - M=0.30, SE=0.04 vs body parts - M=0.28, SE=0.03)(see:
Figure 3). Finally, accessibility of robot parts also caused differences in participants' arousal – less accessible parts caused higher arousal (high vs. medium - B = 0.04 [0.01, 0.07], t(692) = 2.19, p = .029 and high vs. low - B = 0.04 [0.01, 0.07], t(448) = 2.33, p = .020) (see:
Figure 4).
Additional exploratory analyses were conducted to determine whether attitudes toward robots (as measured by NARS and BHNU scales) impacted participants' reported feelings during the study and their level of physiological arousal. On average, participants’ attitudes toward robots were relatively negative - NARS (M = -0.69, SD = 0.64 - 5 point scale for our calculations was coded from -2 to +2.), while their beliefs in human nature uniqueness were slightly positive - M = 0.07, SD = 0.78 (scale between -2 to +2).
Correlation analysis showed a negative relationship between the feelings reported after finishing the experimental procedure and the NARS scale index - Pearsons’ r = -.30, p < .001, which suggests that participants having negative attitudes towards robots reported more negative feelings about the interaction they experienced with the robot.
In turn, there was no relationship between the feelings reported by participants after the experiment and the BHNU scale index (Pearsons’ r = .03, p = .699). There were also no significant relationships between NARS and BHNU scales indexes and averaged excitation experienced during experiment (as measured SCR) - respectively Pearsons’ r = -.07, p = .407 and Pearsons’ r = -.06, p = .467. Importantly, adding NARS and BHNU scales indexes as covariates to tested linear models did not increase model fit.