Preprint
Article

Touching a Mechanical Body: The Role of Anthropomorphic Framing in Physiological Arousal When Touching a Robot

Altmetrics

Downloads

108

Views

36

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

01 June 2023

Posted:

01 June 2023

You are already at the latest version

Alerts
Abstract
The growing prevalence of social robots in various fields necessitates a deeper understanding of touch in Human-Robot Interaction (HRI). This study investigates how human-initiated touch influences physiological responses during interactions with robots, considering factors such as anthropomorphic framing of robot body parts and attributed gender. Two types of anthropomorphic framings are applied: the use of anatomical body part names and assignment of male or female gender to the robot. Higher physiological arousal is observed when touching less accessible body parts than when touching more accessible body parts in both conditions. Results also indicate that using anatomical names intensifies arousal compared to the control condition. Additionally, touching the male robot resulted in higher arousal in all participants, especially when anatomical body part names were used. This study contributes to the understanding of how anthropomorphic framing and gender impact physiological arousal in touch interactions with social robots, offering valuable insights for social robotics development.
Keywords: 
Subject: Social Sciences  -   Psychology

1. Introduction

Understanding human-robot interaction (HRI) has become increasingly important with the rise in development and use of social robots in fields such as education, elderly care, and therapy [1-4].There is a large body of existing studies looking into how humans interact with and perceive social and humanoid robots, and specifically, to what extent such interactions are comparable to human-human interactions (HHI). Touch, a common form of interpersonal communication between humans, is understudied within HRI. Yet, studies show people regularly seek to engage with robots through touch [5-7] and that robot touch may have similar influences to touch between humans [8-10]. Understanding the role of touch in HRI, including its similarities to HHI and its social and physiological impact on humans, is essential to the future development and implementation of social robotics, ultimately improving the overall quality and effectiveness of human-robot interactions.
We aim to contribute to this process by exploring the physiological effect of human-initiated robot touch, and further, the influence of anthropomorphic framing on this interaction.
Touch is defined as physical contact between two or more individuals [11] and historically, it has served both communicative and relational functions for humanity. Touch allows for the nonverbal communication of messages and emotions, as well as creation of intimacy and trust between individuals [11]. It has shown to influence the trust and liking of others [12] and encourage or discourage different prosocial behaviors and performances [13]. Touch between humans also has significant physiological effects: in his arousal model of interpersonal intimacy, Patterson explains how touch evokes measurable changes in physiological arousal [14]. Touch in social communication, for example, in the form of handshaking and hugging, has also proven to reduce stress and anxiety [15-18] and produce a positive care effect [19].
The impact of touch is highly dependent on the social context of the interaction, including which body part is touched. The concept of body accessibility addresses people’s willingness to let others touch various regions of their body [20]. Coined by S.M Jourard, he assessed body accessibility by how frequently people touched and were touched in 24 different regions. While touching hands and arms during handshakes or hugs is generally more acceptable, touching more vulnerable areas, such as the head, neck, torso, lower back, buttocks, and genitalia can be seen as less positive, and even an invasion of privacy, depending on the relationship between individuals [20-21].
Gender dynamics have also been proven to influence touch behavior, unsurprising considering the prevalent role of gender norms in society. In their study, Richard Hesin et al. found that women often derive meaning from touch based on their relationship with the other individual, while men tend to be more impacted by the other person’s gender [22]. For example, women in this study found touch from an opposite sex stranger unpleasant and touch from an opposite sex close friend pleasant. On the other hand, male participants were just as comfortable with touch from a woman who is a stranger and a woman who is a close friend. A study conducted by Hubbard et al. found that cross-gender touch results in more favorable perceptions and reactions while waitressing, or counseling [23]. At the same time, it has been shown that men tend to perceive physical contact more positively than females [24].
How much of what is understood about touch transfers to robots? Social robots have already been implemented in various fields, and thus we know that touching social, pet-like robots such as PARO reduces pain and stress [9-10], and robotic arms performing touches can enhance positive emotional responses from human participants [19]. Humans touch robots in similar ways they would touch humans, as shown in a study conducted by Andreasson et al. on affective touch in HRI [25], in which tactile conveyance of emotions on humanoid robot NAO was observed, as well as in a study conducted by Yohanan et al. on how humans touch a haptic creature robot, in which humans were found to convey nine different emotions through touch [8]. Such studies suggest a similarity in the use and impact of human-human and human-robot touch interactions, but more studies are needed to understand the true extent of this claim, including an exploration of what variables impact the effects of touch within HRI.
One of the more significant studies in this area was conducted by Jamy Jue Li et al. [26]. It explores the physiological impact of touching robots on humans through examining body accessibility. In their study, they instructed participants to touch or point to various body parts of a humanoid robot, varying in levels of accessibility. Their skin conductance response was recorded in order to measure physiological arousal. They found that touching less accessible regions (e.g., genitals, thigh, buttocks) resulted in a higher physiological arousal of human participants compared to more accessible regions (e.g., shoulder, arm, hand). Pointing to these same regions, however, did not result in differences in arousal. The study demonstrates the physiological impact of HRI touch and showcases the transfer of accessibility zones to robots.
The results of this study raise a number of questions about the physiological arousal effect found from touching robots. It is not clear how much of the physiological arousal observed in the study is due to perceiving the robot’s body as human-like and how much of it is due to the semantics used to describe the robot's body, in this case, human anatomical body part names. Does the observed process of anthropomorphization take place through what we see? Or rather, through the words we hear?
Therefore, the study presented in this paper is a replica of the robot touch study conducted by Li et al., with the addition of several changes and conditions. We wish to study the effect of anthropomorphization on physiological arousal, through the anthropomorphic framing of a robot’s body parts (anatomical names vs. numbered body parts) and gender (male-robot vs. female-robot). Additionally, we will conduct the experiment on the humanoid robot Pepper instead of Nao.
Anthropomorphization is defined as attributing human qualities, characteristics, and behaviors to nonhuman entities. Various variables have been proven to influence human anthropomorphization of robots, namely robot characteristics including physical embodiment [27], movements, gestures [28], and language [29]. Anthropomorphization can also be impacted by humans’ mental models of robots, an internal representation that dictates how we perceive and understand them [30]. This is impacted by our individual experiences and characteristics, such as gender, age [31-32], and technological experience level [33-35], but it can also be altered through the use of language, or linguistic framing.
The way we speak about robots has the power to change our perception and understanding of them. Thus, if we use language to frame robots as if they were human, for example, by introducing it with a human name, backstory, and even gender, we can trigger anthropomorphization. The effects of linguistic framing for the purpose of anthropomorphization, i.e. anthropomorphic framing, have been shown in several studies. Reactions to kicking Spot, a robot dog, were significantly more negative after the dog was given a name and backstory [36]. Similarly, research study participants exhibited more hesitancy when striking the Hexbug Nano, a robotic insect, with a mallet after it was given a name and backstory [37]. In their study, Tobias Kopp et al. explored effects of linguistic framing on anthropomorphization and human-robot trust in industrial environments, and found that human-like framing of robots in the workplace increased employee trust when the human-robot relation was perceived as cooperative [38]. Even the pronouns we use when referring to robots have a significant impact: using “he” and “she” instead of “it” can indicate the robot appears to us as a “quasi-other” as opposed to just an object, as Coeckelbergh discusses in his study on the linguistic construction of artificial others [39]. Westlund et al. examined the use of pronouns as well, and found that when experimenters introducing a robot to children spoke to the robot using the personal pronoun “you” instead of the impersonal pronoun “it”, children showed more signs of social interaction with the robot [40].
Attributing gender to the robot through the use of names and pronouns, can also itself be a form of anthropomorphic framing, with significant impacts. For example, in their experiment on anthropomorphism in autonomous vehicles, Waytz et al. discovered providing a name, gender, and voice helped users anthropomorphize the vehicle, and in turn, trust it more [41].
Touch is heavily influenced by context, and thus, it's important to explore the role of framing on touch between humans and robots. To address these limitations in the original study, in our study the robot used will be anthropomorphically framed in two ways: the use of anatomical body part names and gendered names and pronouns. During the study, either human body part names or numerical digits will be used when instructing participants to touch different regions of a robot. The robot will also be attributed either a male or female name (Adam, Ada) and personal pronouns (he, she) during its introduction. The use of this language will frame the robot as human-like, impacting the way it is perceived and encouraging anthropomorphization.
We expect to observe the following outcomes:
H1: In the condition of anthropomorphically framing the robot body parts through the use of anatomical names, the subjects will feel stronger arousal in comparison to the control condition, in which the parts are referred to using numerical digits.
The anthropomorphic framing of the robot through use of body part names should increase anthropomorphization, leading to participants utilizing human-human interaction frameworks. Due to what we know about touch and accessibility zones between humans, we hypothesize increased levels of physiological arousal when compared to participants instructed to touch body parts referred to with numbers.
H2: Physiological arousal is inversely related to the “availability” of a given part of the robotic body for touch.
Physiological excitation was defined as the change in electrodermal arousal from the prompt stage to the action stage [42]. Researchers [26] reported differences in skin conductance response when they categorized robot’s bodily parts by their body accessibility rating into high, medium, and low tertiles according to how frequently that region is touched in interpersonal communication according to Jourard [22]. Thus, within this study we wish to verify those findings.
Attributing a gender to the robot in our study will also allow us to explore the impact of gender on human-robot touch interactions. In general, there is a well-documented influence of gender in HRI. For example, in their study, Kuchenbrandt et al. show us that the gender typicality of HRI tasks substantially influences human-robot interactions as well as human perception and acceptance of a robot [43]. Further studies have shown that people evaluate a robot of the opposite gender more positively than a same-gender robot [43], and that men tend to trust and engage with female robots more [44].
In regards to anthropomorphization, it seems men tend to anthropomorphize robots more than women [31] and that they may be more impacted by anthropomorphization. A study conducted by Pelau et al. indicates men are more sensitive to anthropomorphic characteristics of AI devices [45], and Cheng and Chen’s study results indicate that robots with anthropomorphic appearances generate higher pleasure among men in comparison to women [46].
However, there are limited HRI studies on the relationship between gender and touch, and no clear results regarding the role of gender on the physiological arousal of humans when touching robots. When considering the documented gender effect in HRI and the role of gender in touch between humans, there is reason to believe that robot and participant gender will play a significant role in our study. It is for this reason we formulated the following research question:
RQ1: How will the physiological arousal experienced from cross-gender touch vary from arousal experienced from same-gender touch?
Finally, a person’s attitude towards robots has been shown to impact the way they interact with and are impacted by robots. For example, in a study conducted by Cramer et al. it was found that participants’ attitudes towards robots influenced how they perceived human-robot touch interactions: participants with more positive attitudes towards robots found the robots engaging in touch less machine-like [47]. Attitudes were evaluated using the NARS (Negative Attitude towards Robots Scale), developed by Nomura et al. [48]. In their study, Picarra et al. also used this scale to predict future intentions to work with social robots [49]. With this understanding, we wish to further explore how our participant’s attitudes towards robots impacts the physiological arousal they may experience during human-robot touch interaction, and thus we formulated the following research question:
RQ2: Will physiological arousal when touching a robot be related to the attitudes and beliefs of the subjects about robots?

2. Materials and Methods

2.1. Participants

One hundred sixty adults were recruited for this study, including 80 females and 80 males. Participants were randomly selected and between 18 and 58 years old. After cleaning the data, 141 participants had valid data: 83 females and 58 males. A majority of participants were university students. All participants consented to participating in the study, and were unaware of the true goal of the study until the experiment was complete and the purpose was clearly explained to them. After the experiment, participants were asked to fill out a questionnaire in which they provided their prior experience with humanoid robots and technology and their general approach and beliefs towards such devices.

2.2. Design

A 2 (person-sex: female vs male) x 2 (robot-sex: robot-female vs robot-male) x 2 (instruction: body part names vs digits) between-participants study was conducted in which people were asked to touch a humanoid robot.

2.3. Materials

2.3.1. Pepper Humanoid Robot

The robot used in this study was Pepper from Aldebaran Robotics, owned by the HumanTech Center at SWPS University.This is a humanoid robot standing 1.20 meters tall, with an articulated head, eyes, arms, and fingers. The robot does not have a distinct nose, ears, legs, genitals, or buttocks. The robot was programmed by our research team using QiSDK and Android Studio. The application was deployed on the robot and remotely controlled by the experimenter. Pepper also has a 10.1 inch tablet embedded on its chest, which was used to display instructions to participants. Various marked diagrams of the robot were displayed, corresponding with the places the participants should touch the robot.
Figure 1. Examples of diagrams displaying the robot and the body part that should be touched, using either a digit (a) or the anatomical body name (b) (“Oko” means “eye” in polish).
Figure 1. Examples of diagrams displaying the robot and the body part that should be touched, using either a digit (a) or the anatomical body name (b) (“Oko” means “eye” in polish).
Preprints 75340 g001

2.3.2. Skin Conductance Response Measure and Signal Processing

Electrodermal activity was recorded using a BioPac MP160, digitized with 24-bit resolution, sampled at 1kHz, and recorded on a PC. All digital transformations and further data extractions were performed with the use of Neurokit2 [50]. The EDA signal was filtered with a 3 Hz cutoff frequency and a 4th order Butterworth filter. Skin Conductance Response was measured as the peak amplitude of the first SCR in each epoch - i.e., 7 second period when participants were attempting to touch. Since the recorded signal could have contained artifacts due to additional body movements (i.e. loss of balance when touching different parts of the robot, scratching the hands around the electrode placement area, additional movements of the hand with electrodes, body turning, etc.), the recorded videos were analyzed to detect and remove data from such trials.

2.3.3. Final Questionnaire

The final questionnaire used in this study consisted of two parts. The first section collected information about the participant, including questions regarding age, gender, year in school, and direction of studies. Participants were also asked about their dominant hand (left / right) and whether or not they had had any interactions with robots ( “Have you ever had personal, direct contact with a humanoid (human-like) robot?”). Next they were asked about their well-being during the experiment in the form of semantic differentials. Participants were prompted to answer “In general, while in contact with the robot I felt…” on a 5-point Likert scale for 5 different pairs of descriptors (bad to good, unnatural to natural, tense to relaxed, threatened to safe, and uncomfortable to comfortable). Cronbach's alpha coefficient for this scale named by us “Feelings during the interaction”: 0.847.
The second section of the questionnaire contained questions from the NARS (Negative Attitudes towards Robots Scale) and BHANU (Belief in Human Nature Uniqueness Scale). Both questionnaires are available in the Polish adaptation [51]. The NARS-PL scale consists of two subscales: the subscale of negative attitudes towards interactions with robots (NATIR) and the subscale of negative attitudes towards robots with human features (NARHT). The questionnaire contains 13 statements such as "I would feel relaxed talking with robots" (NARHT) or "I would feel very nervous just standing in front of a robot" (NATIR). Cronbach's alpha coefficient for this scale: 0.815.
The BHNU scale consists of 6 questions concerning beliefs about the uniqueness of human nature. Examples of statements include: "A robot will never be considered human" or "A robot will never have morality." In the case of both tools, respondents responded to the statements on a 5-point Likert scale (1 – totally disagree to 5 – totally agree). Cronbach's alpha coefficient for this scale: 0.717.

2.4. Procedure

A preliminary test of the experiment was executed before the actual data collection began. The entire procedure was tested three times in a target environment and necessary corrections were added to the software, experimental setup, laboratory setup and procedure timing.
This experiment took place in a small laboratory room at SWPS University. Prior to beginning, all participants consented to being recorded (without face being visible). The subject was informed about the course of the experiment. To give credibility and rationality to the experiment, it was presented as "testing the sensitivity of the robot's sensors to the touch of a human hand".
The procedure went as follows. A set of Ag/Cl electrodes were placed onto the participant’s fingers, positioned on their non-dominant hand. They were positioned standing in front of the robot, and then instructed to avoid sudden movements, keep around a 30 cm distance from the robot, and not move the hand connected to the measurement device. They were told the robot will display information about where it should be touched on its tablet, which they should follow using the hand not attached to the measurement device. This process took around 5 minutes. They were then left alone in the room, and the experimentation sequence began.
At the beginning of the experiment, the robot introduced itself to the participant as either a robot-male or a robot-female. This introduction was made up of three components: 1) The experimenter verbally introduced the robot using “he” or “she” pronouns; 2) after the experimenter left the room, the robot presented itself as a woman - “Cześć jestem Ada” (“Hi, I'm Ada”) or man: “Cześć, jestem Adam” (“Hi, I’m Adam”); 3) the voice type used by the robot varied depending on gender, using either a lower-pitched, distinctively male voice or a higher-pitched, distinctively female voice. The gender of the robot (robot-female or robot-male) and the body part labels (anatomical names or digits) were randomly selected for each participant.
The robot then displayed a pre-programmed sequence of body parts on the robot’s embedded tablet. Each participant was asked to touch 11 different places (randomized) three separate times, labeled either with anatomical terminology or numerical digits. 11 places were used. The sequence involved a 3 second countdown, followed by a diagram displaying the robot and the body part that should be touched for 7 seconds, and then a 10 second cooldown at the end. The timing of the sequence was as follows:
[[(3s synchronization + 7s body part image + 10 s cool down] x 11 body parts] x 3 times
This process took 11 minutes in total. The body parts shown were randomized for each sub-sequence (see Figure 2).
After the experimental part of the study, participants were invited to a second room where they were asked to complete a final questionnaire on a computer. This process took no longer than 10 minutes.
Once the questionnaire was complete, participants were informed about the true purpose of the experiment and had the opportunity to ask various questions about the robot and the entire experimental procedure. This debriefing process took approximately 5 minutes. Furthermore, each participant was rewarded with a book for their participation. In total, the entire procedure took around 30 minutes

3. Results

In order to take the sampling hierarchy and handling with missing data the analysis employed multilevel modeling (MLM) with restricted maximum likelihood performed with the use of Jamovi 2.3 and gamlj package. The fixed effect’s structure included four a priori selected factors: names of robot parts (body parts vs. numbers), participant gender (female vs. male), robot gender (female vs. male) and the accessibility of robot parts (high vs. medium vs. low - see [22], with high accessibility being a reference level) and their respective interactions, while the random effects structure were selected based on a bottom-up model building strategy. First, the model was created with a minimal factor structure - i.e., only random intercepts for participants. Next, random-effects of each factor (random slopes) along with their interactions were added to the model. All models that did not fail model convergence were then compared based on Akaike Information Criterion (AIC). The model that fit the data best, except for random intercept included also random effects of accessibility. The covariance structure was set as correlated (unstructured).
The results show that in general, the group of participants asked to touch robot regions referred to with anatomical body part names were more physiologically aroused than those asked using numerical digits - main effect of names of robot parts B = 0.08 [0.01, 0.15], t(132) = 2.25, p = .027. Also, touching the male robot caused participants to be more aroused than touching the female robot - main effect of robot gender B = 0.08 [0.02, 0.15], t(132) = 2.41, p = .017. Additionally, we found an interaction of the way robot parts were named and robot gender - B = 0.21 [0.07, 0.64], t(132) = 2.92, p = .004. A simple effect analysis revealed that in the case of touching a male robot referred to with anatomical body part names, arousal was significantly increased as compared to naming body parts with numbers (numbers - M=0.28, SE=0.04 vs body parts - M=0.46, SE=0.03 - B = 0.18 [0.08, 0.28], t(130) = 3.70, p < .001), but there there were no differences in case of female robot (numbers - M=0.30, SE=0.04 vs body parts - M=0.28, SE=0.03)(see: Figure 3). Finally, accessibility of robot parts also caused differences in participants' arousal – less accessible parts caused higher arousal (high vs. medium - B = 0.04 [0.01, 0.07], t(692) = 2.19, p = .029 and high vs. low - B = 0.04 [0.01, 0.07], t(448) = 2.33, p = .020) (see: Figure 4).
Additional exploratory analyses were conducted to determine whether attitudes toward robots (as measured by NARS and BHNU scales) impacted participants' reported feelings during the study and their level of physiological arousal. On average, participants’ attitudes toward robots were relatively negative - NARS (M = -0.69, SD = 0.64 - 5 point scale for our calculations was coded from -2 to +2.), while their beliefs in human nature uniqueness were slightly positive - M = 0.07, SD = 0.78 (scale between -2 to +2).
Correlation analysis showed a negative relationship between the feelings reported after finishing the experimental procedure and the NARS scale index - Pearsons’ r = -.30, p < .001, which suggests that participants having negative attitudes towards robots reported more negative feelings about the interaction they experienced with the robot.
In turn, there was no relationship between the feelings reported by participants after the experiment and the BHNU scale index (Pearsons’ r = .03, p = .699). There were also no significant relationships between NARS and BHNU scales indexes and averaged excitation experienced during experiment (as measured SCR) - respectively Pearsons’ r = -.07, p = .407 and Pearsons’ r = -.06, p = .467. Importantly, adding NARS and BHNU scales indexes as covariates to tested linear models did not increase model fit.

4. Discussion

Our study results contribute to the understanding of touch in human-robot interaction (HRI) by examining the effects of anthropomorphic framing and gender on physiological arousal during touch interactions with robots.

4.1. Body Part Availability

In line with our second hypothesis (H2), we found that physiological arousal was inversely related to the “availability” of the robot’s body part, supporting the findings of Li et al [26]. This indicates that the concept of body accessibility, established in human-human interactions (HHI) can also apply to HRI.

4.2. Anthropomorphic Framing of Body Parts

Regarding our first hypothesis (H1), our results demonstrated that anthropomorphic framing of the robot’s body parts through the use of anatomical names leads to stronger physiological arousal compared to the control condition, in which body parts are referred to using numerical digits. The findings suggest that anthropomorphic framing influences touch interactions between humans and robots supporting previous research on the effects of anthropomorphic framing [27-29, 36-40]. These results give us reason to believe that anthropomorphic framing of a robot will lead to human participants being more physiologically impacted during the touch interaction between them.

4.3. Anthropomorphic Framing of Gender

Our research question (RQ1) explores the impact of gender on the physiological arousal during touch interactions with robots. Our results found that both male and female participants were more physiologically aroused when touching the robot-male when compared to touching the robot-female. This could be attributed to the physical embodiment of the robot Pepper as well as several societal norms regarding cross-gender and same-gender touch present in western cultures.
Although in our study we attributed gender to Pepper the robot using gendered pronouns and names, it is possible that Pepper’s physical features (shoulder and waist proportions, lack of hair) were perceived as more traditionally male. These physical features may have made Pepper’s male gender attribution more convincing, in turn increasing the physiological impact on participants.
Societal norms surrounding gender also likely play a role in these results. Men engage in less same-sex interpersonal touch than do women [51, 52]. Discomfort in same-sex male touch interactions could be due to the fact that men are socialized to restrain emotional expression, especially among other men [52-53]. Additionally, men who engage in social touching of other men are more likely to be perceived as homosexual. This is shown in a study by Roese et al, which explores the role of homophobic attitudes in same-sex touching behavior [54]. Thus, their reluctance to engage in touch could be due to homophobic attitudes and the fear of being perceived as homosexual. This is particularly relevant in our study, as it was conducted in Poland, a country with relatively stronger homophobic attitudes [55]. This discomfort men experience during same-sex interactions in HHI could be the reason for male participants’ increased physiological arousal when touching a robot, especially when the robot’s body parts are anthropomorphically framed.
On the other hand, there is reason to believe women may be more physiologically aroused when initiating touch with a male-robot because of the social dynamics of cross-sex touch interactions in HHI. Henley et al. explores the role of power and status in touch and finds that initiators of interpersonal touch are often higher in social status, while recipients of touch tend to be lower in status [56,57]. Henley and Major both found that men initiate touch with women more [56,53]. Female initiated touch goes against this norm, which could play a role in potential increased discomfort and higher physiological arousal experienced by women in female-male robot touch interactions.

4.4. Attitudes Towards Robots

The analysis of our final questionnaire showed us that participants' attitudes towards robots were relatively negative and their belief on human nature uniqueness was slightly positive, but that there was no significant relationship between these NARS and BHNU scales indexes and physiological arousal experienced during the experiment. This may be explained by the fact that participant attitude was measured directly after the experimental procedure. Their evaluations were likely strongly influenced by the experiences that took place moments before in the laboratory, and therefore may focus more on their beliefs about Pepper during the experiment instead of their beliefs about robots in general. In the next experiment, it would be worth exploring the importance of attitudes towards robots by having participants complete a questionnaire before they engage in the human-robot touch interaction.

4.5. Significance

This exploration of how body accessibility and anthropomorphic framing impact human physiological arousal in robot touch interactions offers valuable insights for social robotic development. Understanding which body parts can make a user uncomfortable is important when designing a humanoid robot's physical features, and further, when choosing placement of tactile sensors that the user is expected to interact with. In certain settings it may not be necessary to attribute these human-like features to robots, knowing they can generate additional physiological stimulation in people.
Understanding this relationship can also help guide the process of appropriately implementing social robots in different settings, be it education, elderly care, or hospitals. For example, while educational and recreational implementations may benefit from touch that increases physiological arousal, robots in hospital settings will likely want to avoid the arousal created by certain touch interactions. Knowing that gender attribution and anatomical body part names have an effect on the physiological state of participants tells us we must be intentional about how we frame robots in each context they are used, depending on the desired result of the interaction.

4.6. Limitations

Although the study showcases a strong relationship between anthropomorphic framing of body parts and physiological arousal of participants during human-robot touch interaction, a more thorough exploration of participants’ anthropomorphization levels is necessary in future studies. The effect of anthropomorphic framing on anthropomorphization could be investigated using various methods of measurement such as questionnaires and behavior measures.
Another limitation to our study is the so-called novelty effect, which states that people can respond differently to new technologies than they would from sustained use of said technologies over time [58]. Because of this, in future research it may be worth exploring the physiological impact of robot touch once the novelty effect has worn off. Perhaps then no physiological stimulation will occur in the subjects, regardless of anthropomorphic framing of body parts and gender.
Wider inference from the obtained results is also limited due to the specific construction, design of our robot and the material from which it was made. These types of machines have very different characteristics, and our results may not be replicated by using another robot.

Author Contributions

Conceptualization, visualization, supervision, project administration, funding acquisition - K.M., Software, validation, formal analysis - M.O, investigation, resources, data curation - W.D., writing - original draft, writing - review and editing - P.G., K.M., M.O. All authors have read and agreed to the published version of the manuscript.

Funding

The research was financed from the funds of the Faculty of Psychology of the SWPS University in Warsaw.

Institutional Review Board Statement

This study was approved by the Research Ethics Committee of the Faculty of Psychology in Warsaw of SWPS University (protocol code 19/2020 and date of approval 24.03.2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are openly available in FigShare at https://doi.org/10.6084/m9.figshare.23272214.v1.

Acknowledgments

We would like to thank Paweł Zarzycki for his involvement in the initial phase of research and technical support.

Conflicts of Interest

The authors declare no conflict of interest.The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Toh, L.P.E., Causo, A., Tzuo, P.W., Chen, I.M., & Yeo, S.H. A Review on the Use of Robots in Education and Young Children. Educational Technology & Society 2016, 19, 148–163.
  2. Broekens, J., Heerink, M., & Rosendal, H. Assistive social robots in elderly care: A review. Gerontechnology 2009, 8, 94–103. [CrossRef]
  3. Wada, K. & Shibata, T. Robot therapy in a care house - its sociopsychological and physiological effects on the residents. Proceedings 2006 IEEE International Conference on Robotics and Automation 2006, 2006. ICRA 2006, 3966-3971. [CrossRef]
  4. Shibata, T. & Wada, K. Robot therapy: a new approach for mental healthcare of the elderly - a mini-review. Gerontology. 2011, 57, 378–386. [CrossRef]
  5. Ogawa, K., Nishio, S., Koda, K., Balistreri, G., Watanabe, T., & Ishiguro, H. Exploring the natural reaction of young and aged person with Telenoid in a real world. Journal of Advanced Computational Intelligence and Intelligent Informatics 2011, 15, 592–597. [CrossRef]
  6. Turkle, S., Breazeal, C., Dasté, O., & Scassellati, B. (2023). Encounters with Kismet and Cog: Children Respond to Relational Artifacts.
  7. Tanaka, F., Cicourel, A., & Movellan, J.R. Socialization between toddlers and robots at an early childhood education center. Proceedings of the National Academy of Sciences of the United States of America 2007, 104, 17954–17958. [CrossRef]
  8. Yohanan, S., & MacLean, K.E. The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Haptic Creature. International Journal of Social Robotics 2012, 4, 163–180. [CrossRef]
  9. Geva, N., Uzefovsky, F., & Levy-Tzedek, S. Touching the social robot PARO reduces pain perception and salivary oxytocin levels. Touching the social robot PARO reduces pain perception and salivary oxytocin levels. Scientific Reports 2020, 10. [CrossRef]
  10. Geva, N., Hermoni, N., & Levy-Tzedek, S. Interaction Matters: The Effect of Touching the Social Robot PARO on Pain and Stress is Stronger When Turned ON vs. OFF. Frontiers in Robotics and AI 2022, 9. [CrossRef]
  11. Hoffmann, L., & Krämer, N.C. The persuasive power of robot touch. Behavioral and evaluative consequences of non-functional touch from a robot. PLoS ONE 2021, 16. [CrossRef]
  12. van Erp, J.B.F., & Toet, A. Social Touch in Human-Computer Interaction. Frontiers in Digital Humanities 2015, 2. [CrossRef]
  13. Paulsell, S. & Goldman, M. The Effect of Touching Different Body Areas on Prosocial Behavior. The Journal of Social Psychology 1984, 122, 269–273. [CrossRef]
  14. Patterson, M.L. An arousal model of interpersonal intimacy. Psychological Review 1976, 83, 235–245. [Google Scholar] [CrossRef]
  15. Burleson, M.H., & Davis, M.C. (2014). Social touch and resilience. In M. Kent, M.C. Davis, & J. W. Reich (Eds.), The resilience handbook: Approaches to stress and trauma (pp. 131–143). Routledge/Taylor & Francis Group.
  16. Ditzen, B., Germann, J., Meuwly, N., Bradbury, T.N., Bodenmann, G., & Heinrichs, M. Intimacy as Related to Cortisol Reactivity and Recovery in Couples Undergoing Psychosocial Stress. Psychosomatic Medicine 2019, 81, 16–25. [CrossRef]
  17. Ditzen, B., Neumann, I.D., Bodenmann, G., von Dawans, B., Turner, R.A., Ehlert, U., & Heinrichs, M. Effects of different kinds of couple interaction on cortisol and heart rate responses to stress in women. Psychoneuroendocrinology 2007, 32, 565–574. [CrossRef]
  18. Morrison, I. Keep Calm and Cuddle on: Social Touch as a Stress Buffer. Adaptive Human Behavior and Physiology 2016, 2, 344–362. [Google Scholar] [CrossRef]
  19. Sawabe, T., Honda, S., Sato, W., Ishikura, T., Kanbara, M., Yoshikawa, S., Fujimoto, Y., & Kato, H. Robot touch with speech boosts positive emotions. Scientific Reports 2022, 12. [CrossRef]
  20. Jourard, S.M. An Exploratory Study of Body-Accessibility. British Journal of Social and Clinical Psychology 1966, 5, 221–231. [Google Scholar] [CrossRef] [PubMed]
  21. Jones, S.E., & Yarbrough, A.E. A naturalistic study of the meanings of touch. Communication Monographs 1985, 52, 19–56. [CrossRef]
  22. Heslin, R., Nguyen, T.D., & Nguyen, M.L. Meaning of touch: The case of touch from a stranger or same sex person. Journal of Nonverbal Behavior 1983, 7, 147–157. [CrossRef]
  23. Ebesu Hubbard, A.S., Tsuji, A.A., Williams, C., & Seatriz, V. Effects of Touch on Gratuities Received in Same-Gender and Cross-Gender Dyads. Journal of Applied Social Psychology 2003, 33, 2427–2438. [CrossRef]
  24. Chaplin, W.F., Phillips, J.B., Brown, J.D., Clanton, N.R., & Stein, J.L. Handshaking, gender, personality, and first impressions. Journal of personality and social psychology 2000, 79, 110–117. [CrossRef]
  25. Andreasson, R., Alenljung, B., Billing, E., & Lowe, R. Affective Touch in Human–Robot Interaction: Conveying Emotion to the Nao Robot. International Journal of Social Robotics 2018, 10, 473–491.
  26. Li, J.J., Ju, W., Reeves, B. Touching a Mechanical Body: Tactile Contact with Intimate Parts of a Humanoid Robot is Physiologically Arousing. Journal of Human-Robot Interaction Steering Committee 2017, 6, 118–130. [CrossRef]
  27. Onnasch, L., & Roesler, E. Anthropomorphizing Robots: The Effect of Framing in Human-Robot Collaboration. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 2019, 63, 1311–1315. [CrossRef]
  28. Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., & Joublin, F. To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability. International Journal of Social Robotics 2013, 5, 313–323. [CrossRef]
  29. Kahn, P.H., Kanda, T., Ishiguro, H., Freier, N.G., Severson, R.L., Gill, B.T., Ruckert, J.H., & Shen, S. “Robovie, you’ll have to go into the closet now”: Children’s social and moral relationships with a humanoid robot. Developmental Psychology 2012, 48, 303–314. [CrossRef]
  30. Hoff, K.A. & Bashir, M. Trust in automation: integrating empirical evidence on factors that influence trust. Hum Factors 2015, 57, 407–434. [CrossRef]
  31. Schermerhorn, P., Scheutz, M., & Crowell, C.R. Robot social presence and gender: Do females view robots differently than males? HRI 2008 - Proceedings of the 3rd ACM/IEEE International Conference on Human-Robot Interaction: Living with Robots 2008, 263–270. [CrossRef]
  32. Beran, T.N., Ramirez-Serrano, A., Kuzyk, R., Fior, M., & Nugent, S. Understanding how children understand robots: Perceived animism in child robot interaction. International Journal of Human Computer Studies 2011, 69, 539–550. [CrossRef]
  33. Bernstein, D., & Crowley, K. Searching for signs of intelligent life: An investigation of young children’s beliefs about robot intelligence. Journal of the Learning Sciences 2008, 17, 225–247. [CrossRef]
  34. Epley, N., Waytz, A., & Cacioppo, J.T. On seeing human: A three-factor theory of anthropomorphism. Psychological Review 2007, 114, 864–886. [CrossRef]
  35. Epley, N., Waytz, A., Akalis, S., & Cacioppo, J.T. When we need a human: Motivational determinants of anthropomorphism. Social Cognition 2008, 26, 143–155. [CrossRef]
  36. Sparrow, R. (2016). Kicking a robot dog. 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 229-229). [CrossRef]
  37. Darling, K. (2017). ‘Who’s Johnny?’ Anthropomorphic Framing in Human-Robot Interaction, Integration, and Policy. In P. Lin, G. Bekey, K. Abney, R. Jenkins (Eds.) Robot Ethics 2.0: From Autonomous Cars to Artificial Intelligence (pp. 173-188). Oxford University Press. [CrossRef]
  38. Kopp, T., Baumgartner, M., & Kinkel, S. How Linguistic Framing Affects Factory Workers’ Initial Trust in Collaborative Robots: The Interplay Between Anthropomorphism and Technological Replacement. International Journal of Human Computer Studies 2022, 158. [CrossRef]
  39. Coeckelbergh, M. You, robot: On the linguistic construction of artificial others. AI and Society 2011, 26, 61–69. [Google Scholar] [CrossRef]
  40. Westlund, J.M.K., Martinez, M., Archie, M., Das, M., & Breazeal, C. (2016). Effects of Framing a Robot as a Social Agent or as a Machine on Children’s Social Behavior. 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp.688-693). IEEE Press. [CrossRef]
  41. Waytz, A., Heafner, J., & Epley, N. The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. Journal of Experimental Social Psychology 2014, 52, 113–117. [CrossRef]
  42. Dawson, M.E., Schell, A.M., & Filion, D.L. (2017) The electrodermal system. In Handbook of psychophysiology, 4th ed (pp.217-243). Cambridge University Press.
  43. Kuchenbrandt, D., Häring, M., Eichberg, J., Eyssel, F., & André, E. Keep an Eye on the Task! How Gender Typicality of Tasks Influence Human-Robot Interactions. International Journal of Social Robotics 2014, 6, 417–427. [CrossRef]
  44. Siegel, M., Breazeal, C., Norton, M.I. (2009). Persuasive Robotics: The influence of robot gender on human behavior. 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems( pp. 2563–2568). [CrossRef]
  45. Pelau, C., Anica-Popa, L., Bojescu, I., & Niculescu, M. (2022). Are Men More Affected by AI Anthropomorphism? Comparative Research on the Perception of AI Human-like Characteristics Between Genders. In R. Pamfilie, V. Dinu, C. Vasiliu, D. Pleșea, L. Tăchiciu (Eds.), 8th BASIQ International Conference on New Trends in Sustainable Business and Consumption (pp.680-687). ASE. [CrossRef]
  46. Cheng, T. -Y. & Chen, C. -C. Gender perception differences of robots with different degrees of anthropomorphism. 2021 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW) 2021, 1-2. [CrossRef]
  47. Cramer, H., Kemper, N., Amin, A., Wielinga, B. and Evers, V. ‘Give me a hug’: the effects of touch and autonomy on people's responses to embodied social agents. Computer Animation and Virtual Worlds 2009, 20, 437–445. [CrossRef]
  48. Nomura, T., Kanda, T., Suzuki, T. & Kato, K. Altered attitudes of people toward robots: Investigation through the Negative Attitudes toward Robots Scale. Proceedings of the AAAI - 06 Workshop on Human Implications of Human Robot Interaction 2006, 29-35.
  49. Piçarra, N., Giger, J.-C., Pochwatko, G., & Gonçalves, G. Validation of the Portuguese version of the Negative Attitudes towards Robots Scale. Revue Européenne de Psychologie Appliquée/European Review of Applied Psychology 2014, 65. [CrossRef]
  50. Makowski, D., Pham, T., Lau, Z.J., Brammer, J.C., Lespinasse, F., Pham, H., Schölzel, C., & Chen, S.H.A. NeuroKit2: A Python toolbox for neurophysiological signal processing. Behavior research methods 2021, 53, 1689–1696. [CrossRef]
  51. Pochwatko, G., Giger, J.- C., Różańska-Walczuk, M., Świdrak, J., Kukielka, K., Mozaryn, J. & Piçarra, N. Polish Version of the Negative Attitude Toward Robots Scale (NARS-PL). Journal of Automation, Mobile Robotics and Intelligent Systems. 2015, 9, 65–72. [CrossRef]
  52. Nilsen, W.J., & Vrana, S.R. Some touching situations: The relationship between gender and contextual variables in cardiovascular responses to human touch. Annals of Behavioral Medicine 1998, 20, 270–276. [CrossRef]
  53. Major, B. (1981). Gender Patterns in Touching Behavior. In Mayo, C., Henley, N.M. (eds) Gender and Nonverbal Behavior (pp. 15–37). Springer New York. [CrossRef]
  54. Roese, N.J., Olson, J.M., Borenstein, M.N., Martin, A., Ison, A., Shores, L., & Shores, A.J. Same-sex touching behavior: The moderating role of homophobic attitudes. Journal of Nonverbal Behavior 1992, 16, 249–259. [CrossRef]
  55. Dolinski, D. Male homophobia, touch, and compliance: A matter of the touched, not the toucher. Polish Psychological Bulletin 2013, 44, 457–461. [Google Scholar] [CrossRef]
  56. Henley, N.M. Status and sex: Some touching observations. Bulletin of the Psychonomic Society 1973, 2, 91–93. [Google Scholar] [CrossRef]
  57. Henley, N.M. (1977) Body Politics: Power, Sex and Nonverbal Communication.Englewood Cliffs, NJ: Prentice-Hall.
  58. Willemse, C.J.A.M., Toet, A., & van Erp, J.B.F. Affective and behavioral responses to robot-initiated social touch: Toward understanding the opportunities and limitations of physical contact in human-robot interaction. Frontiers in ICT 2017, 4. [CrossRef]
Figure 2. Screenshot from the recording of the procedure. Person is standing next to the robot in a laboratory and touching the robot according to the displayed instructions.
Figure 2. Screenshot from the recording of the procedure. Person is standing next to the robot in a laboratory and touching the robot according to the displayed instructions.
Preprints 75340 g002
Figure 3. The level of physiological arousal when touching the robot depends on the sex of the subject and the "gender" of the robot.
Figure 3. The level of physiological arousal when touching the robot depends on the sex of the subject and the "gender" of the robot.
Preprints 75340 g003
Figure 4. The level of physiological arousal when touching the robot depends on parts accessibility.
Figure 4. The level of physiological arousal when touching the robot depends on parts accessibility.
Preprints 75340 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated