Abstract
Children with autism need innovative solutions that help them learn to master everyday experiences and cope with stressful situations. We propose that socially assistive robot companions could better understand and react to a child’s needs if they utilized tactile sensing. We examined the existing relevant literature to create an initial set of six tactile-perception requirements, and we then evaluated these requirements through interviews with 11 experienced autism specialists from a variety of backgrounds. Thematic analysis of the comments shared by the specialists revealed three overarching themes: the touch-seeking and touch-avoiding behavior of autistic children, their individual differences and customization needs, and the roles that a touch-perceiving robot could play in such interactions. Using the interview study feedback, we refined our initial list into seven qualitative requirements that describe robustness and maintainability , sensing range , feel , gesture identification , spatial , temporal , and adaptation attributes for the touch-perception system of a robot companion for children with autism. Finally, by utilizing the literature and current best practices in tactile sensor development and signal processing, we transformed these qualitative requirements into quantitative specifications. We discuss the implications of these requirements for future human–robot interaction research in the sensing, computing, and user research communities.
1 Introduction
Autism spectrum disorder (ASD) is a complex condition that affects many systems in the human body, from neurological aspects to physical comorbidities. Children with autism often endure sensory overload from everyday stimuli [1,2]. They may be nonverbal and may have difficulty understanding and relaying emotions. These combined experiences can cause the child to engage in repetitive or self-injuring behavior as well as self-isolation and heightened stress during social interactions [3]. Traditional techniques to help these children cope with stress and overstimulation must be administered by a trained adult [4]. As the rate of autism diagnosis continues to rise [5], and as the relative supply of caregivers, therapists, and paraeducators dwindles, there is an urgent need for new mechanisms to help children with autism learn to cope with stressful or unfamiliar situations.
Our overall goal is to help refine and validate robot-mediated intervention as a coherent, unified, evidence-based practice within the clinical autism community [6]. Robots are ideal assistants in autism intervention because they have simplified features, can utilize a variety of sensory outputs to reinforce communication (such as colored lights and sound effects to signify different emotions), and are less intimidating to the children than a human interaction partner [7]. While much work is already being done to bring robots into autism therapy and care, the use of tactile sensing is severely lacking in comparison with other sensing modalities. Children with ASD often struggle with speech and visual emotion cues, so touch is too crucial a communication channel to ignore [2].
Taking inspiration from existing methods such as deep-touch pressure (DTP) therapy [8] and animal-assisted intervention [9], we aim to broaden the tactile interaction capabilities of socially assistive robots in autism intervention. We thus investigated guidelines for the touch-sensing capabilities of a robot companion for autistic children. Our approach combines an initial literature review, an in-depth interview study, and current best practices in tactile sensor development and signal processing. We translated the literature into a set of initial requirements for touch sensing, and we then explored these requirements through hour-long interviews with 11 autism specialists from a variety of backgrounds. We systematically examined these interviews using the method of thematic analysis [10].
Our results highlight three overarching themes: the touch tendencies of children with autism, the importance of their individual differences, and the role of therapists in each child’s development. Participants recommended inclusive design strategies that would enable the companion robot and its tactile-sensing system to work for both touch-seeking and touch-averse children with autism. Specialists frequently alluded to or specifically requested customizable features that could be adjusted by the child’s caregivers and/or therapists. They also described a variety of roles that a tactile-sensing robot could perform to support a child with autism, including a teacher, a companion, a tool for regulating emotional state, and a potential tool for communication. Based on these findings, we provide the following four contributions:
Informed by our in-depth interviews with 11 autism specialists, we recommend seven key touch-sensing requirements that a robot should meet in order to be a good companion for children with autism.
We translate these qualitative requirements into quantitative touch-sensing specifications for human–robot interaction (HRI) researchers and future robot creators.
We present guidelines for the robot itself to promote successful interaction, including its role, responses, and form.
Additionally, we identify areas of the robot a child with ASD is most likely to touch, and the type of gestures they are likely to utilize.
The remainder of this study is divided into the following sections: Section 2 provides a summary of related work on ASD, existing intervention methods, and socially assistive robots. Section 3 describes our methods for deriving a set of preliminary touch-sensing requirements from the literature, followed by detailed procedures of our interview study and analysis. Section 4 presents the three overarching themes that emerged from our analysis of the interviews. Section 5 synthesizes these themes into seven qualitative requirements and translates them into quantitative specifications for touch-perceiving robots. Section 6 discusses all of our results and their implications on the future of socially assistive robotics (SAR) in the autism community.
2 Related work
2.1 Sensory processing in children with autism
Over 96% of children with autism has disordered sensory processing [3]. They may experience sensory overload, where one or more senses overreact to a stimulus. Likewise, they could also experience an undersaturated response to stimuli. The senses affected, as well as the over- or underresponsiveness of each sensory system, will depend on the individual child. These sensory imbalances can be distracting, frustrating, and even painful. Nonverbal children with autism may be particularly affected; unable to communicate their needs or sensory pains, they may inflict self-injury or act aggressively.
Touch is an essential component of early childhood development [11]. Affective touch, or touch with an emotional component, promotes social bonding, secure attachment, and social communication skills. Touch is also key for environment exploration [2]. While touch is one of the most commonly affected sensory systems in children with autism, tactile processing issues in autistic children are less studied than issues with visual or auditory processing [2,3]; this asymmetric focus within autism research mirrors the broader trend in our current scientific understanding of different senses [12].
Cascio et al. observe that individuals with autism may generally be underresponsive to pleasant tactile sensations and overresponsive to unpleasant sensations [2]. This combination leads to an overall tactile defensiveness, causing children with autism to gravitate toward controlled, predictable, and repetitive situations and stimuli. Therefore, children with autism may show a blend of input-seeking and input-avoiding qualities. For example, a child may regularly chew on the sleeve of his or her sweater but avoid unfamiliar foods with unknown textures [1].
2.2 Existing intervention methods
A diverse set of autism specialists focus on different aspects of the development of children with ASD as well as their care. Some of the most common services children with autism receive include occupational therapy, which focuses on cultivating sensorimotor skills and skills needed for daily life; physical therapy, which develops mobility and the muscular system and improves gross motor skills; and speech therapy, which aims to remedy language and communication impairments [13,14]. A child with autism may receive other services besides these, such as behavioral services, orientation and mobility services, and medical services, along with any other related services as deemed necessary by their individualized education program [15]. One important issue all therapists must face is calming down a child who becomes stressed while in their care.
DTP therapy is a tactile-based intervention method for helping children with autism reduce their stress. It is the current gold standard of treatment and utilizes a wide range of therapy tools, such as weighted blankets, weighted vests, and Wilbarger brushes [8]. However, the large range of therapeutic tools available within the classification of DTP means that customization to find the method most beneficial for an individual child may take many attempts [4]. Furthermore, DTP must be administered and monitored by a trained adult.
Animal-assisted intervention is a promising alternative to DTP that encourages independent social and physical interaction between autistic children and an animal companion [9]. Guinea pigs [16], dogs [17], and horses [18] have been found to improve the stress levels, mood, and behavior of children with ASD. Additional benefits included increased independent sensory seeking and social contact with peers. However, a live animal companion is not always feasible for many reasons, including cost, allergen or hygiene concerns, and availability. Additionally, even with robust training, an animal behavior cannot be predicted entirely. To further support families, there is growing interest in incorporating socially assistive robots into the therapy, routines, and care of children with autism.
2.3 SAR for children with autism
The field of SAR aims to use interaction with robots to improve daily life, often for those with impairments [19]. For example, PARO, a robot baby seal, reduces stress and improves various symptoms in elderly patients with dementia [20]. Huggable, a blue teddy-bear-inspired robot, was designed to comfort and entertain children during long-term hospitalization [21]. The Haptic Creature, a furry, lap-sized robot animal, uses calm physical breathing patterns to improve mood and decrease stress; thus far, its benefits have been validated in neurotypical participants [22]. Therabot, a robot dog inspired by animal-assisted therapy, is designed to help survivors of trauma reduce the stress they experience [23].
The use of SAR is especially being investigated for children with autism. While preferences may vary greatly between individuals, children with autism generally seem to prefer child-sized robots [7] with simple, often exaggerated, features [24,25]. To reward further interactions, the robot should produce behavior that is somewhat predictable and that can be controlled by the child [7,24,25]. Robot appearances and/or behaviors are often customized to align with these findings. For example, the IROMEC robotic toy encourages children with autism to engage in robot-assisted play and promotes growth in several developmental categories, including cognitive development, through cause-and-effect activities [26]. In another research effort, five children with autism interacted with the humanoid robot NAO through two different play modes – dancing and an interactive touching game [27]. The play modes encourage prosocial behaviors and promote a sense of bodily awareness in the child.
Socially assistive robots can positively impact children with autism through interaction in many roles: as social mediators to encourage communication [28], as therapy assistants to elicit positive behaviors such as joint attention [29] and emotion recognition and expression [30], and as playmates [7,26,31]. In these roles, timely responses from the robot can promote the child’s continued learning and interest [28]. It is also important to assess how children interact with a robot in a long-term setting. Three different longitudinal studies were recently conducted with different robot form factors and children between 3.5 and 12 years old; two of these studies focused on autistic children, and the third did not report the neurodevelopmental status of the participants. Beneficially, they found improved social skills [32], increased communication attempts [33], and a maintained interest in the robot at the end of a 2-monthlong interaction period [34].
As physical contact and tactile exploration are key tools for child development [2], endowing a robot with robust touch-sensing capabilities could greatly enhance its interaction possibilities and, by extension, its potential benefit for an autistic child. Tactile sensing varies greatly across SAR technology, ranging from no sensing to sensing at a large number of discrete points on the robots body. Even when the robot is equipped with some tactile sensors, the majority of existing robot behavior modes rely almost exclusively on voice recognition, visual cues, and/or remote control of the robot by a trained operator [21,28,29,35,36]. These approaches either miss important touch input or include it at the cost of a human operator. Other robots, such as PARO, the robot dog AIBO, and the humanoids NAO and Pepper, limit physical tactile perception to a few key areas, such as the head and limb extremities, often with only binary output at each location. The childlike robot KASPAR utilizes force-sensing resistors (FSRs) at several discrete points on its body [37]. Finally, some research prototypes in SAR include a large number of contact-sensing points. An initial model of the Haptic Creature detected touch using an internal accelerometer and 56 FSRs distributed across its body [38]. A later model utilized pressure-sensing piezoresistive fabric [39]. The mobile ball robot CARBO, intended to provide an automated therapy experience similar to DTP, collects tactile input from 67 miniature trackballs spread across its shell [40]. While these studies show growing interest in touch sensing for SAR, the variety of designs suggests a lack of design guidelines in this area.
Existing observation studies with autistic children provide preliminary guidelines for touch sensing in autism therapy. Notably, in a study by Robins et al., three child participants with autism were given unconstrained interaction time with the KASPAR robot for a preliminary case study [44]. Instances of physical contact ranged from less than 1 s to more than 2 s. Intensities of forces applied ranged from a very gentle touch to forceful squeezing. Furthermore, several instances of affective touch were observed, such as kisses, gentle hand-holding, and stroking the robots cheek. In a 3 yearlong longitudinal study of over 100 sessions, a small snowman-shaped robot called Keepon was placed in a day care center and teleoperated by a remote person. More than 30 preschool-aged children with autism ultimately interacted with the robot. Their results highlight a variety of touch interactions among a select set of participants over time. One participant initially guided her therapists hand to put a hat on Keepons head (her fifth session with the robot S5), but she became bolder in her own physical interaction with Keepon, first by poking it with a xylophone stick, later touching its belly, and finally giving Keepon a kiss (S14). Another participant showed initial disinterest in Keepon but slowly gained willingness to physically interact, touching it for the first time during S10 and poking the robot frequently to prompt responses by S17. The third participant initially kicked Keepon over, but he became much gentler and friendlier toward the robot in later sessions [41].
3 Methods
We derived an initial list of requirements for a tactile-sensing robot companion by thoroughly reviewing the relevant literature that was summarized in the previous section. In order to validate and fine tune these initial requirements, we then conducted an interview study with autism professionals from a wide range of specialties and with several years of field experience with children across the spectrum. After conducting the interviews, we transcribed, coded, and analyzed the data using an approach called thematic analysis. We then augmented our initial requirements based on the experts input. Finally, we further utilized the literature and established methods in the fields of robotics, sensors, signal processing, and machine learning to translate the resulting qualitative requirements into a set of proposed quantitative specifications for tactile-perceiving systems in companion robots.
3.1 Initial touch-sensing requirements
As detailed in Section 2, we explored a breadth of studies related to robot-mediated autism intervention and studied the variety of physical child–robot interactions that occurred in each study, whether they were planned or spontaneous. We started this process by searching for relevant works in review papers, such as those by Begum et al. [6] and Cabibihan et al. [7]. Once we found a relevant publication that discussed the concept or instance of physical interaction, we then searched for additional sources in both its forward references (new publications that cite the article) and its backward references (older studies cited in the article). Table 1 shows the initial requirements that we derived from this examination of the literature. We used this requirement list as a major discussion point for our interviews with autism specialists. As detailed later in this section, we asked each participant to both rank the requirements we proposed and share suggestions for their improvement. For each requirement, we also identified the gaps in the literature and included them as follow-up questions for the specialists. The follow-up questions can also be seen in Table 1 but were not shown to the participant during the ranking task.
Initial list of six touch-sensing requirements that we derived from the literature. Specific sources that inspired each requirement are cited in the “Motivation” column. Study participants viewed the requirement name, motivation, and qualitative description; they then ranked the requirements in order of importance and gave feedback about each one and the entire list. For each requirement, we also show the follow-up question that the experimenter asked to spark additional feedback after completion of the ranking task
Requirement | Motivation | Qualitative description | Follow-up question |
---|---|---|---|
Gesture identification | Touch applied to an interaction partner can be used to relay immediate physical needs or requests (deictic or instrumental gestural communication) [41,42] | The robot should be able to feel physical communication gestures, such as a poke | What kinds of physical communication gestures should the robot be able to feel? Participants could select gestures from the touch dictionary by Yohanan et al. [43]. We omitted one gesture (“finger idly”) from the 30-item list, to avoid participant confusion |
Sensitivity | The child may use soft, gentle touches to communicate positive feelings [41,44] | The robot should be sensitive enough to detect affective touch communication, such as a gentle hand rest | What kinds of affective touch communication should the robot be able to feel? |
Spatial | Children with autism respond best to direct rewards. When a child touches the robot’s tactile sensor, the robot’s response serves as a reward [7,25,26,37] | The robot should detect touch across a high proportion of its body, so that it may react and therefore encourage future interaction attempts | How much of its body should be touch sensitive? |
Temporal | If time elapses between the child’s touch and the robot’s response, the child may not form a meaningful connection between their touch action and the robot’s resulting behavior [26,28,37] | The robot should provide a near-immediate response to touch interaction, similar to a human’s responses | What kinds of responses should a touch-sensitive robot provide? |
Adaptation | A variety of assistive robots are being studied for use in robot-assisted therapy [7,27,32,35,41,44] | A general tactile sensing system should be easily adaptable to fit robots of different shapes and sizes | What kinds of robots do you think are most important to be able to use? (Adjectives, such as animal, humanoid, big, small, etc.) |
Robustness | Children with autism may be rough during an interaction [7,41] | The tactile sensing system should be robust and keep working properly even after rough treatment | What kind of rough treatment would you expect? |
3.2 Participants
We focused on accessing a breadth of autism expertise by recruiting specialists with different focuses through e-mail and snowball sampling. We started by recruiting professionals in applied behavior analysis and speech-language pathology. After multiple participants stressed that their occupational therapist (OT) colleagues would be very well suited to answer the study questions, we decided to recruit OTs as well. We ultimately interviewed 11 participants (4 male and 7 female). Four were OTs and seven were in other occupations.
All 11 participants were located in the United States or Canada. Their occupations included adaptive physical education teacher, board-certified behavior analyst, neurology professor, OT (two with focuses on pediatrics, one focusing on early childhood special education, and one on sensory processing disorders), paraprofessional/paraeducator, physical therapist, relationship development intervention consultant, and speech-language pathologist. Participant age ranged from 26 to 55 (average: 41, median: 46, SD: 12). Their years of experience working with children with autism ranged from 2 to 30 (average: 13, median: 10, SD: 10). Table 2 summarizes their backgrounds, as self-reported during the interviews.
Summary of participant experience and backgrounds
P1 was a board-certified behavior analyst (BCBA) who also had experience as a paraprofessional and as an applied behavior analyst (ABA). P1 worked in several settings over 7 years including a residential facility, in home services, in various school districts, and in a school specifically for students with an autism diagnosis. |
P2 was a PhD student in Human Development and a relationship development intervention (RDI) consultant at a private practice. P2 studied inclusive work for autistic individuals entering the workforce and coached parents who are adjusting to their child’s autism diagnosis. P2 had 10 years of experience with children with autism. |
P3 was a speech-language pathologist (SLP) in a public school who also previously worked as a paraeducator (also referred to as a paraprofessional). P3 had about 8 years of experience with autistic children aged 5 to 10. |
P4 was a paraprofessional who assisted children with special needs with navigating their day at school. P4 worked in a public school district, in an elementary school in a typical classroom setting for 5 years and in a self-contained autism room for 2 years, and later in a middle school classroom setting for 8 years. P4 has 5–7 years experience specifically with children with autism. |
P5 was a neurology professor who conducts interdisciplinary research on various aspects of autism, including genes, brain development, social interaction, and treatment studies. P5 has roughly 20 years of experience working with children with autism. |
P6 was a pediatric occupational therapist (OT) who worked in a pediatric clinic and a day treatment center specifically for students with an autism diagnosis, both in one-on-one and group therapy settings. P6 had 3 years of experience working with special needs individuals from age 0 to 22. |
P7 was an OT in an early childhood education setting in a public school district. P7 has 15 years of experience with children with special needs, with and without autism diagnoses. |
P8 was an adaptive physical education teacher who teaches physical education class for children with special needs. P8 has 17 years of experience with children with autism, working both in a public school setting and at an alternative behavior school. |
P9 was a physical therapist (PT) in a public school district with 2 years of experience working with children with autism. P9 uses physical therapy to help students access their educational environment and curriculum. |
P10 was an OT at a school specifically for individuals with an autism diagnosis between age 4 and 21. P10 has 30 years of experience, worked with children with autism in four different countries, and specialized in sensory processing disorders. |
P11 was an OT in a public school system that serves individuals with autism age 3–22. P11 has 30 years of experience working with children with autism in a variety of settings, such as a residential summer camp, an early intervention program, and home-based instruction. |
3.3 Procedure
This research study was approved by the Ethics Council of the Max Planck Society under the Haptic Intelligence Departments framework agreement as protocol number F003A. Participants gave informed consent and did not receive compensation for their participation. Each interview was conducted in English over WebEx, which is a secure video-conferencing system [45]. Interviews generally lasted about 60 min. As an exception, the interview with P10 had a duration of about 100 min, as this participant was excited to share many examples from their own experiences. The audio and video of these interviews were recorded with each participant’s explicit consent. The interviews were then transcribed for coding and analysis.
The scripted interview questions were designed to gather detailed feedback in a structured manner. The experimenter guided each autism specialist to provide information on their education background and work experience, describe the touch behavior of autistic children, and rank and comment on touch sensing and response requirements for a companion robot. A description of the six stages of the interview, as well as the general time allotment for each stage, is as follows:
Study setup and demographic questionnaire (10 min) – The experimenter explained the motivation behind this research, the goals we hoped to accomplish through the study, and the general setup for the interview session. We then collected demographic data from participants. Next, we asked participants about their experiences in relation to children with autism, including current and previous related jobs, years of experience, and countries in which they had worked with children with autism.
Children with ASD and general touch interactions (15 min) – We asked participants to comment on how children with autism utilize touch, including whether there are differences compared to neurotypical peers as well as similarities and differences within the autistic population. We also asked how participants respond to children’s requests that are presented through touch, and what methods they have used to calm an autistic child who was stressed. We then showed them two examples of an NAO robot (Figure 1), with one wearing a thin shirt and the other wearing a koala suit, and we asked what types of physical touches the robot should be able to detect. To show the robots, the interviewer either redirected her camera or shared a picture of the robots in the WebEx browser. In both scenarios, the robots were powered off and in the same pose.
Ranking task (15 min) – We sent the participant a link to the requirement–ranking interface, where the initial requirements were presented in an order that was randomized for each participant. We asked the participant to rearrange the requirements, ranking them from most to least important, and to explain their thought process out loud. The ranking–task interface was a shared online presentation application similar to Microsoft PowerPoint (Figure 2). Each initial requirement from Table 1 was displayed on its own slide, along with a brief explanation of the concepts that motivated the requirement. Participants could rearrange the requirements into their desired order by clicking and dragging the slide navigation thumbnails on the left side. If a participant was not able to navigate the ranking interface, they could also tell the interviewer the sequence in which they wanted the requirements rearranged. Participants were encouraged to tell the interviewer if they felt a requirement should be edited, removed, or added.
Follow-up questions (10 min) – Next, we asked the participant a specific follow-up question for each requirement (Table 1), in order to initiate a more detailed dialogue and derive further specifications. While the initial requirements were worded generally, the follow-up questions prompted the participant to give specific details about their ideal robot companion. The question order followed the participants ranking, from most important to least important requirement.
Comments on tactile sensor design (5 min) – To elicit further comments on the existing sensing technology, we showed the participant a prototype fabric-based tactile sensor (Figure 3) and explained how it worked in layman’s terms. We asked for their thoughts on the design as well as related questions and concerns.
Closing questions and recommendations (5 min) – Finally, we asked the participant what movements and sounds the robot should provide in response to a child’s input. We also asked what other factors we should consider in creating a touch-sensing robot companion for children with ASD and what closing comments or questions the participant had for us.

The prototype robot companion for children with autism that was presented to the therapists in our interview study. It is an NAO robot (SoftBank Robotics); the NAO robot shown on the left is wearing a thin fabric shirt, and the one on the right is enclosed in a soft koala suit.

The requirement–ranking interface enabled participants to prioritize tactile-sensing requirements for the robot as they preferred. Participants could reorder the six provided requirements by clicking on and dragging the respective slides in the navigation pane on the left. The main view enabled the participant to clearly see the individual requirements and the motivations that inspired them.

A photograph of the prototype fabric-based tactile sensor for a robot companion that was shown to participants. Participants were told that several such sensors could be placed across the robots body, and that they could be used to detect the intensity and/or type of touch being enacted on the sensor. Participants were asked to give feedback and share any concerns they might have with such a design.
Similar to how we used the initial touch-sensing requirement list in stage 3, the robots presented in stage 3 and the tactile sensor presented in stage 4 were meant to provide tangible starting points for the discussion. Given available resources and time limitations, we presented only one type of robot and one sensor design. However, we encouraged the participant to think about a variety of form factors and approaches in both cases, not limiting their thinking to the presented examples.
3.4 Qualitative analysis of interviews
We utilized thematic analysis to analyze the data from the interviews. Thematic analysis is a method for interpreting and organizing qualitative data, such as a series of interviews, into meaningful patterns [10]. A block diagram illustrating our workflow can be seen in Figure 4. After transcribing each interview, two authors analyzed each complete transcript line by line and labeled the participant’s responses with descriptive codes, using an open, iterative coding approach. The two coders met frequently to discuss findings and compare codes. We started to notice repetitions across participant responses and data saturation around the seventh participant. At this point, the coders began to aggregate all the codes into related groupings in order to identify the overarching themes. The coders separately searched for themes, and they then discussed findings and merged theme results, applying focused coding methods to finalize the analysis. We converged on 3 themes that captured the vast majority of the comments shared by our 11 participants.

A diagram illustrating the flow of our process and the links between our literature review, interview study, and resulting themes. The thematic analysis portion of our study is highlighted in green. The identified themes were used to refine our initial requirement list into finalized qualitative requirements, which we then translated into quantitative specifications.
4 Results: overarching themes
We present the three overarching themes that we identified in our data: the touch tendencies of children with autism, the importance of individual differences, and the role of therapists in each child’s development. Within each theme, we report the relevant requirements that the participants described for both touch sensing and child–robot interactions. Quotes taken directly from a participant’s transcribed interview are marked in italics and enclosed in double quotation marks. We use boldface to highlight terminology from the autism specialists that may be new to the reader.
4.1 Touch in autistic children
Children with autism frequently experience sensory stimuli differently than neurotypical children. Several of our participants explained that children with autism are often on one of the two tail ends of the touch-sensitivity spectrum. As P5 described, “if there is a distribution [of touch preference], the kids with autism will be out here on the two tails, and typically developing kids will fill that average. You would hardly ever meet a typically developing child who just seemed to really crave being touched, or hating it, but with autism you generally won’t find them to be in the middle.” Participants used several different words to describe these two ends of the distribution, including under- or overresponsive to touch, hypo- or hypersensitive to touch, loving or hating touch, and touch seeking and touch averse. For this study, we will refer to them as “touch seekers” and “touch avoiders.”
Touch seekers are hyposensitive to touch. They crave contact and use it to investigate their surroundings and function in the world. They enjoy deep pressure touches, like squeezes and firm hugs. Touch avoiders are hypersensitive to touch. For them, touch can be upsetting, unpleasant, and even painful at times. Touch avoiders may use a very light touch or get very close to the person or object of interest without touching. If there is touching, a touch avoider would prefer to control and initiate the interaction, such as guiding another person’s hand to request assistance. While a child will generally display traits for one of these tail ends (a touch seeker or avoider), the child’s receptiveness to touch also fluctuates based on external factors such as events in their day and their familiarity with the interaction partner or object. P6 explained this topic particularly clearly, stating, “just like the autism spectrum, touch and all of our sensory systems are also on a spectrum. Some kids [with autism] seek out touch as their way of functioning in life. Some kids completely avoid it. A lot of our kids who are over-responsive to touch kind of hold back. They don’t use it as much. They have a really low threshold, so it can be upsetting to them. They don’t like to be touched in certain places. They don’t like to touch things with their hands and they could be super overresponsive to heat, pain, cold, stuff like that. And then on the other hand, people who are underresponsive, as you probably know, they kind of seek that out. They’ll touch literally anything to kinda get that input to their brain of what they’re doing, how they’re doing it, and kind of function throughout their day.”
Touch can help children with autism explore the world, self-regulate, and communicate their needs. To investigate objects, autistic children frequently use their hands, their mouth or lips (P8: “There is a lot of mouth in my class.”), and even their entire body (P2: “I have other clients that like to take fuzzy things and roll in them.”). While touch seekers typically need firm squeezes to calm down, touch avoiders may also use touch for self-regulation, utilizing predictable and controlled interactions like hand-holding. Children with autism, regardless of whether they are a touch seeker or avoider, often have favorite toys or sensory items. Schools and therapy clinics usually also have a variety of objects that can create different tactile sensations for the children to choose from. Finally, children with ASD may be nonverbal or have limited speaking skills. Therefore, touch becomes especially important for communication. They may use touch to communicate needs (e.g., guide the caregiver to an object), to express feelings (e.g., distress), and to socialize with others (e.g., seek attention or space).
Importantly, children with ASD often do not understand social conventions. As such, they may use socially inappropriate touch. When interacting with someone, a child may get very close to the person – in their personal “bubble” (P2). If they find a feature or object on the person’s body interesting, such as a mole or an ID badge, they may touch or grab it without asking. They may unknowingly touch private body parts or use inappropriate touch gestures (e.g., a slap or pinch), typically with the wrong pressure level or a high frequency. P6 explained, “[…]a lot of students will go up to another person and when they’re trying to be really nice and you know, to tap them on the shoulder, they’re actually going to give them a good slap because they’re not fully understanding their touch, their proprioception.”
4.1.1 Sensor and robot recommendations
The participants recommended requirements that would enable a tactile-sensing robot companion to work for touch seekers and touch avoiders alike. Importantly, the tactile sensors should be able to detect a wide range of contact intensities, from the light touches of touch avoiders to deep squeezes from touch seekers. Sensitivity should be consistent across a single sensor’s area and also across different sensors. Additionally, depending on the child or children using the robot, the same type of gesture might occur with different intensities.
The robot should detect touch across a wide range of its body to encourage future touch interactions, especially for touch avoiders. Of 11 participants, 10 requested at least 80% of the robots body be touch sensitive. They said that the location of the touch, not just its intensity, should be detected. They would want the robot to know the general region of its body that had been touched, such as the arms or face, or whether the child had touched a region that was appropriate or inappropriate for social interaction. Notably, the participants did not ask for exact contact location discrimination within a general body region. Often, while participants preferred the idea of whole-body sensing, they also specified which body regions they would prioritize to equip with touch sensing, in casethe whole-body sensing was not possible. Figure 5 presents a summary of the locations suggested by the participants.

Locations that participants specifically prioritized to equip with touch sensing displayed on a diagram of a generalized robot companion. The number of therapists who suggested each region is shown in parentheses. Most of our 11 participants requested whole-body touch sensing, but they also provided these specific locations in case whole-body sensing is not possible.
The robot and its sensors must be very robust. They must survive rough exploration, excited interactions, and tantrums. The sensor material should be durable and not tear easily. Sensors should be adhered securely to the robot to withstand rough squeezing, picking, and pulling. The robot and its sensors as a whole system must also be safe for oral exploration, without sharp edges or risk of electric shock.
The tactile feel of everyday items or the robot itself became an important topic of discussion during many of our interviews. Four participants (P1, P3, P5, and P8) noted that children with autism may be sensitive to the feel of their clothes or objects (e.g., the tag on their shirt, the feel of jeans, or the texture of a stuffed animal). Five other participants (P2, P4, P6, P7, and P10) emphasized that much attention should be paid to the passive feel of the robot. They recommended the robot provide a range of tactile sensations, such as a firm base and a soft and squishy outer covering. Visibly and tactilely noticeable seams and wiring should be avoided. The outer texture of the robot should have a soft, neutral sensation – not extremely fluffy or rough.
4.2 Understanding individual needs and differences
All of the participants referred to the wide range of characteristics that can be found in children on the autism spectrum. These individual needs and differences need to be taken into consideration when designing a robot companion.
The specialists noted that children on the spectrum may differ in their preference for touch sensations, their ability to utilize touch gestures, and their processing of sensory input. Specifically, ASD children may fixate on a favorite body part or tactile sensation and use that repeatedly. They may have other diagnoses and symptoms comorbid with their autism that affect how they are able to interact with the robot companion. For example, a child may have low muscle mass, have poor motor coordination, fatigue easily, or have a vision impairment. Thus, the child may be able to apply only a light touch, use his or her whole hand instead of a single finger, or touch a general region on the robot rather than a specific area. Finally, children on the spectrum may need additional time to process information or sensory input. As explained by P5, “If you could code social interactions between kids with autism and adults, its almost as though the person with autism is one step behind, even though they are responding, and they get kind of out of phase.”
Children with autism communicate in a variety of ways. Their level of verbal communication can range between nonverbal, nonverbal using augmentative and alternative communication, and verbal. They also may have developed custom sign language or gestures to communicate with their caregivers. They will have different cues to express distress or alert the start of a “meltdown” (P1) – when the child is very upset, their senses are overstimulated and further information cannot easily be processed. Therapists work closely with each child to understand the chikd’s needs and communication methods. When a therapist first begins working with a new client, much of their communication with the child may be guesswork.
Like any child, a child with autism will have his or her own interests – a favorite color, toy, cartoon character, etc. They will have their own motivators, skills, and fears. The child’s education program is customized by their individual therapists and education team. The therapist encourages the child to complete their therapy session by using motivators – activities or items the child likes, which can therefore be used as a reward. Motivators are personalized for each child. Therapists constantly monitor and update motivators to reflect the child’s current interests.
4.2.1 Sensor and robot recommendations
Most participants either alluded to or explicitly described the need for customization, agreeing that customizability will be key for creating a robot companion that is adaptable to children across the autism spectrum. They requested the option for the parent or therapists to customize several features, such as the pressure levels and types of gestures the robot can detect, the timing delay of the robot’s responses, and the robots physical appearance.
Along with touch avoiders, some children may also have physical conditions that restrict how much they can control their physical interactions with the robot. As such, the participants reiterated the importance of the tactile sensors reliably detecting touch, whether it is very light touch or a tight squeeze, so that the child can be rewarded for their interaction attempt in either scenario. Additionally, some children cannot apply fine-grained gestures, and the implied meanings of their gestures may vary. Consequently, the participants suggested that the parents or therapists be able to direct the robot about their child’s actions. They could specify what touch gestures the child commonly uses or avoids. In addition, they could specify the intent of a gesture – a hard slap might be a friendly interaction from some children and a negative interaction coming from others.
The timing of the robot’s reactions was our most controversial point of discussion. Participants were divided on how quickly the robot should respond, wanting either a near-immediate response or a time delay. Several participants suggested a compromise – equipping the response setting with a customizable time-delay window. It is important to note that the delay was requested only for the robot’s response to the touch. No delay was requested for the speed at which the tactile sensors detect the child’s touch.
One group of participants strongly agreed with our initial temporal requirement that the robot should provide a near-immediate response to touch (P1, P4, P6, P10, and P11). They explained that this near-immediate response would help the child form a direct correlation between their actions and the robots responses. A near-immediate response would also provide tangible positive reinforcement for the child’s touch and encourage additional interaction. Finally, P1 said that delaying the response could reinforce the wrong interaction, especially if the child was frustrated by the robots silence and began using more force.
Other participants disliked the “near-immediate” wording of our initial temporal requirement (P3, P5, and P9). They recommended having the robot respond after a time delay. As the child might have a processing delay, a near-immediate response might be confusing to the child, happening too early or too fast for the child to understand. Participants with this opinion often recommended that the robot respond with a customizable time delay, as set by the parent or therapist. Finally, P3 was concerned that the child could become dependent on the fast response and noted that the caregiver could gradually increase the time delay to help the child build tolerance. The remaining participants (P2, P7, and P8) did not provide specific comments for or against the “near-immediate” timing of the temporal requirement.
Customization of the robot’s appearance is desired, but it is less important than other requirements. Participants differed in opinion on what form factor is the most desirable for a robot companion. Some participants preferred an animal appearance (P1, P4, P6, P7, P8, and P11). They suggested interacting with an animal would be more inviting and calming than interacting with a humanoid partner. Others further specified that a toy-like, stuffed animal form would be ideal, citing that children with autism often already have a treasured particular toy or stuffed animal. As P7 explained, “Well, just make it something that they really want to hold onto […]. Something that they want; they want it to be their special animal, their special fuzzy, you know? Kids cling onto an animal for forever.”
Other participants suggested a humanoid appearance (P5 and P11). While P1 felt that a humanoid robot would be perceived as creepy, P5 suggested that interacting with a humanoid companion robot would provide good social interaction practice and enable the child to transfer skills to human interactions with greater ease. P5 stated this preference as follows: “Humanoid for sure, because […] you know, I think what you are trying to train up is using touch in a social or communicative sense.” Finally, P11 noted that an animal form would be more comforting for children, while a humanoid form might be more appropriate for children at the high-functioning end of the spectrum as well as when a child’s treatment plan progresses.
Other participants suggested giving the robot a form factor similar to objects that specifically interested that child, such as a toy truck or a cartoon character (P9 and P10). Many participants suggested having several interchangeable options for an outside skin on the robot. Such a design would enable therapists to easily customize one robot’s appearance for several different children. A washable outer skin was also recommended to keep the robot sanitary. Several participants noted that the robot should be of a portable size (P3, P4, P6, P7, and P10), and one emphasized that the tactile sensors should be scaled to match the robot’s size (P5).
4.3 Supporting the individual and promoting independence
Therapists help children with autism learn socially appropriate behaviors, navigate meltdowns, build tolerance to uncomfortable stimuli, and eventually cope with these stimuli through self-regulation skills. Regardless of the exact details of their occupational title, their underlying goal is always the same: to support the child and to promote the child’s independence. As P1 described, “We’ll talk to the parents, we’ll talk to pertinent people in their life, to try and determine what are the skills that are lacking […] because our goal is always to have them achieve their highest functioning level, whatever that might be at that moment in time.”
Therapists help children with autism improve their communication skills by using a variety of methods such as gestures, picture cards, sign language, alternative augmentative communication (AAC) devices, and speech modeling. Therapists also often teach children with autism proprioceptive skills, such as understanding their own body parts and the strength of their touch. Therapists improve these skills through a process called shaping.
Shaping refers to the therapist guiding the child toward an ideal behavior, often through several incremental steps of accepted behavior. For example, a therapist may encourage a child to change their mode of communication over time, gradually migrating from using a guiding touch on their caregivers arm, to selecting a picture card symbolizing their request, with the end goal of asking verbally, if possible. The therapist may also help shape the child’s touch behavior, guiding them to utilize socially appropriate intensity, duration, and location when touching others. The therapists may use a “token system” (P1 and P3) to help shape behavior: every time the child successfully completes a therapy task, they are awarded a token – a positive visual marker such as a sticker or a check mark – on a token board. When the child finishes the therapy activity and fills the token board, they are rewarded with one of their motivators.
Therapists utilize various methods to safely navigate a child with ASD out of a meltdown. To start with, the therapist will often reduce sensory input if possible, such as dimming the lights or moving to a quiet room. As the child may have a difficult time processing additional input during a meltdown, the therapist will limit their talking, using language that is as simple and direct as possible. They may provide the child with picture cues, so the child can express what is wrong or what they need. The therapist may provide positive distractions in the form of calming physical sensations (e.g., deep hugs/squeezing, access to toys with different sensations, using a swing, riding a wagon, or taking a walk). A therapist can also try to remind the child of the awaiting motivator to help them get through the situation.
Ultimately, the therapist aims to help the child build their independence. They seek this goal by giving control and choices to the child where possible. They help build tolerance to unpleasant stimuli. They provide tools and teach strategies that the children can later use on their own. The goal is to help the child thrive and become as self-sufficient as possible. As articulated by P3, “You’re trying to build up that tolerance ability to the unpreferred or less preferred situation. So I think, as you know, in the education setting, we’re doing as much as we can to help them gain independence, or gain the strategies that they will need to be functional. Just that little bit at a time.” In order to reach this goal, the therapist must first build rapport, establish a relationship, and gain the child’s trust. Trust is key for building the child’s tolerance to activities or objects they dislike and to help a child when he or she is particularly distressed.
4.3.1 Sensor and robot recommendations
The participants discussed the various roles the robot companion could play in order to support the child and promote the child’s independence. Participants seemed to see the robot serving as a teacher, a companion, a tool for regulating the child’s emotional state, or a tool for communication. These roles are not necessarily exclusive from one another: some participants thought the robot could fulfill multiple roles.
Several participants saw an opportunity for the robot to act as a teacher (P5, P7, and P10). P10 praised the idea of using a robot to reinforce concepts from educators through play, saying, “You have to give [children with autism] time to play and process if they’re behind […]. You gotta give them time to process it, integrate it, to feel comfortable, to repeat it […]. The summary is very important. So, I think tools like a robot could be that, that moment where they get to practice the goals that are coming from the educators, teachers, and therapists. So I think it’s very, very important. So they can repeat and do it in a more calm and enjoyable way.” The robot could be used to teach about body parts and promote understanding of the child’s own body (P7). The robot could also teach about socially acceptable locations to touch others. The spatial settings could be adapted and reduced over time to promote touching the robot in areas that are socially acceptable for human interactions.
Additionally, the robot could teach what kinds of gestures are socially acceptable to use when touching others. The robot could slowly shape the appropriate touch type, location, intensity, and frequency during interactions with the child. Certain reactions, such as clapping and positive verbal responses, could be used to reinforce positive behavior. Participants gave a variety of suggestions for how the robot should react to less desired touches, such as giving a firm verbal response for negative behavior, or turning off and giving no response at all. The robot’s gesture identification feature could help reward the appropriate type of touch at the start of the behavior shaping, even if the intensity is wrong (e.g., a tight hug is OK at the start, because a hug is a socially acceptable gesture). P5 suggested matching the force detected by the tactile sensors to the human perceptions of pleasure and pain. They suspected that matching the robots responses to expected human responses at the same force levels could help the child transfer their practice with the robot to human interaction. Figure 6 presents the touch gestures that the therapists observed in autistic children and the top five gestures that they recommended the robot to detect. While most of the gestures are from the touch dictionary by Yohanan et al. [43], some participants also suggested gestures of their own: “light touch,” “squish,” “heavy catch,” and “gentle hand rest.”

A visualization of the touch gestures that specialists recommended (or did not recommend) a touch-perceiving robot companion should be able to detect from children with autism. An asterisk above a gesture indicates that it was selected as a “top five” gesture by more than three participants.
Multiple participants requested that the robot detect whether a gesture is repeatedly occurring, in order to prevent the robot from quickly repeating a response over and over. A repeated gesture could be an indicator that the child is stimming. Even if it is not a stim behavior, repeating a poke over and over, for example, is not a desirable social interaction.
Some participants saw the robot as a companion for the child (P3, P4, P5, P6, and P11). They felt it was well suited to act as a friend and source of comfort. P4 extended this idea to helping the child befriend other children, saying “But that would be awesome because, you know […] it’s hard for those kids to make friends, and this could be a friend, you know, an extra friend for them […] and I can see it drawing attention too, to get other children interested in communicating more with that child also.” Therapists highlighted that the robot should behave in a manner that was calm, reassuring, and gentle. In particular, the child’s initial interaction with the robot is very important to gaining trust. P3 and P4 suggested that the robot should start with very small predictable responses or no movement at all, so as not to frighten the child. P4, P5, and P11 suggested the robot build rapport by reciprocating and mirroring the child’s communicative actions, such as giving and receiving hugs, holding the child’s hand, and playfully poking back. They also suggested other actions the robot should perform, such as greeting the child and looking in the direction of interest (either at the child or where the child touched).
Participants also suggested that the robot could serve as a tool for regulating emotional state (P1, P7, P9, P10, and P11). Three participants (P7, P10, and P11) felt that the robot could calm the child down if he or she was overexcited, perhaps by giving hugs, playing calm music, or playing white noise. They also suggested having the robot play comforting custom audio messages recorded by the child’s family members. Conversely, the robot could help energize the child if they were feeling lethargic, using colorful lights (P10) and singing and dancing (P7 and P9). Two participants (P1 and P7) also suggested the robot could replicate some of the cooldown methods therapists use to navigate the child out of a meltdown.
Finally, participants saw the robot as a potential tool for communication (P6, P8, P9, and P11). The robot could encourage communication, perhaps acting as an AAC device. The child could use the robot as a safe companion for practicing communication requests. The robot could use its cameras and/or touch sensors to identify a child’s touch requests, and it could then verbalize those requests out loud (P6). The robot could also use its sensors to detect how the child is feeling and then verbalize this observation to help give the child vocabulary for what the child is feeling (P11).
5 Results: qualitative and quantitative requirements
Building on the above themes, we present seven qualitative requirements and further translate them into quantitative specifications for a touch-perceiving robot.
5.1 Qualitative tactile-perception requirements
The results of the requirement-ranking task can be seen in Figure 7. Five participants explicitly stressed the importance of reviewing and carefully selecting the tactile properties of the robot companion and its tactile-sensing system, and an additional four participants separately mentioned sensitivity in autistic children toward certain tactile textures and sensations. We therefore added the feel requirement to our initial list. Based on the participants input, we provide a finalized version of these seven qualitative tactile-sensing requirements in Table 3. We revised the requirements’ initial descriptions from Table 1 to better reflect the recommendations gathered from participants. Additionally, the robustness requirement was renamed to robustness and maintainability , and the sensitivity requirement was renamed to sensing range , in order to better match their revised descriptions. The finalized requirements are boldfaced and italicized in Tables 3 and 4 and also in the text to allow readers to distinguish them from our initial requirement list. The order of the final requirements corresponds to the median ranking across all participants. As the feel requirement was consistently requested or implied by almost all participants, we placed it third in the requirement priority order.

The participant’s individual responses for the requirement-ranking task. A ranking of 1 indicates the most important requirement, and 6 indicates the least important. A summary of each requirement ranking is also displayed at the top of each panel using a green opacity overlay – the darkness of a number’s green background indicates the number of participants who selected that rank for that requirement. White means that no participants chose that ranking. The asterisk above each overall summary indicates that requirement’s median ranking. Participant 10 completed two different versions of the ranking – one for using the robot in a school setting (P10 S ) and the second for using it in a home (P10 H ).
Our final qualitative guidelines for a touch-sensing robot companion for children with autism, as derived from the recommendations of 11 autism specialists. Requirements are listed in descending order of importance, with 1 indicating most important
Requirement | Qualitative description |
---|---|
1. Robustness and maintainability | The tactile-sensing system and robot should be robust and keep working properly even after rough treatment. The sensor material should be robust to vigorous mechanical interactions as well as oral exploration. The sensor attachment and wires should be robust to pulling and rubbing gestures. The sensor and/or its outer cover should be easy to wash and repair by caregivers. |
2. Sensing range | The robot should be sensitive enough to detect a wide range of contact intensities, from light touches to deep squeezes, similar to humans. |
3. Feel | The sensing system should be pleasant to touch (e.g., soft and squishy). The sensors and wires should be minimally detectable and seamless to touch. |
4. Gesture identification | The robot should be able to differentiate physical communication gestures. The five most recommended gestures to detect include hug, poke, squeeze, hold, and tickle. Gestures should be identifiable at different intensity levels, at different execution rates and speeds (e.g., to detect stimming behavior), and at different locations. |
5. Spatial | The robot should detect touch across all (or a high proportion) of its body. The robot should discriminate which body part or region was touched but does not need exact contact localization within each region. |
6. Temporal | The robot should respond to touch interaction with timing that can be customized to the child’s processing needs. The sensor’s measuring capabilities should be fast enough to capture all human contacts. |
7. Adaptation | The sensors should be easy to scale to different sizes and adapt to different curvatures present on a robot body. The entire robot should also be portable. Adapting to different robot types would be nice but not essential. The appearance of the sensing system should be customizable to the child’s preference. |
The minimum quantitative specifications we propose for a touch-perceiving companion robot for children with autism. These quantitative specifications were translated from our final qualitative requirements (Table 3)
Requirement | Quantitative specifications |
---|---|
1. Robustness and maintainability | ▪ Minimum duration of consistent operation: ≥ 45 min |
▪ Minimum force to withstand without malfunction: ≥ 30 N | |
▪ Compliant with all relevant safety standards in the country of use | |
2. Sensing range | ▪ Minimum detectable force: ≤ 0.4 N |
▪ Maximum detectable force: ≥ 25 N | |
▪ Minimum signal to noise ratio (SNR): ≥ 3.3 | |
3. Feel | ▪ We did not find any references that provide quantitative specifications for tactile pleasantness |
4. Gesture identification | ▪ Gesture recognition: ≤ 5% confusion between socially appropriate and socially inappropriate gestures |
▪ Intensity perception: able to detect each gesture at ≥ 2 intensity levels | |
▪ Spatial consistency: ≤ 5% change in recognized gesture for application of the same gesture in different locations on the same sensor | |
▪ Temporal consistency: ≤ 5% change in recognized gesture for repeated application of the same gesture over time | |
5. Spatial | ▪ Surface area capable of touch detection: ≥ 80% |
▪ Number of contact-sensing regions: ≥ 14 different areas distributed across the robot’s body, without needing to localize contact within each area | |
6. Temporal | ▪ Minimum cutoff frequency for detecting dynamic contacts: ≥ 20 Hz |
▪ Maximum delay between tactile interaction and recognition: ≤ 155 ms | |
7. Adaptation | ▪ Minimum sensor size: ≤ 9 cm2 |
▪ Maximum sensor size: ≥ 100 cm2 | |
▪ Maximum convex curvature: ≥ 0.4 cm−1 | |
▪ Maximum concave curvature: ≥ 0.05 cm−1 |
5.2 Quantitative specifications
Next, we returned to the literature and established methods for implementing touch perception to translate our qualitative requirements from Table 3 into the quantitative specifications shown in Table 4. Rather than limit the translation of qualitative requirements into specifications that work for only one particular tactile-sensing technology, we have attempted to provide quantitative specifications that are as “approach agnostic” as possible. It is our hope that engineers developing many different sensing technologies and processing methods could utilize these specifications to create successful robot companions. Below, we describe the rationale and terminology associated with each requirement in further detail.
In terms of robustness and maintainability , we propose that the tactile-sensing system must remain fully functional, at minimum, for at least the total duration of a child’s therapy session. This specification ensures the robot does not stop working during a child’s session, which could distress and discourage the child and the therapist from utilizing the robot in the future. The typical duration of an autism therapy session was found to be between 30 and 45 min, with some sessions lasting even longer [46]. The sensing system and robot itself must also be able to withstand the maximum pressing force of a human finger (30 N) without breaking [47]. Furthermore, it should continue to operate normally even after being subjected to such treatment. Lastly, all devices used with children must follow applicable local safety standards.
The sensing range requirement indicates that the robot should be sensitive enough to detect a wide range of contact intensities, from light touches to deep squeezes. We propose minimum force detection for a light touch based on the force reported to deliver a light, pleasant touch during affective touch studies, 0.4 N [48]. For the high end of our force sensing range, we suggest that the sensor should be able to detect at least the force at which the human arm is in pain, 25 N [49]. These values help align the robots force sensing capabilities with those of a human, which can in turn help a child learn socially acceptable touch through interaction with the robot. The standard minimum acceptable signal to noise ratio in sensor development is set to be 3.3, following the limit of detection [50]; we thus propose this value as a reasonable guideline for ensuring good performance across the sensing range.
The sensing system should enable reliable gesture identification despite different intensity levels, execution rates, and locations. As such, we believe there should be less than a 5% change in gesture characterization if the same gesture is applied repeatedly over time, or if the same gesture is applied at a different position on the same sensor. Furthermore, many participants expressed interest in using the robot to teach socially appropriate and inappropriate behaviors. It is thus crucial that the robot can differentiate appropriate gesture types (e.g., hug and stroke) from inappropriate ones (e.g., pinch and slap). Therefore, we recommend a confusion rate lower than or equal to 5% when discriminating between these two categories of gestures; confusion rates between gestures in the same category may be higher. For all three of these quantitative specifications, we have recommended the value of 5% to match the accepted level of error that scientists most commonly use when performing statistical analyses (i.e., p ≤ 0.05 in significance testing).
For the spatial requirement, we based our quantitative specifications on the over 80% tactile-sensing coverage of the robot’s body and the 14 body regions that the participants requested during the interviews. The robot’s arms, head, and hands were the most requested body regions (Figure 5).
According to the temporal requirement, the robot should respond to touch interaction with timing that can be customized to the child’s processing needs. To accommodate variation across the autism spectrum, we recommend that the tactile-sensing system’s bandwidth (low-pass cutoff frequency) should be at least twice as fast as that of the fastest human movement frequency of 10 Hz [51,52,53] to adequately capture even energetic or violent touch actions that a child performs. Since textural contact generates large high-frequency vibrations [54], even higher bandwidth may benefit perception of gestures that involve motion across the robot’s surfaces, such as stroking or tickling. While the participants stressed the importance of providing a customizable time delay, some children with autism may be capable of interacting at the same rate as neurotypical individuals. In this case, the robot should detect the child’s touch with near-immediate recognition. We propose that recognition rate match that of a humans average touch reaction time, around 155 ms, to promote the child’s skill transfer from robot interaction to human interaction [55].
For the feel requirement, several studies describe tactile pleasantness in qualitative terms and give examples of surfaces that people commonly perceive as pleasant to touch [2,48,56,57]; however, we were unable to locate any work that provides quantitative specifications for tactile pleasantness.
The adaptation requirement states that the tactile sensing system should be scalable for robots of different shapes and sizes. We translated this qualitative goal to quantitative specifications by referencing the measurements of the NAO, a child-sized robot by SoftBank Robotics that has been used in autism research [27,58,59] and was shown to participants during the study. We estimate that a minimum sensor size of 9 cm2 (3 cm by 3 cm) is small enough to highlight the smallest body regions requested by several participants, such as the hands of the robot. Beyond a maximum sensor size of 100 cm2 (10 cm by 10 cm, approximating a robots belly or upper back), researchers may find that they lose the ability to discriminate between robot body regions, which was heavily prioritized by the participants. The most highly curved robot body part that several participants requested is the arms; each arm of the NAO has an approximate radius of 2.5 cm, which corresponds to a maximum convex curvature of 0.4 cm−1. Many robot surfaces have lower curvature than the arms, including flat surfaces with zero curvature. Some robot surfaces that need tactile sensing may also be gently concave. The most highly curved concave surfaces on the NAO robot are at the front of each leg, below the knee; these surfaces have an approximate radius of 20 cm, which corresponds to a maximum concave curvature of 0.05 cm−1.
6 Discussion
This study has presented guidelines for tactile perception in a robot companion, considering both the qualitative requirements and the quantitative specifications that resulted from our study. This section reflects on the scientific methods we employed and discusses how our results contribute to the fields of SAR and robot-mediated autism intervention. With our key tactile-perception guidelines, we attempt to form a bridge between tactile sensor developers, HRI researchers, and the target populations – children with autism, their families, and their therapist teams. Although many socially assistive robots currently exist to help children with autism, they rarely incorporate touch perception and thus typically cannot feel or react to contacts that children apply. The added feature of rich touch sensing, designed specifically with autistic children in mind, could make their interactions even more meaningful.
6.1 Reflecting on our methods
We explored the existing literature to develop an initial set of six tactile-sensing requirements for a robot companion for children with autism (Table 1). Interviewing 11 autism specialists enabled us to verify the importance of these requirements and expand them into a set of seven richer and more complex descriptions than could be derived from the literature alone (Table 3). The autism specialists were eager to work with us to design a robot specifically catered to the needs of autistic children. Participants actively encouraged us to take into account the tactile properties of the robot (new feel requirement), which will heavily influence the child’s interest in interacting with it, no matter whether they are touch seekers or touch avoiders. The participants alerted us if a requirement needed a shift in thinking, such as changing the approach for the temporal requirement to feature a customizable reaction delay. They also told us if they considered a requirement to be only a low priority. This open dialogue allowed us to form a ranked list of requirements that are focused on the therapeutic needs and capabilities of children with autism; however, as in any human-subject study, our chosen sampling method, interview format, and materials contributed to and influenced our results. Here, we reflect on the rationale for and the limitations caused by these choices.
Our participant pool consisted of 11 participants from the United States and Canada. As children across the autism spectrum behave differently and have different needs, one could argue that a larger pool of specialists could have been interviewed. However, as each participant adds about 1 hour of interview footage and several hours of transcription and data analysis, large-scale recruitment in a study of this format is not feasible. More importantly, we found we reached a saturation in input from these 11 specialists, most likely because most of them have interacted with dozens or hundreds of children with autism over their career. Therefore, we believe our results effectively reflect autism care in the United States and Canada, where similar therapy methods are practiced [60]. Future work could verify and revise our recommendations for autism care practices in other countries.
The remote format of the interviews allowed us to collect data from a variety of autism specialists who are geographically distributed, but it also prevented participants from physically interacting with the study materials. If the interviews had been conducted in person, we could have asked the specialists to demonstrate preferred gestures and locations directly on a robot or a sensor. Furthermore, the participants could have felt the presented robot and sensor prototype. However, given that the recruiting specialist is challenging, we opted for the remote format to be able to recruit widely. In retrospect, we believe that maintaining some distance from the specific robot and sensor we showed may have enabled the participants to think more freely about the technology when answering our questions.
Seeing the example robots and tactile sensor seemed to help the participants better understand our research goals. On the other hand, showing only one robot model and one sensor design may have limited what the participants perceived to be their options and thereby impacted their responses. However, we did alter NAO’s appearance to present both a humanoid and animal form. Furthermore, when asked to describe an ideal appearance for the robot companion, participants gave a variety of unique answers beyond NAOs traditional humanoid form, suggesting additional forms such as cartoon characters, stuffed toys, trucks, and other objects unique to the child’s specific interests. Participants also answered all of the questions regarding their recommendations for the robot’s tactile perception before seeing the prototype tactile sensor.
Presenting the therapists with an initial set of touch-sensing requirements supported their thought process, but it did not seem to bias their responses. For example, the adaptation requirement was not perceived as important by the majority of the participants, and they thus suggested significant revisions to its qualitative definition. New requirements about the sensor’s robustness and maintainability and feel were also suggested during many interviews.
We used the final list of seven qualitative requirements to propose corresponding quantitative specifications, but further studies would be needed to validate the proposed specifications. We hope that our methods, findings, and recommendations can guide the design of such studies in the future.
6.2 Implications for future HRI research
The necessity of touch perception depends on the robots role. Therapist responses in our study suggest that all four recommended roles – teacher, companion, emotion regulation, and communication – would benefit from tactile interaction. However, the benefits accrued from adding touch sensing may vary widely. For some roles, this capability may fall into the category of “nice to have” or “not necessary,” such as a robot teaching math skills. For others, such as emotion regulation and social companionship, touch sensing brings additional information that can help the robot better judge the child’s needs and respond to them more appropriately (e.g., reciprocating a hug). Finally, for teaching acceptable touch interactions and facilitating nonverbal communication, touch sensing is a crucial requirement that should take the primary focus of child–robot interaction design.
Our touch-sensing guidelines span the hardware, software, and user-interaction components of a robotic system that must be tightly integrated. For example, the sensing range and spatial requirements reference hardware specifications, while other recommendations, such as having a customizable temporal delay for robot response, lean more on software and interaction-design approaches. Building a touch-perceiving robot for autism intervention poses different challenges for the sensing, computing, and HRI research communities. Here we present the main challenges that we foresee in these areas, and we suggest future work to address them.
The biggest challenge for the sensing hardware is ensuring robust and reliable measurements in dynamic, uncontrolled environments, such as home or school settings. High-quality force sensors are typically rigid and fragile. In contrast, recently developed stretchable fabric-based sensors (e.g., ref. [61,62,63,64,65]) can provide a promising solution, as they tend to be robust to high-force contacts and impacts and can cover nonplanar surfaces. However, their soft materials often result in nonlinear sensing performance, which may make gesture recognition more difficult. HRI researchers should stress-test soft sensors with the touch gestures that are common among children on the autism spectrum to identify their potential failure points before field deployment. If using an existing commercial social robot, researchers should also test whether the robot demonstrates suitable robustness and maintainability to withstand being touched by energetic users. Unfortunately, most existing commercial robots do not have enough tactile-sensing capabilities onboard to meet any of our quantitative specifications. However, one does not necessarily need to design and build a whole new robot. An external tactile-sensing system could be developed, mechanically fitted to the robot, and also intelligently integrated into the robot’s existing processing technology.
Importantly, the tactile-perception guidelines we propose here are in reference to passive touch, meaning the robot is being touched by the child. Defining the requirements for active touch, where the robot touches the child, is beyond the scope of this study and would require a future study that considers robot mechanics, motion, and control alongside sensing capabilities.
On the software level, identifying touch gestures applied by children with autism is not trivial. The robot will need to process simultaneous tactile sensor inputs from regions all across its body. Some of these tactile sensors will be activated by the robot’s own movement or pose, rather than the child’s contact; developing strategies to screen out these self-caused tactile sensations is still a nascent research topic. Also, large individual differences among children with ASD (e.g., physical abilities, communication abilities, and intent) further add to the challenge of recognizing diverse gestures as they are applied in a realistic setting. Here, HRI research can build on the ongoing work in the artificial intelligence and personalization research communities. The solutions can range from designing user interfaces for manual calibration and customization of the robot by caregivers to developing algorithms that learn and adapt to a child’s actions over time.
Further exploration is needed to collect quantitative tactile interaction information between robots and children with autism. We were unable to identify any literature that provided quantified measurements of physical contact between these interaction partners, so it seems that new physical experiments are needed. Touch locations and touch gesture data could be further characterized through an observation study with autistic children and a large online survey with their caregivers.
Conducting this study has convinced us that touch is a worthwhile sense to pursue when creating a robot companion for children with autism. Although such an endeavor poses major engineering challenges, we believe that a touch-perceiving robot that follows the qualitative requirements and quantitative specifications we have formulated will be able to engage in meaningful passive-touch interactions with children with ASD. In turn, such progress will hopefully increase the level of education and companionship that socially assistive robots are able to provide for this population and more broadly.
Acknowledgments
The authors thank the International Max Planck Research School for Intelligent Systems (IMPRS-IS) for supporting Rachael Burns and the Natural Sciences and Engineering Research Council (NSERC) of Canada for providing funding for Hasti Seifi. The authors also thank all the participants who contributed their insights and time to this study and the reviewers who provided constructive comments on this manuscript. Finally, they thank Joey Burns for his technical support with WebEx and the study setup.
References
[1] J. Ashburner, J. Ziviani, and S. Rodger, “Sensory processing and classroom emotional, behavioral, and educational outcomes in children with autism spectrum disorder,” Amer. J. Occup. Ther., vol. 62, no. 5, pp. 564–573, 2008.10.5014/ajot.62.5.564Search in Google Scholar PubMed
[2] C. J. Cascio, E. J. Moana-Filho, S. Guest, M. B. Nebel, J. Weisner, G. T. Baranek, and G. K. Essick, “Perceptual and neural response to affective tactile texture stimulation in adults with autism spectrum disorders,” Autism Res., vol. 5, no. 4, pp. 231–244, 2012.10.1002/aur.1224Search in Google Scholar PubMed PubMed Central
[3] E. J. Marco, L. B. Hinkley, S. S. Hill, and S. S. Nagarajan, “Sensory processing in autism: A review of neurophysiologic findings,” Pediatric Res., vol. 69, no. 8, pp. 48–54, 2011.10.1203/PDR.0b013e3182130c54Search in Google Scholar PubMed PubMed Central
[4] L. Bestbier and T. I. Williams, “The immediate effects of deep pressure on young people with autism and severe intellectual difficulties: Demonstrating individual differences,” Occup. Ther. Int., vol. 2017, 2017.10.1155/2017/7534972Search in Google Scholar PubMed PubMed Central
[5] Centers for Disease Control and Prevention, “Data & statistics on autism spectrum disorder,” 2020. https://www.cdc.gov/ncbddd/autism/data.html.Search in Google Scholar
[6] M. Begum, R. W. Serna, and H. A. Yanco, “Are robots ready to deliver autism interventions? A comprehensive review,” Int. J. Soc. Robot., vol. 8, no. 2, pp. 157–181, 2016.10.1007/s12369-016-0346-ySearch in Google Scholar
[7] J.-J. Cabibihan, H. Javed, M. Ang, and S. M. Aljunied, “Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism,” Int. J. Soc. Robot., vol. 5, no. 4, pp. 593–618, 2013.10.1007/s12369-013-0202-2Search in Google Scholar
[8] M. Losinski, S. A. Sanders, and N. M. Wiseman, “Examining the use of deep touch pressure to improve the educational performance of students with disabilities: A meta-analysis,” Res. Pract. Pers. Severe Disabil., vol. 41, no. 1, pp. 3–18, 2016.10.1177/1540796915624889Search in Google Scholar
[9] M. E. O’Haire, “Animal-assisted intervention for autism spectrum disorder: A systematic literature review,” J. Autism Dev. Disord., vol. 43, no. 7, pp. 1606–1622, 2013.10.1007/s10803-012-1707-5Search in Google Scholar PubMed
[10] V. Braun and V. Clarke, “Thematic analysis,” in APA Handbook of Research Methods in Psychology, Vol. 2: Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological, Washington, DC, USA: American Psychological Association, 2012, pp. 57–71.10.1037/13620-004Search in Google Scholar
[11] C. J. Cascio, D. Moore, and F. McGlone, “Social touch and human development,” Dev. Cogn. Neurosci., vol. 35, pp. 5–11, 2019.10.1016/j.dcn.2018.04.009Search in Google Scholar PubMed PubMed Central
[12] K. J. Kuchenbecker, “Haptics and haptic interfaces,” in Encyclopedia of Robotics, Berlin, Germany: Springer, 2018.10.1007/978-3-642-41610-1_19-1Search in Google Scholar
[13] J. Neal, L. Bigby, and R. Nicholson, “Occupational therapy, physical therapy, and orientation and mobility services in public schools,” Interv. Sch. Clin., vol. 39, no. 4, pp. 218–222, 2004.10.1177/10534512040390040301Search in Google Scholar
[14] L. Peranich, K. B. Reynolds, S. O’Brien, J. Bosch, and T. Cranfill, “The roles of occupational therapy, physical therapy, and speech/language pathology in primary care,” J. Nurse Practitioners, vol. 6, no. 1, pp. 36–43, 2010.10.1016/j.nurpra.2009.08.021Search in Google Scholar
[15] Legal Information Institute, “20 U.S. Code § 1401. Definitions” https://www.law.cornell.edu/uscode/text/20/1401.Search in Google Scholar
[16] M. E. O’Haire, S. J. McKenzie, S. McCune, and V. Slaughter, “Effects of classroom animal-assisted activities on social functioning in children with autism spectrum disorder,” J. Alt. Complementary Med., vol. 20, no. 3, pp. 162–168, 2014.10.1089/acm.2013.0165Search in Google Scholar PubMed PubMed Central
[17] S. C. Mey, “Animal assisted therapy for children with autism,” Int. J. Child. Dev. Ment. Health., vol. 5, no. 1, pp. 29–42, 2017.Search in Google Scholar
[18] M. M. Bass, C. A. Duchowny, and M. M. Llabre, “The effect of therapeutic horseback riding on social functioning in children with autism,” J. Autism Dev. Disord., vol. 39, no. 9, pp. 1261–1267, 2009.10.1007/s10803-009-0734-3Search in Google Scholar PubMed
[19] D. Feil-Seifer and M. J. Mataric, “Defining socially assistive robotics,” in Proc. IEEE Int. Conf. Rehabil. Robot. (ICORR), 2005, pp. 465–468.Search in Google Scholar
[20] T. Shibata and K. Wada, “Robot therapy: A new approach for mental healthcare of the elderly – a mini-review,” Gerontology, vol. 57, no. 4, pp. 378–386, 2011.10.1159/000319015Search in Google Scholar PubMed
[21] S. Jeong, K. D. Santos, S. Graca, B. O’Connell, L. Anderson, N. Stenquist, et al., “Designing a socially assistive robot for pediatric care,” in Proc. Int. Conf. Interact. Design and Children, 2015, pp. 387–390.10.1145/2771839.2771923Search in Google Scholar
[22] Y. S. Sefidgar, K. E. MacLean, S. Yohanan, H. M. Van der Loos, E. A. Croft, and E. J. Garland, “Design and evaluation of a touch-centered calming interaction with a social robot,” IEEE Trans. Affect. Comput., vol. 7, no. 2, pp. 108–121, 2015.10.1109/TAFFC.2015.2457893Search in Google Scholar
[23] C. L. Bethel, Z. Henkel, S. Darrow, and K. Baugus, “Therabot – an adaptive therapeutic support robot,” in Proc. IEEE World Symp. Digit. Intell. Syst. and Mach. (DISA), 2018, pp. 23–30.10.1109/DISA.2018.8490642Search in Google Scholar
[24] A. Peca, R. Simut, S. Pintea, C. Costescu, and B. Vanderborght, “How do typically developing children and children with autism perceive different social robots?” Comput. Hum. Behav., vol. 41, pp. 268–277, 2014.10.1016/j.chb.2014.09.035Search in Google Scholar
[25] B. Robins, N. Otero, E. Ferrari, and K. Dautenhahn, “Eliciting requirements for a robotic toy for children with autism – results from user panels,” in Proc. IEEE Int. Symp. Robot and Human Interact. Commun. (RO-MAN), 2007, pp. 101–106.10.1109/ROMAN.2007.4415061Search in Google Scholar
[26] E. Ferrari, B. Robins, and K. Dautenhahn, “Therapeutic and educational objectives in robot assisted play for children with autism,” in Proc. IEEE Int. Symp. Robot and Human Interact. Commun. (RO-MAN), 2009, pp. 108–114.10.1109/ROMAN.2009.5326251Search in Google Scholar
[27] R. Suzuki and J. Lee, “Robot-play therapy for improving prosocial behaviours in children with autism spectrum disorders,” in Proc. IEEE Int. Symp. Micro-NanoMechatron. and Human Sci. (MHS), 2016, pp. 1–5.10.1109/MHS.2016.7824238Search in Google Scholar
[28] E. S. Kim, L. D. Berkovits, E. P. Bernier, D. Leyzberg, F. Shic, R. Paul, and B. Scassellati, “Social robots as embedded reinforcers of social behavior in children with autism,” J. Autism Dev. Disord., vol. 43, no. 5, pp. 1038–1049, 2013.10.1007/s10803-012-1645-2Search in Google Scholar PubMed
[29] A. Duquette, F. Michaud, and H. Mercier, “Exploring the use of a mobile robot as an imitation agent with children with low-functioning autism,” Auton. Robot., vol. 24, no. 2, pp. 147–157, 2008.10.1007/s10514-007-9056-5Search in Google Scholar
[30] H. Javed and C. H. Park, “Interactions with an empathetic agent: Regulating emotions and improving engagement in autism,” IEEE Robot. Autom. Mag., vol. 26, no. 2, pp. 40–48, 2019.10.1109/MRA.2019.2904638Search in Google Scholar
[31] C. M. Stanton, P. H. Kahn, R. L. Severson, J. H. Ruckert, and B. T. Gill, “Robotic animals might aid in the social development of children with autism,” in Proc. ACM/IEEE Int. Conf. Human-Robot Interact. (HRI), 2008, pp. 271–278.10.1145/1349822.1349858Search in Google Scholar
[32] B. Scassellati, L. Boccanfuso, C.-M. Huang, M. Mademtzi, M. Qin, N. Salomons, et al., “Improving social skills in children with ASD using a long-term, in-home social robot,” Sci. Robot., vol. 3, no. 21, art. eaat7544, 2018.10.1126/scirobotics.aat7544Search in Google Scholar PubMed
[33] R. Pakkar, C. Clabaugh, R. Lee, E. Deng, and M. J. Matarić, “Designing a socially assistive robot for long-term in-home use for children with autism spectrum disorders,” in Proc. IEEE Int. Symp. Robot and Human Interact. Commun. (RO-MAN), 2019, pp. 1–7.10.1109/RO-MAN46459.2019.8956468Search in Google Scholar
[34] J. M. K. Westlund, H. W. Park, R. Williams, and C. Breazeal, “Measuring young children’s long-term relationships with social robots,” in Proc. ACM Conf. Interact. Design and Children (IDC), 2018, pp. 207–218.10.1145/3202185.3202732Search in Google Scholar
[35] H. Javed, R. Burns, M. Jeon, A. M. Howard, and C. H. Park, “A robotic framework to facilitate sensory experiences for children with autism spectrum disorder: A preliminary study,” ACM Trans. Human-Robot Interact. (THRI), vol. 9, no. 1, pp. 1–26, 2019.10.1145/3359613Search in Google Scholar PubMed PubMed Central
[36] S. Shamsuddin, H. Yussof, L. Ismail, F. A. Hanapiah, S. Mohamed, H. A. Piah, and N. I. Zahari, “Initial response of autistic children in human-robot interaction therapy with humanoid robot NAO,” in Proc. IEEE Int. Colloq. Signal Process. and its Applications, 2012, pp. 188–193.10.1109/CSPA.2012.6194716Search in Google Scholar
[37] B. Robins and K. Dautenhahn, “Tactile interactions with a humanoid robot: Novel play scenario implementations with children with autism,” Int. J. Soc. Robot., vol. 6, no. 3, pp. 397–415, 2014.10.1007/s12369-014-0228-0Search in Google Scholar
[38] J. Chang, K. MacLean, and S. Yohanan, “Gesture recognition in the haptic creature,” in Haptics: Generating and Perceiving Tangible Sensations, Proc. EuroHaptics, Part I, Lecture Notes in Comp. Sci., Berlin, Germany: Springer, 2010, pp. 385–391.10.1007/978-3-642-14064-8_56Search in Google Scholar
[39] X. L. Cang, P. Bucci, A. Strang, J. Allen, K. MacLean, and H. S. Liu, “Different strokes and different folks: Economical dynamic surface sensing and affect-related touch recognition,” in Proc. ACM Int. Conf. Multimodal Interact., 2015, pp. 147–154.10.1145/2818346.2820756Search in Google Scholar
[40] J. L. Krichmar and T.-S. Chou, “A tactile robot for developmental disorder therapy,” in TechMindSociety '18: Proc. Tech., Mind, and Soc., New York, NY, USA: ACM, 2018, pp. 1–6.10.1145/3183654.3183657Search in Google Scholar
[41] H. Kozima, M. P. Michalowski, and C. Nakagawa, “Keepon,” Int. J. Soc. Robot., vol. 1, no. 1, pp. 3–18, 2009.10.1007/s12369-008-0009-8Search in Google Scholar
[42] M. Mastrogiuseppe, O. Capirci, S. Cuva, and P. Venuti, “Gestural communication in children with autism spectrum disorders during mother–child interaction,” Autism, vol. 19, no. 4, pp. 469–481, 2015.10.1177/1362361314528390Search in Google Scholar PubMed
[43] S. Yohanan and K. E. MacLean, “The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature,” Int. J. Soc. Robot., vol. 4, no. 2, pp. 163–180, 2012.10.1007/s12369-011-0126-7Search in Google Scholar
[44] B. Robins, F. Amirabdollahian, Z. Ji, and K. Dautenhahn, “Tactile interaction with a humanoid robot for children with autism: A case study analysis involving user requirements and results of an initial implementation,” in Proc. IEEE Int. Symp. Robot and Human Interact. Commun. (RO-MAN), 2010, pp. 704–711.10.1109/ROMAN.2010.5598641Search in Google Scholar
[45] Cisco WebEx, “Four key security features of Cisco web conferencing,” 2019. https://blog.webex.com/video-conferencing/four-key-security-features-of-cisco-webex/.Search in Google Scholar
[46] R. Watling, J. Deitz, E. M. Kanny, and J. F. McLaughlin, “Current practice of occupational therapy for children with autism,” Amer. J. Occup. Ther., vol. 53, no. 5, pp. 498–505, 1999.10.5014/ajot.53.5.498Search in Google Scholar PubMed
[47] N. Miyata, K. Yamaguchi, and Y. Maeda, “Measuring and modeling active maximum fingertip forces of a human index finger,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots and Syst. (IROS), 2007, pp. 2156–2161.10.1109/IROS.2007.4399243Search in Google Scholar
[48] P. Taneja, H. Olausson, M. Trulsson, P. Svensson, and L. Baad-Hansen, “Defining pleasant touch stimuli: A systematic review and meta-analysis,” Psychol. Res., pp. 1–16, 2019.10.1007/s00426-019-01253-8Search in Google Scholar PubMed
[49] M. Melia, M. Schmidt, B. Geissler, J. König, U. Krahn, H. J. Ottersbach, et al., “Measuring mechanical pain: The refinement and standardization of pressure pain threshold measurements,” Behav. Res. Methods, vol. 47, no. 1, pp. 216–227, 2015.10.3758/s13428-014-0453-3Search in Google Scholar PubMed
[50] D. A. Armbruster and T. Pry, “Limit of blank, limit of detection and limit of quantitation,” Clin. Biochem. Rev., vol. 29, no. Suppl 1, p. S49–52, 2008.Search in Google Scholar
[51] P. D. Neilson, “Speed of response or bandwidth of voluntary system controlling elbow position in intact man,” Med. Biol. Eng., vol. 10, no. 4, pp. 450–459, 1972.10.1007/BF02474193Search in Google Scholar PubMed
[52] P. Fischer, R. Daniel, and K. Siva, “Specification and design of input devices for teleoperation,” in Proc. IEEE Int. Conf. Robot. and Autom. (ICRA), pp. 540–545, 1990.10.1109/ROBOT.1990.126036Search in Google Scholar
[53] E. Foxlin, “Motion tracking requirements and technologies,” Handb. Virtual Env. Tech., vol. 8, pp. 163–210, 2002.Search in Google Scholar
[54] S. Choi and K. J. Kuchenbecker, “Vibrotactile display: Perception, technology, and applications,” Proc. IEEE, vol. 101, pp. 2093–2104, 2013.10.1109/JPROC.2012.2221071Search in Google Scholar
[55] R. Milo, P. Jorgensen, U. Moran, G. Weber, and M. Springer, “BioNumbers – the database of key numbers in molecular and cell biology,” Nucleic Acids Res., vol. 38, no. suppl_1, pp. D750–D753, 2010.10.1093/nar/gkp889Search in Google Scholar PubMed PubMed Central
[56] C. J. Cascio, J. Lorenzi, and G. T. Baranek, “Self-reported pleasantness ratings and examiner-coded defensiveness in response to touch in children with ASD Effects of stimulus material and bodily location,” J. Autism Dev. Disord., vol. 46, no. 5, pp. 1528–1537, 2016.10.1007/s10803-013-1961-1Search in Google Scholar PubMed PubMed Central
[57] R. Etzi, C. Spence, and A. Gallace, “Textures that we like to touch: An experimental study of aesthetic preferences for tactile stimuli,” Conscious. Cogn., vol. 29, pp. 178–188, 2014.10.1016/j.concog.2014.08.011Search in Google Scholar PubMed
[58] A. Tapus, A. Peca, A. Aly, C. Pop, L. Jisa, S. Pintea, et al., “Children with autism social engagement in interaction with NAO, an imitative robot: A series of single case experiments,” Interact. Stud., vol. 13, no. 3, pp. 315–347, 2012.10.1075/is.13.3.01tapSearch in Google Scholar
[59] J. Greczek, E. Kaszubski, A. Atrash, and M. Matarić, “Graded cueing feedback in robot-mediated imitation practice for children with autism spectrum disorders,” in Proc. IEEE Int. Symp. Robot and Human Interact. Commun. (RO-MAN), 2014, pp. 561–566.10.1109/ROMAN.2014.6926312Search in Google Scholar
[60] M. Keenan, K. Dillenburger, H. R. Röttgers, K. Dounavi, S. L. Jónsdóttir, P. Moderato, et al., “Autism and ABA: The gulf between North America and Europe,” Rev. J. Autism Dev. Disord., vol. 2, no. 2, pp. 167–183, 2015.10.1007/s40489-014-0045-2Search in Google Scholar
[61] G. H. Büscher, R. Kõiva, C. Schürmann, R. Haschke, and H. J. Ritter, “Flexible and stretchable fabric-based tactile sensor,” Robot. Auton. Syst., vol. 63, pp. 244–252, 2015.10.1016/j.robot.2014.09.007Search in Google Scholar
[62] H. Lee, K. Park, J. Kim, and K. J. Kuchenbecker, “Internal array electrodes improve the spatial resolution of soft tactile sensors based on electrical resistance tomography,” in Proc. IEEE Int. Conf. Robot. and Automat. (ICRA), Montreal, Canada, pp. 5411–5417, 2019.10.1109/ICRA.2019.8794276Search in Google Scholar
[63] T. Yoshikai, H. Fukushima, M. Hayashi, and M. Inaba, “Development of soft stretchable knit sensor for humanoids’ whole-body tactile sensibility,” in Proc. IEEE-RAS Int. Conf. Humanoid Robots, pp. 624–631, 2009.10.1109/ICHR.2009.5379556Search in Google Scholar
[64] S. Pyo, J. Lee, W. Kim, E. Jo, and J. Kim, “Multi-layered, hierarchical fabric-based tactile sensors with high sensitivity and linearity in ultrawide pressure range,” Adv. Func. Mater., vol. 29, no. 35, art. 1902484, 2019.10.1002/adfm.201902484Search in Google Scholar
[65] Y. Song, W. Huang, C. Mu, X. Chen, Q. Zhang, A. Ran, et al., “Carbon nanotube-modified fabric for wearable smart electronic-skin with exclusive normal-tangential force sensing ability,” Adv. Mater. Technol., vol. 4, no. 5, art. 1800680, 2019.10.1002/admt.201800680Search in Google Scholar
© 2021 Rachael Bevill Burns et al., published by De Gruyter
This work is licensed under the Creative Commons Attribution 4.0 International License.