Service robotics, a branch of robotics that entails the development of robots able to assist humans in their environment, is of growing interest in the hospitality industry. Designing effective autonomous service robots, however, requires an understanding of Human–Robot Interaction (HRI), a relatively young discipline dedicated to understanding, designing, and evaluating robotic systems for use by or with humans. HRI has not yet received sufficient attention in hospitality robotic design, much like Human–Computer Interaction (HCI) in property management system design in the 1980s. This article proposes a set of introductory HRI guidelines with implementation standards for autonomous hospitality service robots.
A set of key user-centered HRI guidelines for hospitality service robots were extracted from 52 research articles. These are organized into service performance categories to provide more context for their application in hospitality settings.
Based on an extensive literature review, this article presents some HRI guidelines that may drive higher levels of acceptance of service robots in customer-facing situations. Deriving meaningful HRI guidelines requires an understanding of how customers evaluate service interactions with humans in hospitality settings and to what degree those will differ with service robots.
Robots are challenging assumptions on how hospitality businesses operate. They are being increasingly deployed by hotels and restaurants to boost productivity and maintain service levels. Effective HRI guidelines incorporate user requirements and expectations in the design specifications. Compilation of such information for designers of hospitality service robots will offer a clearer roadmap for them to follow.
Emerald Publishing Limited
Copyright © 2020, Galen R. Collins
Published in International Hospitality Review. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode
The rapidly evolving world of technology has manifested itself in innumerable ways. Perhaps the most profound is the way in which humans interact with robots, non-carbon life forms that are capable of performing a range of tasks with varying levels of complexity. The 2013 U.S. Robotics Roadmap, authored by more than 150 researchers, predicts that robotics could be as transformative as the internet. Robots are challenging assumptions on how hospitality businesses operate. They are being increasingly deployed by hotels and restaurants to boost productivity and maintain service levels. Consequently, service robotics, a branch of robotics that entails the development of robots able to assist humans in their environment, is of growing interest in the hospitality industry. Designing effective autonomous service robots, however, requires an understanding of Human–Robot Interaction (HRI), a relatively young discipline dedicated to understanding, designing, and evaluating robotic systems for use by or with humans. HRI has not yet received sufficient attention in hospitality robotic design, much like Human–Computer Interaction (HCI) in property management system design in the 1980s. This article proposes a set of introductory HRI guidelines with implementation standards for autonomous hospitality service robots based on an extensive literature review. These are organized into service performance categories to provide more context for their application in hospitality settings.
Robots from industrial to human spaces
Robots can be broadly categorized as industrial or service robots. Industrial robots are used in the manufacturing and production of goods. The first industrial robot, invented in 1954 and later sold to General Motors in 1961, was used for a singular task: lifting pieces of metal from die casting machines. The first hospitality-related service robot, named Sepulka, was a tour guide at the Polytechnic Museum in Moscow, Russia. It was put into service in 1963 and showed visitors around the museum for over five decades. The International Organization for Standardization (ISO 8373) defines a “service robot” as a robot “that performs useful tasks for humans or equipment excluding industrial automation applications.” According to ISO 8373, service robots require a degree of autonomy or the ability to perform intended tasks based on external sensors, without human intervention. For service robots this ranges from partial autonomy, including human robot interaction, to full autonomy, without active human robot intervention. Joseph Engleberger, the developer of the first industrial robot in the United States, predicted that service robots would one day become the largest class of robot applications, outnumbering the industrial uses by several times (Engleberger, 1989). This reality is on the horizon due to technological advancements in service robotics enabling robots to move from predictable environments (e.g. manufacturing) into customer-facing roles.
A major difference between industrial and service robots is the environment in which they operate. Industrial robots are typically deployed in highly structured and well-contained environments. Employees receive special training on how to interact with the robots for accomplishing preprogrammed and narrow tasks. However, for service robots this is not typically possible. Their tasks are often carried out in ever-changing environments (e.g. delivering luggage to a particular room), requiring navigational capabilities for maneuvering through populated and sometimes constricted areas (e.g. hotel elevator). They often interact with people to carry out their tasks (e.g. taking a food order or answering a question), requiring varying levels of capability and artificial intelligence (AI).
Travelzoo (2016), a global media commerce company, published a study on the acceptance of robots working in the travel industry based a survey of 6,211 travelers in Brazil, Canada, China, France, Germany, Japan, Spain, the United Kingdom, and the United States. Almost two-thirds of respondents were comfortable with the use of robots in the travel industry. Less than half of the French and German respondents were at ease with robotic hospitality. The vast majority of respondents felt that robots respondents were superior to humans in efficiency, data retention, recall, data handling, and language capabilities but inferior in personalized interactions. Another key finding from this study is that consumers presently see the combination of robots and humans working in tandem in customer-facing roles as the ideal solution.
Service robot attributes and capabilities
Service robot AI entails cognitive and affective computing. The required range of robotic cognitive and affective skills or attributes is dependent on the robotic design and the task complexity and environment. Cognitive computing enables robots to mimic the way the human brain works. It involves machine learning, reasoning, natural language processing, speech and object recognition, and human–computer interaction (Kelly, 2015). According to Trevelyan (1999), robots need more than cognitive intelligence to interact with humans. They must also possess social and emotional skills.
Affective computing, an interdisciplinary field spanning computer science, psychology, and cognitive science, enables robots to recognize, interpret, process, and simulate human emotions and interact with humans in a socially acceptable and empathetic manner (Tao and Tieniu, 2005). In other words, they can be programmed to be emotionally intelligent, which is of particular importance in handling complex customer service interactions (e.g. handling an upset customer). However, this capability is in the early stages of development (Picard, 2007).
Parnas (2017) maintains that the application of AI methods can result in untrustworthy systems. The trust and confidence (assurance) that customers have in a business is of paramount confidence. Everything about a business has to be trustworthy, including the procedures and systems (Collins, 2016). Hospitality service interactions (e.g. handling a customer complaint) are driven by service quality standards (e.g. reliability and responsiveness) or rules-based actions. AI is sometimes described as heuristic programming. Heuristics are typically simple, efficient rules for making decisions and solving problems. Parnas states that heuristics can be safely used by an application if there is more than one acceptable solution to select from. For example, more than one resolution option may be required to calm and satisfy an upset customer, which may require the intervention of a supervisor (another option) if none of the robot's resolution options are acceptable.
An intelligent service robot's model of the world is created though its programming and senses. Its purpose is defined by human-created algorithms, which may generate subgoals depending on the situation (Kuipers, 2018). Performance requirements for a service robot depend on the number and types of steps in a particular service process. A given service process specifies the service request elements (e.g. guest requests a snack to be delivered to Room X at Y time of day), the sequence of required actions and applicable service standards (e.g. requested snack is loaded into the robot's storage bin and delivered to room X within Z minutes), the service-interaction execution (e.g. guest is notified of delivery and retrieves the snack from the robot's secure storage bin ), and the service evaluation methodology (e.g. the guest accepts the snack and then rates the experience via the robot touchscreen). Menial tasks do not require much robot reasoning or HRI unless there is a service failure (Rodriguez-Lizundia et al., 2015). For example, if the guest rejects the snack, the service robot must then respond appropriately and identify a remedy that satisfies the guest.
The first emotionally intelligent robot, named Pepper, was unveiled by Softbank Robotics in 2014. This autonomous robot is capable of interpreting facial expressions, tone of voice, and body movements and then responding accordingly (e.g. comforting you when you are sad). It also can recognize people by their voices and faces and is equipped with hand sensors for playing games and social interaction. Pepper is being used by businesses in Europe, Asia, and North America. For example, Société Nationale des Chemins de Fer (SNCF), France's official railway operator, has deployed Pepper in three stations for providing customers with information on trains and the surrounding areas, entertaining customers (dances and games) while they wait, and recording customer satisfaction with train services. In Asia, Pizza Hut is using Pepper for greeting and interacting with customers, taking and placing food orders, and settling checks. Costa Cruise Lines uses Pepper for assisting passengers and is trilingual in English, German, and Italian. Pepper is known as a technical ambassador at the Mandarin Oriental Hotel in Las Vegas, Nevada, providing answers to property-specific questions, giving directions, telling stories, and posing for selfies.
Pepper is a human-like robot or humanoid. Humanoid robots that are built to aesthetically resemble humans are called androids. Japan's Hotel Henn na (which means weird in Japanese) has a female android interacting with guests as she checks them in and out. Research has shown that for people to relate to robots, their outward appearance must be human-like and aesthetically pleasing (Schermerhorn and Scheutz, 2011). However, the noted Japanese roboticist Masahiro Mori introduced a theory, known as the Uncanny Valley, which predicts robots with appearances almost, but not exactly, like real human beings elicit uncanny, or strangely familiar, feelings of eeriness and repulsion in observers if their behaviors are not human enough. Mori recommends avoiding this valley by building robots that do not resemble people too much, but are still human-like in behavior (Mori, 1970). This raises important questions regarding the level of robotic humanization required for various types of service interactions with hospitality customers. Nieuwenhuisen et al. (2010) maintain that the multimodal communication (e.g. speech, facial expressions, gestures, and body language) capabilities of a service robot are the key to successful interactions with human users.
Need for HRI design
Robots are evolving quickly, and so is the need for HRI design, especially as robots are challenged to undertake more sophisticated tasks in a variety of environments (Burke et al., 2004). Even two robots based on the same technology can be experienced in different ways by users depending on the type of interaction system deployed (Kim et al., 2011). Robots that appear too human may not be ideal for social interaction. Various physical and behavioral design factors need to be carefully addressed, such as the size and eyes. A robot that is too large may overwhelm a customer. If it is too small, it may be ignored. Eye contact and gaze need to be right to emotionally connect with customers and draw them into conversations. Robots must be able to move and act in safe, understandable, and appropriate ways, taking into account social rules like proxemics or the amount of space that people feel it necessary to set between themselves and others (Beer et al., 2011). HRI design not based on physical and social models humans expect hinders acceptance of robots (Breaszeal, 2003; Kanda et al., 2008). These are examples of some of the challenges that need to be translated into clear and meaningful HRI design guidelines for improving human-robot interactions and acceptance during hospitality service encounters.
The user interface (UI) design for robots (HRI) is similar and different from HCI. The Special Interest Group on Computer–Human Interaction (SIGCHI) of the Association of Computer Machinery (ACM) defines HCI as the discipline concerned with the design, evaluation and implementation of interactive computing systems for human use, and with the study of major phenomena surrounding them. Yanco and Drury (2002) view HRI as a subset of HCI because robots are interactive computing systems designed to benefit humans. Scholtz (2003) maintains that HRI differs from HCI because it concerns complex, dynamic systems that exhibit autonomy and cognition and physically operate in changing, real-world environments. Differences also occur in the types and nature of interactions or interaction roles. For example, hotel robots delivering items to rooms (e.g. towels) need to interact with employees providing the delivery items and instructions, with guests receiving the items, and with people blocking delivery paths. Each of these interactions has different tasks and hence, different situational roles and awareness needs.
Molich and Nielsen (1990) maintain that any system designed for people should be easy to interact with, effective, and pleasant to use. Ideally, a human-machine interface should be transparent to the user, requiring little or no cognitive effort (Rodriguez-Lizundia et al., 2015). Having a set of UI design guidelines and recommendations for how user inputs and interaction mechanisms work in particular environments, identifies likely UI problems before robotic products have gone into production. The cost of early interface changes is 25 percent less than that of late changes (Mantei and Teorey, 1988). UI design guidelines can also be translated into criteria or questions for evaluating usability (e.g. task easiness, intuitiveness, and efficiency) and user experiences (e.g. task meaningfulness and value and user perceptions and emotional connections).
The article proposes a set of introductory HRI guidelines with implementation standards for autonomous hospitality service robots with varying functions, social capabilities, and appearances. These were based on an extensive literature review that addressed human-centered rather than robot-centered HRI design. Robot-centered design addresses how robots recognize, understand, and react effectively in their given environment in order to interact well with humans from the perspectives of robot engineers. In contrast, human-centered design addresses the responses, requirements, and understandings or perceptions of robot users. It is concerned with how robots can fulfill their tasks in a way that is acceptable and comfortable to customers (Dautenhahn, 2007). However, ideal HRI implementation standards for particular robotic applications may be difficult to achieve for heterogeneous customer populations (e.g. auditory requirements for older users vs younger users) (https://www.interaction-design.org/literature/topics/design-guidelines).
Studies were collected and reviewed on HRI guidelines and implementation standards specific to autonomous service robots, especially those deployed in hospitality settings. Searches were performed on three online bibliographic databases: ACM's digital library, IEEE's Xplore digital library, and Elsevier's ScienceDirect. These databases were queried using general keys keywords, such as service robots, service robot UI, HRI design guidelines, HRI usability, HRI UI guidelines, affective robots, robot aesthetics, robot functionality, social robots, hotel robots, and restaurant robots.
After abstract inspection and duplicate inspection, 125 articles were read. Articles that did not address user-centered HRI guidelines and implementation standards were excluded. The resulting 89 articles with overlapping HRI guidelines and implementation standards were reduced to 52 articles, from which a set of key user-centered HRI guidelines with support research and implementation standards for hospitality service robots were extracted.
Deriving meaningful HRI guidelines requires an understanding of how customers evaluate service interactions with humans in hospitality settings and to what degree those will differ with service robots. According to Shechtman and Horowitz (2003), creating effective social technologies requires designers to have an understanding of human-human interactions. Designers must pay attention to not only the social cues that robots emit but also to the information people use to create mental models of a robot (Lee et al., 2005). Social psychological research suggests that customers to some extent will judge service robot performances based on their mental models or expectations of human performances (Bargh et al., 1996; Breazeal, 2003). After extensive research, Zeithaml et al. (1991) found five variables customers use when evaluating service performance or quality: reliability, responsiveness, empathy, assurance, and tangibles (see Table I). These were used for organizing the HRI guidelines to provide more context for how they apply in hospitality settings. The proposed set of HRI guidelines points to the original source for each guideline.
Of the service performance variables denoted in Table I research studies have shown that reliability is the most important determinant of the perception of service quality among U.S. customers. It addresses how customers judge the service outcome or the delivered service, whereas responsiveness, empathy, assurance, and tangibles address how customers judge the service process or the service being delivered (Zeithaml et al., 1990).
Customers use their sensory desktops (vision, sound, touch, etc.) to evaluate hospitality experiences. They combine to produce a customer's perceptual experience of his or her surroundings. Consequently, it is important to understand how customers take in hospitality encounters with service robots via their sensory desktops when developing HRI guidelines. Service robots in hospitality environments can vary greatly in form and function.
HRI guidelines with implementation standards
The guidelines are numbered sequentially for each service performance variable to permit convenient referencing. Each guideline is stated as a single sentence. A stated guideline is illuminated with one or more research findings and implementation standard considerations for hospitality environments. Depending on the task and its complexity, a particular HRI guideline may not be applicable. In some instances, guidelines are specifically related to one another because of the underlying robotic technologies involved. Much of the research cited is based on experimental studies.
Guideline: Service tasks are accomplished with designed autonomy (effectiveness) and within the required time frames (efficiency) (Steinfeld et al., 2006).
Research: A critical construct related to HRI is autonomy or the ability for service robots to operate in hospitality environments without any form of external control for extended periods of time (Beer et al., 2014). Autonomy addresses the number of changes that service robots must detect (perceiving environment complexity) and adapt to (making appropriate decisions) for completing tasks. As service robot autonomy increases so does the level of HRI sophistication or the richness of its interactions with customers (Thrun, 2004). If service robots are designed to be fully autonomous but require human intervention 20 percent of the time to successfully complete tasks, they are only 80 percent effective and will be judged accordingly by customers. If a service robot completes a task (e.g. a snack is delivered to room X in 30 min that should have been completed in 15 min), it is only 50 percent efficient and will be judged accordingly by customers (Steinfeld et al., 2006). Time to task completion is a critical reliability metric in the hospitality industry. In order to achieve the optimal effectiveness in interactions, measures (e.g. help and feedback mechanisms) for informing the customer on how to interact efficiently with the service robot may be needed.
Implementation standard: With the assistance of hospitality professionals, develop hospitality-specific standard operating procedures and metrics for each robotic task. Standard operating procedures specify the (Collins, 2016):
The standard (e.g. guest in room X receives requested item within Y minutes).
How the standard is met (e.g. service robot notifies guest via phone call when it arrives to room X with requested item and enables the guest to retrieve the requested item once it senses that the guestroom door is opened).
The necessary robotic functionality to implement the service standard (e.g. service robot is equipped with the appropriate sensors, control and navigational systems, and electric propulsion and speed – 1.5 mph) to deliver the requested item to room X within the time frame communicated to the guest.
What corrective should be taken when the standard is not met (e.g. service robot notifies a hotel employee to call the guest, apologizes for delivering the incorrect requested item, and informs the guest that a hotel employee will be calling within X minutes to correct the problem).
Guideline: Communication is initiated when the service robot detects the presence of a customer (Rodriguez-Lizundia et al., 2015).
Research: Service robots greeting customers when expected to do so causes engagement rather than rejection (Satake et al., 2009).
Implementation standard: Marriott International has a 15/5 customer-contact rule. When customers come within 15 feet of service robots, they acknowledge their presence with eye contact or some other gesture. Service robots provide verbal greetings when customers are within 5 feet.
Guideline: The service robot's interaction with a customer, in terms of length and dialogue, is appropriate for the given task (Scheutz et al., 2011).
Research: More research is needed on human receptivity to different task-based dialogues with service robots and the constraints they impose at any given point during the exchange (Scheutz et al., 2011). The service interaction needs scripting beginning with the greeting through the closing dialogue. Do not keep customers guessing about what the next step is in the interaction (Shneiderman, 1998). The ideal interaction time, the difference between the service robot's first verbal contact with the customer and the end of the dialog, will vary by task. Interaction time is an essential metric for human-robot interaction efficiency (Beaudouin-Lafon, 2004). Finally, conversations need to be at acceptable human interaction rates (Fong et al., 2003).
Implementation standard: Service robots use customer names in greetings whenever possible and phrases that yield closure to dialogues, such as: “It is my pleasure to be of service.” People appreciate having their names used by service robots (Kanda and Ishiguro, 2013). For most hospitality tasks, short interaction times are preferred. For example, the target interaction time for checking a guest in might be three minutes. Service robot speech rates should mirror those of customers, which vary between 110 and 150 words per minute.
Guideline: The service robot promptly responds to customer requests and needs, providing them with the required service and information to achieve their goals (Stock and Merkle, 2017).
Research: Responsiveness has a significant impact on both customer satisfaction (fast response) and dissatisfaction (slow response) (Athanassopoulos, 2000). Quick responses communicate to customers that they are important. An appropriate response requires service robots to listen attentively to requests or problems. Service robots may need to ask additional questions to fully understand customer inquiries or issues: who? what? when? where? and how? Customers will stop interacting with service robots if they do not get helpful information (informativeness) or the expected responses (Stock and Merkel, 2017; SatakeKanda et al., 2009). Furthermore, service robots that interrupt conversations in progress will more than likely decrease the level of comfort and discourage customers from interacting with them (Rodriguez-Lizundia et al., 2015). Ideally, service robots providing customer-intensive services should handle common requests and complaints without any human assistance. First contact resolutions are an important customer satisfaction factor. Research shows that 95 percent of complaining customers will return if the complaint is resolved on the spot (Collins, 2016).
Implementation: Establish service robot reaction times to common queries and activities of customers (e.g. room service robot answers phone within three rings, food service robot checks on table three minutes after food is delivered, etc. ). Service robots should restate customer questions to assure confirmation. Build domain- or task-specific knowledge databases that enable service robots to easily analyze inquiries and generate appropriate responses and actions. This task increases in complexity for multifaceted jobs (e.g. front desk agent), especially if they require problem solving and complaint handling skills. Evaluate service robot responsiveness by tracking the average problem resolution times and by calculating the first contact resolution ratio or the number of issues resolved through a single response divided by the number that required more responses.
Guideline: The service robot provides understandable and relevant information (Molich and Nielsen, 1990).
Implementation: Service robots use vocabulary of the task domain, void of hospitality jargon or technical terms, and handle contextual responses, such as inquiries about same-day jet ski rentals. Communication errors are tracked and used to update the natural language database, which consists of a standardized list of words (e.g. occupied), phrases (e.g. all-inclusive hotel), and sentences (e.g. Do you have a reservation?) applicable to the service robot's job role (e.g. food server), tasks (e.g. order taking delivery of food to tables, check settlement, etc.) and work environment (e.g. casual dining restaurant).
Guideline. The service robot senses the emotions and behaviors of customers to appropriately acknowledge their feelings and intentions verbally (Fong et al., 2003).
Research: Service robots' ability to act according to social norms, even for simple utilitarian ones, can become critical for their long-term acceptance (Sung et al., 2010). In order to not violate interaction norms in service encounters, service robots must possess some level of cognitive empathy, the extent to which customer's thoughts and feelings are accurately perceived and understood (Hodges and Myers, 2007). Masuyama et al. (2018) maintain that this requires service robots to recognize basic customer emotions (e.g. happiness, sadness, anger, disgust, and fear) expressed through their words, tone of voice, and body language. Customer characteristics, such as gender and age, are intervening variables in a service robot's assessment of the interplay of customer emotions and intentions (Ihasz and Kryssanov, 2018). For example, if a child requests three candy bars, the service robot should recognize that this is a child and say something like, “Great treat but let's first check with your parents.” The appropriate use of empathy by service robots can alleviate negative emotional states, such as frustration (Klein et al., 2002). However, it is important to note that short-term interactions may only require service robots to be superficially socially or empathetically competent (Fong et al., 2003).
Implementation: Service robots use empathetic phrases when responding to troubled and upset customers, such as:
I apologize but…
I know how you feel…
That must be frustrating…
I can appreciate…
Service robots mirror the communication styles of customers by using their words. To the visual say: “I see what you mean.” To the auditory say: “I hear what you say.” To the feeling say: “I feel the same way.” They are programmed to recognize common subtle clues that are prevalent in particular services environments or interactions, such as a restaurant customer glancing at a watch, a slumped over hotel guest, or a complaining customer with poor eye contact.
Guideline. The service robot uses nonverbal actions to build rapport and interactive understanding with customers (Hellstrom and Bensch, 2018).
Research: As service robots become increasingly autonomous and complex, non-speech modalities will be required to support customers' understanding of the them (Hellstrom and Bensch, 2018). This entails vocal elements, eye contact, facial expressions, gestures, and proximity to customers:
Vocal elements. The dimensions of robotic speech (volume, pitch, and rhythm) or expressive utterances help service robots convey cognitive empathy and its affective state (Fong et al., 2003; Breazeal and Scassellati, 2002). From service robot voice patterns, customers will make judgments about their attitudes toward them, affecting the quality of service interactions. Not much research in the field of HRI has focused on the psychological effects of voice patterns on users' perceptions (Niculescu et al., 2013). Although a monotone voice may be appropriate for limited service interactions, more elaborate service interactions (e.g. dealing with upset customers) may require variations in pitch to convey the appropriate meaning. One study found that higher-pitched robots were perceived by users to have better socials skills and preferred by those with higher expectations. It also showed that introvert users found lower-pitched robots more empathetic and easier to interact with than extrovert users (Niculescu et al., 2013). Another variable is the customer's emotional state. For example, lowering the pitch and pace of voice is more effective when dealing with hostile customers.
Eye contact. Service robot eye contact with customers influences the flow of communication and conveys interest, empathy, and credibility. It is an important social behavior in building intelligent human-robot interactions (Xu et al., 2016). Ideally, conversational service robots should orient their gazing direction to customer faces (Kanda and Ishiguro, 2013). However, customers may not know that service robots are looking at them until they change their facial expressions, such as smiling. One study found that human understanding of robot speech could be improved when the robot's language-related gaze behavior was similar to that of humans (Staudte and Crocker, 2011).
Facial Expressions. Humans may take interest in robots and interact with them more when they feel aliveness in their facial expressions. For example, smiling service robots are perceived as more friendly and approachable. Most robots, however, are quite limited in the number of human expressions they can mimic (Park et al., 2015). While advancements in material technologies will significantly expand the number of possible android facial expressions, one study found that as many as 44 percent in the U.S. preferred or somewhat preferred a fixed face (Nomura Research Institute, 2017).
Gestures. Suppose the customer would like to place a food order in a restaurant. The service robot makes eye contact with the customer who makes a small gesture with his hand to come over to the table. Hand gestures are a good means of nonverbal communication but may mean nothing if there is no eye contact (Miyauchi et al., 2004). Service robot head movements also affect interpersonal communication. For example, customers could observe service robot head movements (e.g. nod) as a sign of interest or agreement in service interactions (Lanillos et al., 2017).
Proximity: Cultural norms dictate comfortable distances for interactions with others. Mobile service robots move around hotel and restaurant environments, sharing the same space as customers. According to Harrigan (2005), social relationships for human are reflected in their use of space in face-to-face encounters, a subcategory of non-verbal communication called proxemics. Proxemics specifically addresses the measureable distances between people when they interact with each other. Different studies have identified various factors, such as robot voice style, gestures, gender, age, gaze, culture, and size, affecting human-robot proxemics (Rodriguez-Lizundia et al., 2015). Customers will also have expectations about how service robots approach their personal spaces. Service robots must select paths that make their intentions easier for customers to interpret ( Arunin and Simmons, 2014). People are most comfortable with frontal approaches (Ball et al., 2014).
Implementation: Service robots recognize customer vocal patterns to respond appropriately. For example, the vocal patterns of sad customers may be slow and low-pitched at a weak high frequency.
Service robots change vocal patterns and pace to appear engaged, interested, and capable. For example, ending tones in a low pitch give customers the impression that service robots are capable of helping them or solving their problems.
Stationary services robots avoid the “Mona Lisa” effect, or the perception that they are looking at customers when they are not, by moving their heads. They only start gesture recognition after making eye contact with customers (Miyauchi et al., 2004).
Service robots directly face customers when speaking to communicate interest. They use body movements and/or facial expressions to facilitate understanding when interacting with customers. For example, a slight nod or smile when greeting customers from a distance sends the message: “Yes, I acknowledge you.”
Service robot facial expressions and communication tones reflect understanding of customer emotional states, encouraging customers to interact with them more. When service robots speak, they look at customer faces 40 percent of the time. When listening to customers, they look at their faces 75 percent of the time (Xu et al., 2016).
Service robots are aware of signals of discomfort caused by invading customer spaces. Examples include customers wincing, stepping back, talking faster, giving one-word answers, and gaze aversion. Adults typically feel comfortable when service robots are positioned 18–48 inches away. The comfort distance increases if the service robot is producing gestures or interacting with children (Walters et al., 2005). However, proxemics preferences of customers may change over time as they interact and come to understand service robots (Mead and Mataric, 2016).
Have service robots approach customers from the front-left or front-right, which are considered more comfortable for customers when they are sitting or standing in the center of a room or with their backs against a wall (Ball et al., 2014).
Guideline. The service robot entertains customers to positively influence their moods and interaction experiences (Nijholt et al., 2017).
Research. Service robots are increasingly being used for psychological enrichment or bringing joy and comfort to customers (Shibata and Wada, 2011). Humor is helpful in building new relationships and solidifying social bonds. Service robots that use humor could evoke feelings of uncanniness or familiarity. According to Nijholt et al. (2017), implementing robotic humor, given its subtleties and nuanced facets, is one of the major challenges in computer science. Niculescu et al. (2013) demonstrated that humor appeared to improve the users' perception of the service robot's personality and speaking style and overall task enjoyment. Another study found that the non-verbal humor of a service robot had a positive effect on the users' ratings of its different characteristics, as well as on the perceived interaction quality (Wendt and Berg, 2009). Hospitality service robots are increasingly be designed with entertainment capabilities, such as telling stories and jokes, posing for selfies, and performing dances. Despite the positive effects of humor in customer service interactions, the HCI field holds a somewhat negative view about the use of humor in interfaces. Consequently, interaction design has historically been focused on maximizing task performance and minimizing task duration (Shneiderman, 1998).
Implementation. Service robot humor is detected and semantically understood and delivered at the right moment and appropriate situation (Nijholt et al., 2017). This requires the establishment of rules for what kinds of behaviors are perceived as humorous by what kinds of customers in what kinds of situations (Wendt and Berg, 2009).
Guideline. Customers understand and are comfortable with the service robot's thinking, knowledge, communications, and actions (Hellstrom and Bensch, 2018).
Research: As service robots become more competent and autonomous in hospitality environments, there is an increasing need to study how customers understand and perceive them. Service robots that do not adequately communicate their intentions or actions during service encounters may result in anxious and uncomfortable customers. For example, experiments have shown that pedestrians feel unsafe when they encounter autonomous vehicles. External interfaces are being explored to communicate to pedestrians the vehicle's intentions (e.g. flashing light – “I'm about to yield”) (https://www.viktoria.se/projects/avip-automated-vehicle-interaction-principles). Service robots using loud warning sounds to avoid collisions with customers would probably frighten many customers and be perceived as socially acceptable (Rodriguez-Lizundia et al., 2015). Carff et al. (2009) maintain that service robots navigating in visually complex environments (e.g. delivery of food on a college campus with various vehicle intersections and pedestrian and bike paths) is challenging and may require computational thinking to solve navigational challenges. There are many tasks, besides movement in uncertain and variable environments, that will require service robots to demonstrate a higher level of competence, such as the identification of solution options for non-routine customer complaints.
Computational thinking is a set of problem-solving methods that involves expressing problems and their solutions in ways that service robots could execute. It involves the thought processes in modeling a situation and specifying the ways service robots can operate within it to reach the desired outcomes (Nardelli, 2019). For example, a restaurant server robot, in addition to order taking, delivery, and payment, could be programmed to respond to inquiries about menu item nutrition, calories, and allergens in an empathetic manner. Verbal expressivity is one of the most important social cues that makes a robot seem trustworthy and believable (Beer et al., 2011). According to Strapparava and Mihalcea (2008), affective words selection and understanding is crucial for realizing appropriate and expressive conversations in human-computer interactions.
Computational thinking and AI thinking are interwoven as computing-enabled machine intelligence is a primary means to realizing intelligence (Zeng, 2013). AI is evolving and enabling service robots to simulate elements of human thinking. The level of AI in service robots can range from weak to strong AI. Weak AI focuses on superior execution of narrow tasks, such as a robotic brewing machine producing 100 cups of coffee in an hour. In contrast, service robots with strong AI are capable of executing cognitive functions: perceiving, reacting, making decisions, and responding appropriately. For example, the Hilton robot concierge, which uses an AI platform developed by IBM, interacts with customers and responds to their questions. It learns and adapts with each interaction, improving the answers it provides.
Implementation: To ensure the safety of customers, physical service robot design (e.g. weight and form) should not cause injuries if collisions should happen. They need to be equipped with mechanisms for slowing down, stopping , and alerting customers when they impede their paths. They need to provide the necessary visual, verbal, and/or sound cues that communicate movement and interaction intentions in socially acceptable and understandable forms. They need to be connected to the appropriate databases (e.g. guest history, menu item ingredient, concierge, etc.) for the necessary contextual knowledge and information required in service interactions. Verify that service robots are providing consistently correct responses and actions to customers inquiries by testing it time and again (interactive proofs) with variations of customer requests unique to the hospitality environment where the service robots will be deployed.
Guideline. The service robot has a likeable, appealing appearance (Mathur and Reichling, 2016).
Research: Customers are more likely to interact with service robots that have appealing embodiments (size, shape, color, materials, facial features, and motion) (Breazeal, 2002; Font et al., 2003). A 2016 study asked 64 workers on the online marketplace Amazon Mechanical Turk to rate 80 real-world robot faces, from the most mechanical ( industrial robot arm) to the most human (humanoid robot with skin), for likeability (Mathur and Reichling, 2016). The study found that as the robot faces became more human than mechanical, they became more unlikable. But as the robot faces became nearly human, likability sharply increased. The study's researchers warn that certain small flaws in in robotic human resemblance may result in Mori's Uncanny Valley effect (robots look creepy), negatively impacting social interactions (Mori. 1970). They recommend additional research on design choices to circumvent the Uncanny Valley effect. A 2015 survey of consumers in Germany, Japan, and the U.S. revealed that the respondents were more receptive to robots whose shapes are human-like but whose surfaces are different from humans (Nitto et al., 2017).
Implementation: Avoid the Uncanny Valley appearance. Design conversational service robots with human shapes and carefully chosen surface materials, especially for those tasks that require a higher degree of interaction with customers.
Guideline. The service robot's appearance is appropriate for the tasks and interactions (Lohse et al., 2008).
Research: Service robot appearances influence the assumptions that people make about their applications and functionalities. Service robots with anthropomorphic (human-like) features put customers at ease with their forms and functions, making them more compelling to interact with (Lohse et al., 2008; Duffy, 2003; Breazeal, 2002). They are perceived by users as more serious in nature and more likely to comply with their instructions (Goetz et al., 2003). One HRI study found that users perceived mechanical-looking robots as less functional and socially interactive than the ones with more humanlike appearances (Hinds et al., 2004). The design of a service robot's head is an important HRI consideration because most non-verbal cues are mediated through its face and without a face, it is perceived as being anonymous (Blow et al., 2006).
Research conducted by Honda found that the ideal height for a service root was between four feet and the height of an average adult, making it easy to communicate with and operate with human living spaces (https://world.honda.com/ASIMO/technology/2000/page02.html). Hiroi and Ito (2016) found that the comfortableness of dialog with service robots declines as their height becomes much shorter or taller than the height of users.
An experimental study evaluated the appearance design of robot eyes suitable both for gaze reading and for conveying friendly impressions. The study analyzed three types of eye shapes, “round,” “ellipse,” and “squint,” and the ratio of the iris and sclera area. The study concluded that a robot's gaze is most readable by human users when performed by robot eyes with a round outline shape and a large iris (Onuki et al., 2013).
Implementation: Service robots have anthropomorphic appearances for supporting tasks that require conversations with customers. The relationship between a service robot's capabilities (simple to complex) and its form (mechanistic to humanistic) is strong to prevent ambiguity and misinterpretations about its role and abilities. In other words, service robots should not look more sophisticated than they really are (Tsui et al., 2009).
The ideal height of conversational service robots is about 12 inches lower than the eye height of customers for both standing and sitting postures (Hiroi and Ito, 2016). Service robot eyes are round with large irises to convey a welcoming appearance (Onuki et al., 2013).
The HRI guidelines and implementation standards for service robots presented in this article are incomplete and untested. It is a starting point for further discussion and research. The next step is to apply these in real-world practice to determine their appropriateness and the need for refined and/or new HRI guidelines and implementation standards. How HRI guidelines are operationalized for a specific service robot application in a given service environment will depend on the type of robot (form, function, movement, and intelligence), the required task(s) for the target audience, and the desired service quality levels.
A major challenge in designing service robots is understanding how they will be experienced by customers. What mental constructs will customers use to evaluate those experiences? Unquestionably, psychology plays a much deeper role in how these experiences are perceived when compared to human-computer interactions. Why? Service robots are autonomous physical entities with a range of human-like attributes and capabilities. As a result, customers often judge their performances based on previous service experiences with humans. Therefore, it is important to understand what dimensions customers use when evaluating service quality and their implications for service robot design in hospitality settings. Unlike an industrial robot, designing the more sophisticated hospitality service robots ideally requires a multitalented team (e.g. engineer, psychologist, linguist, etc.) with someone who has hospitality expertise and knowledge of service performance standards and specifications.
Effective HRI guidelines incorporate user requirements and expectations in the design specifications, driving higher levels of acceptance of service robots in customer-facing situations. A set of universally accepted HRI guidelines with implementation standards has yet to emerge, however (Zenk et al., 2017). Compilation of such information for designers of hospitality service robots will offer a clearer roadmap for them to follow. This article provided a set of introductory HRI guidelines and implementation standards to hopefully jumpstart this undertaking. These are presented within a service quality framework to provide hospitality context for their future development and evolution. Shneiderman (2008, p. 3) maintains that “guidelines are not a comprehensive academic theory that has strong predictive value, rather they should be prescriptive, in the sense that they prescribe practice with useful sets of dos and don'ts.” They should be presented with justifications and examples (https://www.usability.gov/sites/default/files/documents/guidelines_book.pdf).
HRI guidelines and implementation standards will grow in importance as hospitality companies explore and find ways for utilizing service robots to add value to customer experiences. These will need to address service quality attributes that are important to customers in service delivery and interactions. Perhaps this was the lesson that Japan's Henn na Hotel learned in 2019 when it decommissioned half of the 243 robots it deployed to enhance operational performance and reduce its dependency on human employees. Their service robots were unable to complete a variety of tasks reliably, such as delivering luggage to rooms, answering basic guest questions, and conversing with guests. As service robots become more prevalent in hospitality settings, robot vendors and researchers are hard at work exploring different ways of strengthening interactions between humans and robots.
Service performance variables or categories
|X1: Reliability||Performing promised service dependably and accurately. This addresses what is necessary for service robots to successfully complete service tasks according to specifications communicated to customers|
|X2: Responsiveness||Assisting customers promptly. This addresses what is necessary for service robots to engage in reciprocal interactions and to appropriately respond to customer requests, questions, complaints, and problems in a timely manner. This requires communicating with customers in language they can understand (informational responsiveness) and listening to them|
|X3: Empathy||Providing caring, individualized attention to customers. This addresses emotional and social intelligence or what is necessary for service robots to interpret and appropriately respond to customer emotional states (emotional responsiveness) and to establish a sense of connectedness with them|
|X4: Assurance||Establishing credibility and trust with customers. This addresses robot safety (e.g. navigation, spatial intelligence, and no physical harm) and customer confidence in robot decision-making capabilities and competence (knowledge and skills) in performing services|
|X5: Tangibles||Ensuring that the appearances of service providers are well-received by customers. This addresses a service robot's external features (e.g. size, shape, material, and sound), which are first seen, heard, and felt through the user's five senses, and how they impact service interactions|
Arunin, E. and Simmons, R. (2014), “Socially-appropriate approach paths using human data”, Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication in Edinburgh, UK, IEEE, New York, NY, pp. 1037-1042.
Athanassopoulos, A.D. (2000), “Customer satisfaction cues to support market segmentation and explain switching behavior”, Journal of Business Research, Vol. 47 No. 3, pp. 191-207.
Ball, A., Silvera-Tawil, D., Rye, D. and Velonaki, M. (2014), “Group comfortability when robot approaches”, Proceedings of the 6th International Conference on Social Robotics in Sydney, Australia, Springer International Publishing, Sydney, Australia, pp. 44-53.
Bargh, J.A., Chen, M. and Burrows, L. (1996), “Automaticity of social behavior: direct effects of trait construct and stereotype activation on action”, Journal of Personality and Social Psychology, Vol. 71 No. 2, pp. 230-244.
Beaudouin-Lafon, B. (2004), “Designing interactions, not interfaces”, Proceedings of the Working Conference on Advanced Visual Interfaces in Gallipoli, Italy, ACM, New York, NY, pp. 15-22.
Beer, J.M., Fisk, A.D. and Rogers, W.A. (2014), “Toward a framework for levels capable of operating in the real-world environment without any form of external control for extended periods of time”, Journal of Human-Robot Interaction, Vol. 3 No. 2, pp. 74-99.
Beer, J.M., Prakash, A., Mitzner, T. and Rogers, W. (2011), “Understanding robot acceptance”, Technical Report HFA A-TR-1103, Georgia Institute of Technology, Atlanta, available at: https://smartech.gatech.edu/bitstream/handle/1853/39672/HFA-TR-1103-RobotAcceptance.pdf?sequence=1&isAllowed=y (accessed 3 April 2017).
Blow, M., Dautenhahn, K., Appleby, A., Nehaniv, C.L. and Lee, D.C. (2006), “Perception of robot smiles and dimensions for human-robot interaction design”, Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication in Hatfield, UK, IEEE, New York, NY, pp. 469-474.
Breazeal, C. (2002), Design Sociable Robots, MIT Press, Cambridge, Massachusetts, MA.
Breazeal, C. (2003), “Emotion and sociable humanoid robots”, International Journal of Human Computer Interaction, Vol. 59, pp. 115-119.
Breazeal, C. and Scassellati, B. (2002), “Robots that imitate people”, Trends in Cognitive Sciences, Vol. 6 No. 11, pp. 481-487.
Burke, R., Murphy, E., Rogers, V., Lumelsky and Scholtz, J. (2004), “Final report for the DARPA/NSF interdis”, IEEE Transactions on Systems, Man, and Cybernetics - Part C: Applications and Reviews, Vol. 34 No. 2, pp. 103-112.
Carff, J., Johnson, M., El-Sheikh, E.M. and Pratt, J.E. (2009), “Human-robot team navigation in visually complex environments”, Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems in St. Louis, MO, IEEE, New York, NY, pp. 3043-3050.
Collins, G.R. (2016), Overcoming the Customer Service Syndrome: How to Achieve & Sustain High Customer Satisfaction, Kendall Hunt Publishing Company, Dubuque, Iowa, IA.
Cruz-Sandoval, D., Eyssel, F., Favela, J. and Sandoval, E.B. (2017), “Towards a conversational corpus for human-robot conversations”, Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction in Vienna, Austria, IEEE, New York, NY, pp. 99-100.
Duffy, B. (2003), “Anthropomorphism and the social robot”, Robotics and Autonomous Systems, Vol. 42, pp. 177-190.
Dautenhahn, K. (2007), “Socially intelligent robots: dimensions of human-robot-interaction”, Philosophical Transactions B, Vol. 362 No. 1480, pp. 679-704.
Engelberger, J. (1989), Robotics in Service, MIT Press, Cambridge, Massachusetts, MA.
Fong, T., Nourbakhsh, I. and Dautenhahn, K. (2003), “A survey of socially interactive robots”, Robotics and Autonomous Systems, Vol. 42 Nos 3-4, pp. 43-166.
Goetz, J., Kiesler, S. and Powers, A. (2003), “Matching robot appearance and behavior to tasks to improve human-robot cooperation”, Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication in Milbrae, CA, IEEE, New York, NY, pp. 55-60.
Harrigan , J.A. (2005), ”Proxemics, kinesics, and gaze”, in Harrigan, J.A., Rosenthal, R. and Scherer, K.R. (Eds), The New Handbook of Methods in Nonverbal Behavior Research, Oxford University Press, Oxford.
Hellstrom, T. and Bensch, S. (2018), “Understandable robots”, Journal of Behavioral Robotics, Vol. 9 No. 1, pp. 110-123.
Hinds, P.J., Roberts, T.L. and Jones, H. (2004), “Whose job is it anyway? A study of human robot interaction in a collaborative task”, Human-Computer Interaction, Vol. 19 No. 1, pp. 151-181.
Hiroi, Y. and Ito, A. (2016), “Influence of the height of a robot on comfortableness of verbal interaction”, International Journal of Computer Science, Vol. 43 No. 4, pp. 1-9.
Hodges, S.D. and Myers, M.W. (2007), “Empathy”, in Baumeister, R.F. and Vohs, K.D. (Eds), Encyclopedia of Social Psychology, Sage, Thousand Oaks, California, CA, pp. 296-298.
Ihaz, P.L. and Kryssanov, V. (2018), ”Emotions and intentions mediated with dialogue acts”, Proceedings of 2018 5th International Conference on Business and Industrial Research in Bangkok, Thailand, IEEE, New York, NY, pp. 125-13.
Kanda, T. and Ishiguro, H. (2013), Human-Robot Interaction in Social Robotics, CRC Press, Boca Raton, Florida, FL.
Kanda, T., Miyashita, T., Osada, T., Haikawa, Y. and Ishiguro, H. (2008), “Analysis of humanoid appearance in human-robot interaction”, IEEE Transactions on Robotics, Vol. 24 No. 3, pp. 725-735.
Kelly, J. (2015), “Computing, cognition and the future of knowing: how humans and machines are forging a new age of understanding”, IBM Research and Solutions Portfolio, available at: http://researchweb.watson.ibm.com/software/IBMResearch/multimedia/Computing_Cognition_WhitePaper.pdf (accessed 23 August 2017).
Kim, M., Oh, J., Choi, J. and Kim, Y. (2011), “User‐centered HRI: HRI research methodology for designers”, in Wang, X. (Ed.), Mixed Reality and Human‐Robot Interaction, Springer, New York, NY, pp. 13-33.
Klein, J., Moon, Y. and Picard, R. (2002), “This computer responds to user frustration: theory, design, results, and implications”, Interacting with Computers, Vol. 14, pp. 119-140.
Kuiipers, B. (2018), “How can we trust a robot?”, Communications of the ACM, Vol. 61 No. 3, pp. 86-95.
Lanillos, P., Ferreira, J.F. and Dias, J. (2017), “A bayesian hierarchy for robust gaze estimation in human-robot interaction”, International Journal of Approximate Reasoning, Vol. 87 No. 1, pp. 1-22.
Lee, S.L., Lau, I.Y., Kiesler, S. and Chiu, C.Y. (2005), “Human mental models of humanoid robots”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation in Barcelona, Spain, IEEE, New York, NY, pp. 2767-2772.
Mantei, M. and Teorey, T. (1988), “Cost/benefit analysis for incorporating human factors in the software lifecycle”, Communications of the ACM, Vol. 31 No. 4, pp. 428-439.
Mather, M.B. and Reichling, D.B. (2016), “Navigating a social world with robot partners: a quantitative cartography of the Uncanny Valley”, Cognition, Vol. 146, pp. 22-32.
Masuyama, N., Loo, C.K. and Seera, M. (2018), “Personality affected robotic emotional model with associative memory for human-robot interaction”, Neurocomputing, Vol. 272, pp. 213-225.
Mead, R. and Mataric, M.J. (2016), “Robots have needs too: how and why people adapt their proxemics behavior to improve robot signal understanding”, Journal of Human-Robot Interaction, Vol. 5 No. 6, pp. 48-68.
Miyauchi, D., Sakurai, A., Nakamura, A. and Kuno, Y. (2004), “Active eye contact for human-robot communication”, Proceedings of the CHI '04 Extended Abstracts on Human Factors in Computing Systems in Vienna, Austria, ACM, New York, NY, pp. 1099-1102.
Molich, R. and Nielsen, J. (1990), “Improving a human-computer dialogue”, Communications of the ACM, Vol. 33 No. 3, pp. 338-348.
Mori, M. (1970), “The uncanny valley”, Energy, Vol. 7 No. 4, pp. 33-35.
Nardelli, E. (2019), “Do we really need computational think?”, Communications of the ACM, Vol. 62 No. 2, pp. 32-35.
Niculescu, A., van Dijk, B., Nijholt, A., Li, H. and Lan, S.L. (2013), “Making social robots more attractive: the effects of voice pitch, humor, and empathy”, International Journal of Social Robotics, Vol. 5 No. 2, pp. 171-191.
Nijholt, A., Valitutti, A., Niculescu, A.I. and Banchs, R.E. (2017), “Humor in human-computer interaction: a short survey”, Proceedings of the Workshop on Designing Humor in Human-Compute Interaction in Mumbai, India, Industrial Design Centre, Indian Institute of Technology, Bombay, pp. 199-220.
Nieuwenhuisen, M., Stückler, J. and Behnke, S. (2010), “Intuitive multimodal interaction for service”, Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction in Oska, Japan, IEEE Press, Piscataway, New Jersey, NJ, pp. 177-178.
Nitto, H., Taniyama, D. and Inagaki, H. (2017), Social Acceptance and Impact of Robots and Artificial Intelligence, Nomura Research Institute, Tokyo, Japan.
Onuki, T., Ishinoda, T., Kobayashi, Y. and Kuno, Y. (2013), “Designing robot eyes for gaze communication”, Proceedings of the IEEE Korea-Japan Joint Workshop on Frontiers of Computer Vision in Fukuoka, Japan, IEEE, New York, NY, pp. 97-102.
Park, J.W., Lee, H.S. and Chung, M.J. (2015), “Generation of realistic robot facial expressions for human robot interaction”, Journal of Intelligent and Robotic Systems, Vol. 78 Nos 3-4, pp. 443-462.
Parna, D.L. (2017), “The real risks of artificial intelligence”, Communications of the ACM, Vol. 60 No. 10, pp. 27-31.
Parasuraman, A., Berry, L.L. and Zeithaml, V.A. (1991), “Understanding customer expectations of service”, Sloan Management Review, Vol. 32 No. 3, p. 42.
Picard, R. (2007), “Toward machines with emotional intelligence”, in Matthews, G., Zeidner, M. and Roberts, R.D. (Eds), The Science of Emotional Intelligence: Knowns and Unknowns, Oxford University Press, Oxford.
Rodriguez-Lizundia, E., Marcos, S., Zalama, E., Gómez-García-Bermejo, J. and Gordaliza, A. (2015), “A bellboy robot: study of the effects of robot behaviour on user engagement and comfort”, International Journal of Human-Computer Studies, Vol. 82 No. 1, pp. 83-95.
Satake, S., Kanda, T., Glas, D.F., Imai, M., Ishiguro, H. and Hagita, N. (2009), “How to approach humans? Strategies for social robots to initiate interaction”, Proceedings of the 4th ACM/IEEEC Conference on Human-Robot Interaction in San Diego, CA, ACM, New York, NY, pp. 109-116.
Shermerhorn, P. and Scheutz, M. (2011), “Disentangling the effects of robot affect, embodiment, and autonomy on human team members in a mixed-initiative task”, Proceedings of the International Conference on Advances in Computer-Human Interactions in Gosier, IARIA XPS Press, France, pp. 236-241.
Scheutz, M., Cantrell, R. and Schermerhorn, P. (2011), “Toward humanlike task-based dialogue processing for human robot interaction”, AI Magazine, Vol. 32 No. 4, pp. 77-84.
Scholtz, J. (2003), “Theory and evaluation of human robot interactions”, Proceedings of the 36th Annual Hawaii International Conference on System Sciences in Big Island, Hawaii, IEEE Computer Society, Washington, DC, p. 125.
Shechtman, N. and Horowitz, L. (2003), “Media inequality in conversation: how people behave differently when interacting with computers and people”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems in Fort Lauderdale, ACM, New York, NY, pp. 281-288.
Shibata, T. and Wada, K. (2011), “Robot therapy: a new approach for mental healthcare of the elderly – a mini-review”, Gerontology, Vol. 57, pp. 378-386.
Shneiderman, B. (1998), Designing the User Interface: Strategies for Effective Human-Computer Interaction, Addison-Wesley Longman Publishing Company, Boston, Massachusetts, MA.
Shneiderman, B. (2008), “Foreword on HHS web usability guidelines”, U.S. Department of Health and Human Services: Research-Based Web Design and Usability Guidelines, available at: https://www.usability.gov/sites/default/files/documents/guidelines_book.pdf (accessed 20 January 2019).
Steinfeld, A., Fong, T., Kaber, D., Lewis, M., Scholtz, J., Schultz, A. and Goodrich, M. (2006), “Common metrics for human-robot interaction”, Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction in Salt Lake City, UT, ACM, New York, NY, pp. 33-40.
Sung, J., Grinter, R.E. and Christensen, H.I. (2010), “Domestic robot ecology: an initial framework to unpack long-term acceptance of robots at home”, International Journal of Social Robotics, Vol. 2 No. 4, pp. 417-429.
Staudte, M. and Crocker, M.W. (2011), “Investigating joint attention mechanisms through spoken human–robot interaction”, Cognition, Vol. 120 No. 2, pp. 268-291.
Stock, R.M. and Merkle, M. (2017), “A service robot acceptance model: user acceptance of humanoid robots during the service encounter”, Proceedings of the IEEE International Conference on Pervasive Computing and Communications Workshops in Kona, Hawaii, IEEE, New York, pp. 339-344.
Strapparava, C. and Mihalcea, R. ( 2008), “Learning to identify emotions in text”, Proceedings of the 2008 ACM Symposium on Applied Computing in Fortaleza, Brazil, ACM, New York, NY, pp. 1556-1560.
Tao, J. and Tieniu, T. (2005), “Affective computing: a review”, in Tao, J., Tan, T. and Picard, R.W. (Eds), Affective Computing and Intelligent Interaction, Springer International Publishing, Cham.
Thrun, S. (2004), “Toward a framework for human-robot interaction”, Human-Computer Interaction, Vol. 19 Nos 1-2, pp. 9-24.
Tsui, K., Abu-Zahra, K., Casipe, R., M'Sadoques, J. and Drury, J.L. (2009), “A process for developing specialized Heuristics: case study in assistive robotics”, University of Massachusetts Lowell Technical Report, Department of Computer Science, available at: https://www.researchgate.net/publication/228753659_A_Process_for_Developing_Specialized_Heuristics_Case_Study_in_Assistive_Robotics (accessed 7 February 2019).
Trevelyan, J. (1999), “Redefining robotics for the next millennium”, The International Journal of Robotics Research, Vol. 18 No. 12, pp. 1211-1223.
Travelzoo (2016), “Travelers expect robots on their holidays by 2020”, Travelzoo Press Release, available at: https://press.travelzoo.com/robophiles--robophobes--britons-divided-use-of-robots-in-travel (accessed 16 January 2018).
Walters, M.L., Dautenhahn, Koay, K.L., Kaori, C., te Boekhorst, R., Nehaniv, C., Werry, I. and Lee, D. (2005), “Close encounters: spatial distances between people and a robot mechanistic appearance”, Proceedings of the 2005 5th IEEE-RAS Conference in Humanoid Robots in Tsukuba, Japan, IEEE, New York, NY, pp. 450-455.
Wendt, C.S. and Berg, G. (2009), “Nonverbal humor as a new dimension of HRI”, Proceedings of the 18th International Symposium on Robot and Human Interactive Communication in Oyama, Japan, IEEE, New York, NY, pp. 183-188.
Xu, T., Zhang, H. and Yu, C. (2016), “See you see me: the role of eye contact in multimodal human-robot interaction”, ACM transactions on Interactive Intelligent Systems, Vol. 6 No. 1, pp. 1-22.
Yanco, H.A. and Drury, J.L. (2002), “A taxonomy for human-robot interaction”, Proceedings of the AAAI Fall Symposium on Human-Robot Interaction in North Falmouth, Massachusetts, AAAI, Palo Alto, California, CA, pp. 111-119.
Zeithaml, V.A., Parasuraman, A. and Berry, L.L. (1990), Delivering Quality Service: Balancing Customer Perceptions and Expectations, The Free Press, New York.
Zenk, J., Crowell, C.R., Villano, M., Kaboski, J., Tang, K. and Diehl, J.J. (2017), “Unconventional students in HRI education: a review of two initiatives”, Journal of Human-Robot Interaction, Vol. 6 No. 2, pp. 92-110.
Zeng, D. (2013), “From computation thinking to AI thinking”, IEEE Intelligent Systems, Vol. 28 No. 6, pp. 2-4.
Breazeal, C., Kidd, C.D., Thomaz, A.L., Hoffman, G. and Berlin, M. (2005), “Effects of nonverbal communication on efficiency and robustness in human-robot teamwork”, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems in Edmonton, Canada, IEEE, New York, NY, pp. 708-713.
Lose, M., Hegel, F. and Wrede (2008), “Domestic applications for social robot: an online survey on the influence of appearance and capabilities”, Journal of Physical Agents, Vol. 2 No. 2, pp. 21-32.
Royakkers, L. and Van Est (2016), Just Ordinary Robots-Automation from Love to War, CRC Press Boca Raton, Florida, FL.