Emotional communication by service robots: a research agenda

Marc Becker (Department of Marketing and Supply Chain Management, School of Business and Economics, Maastricht University, Maastricht, The Netherlands)
Emir Efendić (Department of Marketing and Supply Chain Management, School of Business and Economics, Maastricht University, Maastricht, The Netherlands)
Gaby Odekerken-Schröder (Department of Marketing and Supply Chain Management, School of Business and Economics, Maastricht University, Maastricht, The Netherlands)

Journal of Service Management

ISSN: 1757-5818

Article publication date: 29 April 2022

Issue publication date: 8 July 2022

3181

Abstract

Purpose

Many service industries are facing severe labor shortages. As a result, service providers are turning to new sources of labor, such as service robots. Critics however often point out that service robots lack emotional communication capabilities without which they cannot be expected to truly replace human employees and fill the emerging labor market gaps. Here, a research agenda for the investigation of the role of emotional communication by service robots and its effects on customers and their service experience are laid out. This paper aims to propose that research in this area will further understanding of how service robots can add value to service frontlines, engage customers, increasingly replace service employees and ultimately help overcome pressing labor shortages.

Design/methodology/approach

A research agenda structured around the three-step emotional communication process (i.e. read, decide and express) and the four emotional communication strategies crucial for service interactions (i.e. mimicking, alleviating, infusing and preventing) are conceptualized.

Findings

Three contributions are made. First, the importance of emotional communication by service robots during service interactions is highlighted. Second, interdisciplinary research priorities and opportunities in this emerging field are mapped out. Third, a theoretical structure to connect the findings of future studies is provided.

Originality/value

Service research investigating the role and implications of emotional communication by service robots is scarce. A research agenda to guide the exploration of this crucial, yet underresearched component of customer-robot service interactions is structured and mapped out.

Keywords

Citation

Becker, M., Efendić, E. and Odekerken-Schröder, G. (2022), "Emotional communication by service robots: a research agenda", Journal of Service Management, Vol. 33 No. 4/5, pp. 675-687. https://doi.org/10.1108/JOSM-10-2021-0403

Publisher

:

Emerald Publishing Limited

Copyright © 2022, Marc Becker, Emir Efendić and Gaby Odekerken-Schröder

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.


Introduction

Many service industries are facing severe labor shortages (Decker et al., 2017). For example, due to an aging population, the USA expects a lack of 151,000 caretakers by 2030 (Osterman, 2017). Similarly, after large layoffs in the hospitality industry due to the Covid pandemic, in the USA about 50% (Dmitrieva, 2021) and in Germany about 16% (Sullivan, 2021) of service employees have left the industry. Consequently, service providers are seeking alternative sources of labor, such as service robots (Decker et al., 2017). Service robots are “system-based autonomous and adaptable interfaces that interact, communicate and deliver services to an organization's customers” (Wirtz et al., 2018, p. 909). Simply put, service robots are an interactive communication technology, particularly amenable to communication interventions and strategies. As such, it can be argued that service robots can augment and sometimes even replace human service employees. In the hospitality industry, they greet customers, serve dishes or handle customer check-ins (Pieskä et al., 2013). In elderly care, they are being used for health monitoring, encouragement of health-promoting exercises and general companionship (Čaić et al., 2018). Service robots are also used in health care, retailing and office settings (Calderone, 2019).

However, a crucial limitation critics often point out is that service robots lack emotional communication capabilities (e.g. Reis et al., 2020). Emotional communication refers to the process of recognizing another person's emotional state and influencing it through own emotional displays (Reis and Sprecher, 2009). It is a multidisciplinary concept relevant in diverse fields, such as communication, marketing and psychology (Keltner and Haidt, 1999; van der Meer and Verhoeven, 2014; Zhang et al., 2014). Emotional communication is reflected in our language, in our relationships, how we are perceived by others and what influences our choices. Most relevant for our research agenda, service research repeatedly shows how vital it is that service employees understand customer emotions and control their own emotions. These capabilities are associated with more positive customer service outcomes (Delcourt et al., 2013) and higher employee sales performance (Kidwell et al., 2021). Similarly, emotional service failure recovery is more effective than economic service failure recovery in achieving consumer forgiveness (Wei et al., 2020). Such positive effects are contingent on the authenticity of the displayed emotions (Hennig-Thurau et al., 2006). Consequently, recognizing and authentically expressing emotions may be a crucial ability for service robots that could lead to their smoother integration into service frontlines.

To advance our understanding of the role of emotional communication by service robots, as well as its effect on customers and their service experience, we propose a research agenda. We argue that research in this area will provide insights into how service robots can add value to service frontlines, engage customers, increasingly replace service employees and ultimately help overcome pressing labor shortages. We structure our research agenda around four emotional communication strategies and a three-step emotional communication process.

The four emotional communication strategies

In a service encounter, different emotional communication strategies are required for different types of interactions. For example, sometimes a service robot must proactively express emotions, while other times it must react to customer emotions. Therefore, a service robot must command proactive, as well as reactive emotional communication strategies. In addition, sometimes a service robot must react to or prevent negative customer emotions, while other times it can infuse or mimic positive customer emotions. Consequently, a service robot must command emotional communication strategies for different emotional valences (i.e. positive and negative customer emotions). From the combination of these two dimensions (i.e. reactivity and valence), four unique strategies emerge (see Figure 1).

Alleviating: The aim of the alleviating strategy is to alleviate negative customer emotions. For example, a customer may be disappointed about having received an incorrect dish. Here, customers expect the service robot to address these emotions (Gruber, 2011; Wei et al., 2020). Based on their empirical study in a service employee context, Chebat and Slusarczyk (2005, p. 670) conclude that “even if the problem, which triggered the complaint, can be fixed, the customers do not necessarily remain loyal if the emotions are not properly attended to”.

Mimicking: The aim of the mimicking strategy is to strengthen or amplify customers' positive emotions. For example, if a customer happily smiles, a service robot could make them feel even better by smiling back. Such imitation and matching of another person's nonverbal behavior is called mimicry (Hess, 2021). Chartrand and Lakin (2013) review many of its positive effects, such as increased affiliation, rapport and collaboration.

Infusing: The aim of the infusing strategy is to proactively infuse positive customer emotions. For instance, a cheerfully greeting service robot could put guests into a positive mood when they enter. This “flow of emotions from one person to another, with the receiver ‘catching’ the emotions that the sender displays” is called emotional contagion (Hennig-Thurau et al., 2006, p. 58). Especially the contagious effect of smiling has been investigated, showing a positive association between a smiling (vs nonsmiling) service employee and customer satisfaction (Otterbring, 2017).

Preventing: The aim of the preventing strategy is to prevent or minimize negative customer emotions before they occur. For example, a service robot might have to convey a message that could put the customer into a negative emotional state, such as a stockout or a lost reservation. By adapting its emotional demeanor, the service robot may prevent or at least minimize negative customer emotions. For example, emotional communication is crucial not only for doctors when giving bad news (Baile and Beale, 2003) but also for airlines when informing customers of an overbooked flight (Nazifi et al., 2020).

The emotional communication process

To determine what emotional communication should look like for service robots for each of these four strategies, we present a brief review of the emotional communication process. We model the emotional communication process in three steps based on the emotion model for social robots by Paiva et al. (2014) (see Figure 2).

  • Step1: The first step differs between reactive and proactive emotional communication strategies. For reactive strategies, the service robot reads the customer's emotional state to later either mimic or alleviate it. For example, it recognizes that a customer is disappointed. In contrast, for proactive strategies, the service robot reads a service situation to spot opportunities to either prevent negative customer emotions or infuse positive ones. For example, it anticipates that informing a customer of a lost reservation can result in disappointment or even anger.

  • Step 2: In the second step, the service robot decides on an adequate emotional expression. For example, in situations in which the customer appears to be disappointed, it determines that it is more appropriate to display an apologetic rather than a cheerful demeanor.

  • Step 3: In the third step, the service robot expresses the identified emotion. For example, it shows a sad face on its built-in display. After the expression of the emotion, the process starts over with Step 1.

Although service research has devoted considerable attention to understanding communication in service encounters, studies on communication (and in particular on emotional communication) for service robots are largely lacking (Peter and Kühne, 2018). In fact, we found only a few studies looking at emotionally communicative service robots. Yu and Ngan (2019) observe that service robot smiles are perceived more positively by men. Further, analyzing sentiments towards the highly anthropomorphic robot “Sophia”, Chuah and Yu (2021) show that its expressions of surprise and happiness are perceived most positively. With our research agenda, we seek to spur investigation of this important research area.

Contributions

Our research agenda makes three major contributions. First, we highlight the importance of emotional communication by service robots, a crucial, yet underresearched aspect of customer-robot interactions. Second, we map out a wide range of interdisciplinary research priorities crucial to determining how service robots can add value to service frontlines, engage customers, increasingly replace service employees and ultimately help overcome pressing labor shortages. Third, by structuring the research agenda around the emotional communication process and the four emotional communication strategies, we provide a structure that will act as a backbone to connect the findings of future studies.

Research agenda

We organize the research agenda around the emotional communication process (see Figure 2). We propose that each step comes with unique research needs. Because of strategy-specific differences, we further organize the discussion around the four emotional communication strategies (see Figure 1). To inspire future research in this burgeoning topic, we discuss strategy-specific research areas of interest for each step of the emotional communication process below (see Figure 3).

Step 1: reading emotions and situations

The first step of the emotional communication process largely differs between proactive (i.e. alleviating and mimicking) and reactive (i.e. infusing and preventing) emotional communication strategies. While reactive strategies require the service robot to recognize customer emotions, proactive strategies require the service robot to be aware of the service situation.

Alleviating and mimicking

Before a service robot can alleviate or mimic a customer's emotional state, the state must first be identified. There are multiple techniques with which artificial systems can detect human emotions, such as facial recognition, voice analysis and heart rate detection (Egger et al., 2019). Beforehand though, how should service robots best categorize emotions? An argument could be made that service robots ought to focus on discrete (rather than dimensional) emotion detection because its fine-grained distinctions between emotions allow for better responsive actions (Nabi, 2010). This raises the question of whether the distinction of discrete emotions is not only feasible for service robots but whether it is also superior. In addition, how granular do those distinctions need to be in (different) service contexts? Importantly, service robots often operate in unstructured environments. We need to then ask how service robots can detect customer emotions under varying conditions.

These are not purely technical challenges (Ostrom et al., 2021). We suggest that service, communication and psychology researchers can make important contributions as well. For example, researchers from these fields could investigate how service providers can optimize the Servicescape to facilitate emotion recognition. They could also contribute by mapping service-relevant emotions or identifying cues for emotions that are difficult to distinguish.

Capturing, aggregating and analyzing customer emotions offers great opportunities for service marketers, but it can also have wide-reaching social, societal and business implications (Helberger et al., 2020). Regarding customers' right to privacy, how can service providers responsibly use emotional data collected by service robots and what actions must policymakers take before it comes to widespread misuse? Further, feeling monitored or surveilled could result in adverse customer reactions. How will then customers perceive service robots that analyze every muscle movement for emotional cues? Importantly, will privacy and monitoring concerns decrease or increase as customers are becoming used to the technology and how can service providers minimize adverse customer reactions?

Infusing and preventing

For proactive emotional communication strategies (i.e. infusing and preventing), a service robot reads the service situation to determine whether a proactive expression of emotions could be beneficial. For example, a service robot must recognize situations in which it can use own emotional displays to either infuse positive or prevent negative customer emotions. It is then important to ask in which situations a preventing or an infusing strategy is beneficial, necessary or even damaging. Only once these questions have been investigated, service robots can be programmed to recognize opportunities for proactive emotional displays.

Step 2: Deciding on appropriate emotional expressions

In the second emotional communication step, a service robot must decide which emotional display is the most appropriate to adopt. Before discussing strategy-specific research questions, we highlight multiple general research areas of interest. A crucial question is how to define and measure success for emotional communication by service robots. Beyond management applications, the identification of (new) methods and metrics is essential for service robots to learn and improve their emotional communication. What ground truths can best be used for this purpose? In particular, what objective should service robots' emotional communication have – should it lie in the maximization of interaction-based or of organizational performance measures? Also, we know that carefully curated training data is crucial for service robots to successfully express empathetic responses (Rashkin et al., 2019). Which (field) data is then best suited for training and improving service robots' emotional communication?

However, until service robots can learn themselves when to best display a specific emotion, service providers must already make these decisions when designing the entire customer service experience. This makes it essential to determine how different emotional displays by service robots impact customers and how service robots' emotional communication can be integrated into the service design process. For example, which service design methods can be used and which skills and expertise are necessary? Also, how do service robot hardware and software producers impact service differentiation and to whom do customers attribute emotional communication failures?

Furthermore, emotionally communicative service robots' direct face-to-face customer contact requires service providers to become highly customer-centric when designing their emotional communication (Finne and Grönroos, 2017). What (new) methodologies and approaches are required to design these highly customized and emotional messages? Importantly, service robots might need to autonomously adapt their emotional messages to unique situations. How then can service providers ensure a consistent brand and messaging? Also, how much autonomy should service robots have when mediating the service provider's emotional messages (e.g. see Hancock et al., 2020)?

Beyond service and communication design, emotionally communicative service robots might greatly impact other stakeholders in the broader service delivery network and ecosystem (Barile et al., 2016). We need to understand who these actors are and how they impact the service provider. We know that service robots can create but also destroy value for partners in the service delivery network (Čaić et al., 2018). For which partners could emotionally communicative service robots become value creators/destroyers? Value might also be destroyed when emotionally communicative service robots bring imbalance to service delivery networks (Verleye et al., 2017). For example, past studies have generally suggested that service employees should take on emotional and relational roles, while service robots should be assigned to mechanical and analytical ones (Huang and Rust, 2020). How do emotional communication capabilities impact these allocations (especially in the long-term), which new roles become not only feasible but also socially acceptable and how does the performance for these roles compare between service robots and service employees?

Roles could also shift because of new affordances enabled by service robots' emotional communication capabilities (Evans et al., 2017). What will these affordances look like, how will they impact customer behaviors and attitudes and what new tasks and roles will they enable? In the long-term, as emotionally communicative service robots are uniquely positioned to take jobs from humans, how will they impact the overall labor market and which skills do service employees need to develop to stay relevant (Ostrom et al., 2021)?

When deciding on an adequate emotional response, service robots need to consider relevant others around them (Abboud et al., 2021). For example, in restaurant settings, customers might often not eat and order alone. Consequently, how does emotional communication by service robots need to differ between interactions with individual customers and groups of customers? Similarly, for situations in which service robots work side-by-side with service employees, when and how should service robots synchronize their emotional expressions with the service employee?

Lastly, emotionally communicative service robots beget a host of moral and ethical issues. How can (especially vulnerable) customers be protected against (emotional) manipulations by service robots? Further, how can service providers use these new technological possibilities responsibly and what measures must policymakers take to prevent misuse?

Besides these general questions, the different emotional communication strategies come with their unique challenges and research priorities.

Alleviating

For the alleviating strategy, a service robot needs to decide which emotional display will most likely alleviate the customer's negative emotional state. It is, thus, important to look at what emotional response a customer finds appropriate in different situations, especially when this response comes from a machine. Importantly, errors may arise; so, research needs to investigate what adverse effects an inappropriate emotional display by a service robot can have, especially if the customer is already in a negative emotional state. Furthermore, for service failure recoveries, emotional communication is often even more important than economic compensation (Wei et al., 2020). How effective and important are then emotional displays by service robots for restoring customer service outcomes? As customers get used to these emotional displays, does their effectiveness increase or decrease?

Mimicking

Once a service robot has recognized that a customer is in a positive emotional state, it must decide whether to mimic the customer's emotions. Which positive customer emotions should a service robot mimic? An important caveat is to investigate which emotions customers perceive as genuine and which as unsettling if mimicked by a service robot. However, the key question is whether mimicry has equally positive effects on customer emotions and customer service outcomes for service robots as it does for service employees (see Chartrand and Lakin, 2013). Also, does the (in)appropriateness of mimicry depend on the task and service situation? Longitudinal studies might allow determining whether these perceptions and effects change as customers get used to emotional communication by service robots.

Infusing

After identifying an opportunity to infuse positive emotions, a service robot must decide which emotion to display. Here, the first question is which emotions are effective and perceived as genuine. Even more importantly, different service interactions require different kinds of emotional displays. Consequently, which proactive emotional displays are the most appropriate for different tasks, interactions and situations?

The proposed mechanism underlying the infusing strategy is emotional contagion. However, a crucial question is if emotional displays by service robots are, in fact, contagious, and if so, which ones are? To support the effectiveness of infusing strategies, it is important to also investigate other potential underlying mechanisms [e.g. feelings-as-information, Otterbring (2017)].

Service interactions seldomly occur in isolation. Consequently, we need to ask to what extent proactive emotional communication must take past service encounters, customer characteristics and external events into account. Also, how important is personalization? In particular, the effect of a cheerfully greeting service robot might disappear if every guest is being greeted in the same way.

Preventing

If a service robot recognizes that an interaction could soon provoke negative customer emotions, it needs to preemptively decide on an appropriate emotional message to prevent or minimize these negative emotions. We need to investigate whether service robots can, in fact, achieve this goal through emotional communication. Similarly, it needs to be investigated whether emotional displays are sufficient to prevent the deterioration of customer service outcomes. Here, a crucial question is which proactive emotional displays customers expect in different situations, especially considering that they might have different expectations toward a service robot than toward a service employee. How do these expectations change over time?

Step 3: Expressing emotions

In the third and last step of the emotional communication process, the service robot needs to (convincingly) express the decided-on emotion. Humans express emotions through verbal and nonverbal cues (Reis and Sprecher, 2009). Also service robots can communicate using verbal behaviors, nonverbal behaviors and appearance characteristics (Van Pinxteren et al., 2020). Here, service robots can use human-like types of emotional expression (e.g. smiles and tone of voice) or they can use machine-specific forms of emotional expression (e.g. sounds and color changes). Which modes of emotional expression are perceived most positively across different contexts? Further, how does their appropriateness change once customers get used to (the idea of) service robots expressing emotions?

In addition, emotional displays can greatly differ with regard to their abstraction and realism. The uncanny valley theory posits that too human-like service robots provoke eerie user reactions (Wirtz et al., 2018). Also inconsistencies in realism appear to be greatly unsettling (Kätsyri et al., 2015). We need to, therefore, ask which level of human-likeness and realism is desired in emotional expressions by service robots, and how must a service robot's emotional expressions match with its appearance and behaviors.

Besides these general questions, also this step of the emotional communication process comes with multiple strategy-specific research priorities.

Alleviating

For the alleviating strategy, it is crucial that a service robot's emotional response is convincing as the customer is in a negative emotional state. Which mode of emotional expression do customers best respond to in such a situation? We also need to ask how and when the service robot ought to match the intensity of its emotional expression to that of the customer. In addition, since the customer is in a negative emotional state, an indiscrete service robot might not only further anger the customer, but it might also cause a spillover effect on other customers. This raises the question of how discrete the service robot's emotional displays should be.

Mimicking

After having decided to mimic the customer, the service robot must use its emotional expression capabilities to convincingly express the identified emotion. Importantly, in the foreseeable future, service robots will not be able to perfectly mimic every muscle movement of the customer. How much detail is then needed for successful mimicking? Importantly, many of today's service robots do not look anything close to human beings. Therefore, how much human-likeness is needed for customers to feel mimicked?

In addition, because of such a lack of versatility when it comes to the expression of emotions, there is a high chance that the service robot does not exactly mimic the customer but displays a different emotion. This bears the question of how such mismatches impact customers and their service experience.

Lastly, all these considerations and technological limitations point to one critical question: Besides building service robots that can copy all human muscle movements, what can service providers do to make mimicking by their service robots more effective?

Infusing

To put the customer into a positive emotional state, a service robot could proactively express positive emotions. As current service robots need to rely on more basic and abstracted forms of emotional expression, we ought to understand whether such abstracted emotions are contagious and how important human-likeness is for emotional contagion. Moreover, when a service robot smiles all the time, the smile might be perceived as a design element rather than an emotional expression. This means it is important to understand how customers perceive proactive emotional displays in the long-term, whether it is necessary for a service robot to have an emotional baseline, and where this baseline should be. From this, it stands to reason that we need to understand which default emotional demeanor a service robot should display while not engaging in emotional communication.

Preventing

Once a service robot has decided on an emotional expression to prevent negative customer emotions, the service robot must convincingly express it. Two crucial questions are which mode of expression customers respond to most favorably and whether the service robot has to match the intensity of its emotional expression to the severity of the situation. In addition, a crucial consideration for such proactive displays is timing. It is currently not clear when the service robot should change its emotional expression. If a service robot must inform a customer of a stockout, should the service robot already adapt its emotional demeanor when it comes to the table, when it begins talking, or is it necessary to perfectly synchronize the emotional expression with the spoken words?

Conclusion

While technology is increasingly integrated into the service delivery, services' interactive and interpersonal nature has, thus, far made (human) service employee involvement indispensable. However, this might soon change as, for example, emotionally communicative service robots could deliver highly engaging and interactive service experiences. Consequently, we are on the verge of a paradigm shift where technology not only facilitates service interactions but spearheads them. Understanding these new types of service encounters is a daunting task for researchers that requires the integration of knowledge from fields beyond communication and service. The good news is that emotional communication is investigated in many settings, such as linguistics, social and organizational psychology, marketing and communication. There is, thus, plenty of findings and insights to draw from and learn. How scholars from these diverse fields collaborate and share insights will greatly determine whether the technology can unfold its full potential. With our research agenda, we aim to spur and structure future multidisciplinary research efforts on this important topic.

To conclude, we want to highlight three additional considerations. First, we would like to emphasize that research on the different steps of the emotional communication process (i.e. read, decide and express) does not need to be done chronologically. Second, the role, effectiveness and implementation of emotional communication will most likely, and probably often substantially, differ across contexts, industries and business strategies. Lastly, not all customers are equal, nor are their perceptions of emotions in service robots. For example, there might be demographic or cultural differences (Egger et al., 2019; Yu and Ngan, 2019). Being aware of such differences is essential for enabling service robots to successfully engage in emotional communication with different customers in different contexts.

Figures

The four emotional communication strategies

Figure 1

The four emotional communication strategies

The emotional communication process

Figure 2

The emotional communication process

Summary of research agenda with exemplary research questions

Figure 3

Summary of research agenda with exemplary research questions

References

Abboud, L., As'ad, N., Bilstein, N., Costers, A., Henkens, B. and Verleye, K. (2021), “From third party to significant other for service encounters: a systematic review on third-party roles and their implications”, Journal of Service Management, Vol. 32 No. 4, pp. 533-559, doi: 10.1108/JOSM-04-2020-0099.

Baile, W.F. and Beale, E.A. (2003), “Giving bad news to cancer patients: matching process and content”, Journal of Clinical Oncology, Vol. 21 No. 9, pp. 49-51, doi: 10.1200/jco.2003.01.169.

Barile, S., Lusch, R., Reynoso, J., Saviano, M. and Spohrer, J. (2016), “Systems, networks, and ecosystems in service research”, Journal of Service Management, Vol. 27 No. 4, pp. 652-674, doi: 10.1108/JOSM-09-2015-0268.

Čaić, M., Odekerken-Schröder, G. and Mahr, D. (2018), “Service robots: value co-creation and co-destruction in elderly care networks”, Journal of Service Management, Vol. 29 No. 2, pp. 178-205, doi: 10.1108/JOSM-07-2017-0179.

Calderone, L. (2019), “More industrial automation, robots and unmanned vehicles resources”, available at: https://www.roboticstomorrow.com/article/2019/02/what-are-service-robots/13161 (accessed 26 October 2021).

Chartrand, T.L. and Lakin, J.L. (2013), “The antecedents and consequences of human behavioral mimicry”, Annual Review of Psychology, Vol. 64 No. 1, pp. 285-308, doi: 10.1146/annurev-psych-113011-143754.

Chebat, J.-C. and Slusarczyk, W. (2005), “How emotions mediate the effects of perceived justice on loyalty in service recovery situations: an empirical study”, Journal of Business Research, Vol. 58 No. 5, pp. 664-673, doi: 10.1016/j.jbusres.2003.09.005.

Chuah, S.H.-W. and Yu, J. (2021), “The future of service: the power of emotion in human-robot interaction”, Journal of Retailing and Consumer Services, Vol. 61, 102551, doi: 10.1016/j.jretconser.2021.102551.

Decker, M., Fischer, M. and Ott, I. (2017), “Service Robotics and Human Labor: a first technology assessment of substitution and cooperation”, Robotics and Autonomous Systems, Vol. 87, pp. 348-354, doi: 10.1016/j.robot.2016.09.017.

Delcourt, C., Gremler, D.D., van Riel, A.C.R. and van Birgelen, M. (2013), “Effects of perceived employee emotional competence on customer satisfaction and loyalty”, Journal of Service Management, Vol. 24 No. 1, pp. 5-24, doi: 10.1108/09564231311304161.

Dmitrieva, K. (2021), “Half of U.S. Hospitality workers won't return in job crunch”. available at: https://www.bloomberg.com/news/articles/2021-07-08/half-of-u-s-hospitality-workers-won-t-return-in-job-crunch (accessed 26 October 2021).

Egger, M., Ley, M. and Hanke, S. (2019), “Emotion recognition from physiological signal analysis: a review”, Electronic Notes in Theoretical Computer Science, Vol. 343, pp. 35-55, doi: 10.1016/j.entcs.2019.04.009.

Evans, S.K., Pearce, K.E., Vitak, J. and Treem, J.W. (2017), “Explicating affordances: a conceptual framework for understanding affordances in communication research”, Journal of Computer-Mediated Communication, Vol. 22 No. 1, pp. 35-52, doi: 10.1111/jcc4.12180.

Finne, Å. and Grönroos, C. (2017), “Communication-in-use: customer-integrated marketing communication”, European Journal of Marketing, Vol. 51 No. 3, pp. 445-463, doi: 10.1108/EJM-08-2015-0553.

Gruber, T. (2011), “I want to believe they really care”, Journal of Service Management, Vol. 22 No. 1, pp. 85-110, doi: 10.1108/09564231111106938.

Hancock, J.T., Naaman, M. and Levy, K. (2020), “AI-mediated communication: definition, research agenda, and ethical considerations”, Journal of Computer-Mediated Communication, Vol. 25 No. 1, pp. 89-100, doi: 10.1093/jcmc/zmz022.

Helberger, N., Huh, J., Milne, G., Strycharz, J. and Sundaram, H. (2020), “Macro and exogenous factors in computational advertising: key issues and new research directions”, Journal of Advertising, Vol. 49 No. 4, pp. 377-393, doi: 10.1080/00913367.2020.1811179.

Hennig-Thurau, T., Groth, M., Paul, M. and Gremler, D.D. (2006), “Are all smiles created equal? How emotional contagion and emotional labor affect service relationships”, Journal of Marketing, Vol. 70 No. 3, pp. 58-73, doi: 10.1509/jmkg.70.3.058.

Hess, U. (2021), “Who to whom and why: the social nature of emotional mimicry”, Psychophysiology, Vol. 58 No. 1, p. e13675, doi: 10.1111/psyp.13675.

Huang, M.-H. and Rust, R.T. (2020), “Engaged to a robot? The role of AI in service”, Journal of Service Research, Vol. 24 No. 1, pp. 30-41, doi: 10.1177/1094670520902266.

Kätsyri, J., Förger, K., Mäkäräinen, M. and Takala, T. (2015), “A review of empirical evidence on different uncanny valley hypotheses: support for perceptual mismatch as one road to the valley of eeriness”, Frontiers in Psychology, Vol. 6 No. 390, doi: 10.3389/fpsyg.2015.00390.

Keltner, D. and Haidt, J. (1999), “Social functions of emotions at four levels of analysis”, Cognition and Emotion, Vol. 13 No. 5, pp. 505-521, doi: 10.1080/026999399379168.

Kidwell, B., Hasford, J., Turner, B., Hardesty, D.M. and Zablah, A.R. (2021), “Emotional calibration and salesperson performance”, Journal of Marketing, Vol. 85 No. 6, pp. 141-161, doi: 10.1177/0022242921999603.

Nabi, R.L. (2010), “The case for emphasizing discrete emotions in communication research”, Communication Monographs, Vol. 77 No. 2, pp. 153-159, doi: 10.1080/03637751003790444.

Nazifi, A., Gelbrich, K., Grégoire, Y., Koch, S., El-Manstrly, D. and Wirtz, J. (2020), “Proactive handling of flight overbooking: how to reduce negative eWOM and the costs of bumping customers”, Journal of Service Research, Vol. 24 No. 2, pp. 206-225, doi: 10.1177/1094670520933683.

Osterman, P. (2017), Who Will Care for Us?: Long-Term Care and the Long-Term Workforce, Russell Sage Foundation, New york, NY.

Ostrom, A.L., Field, J.M., Fotheringham, D., Subramony, M., Gustafsson, A., Lemon, K.N., Huang, M.-H. and McColl-Kennedy, J.R. (2021), “Service research priorities: managing and delivering service in turbulent times”, Journal of Service Research, Vol. 24 No. 3, pp. 329-353, doi: 10.1177/10946705211021915.

Otterbring, T. (2017), “Smile for a while: the effect of employee-displayed smiling on customer affect and satisfaction”, Journal of Service Management, Vol. 28 No. 2, pp. 284-304, doi: 10.1108/JOSM-11-2015-0372.

Paiva, A., Leite, I. and Ribeiro, T. (2014), “Emotion modeling for social robots”, Calvo, R., D'Mello, S., Gratch, J. and Kappas, A. (Ed.s), The Oxford Handbook of Affective Computing, Oxford Library of Psychology, pp. 296-308.

Peter, J. and Kühne, R. (2018), “The new frontier in communication research: why we should study social robots”, Media and Communication, 2018, Vol. 6 No. 3, p. 4, doi: 10.17645/mac.v6i3.1596.

Pieskä, S., Luimula, M., Jauhiainen, J. and Spiz, V. (2013), “Social service robots in wellness and restaurant applications”, Journal of Communication and Computer, Vol. 10, pp. 116-123.

Rashkin, H., Smith, E.M., Li, M. and Boureau, Y.-L. (2019), “Towards empathetic open-domain conversation models: a new benchmark and dataset”, Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, Association for Computational Linguistics, pp. 5370-5381, doi: 10.18653/v1/P19-1534.

Reis, H.T. and Sprecher, S. (2009), Encyclopedia of Human Relationships, Sage Publications, Thousand Oaks, CA.

Reis, J., Melão, N., Salvadorinho, J., Soares, B. and Rosete, A. (2020), “Service robots in the hospitality industry: the case of Henn-na hotel, Japan”, Technology in Society, Vol. 63, 101423, doi: 10.1016/j.techsoc.2020.101423.

Sullivan, A. (2021), “'Staff wanted' as pandemic forces hospitality workers to rethink”. available at: https://www.dw.com/en/staff-wanted-as-pandemic-forces-hospitality-workers-to-rethink/a-59118045 (accessed 26 October 2021).

van der Meer, T.G.L.A. and Verhoeven, J.W.M. (2014), “Emotional crisis communication”, Public Relations Review, Vol. 40 No. 3, pp. 526-536, doi: 10.1016/j.pubrev.2014.03.004.

Van Pinxteren, M.M.E., Pluymaekers, M. and Lemmink, J.G.A.M. (2020), “Human-like communication in conversational agents: a literature review and research agenda”, Journal of Service Management, Vol. 31 No. 2, pp. 203-225, doi: 10.1108/JOSM-06-2019-0175.

Verleye, K., Jaakkola, E., Hodgkinson, I.R., Jun, G.T., Odekerken-Schröder, G. and Quist, J. (2017), “What causes imbalance in complex service networks? Evidence from a public health service”, Journal of Service Management, Vol. 28 No. 1, pp. 34-56, doi: 10.1108/JOSM-03-2016-0077.

Wei, C., Liu, M.W. and Keh, H.T. (2020), “The road to consumer forgiveness is paved with money or apology? The roles of empathy and power in service recovery”, Journal of Business Research, Vol. 118, pp. 321-334, doi: 10.1016/j.jbusres.2020.06.061.

Wirtz, J., Patterson, P.G., Kunz, W.H., Gruber, T., Lu, V.N., Paluch, S. and Martins, A. (2018), “Brave new world: service robots in the frontline”, Journal of Service Management, Vol. 29 No. 5, pp. 907-931, doi: 10.1108/JOSM-04-2018-0119.

Yu, C.-E. and Ngan, H.F.B. (2019), “The power of head tilts: gender and cultural differences of perceived human vs human-like robot smile in service”, Tourism Review, Vol. 74 No. 3, pp. 428-442, doi: 10.1108/TR-07-2018-0097.

Zhang, H., Sun, J., Liu, F. and Knight, G.J. (2014), “Be rational or be emotional: advertising appeals, service types and consumer responses”, European Journal of Marketing, Vol. 48 Nos 11/12, pp. 2105-2126, doi: 10.1108/EJM-10-2012-0613.

Corresponding author

Marc Becker is the corresponding author and can be contacted at: m.becker@maastrichtuniversity.nl

About the authors

Marc Becker is a PhD candidate at the Department of Marketing and Supply Chain Management at Maastricht University's School of Business and Economics. His research interests are in service research, particularly concerning the impact of service robots on customers, businesses and society at large.

Dr. Emir Efendić is an assistant professor at Maastricht University's School of Business and Economics with the Marketing and Supply Chain Management department. His work mostly focuses on the how people make decisions and judgments. Lately, his focus has been increasingly on algorithmic judgments and how people interact with artificial/robotic systems.

Prof. Dr. Gaby Odekerken-Schröder is a full professor in Customer-centric Service Science with the Department of Marketing and Supply Chain Management at Maastricht University’s School of Business and Economics. Her main research interests are service innovation, service robots, healthcare services, relationship management, customer loyalty, and service failure and recovery. She is one of the cofounders of Maastricht University's Service Science Factory, as she loves to bridge theory and practice. Her research has been published in Journal of Marketing, Management Information Systems Quarterly (MISQ), Journal of Retailing, Journal of Service Research, International Journal of Social Robotics, Journal of the American Medical Doctors Association, Journal of Service Management, Journal of Services Marketing, Journal of Business Research and many more.

Related articles