Skip to main content
Log in

Cobotic service teams and power dynamics: Understanding and mitigating unintended consequences of human-robot collaboration in healthcare services

  • Original Empirical Research
  • Published:
Journal of the Academy of Marketing Science Aims and scope Submit manuscript

Abstract

In cobotic service teams, employees and robots collaborate to serve customers. As cobotic teams become more prevalent, a key question arises: How do consumers respond to cobotic teams, as a function of the roles shared by employees and robots (robots in superordinate roles as team leaders and humans in subordinate roles as assistants, or vice versa)? Six studies, conducted in different healthcare settings, show that consumers respond less favorably to robot-led (vs. human-led) teams. In delineating the process underlying these responses, the authors demonstrate that consumers ascribe less power to robot (vs. human) team leaders, which increases consumer anxiety and drives downstream responses through serial mediation. Further examining the power dynamics in cobotic service encounters, the authors identify boundary conditions that help mitigate negative consumer responses (increasing consumers’ power by letting them choose the robot in the service team, leveraging consumers’ power distance beliefs, and reinforcing the robot’s performance capabilities).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. Robots can be defined in multiple ways (Table 1 provides multiple illustrative definitions). Wirtz et al. (2018 p. 909) provide a nuanced conceptualization of ‘service robots’, which they define as “system-based autonomous and adaptable interfaces that interact, communicate and deliver service to an organization’s customers.” The authors also consider the role of multiple important robot design attributes and characteristics, such as human-like physical embodiment (vs. virtual representation). In line with current real-world examples in services (e.g., robots such as Pepper and Nao), we study embodied, humanoid robots; that is, devices housed in mobile, human-like bodies, which operate via powerful software that enables them to perform in a rational, seemingly human way (Broadbent et al., 2009; Duffy, 2003). In our General Discussion, we will return to the importance of robot characteristics and design features as well as additional specific roles of robots (e.g., companion robots) as a platform for future marketing research (Table 4). For broader reviews of the corresponding robotics literature in marketing and beyond, see: Henschel et al. (2021), Lu et al. (2020), and Bartneck et al. (2020). For a recent review on how human and robot personalities affect human-robot-interactions, see Lionel et al. (2020).

  2. For example, in 2018, a hospital in Singapore began using a Nao robot to guide seniors with dementia in rehabilitation sessions and recovery therapy (Mosaic 2021).

  3. The field of cobotics is concerned with the design and evaluation of robotic systems built to collaborate with humans (Moulières-Seban et al., 2016; Peshkin & Colgate, 1999). In their review of human-robot-collaboration, Bauer et al. (2008, p. 47) note that “humans and robots collaborating on a common task form a team,” and define collaborating as working with someone to reach a common goal. Accordingly, drawing on research in robotics (e.g., Bauer et al., 2008), we study cobotic service teams that include a human and a robot, collaborating in a co-located space, to serve a customer (also see Table 1). More complex configurations of cobotic teams are possible (i.e., multiple humans x multiple robots x multiple customers), but for this initial empirical investigation, we seek to establish the predicted effects in a comparatively simple setting, the triad of one employee, robot, and customer.

  4. Notably, this is not the first case of robots being used in roles of authority. For example, during the coronavirus pandemic, Singapore employed robotic dogs “to enforce strict social distancing” in public (Chen, 2023).

  5. To explore this notion empirically, we conducted a study (N = 80, MTurk, MAge = 38.05, 37 females) to examine consumers’ perceptions of doctors and assistants. Participants rated the relative extent to which power, autonomy, warmth, and competence (using the same scales as in the main studies) are usually exhibited by a doctor or an assistant in a typical medical visit (1 = “doctor has this trait” to 7 = “assistant has this trait”). The results of t-tests relative to the scale midpoint (4.0) revealed that participants more typically perceive a doctor (relative to the assistant) as more powerful (M = 2.70; t(79) = -6.91, p < .001), more autonomous (M = 2.91; t(79) = -5.89, p < .001), more competent (M = 3.61; t(79) = -2.36, p = .02) but less warm (M = 4.45; t(79) = 3.50, p < .001). These results are consistent with the notion that patient–doctor encounters are characterized by the doctor being perceived as having more power in the encounter as they provide professional diagnoses and treatment (Calnan, 1984; Schei, 2006).

  6. For the remaining studies (3–5) we develop the corresponding hypotheses as the studies are presented.

  7. These exercise classes are designed be relatively standardized, considering the needs of the elderly participants and their (physical) abilities. Moreover, the exercise class occurred weekly, and respondents had participated in prior classes with a human-only instructor (i.e., neither the class nor the exercises were new to the patients).

  8. We calculated the alpha for the first wave (exercise class led by a human) and the second wave (exercise class led by a human-robot team) separately. The small sample size may account for the differences in alphas.

  9. Our results remain the same when performing a nonparametric Wilcoxon signed-rank test (favorability Z = -2.80, p = 0.005; behavioral intentions Z = -2.04, p = 0.042).

  10. Further discussions of analyses related to the role of autonomy, warmth, and competence for this study and our other studies are provided in Web Appendix E. We explored these variables because an alternative perspective in robotics examines the extent to which humans perceive robots as social actors (Dautenhahn, 1999) and, therefore, judge robots along social perception dimensions, such as those that humans use to assess other humans (van Doorn et al., 2017). Considering this lens, some of our studies included measures of a robot’s warmth and competence, which represent the two fundamental dimensions of social perception (see Cuddy et al., 2008). We present these exploratory results in Web Appendix E as a potential inspiration for future work on cobotics.

  11. For completeness, for Study 1, we also examined the mediation path doctor type anxiety behavioral intentions (Hayes, 2015, Model 4, 5000 resamples). For Study 1, the analysis reveals a total indirect effect (a × b = -1.0556, 95% CI: [-1.5263, -.6344]).

  12. Preregistration is available at https://osf.io/jqzph/?view_only=c6de335a4921470b8d199e44d821fdcc

  13. In a mediation analysis (Hayes, 2015, Model 4, 5000 resamples), we use doctor type as the (multi-categorical) independent variable, anxiety as the mediator, and behavioral intentions as the dependent variable. The indirect effect of the human-led versus the robot-led cobotic team though anxiety was significant (a × b = -1.2298, 95% CI: [-1.6675, -.8192]), consistent with Study 1. The indirect effect of human-only team versus robot-led cobotic team was significant (a × b = -1.0430, 95% CI: [-1.4300, -.6711]). The indirect effect of the human-only team versus human-led cobotic team was not significant (a × b = .1868, 95% CI: [-.1373, .5251]).

  14. For completeness, a mediation analysis (Hayes, 2015, Model 7, 5000 resamples) that included doctor type as the IV, choice as the moderator, anxiety as the mediator, behavioral intentions as the dependent variable, and pandemic concerns and gender as control variables revealed significant moderated mediation (a × b = .6564, 95% CI:[.1270, 1.2513]). Anxiety mediated the effect of doctor type on behavioral intentions in the control condition (a × b = -.5694, 95% CI:[-1.0593, -.1281]), as in previous studies, but not in the choice condition (a × b = .0871, 95% CI:[-.2944, .4470]). Anxiety also mediates the effect of choice on behavioral intentions when the doctor was a robot (a × b = .4551, 95% CI:[.0079, .9222]) but not when the doctor was a human (a × b = -.2100, 95% CI:[-.5681, .1381]).

  15. We conduct a mediation analysis (Hayes, 2015, Model 4, 5000 resamples), with doctor type as the independent variable, anxiety as mediator, and behavioral intentions as the dependent variable. We confirm the mediation path (a × b = -.5427, 95% CI: [-.9239, -.1905]).

  16. However, under some circumstances, customers with high power distance beliefs might feel superior over technology and therefore might be less likely to accept a robot physician; future research could examine this alternative idea (we thank one of the five reviewers for raising this point).

  17. We requested N = 150 MTurk participants, but we received N = 153. Prior to exposure to the main study, 12 participants failed an attention screener and were directed to the end of the study without participating, leaving a final sample of N = 141.

  18. For completeness, a mediation analysis (Hayes, 2015, Model 4, 5000 resamples) that included team leader as the independent variable, anxiety as the mediator, behavioral intentions as the dependent variable, and pandemic concerns and gender as control variables revealed significant mediation when examining the human and robot (a × b = -1.1221, 95% CI: [-1.6039, -.6665]) and human and robot with performance information team leader conditions (a × b = -.2330, 95% CI [-.4066, -.0791]). As expected, the mediation for the robot and robot with performance information team leader conditions was not significant (a × b = .4069, 95% CI: [.0205, .8121]).

References

  • Anderson, J., & Raine, L. (2023). The Future of Human Agency. Pew Research, 2/24/2023, [available at https://www.pewresearch.org/internet/2023/02/24/the-future-of-human-agency/]. Accessed 6/30/2023.

  • Anderson, C., John, O. P., & Keltner, D. (2012). The personal sense of power. Journal of Personality, 80(2), 313–344.

    Article  PubMed  Google Scholar 

  • Asimov, I. (1950/2004). I, Robot. Vol. 1. Spectra.

  • Atreja, A., Bellam, N., & Levy, S. R. (2005). Strategies to enhance patient adherence: Making it simple. Medscape General Medicine, 7(1), 4.

    PubMed  PubMed Central  Google Scholar 

  • Bartneck, C., Belpaeme, T., Eyssel, F., Kanda, T., Keijsers, M., & Šabanović, S. (2020). Human-robot interaction: An introduction. Cambridge University Press.

    Book  Google Scholar 

  • Bauer, A., Wollherr, D., & Buss, M. (2008). Human-robot collaboration: A survey. International Journal of Humanoid Robotics, 5(1), 47–66.

    Article  Google Scholar 

  • Biron, B. (2022). Robot Orders Increase 40% in First Quarter as Desperate Employers Seek Relief from Labor Shortages, Report Says. Business Insider, 5/29/2022, [available at https://www.businessinsider.com/robot-orders-up-40-percent-employers-seek-relief-labor-shortage-2022-5]. Accessed 6/30/2023.

  • Bradshaw, J. M., Feltovich, P., Johnson, M., Breedy, M., Bunch, L., Eskridge, T., Jung, H., Lott, J., Uszok, A., & van Diggelen, J. (2009). From Tools to Teammates: Joint Activity in Human-Agent-Robot Teams. In International Conference on Human Centered Design, 935–44.

  • Breazeal, C., Gray, J., Hoffman, G., & Berlin, M. (2004). Social robots: Beyond tools to partners. In: 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No. 04TH8759, 551–556, IEEE.

  • Broadbent, E., Stafford, R., & MacDonald, B. (2009). Acceptance of healthcare robots for the older population: Review and future directions. International Journal of Social Robotics, 1(4), 319–330.

    Article  Google Scholar 

  • Brown, J. H., & Raven, B. H. (1994). Power and compliance in doctor/patient relationships. Journal of Health Psychology, 6(1), 1–22.

    Google Scholar 

  • Burgoon, M., Birk, T. S., & Hall, J. R. (1991). Compliance and satisfaction with physician-patient communication an expectancy theory interpretation of gender differences. Human Communication Research, 18(2), 177–208.

    Article  Google Scholar 

  • Čaić, M., Avelino, J., Mahr, D., Odekerken-Schroder, G., & Bernardino, A. (2020). Robotic versus human coaches for active aging: An automated social presence perspective. International Journal of Social Robotics, 12, 867–882.

    Article  Google Scholar 

  • Calnan, M. (1984). Clinical uncertainty: Is it a problem in the doctor-patient relationship? Sociology of Health & Illness, 6(1), 74–85.

    Article  Google Scholar 

  • Cantrell, S., Davenport, T. H., Hatfield, S., & Kreit, B. (2022). Strengthening the bonds of human and machine collaboration. Deloitte Insights, November 22. Retrieved on 1/31/2024 from https://www2.deloitte.com/xe/en/insights/topics/talent/human-machine-collaboration.html?trk=public_post_comment-text

  • Castelo, N., Bos, M. W., & Lehmann, D. R. (2019). Task-dependent algorithm aversion. Journal of Marketing Research, 56(5), 809–825.

    Article  Google Scholar 

  • Castelo, N., Boegershausen, J., Hildebrand, C., & Henkel, A. P. (2023). Understanding and improving consumer reactions to service bots. Journal of Consumer Research. https://academic.oup.com/jcr/advance-article/doi/10.1093/jcr/ucad023/7100346

  • Chafkin, M. (2018). Robots Could Replace Surgeons in the Battle Against Cancer. Bloomberg Businessweek, [available at https://www.bloomberg.com/news/features/2018-03-23/robots-could-replace-surgeons-in-the-battle-against-cancer]. Accessed 14 May 2019.

  • Chen, H. (2023). ‘Like something out of Black Mirror’: Police robots go on patrol at Singapore airport. CNN, [available at https://edition.cnn.com/2023/06/18/asia/police-robots-singapore-security-intl-hnk/index.html]. Accessed 6/18/2023.

  • Christoforou, E. G., Avgousti, S., Ramdani, N., Novales, C., & Panayides, A. S. (2020). The upcoming role for nursing and assistive robotics: Opportunities and challenges ahead. Frontiers in Digital Health, 2, 585656.

    Article  PubMed  PubMed Central  Google Scholar 

  • Cramer, H., Kemper, N., Amin, A., & Evers, V. (2009). The Effects of Robot Touch and Proactive Behaviour on Perceptions of Human-Robot Interactions. In 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), La Jolla, California, 275–76.

  • Cuddy, A. J. C., Fiske, S. T., & Glick, P. (2008). Warmth and competence as universal dimensions of social perception: The stereotype content model and the BIAS map. Advances in Experimental Social Psychology, 40(January), 61–149.

    Article  Google Scholar 

  • Dautenhahn, K., Woods, S., Kaouri, C., Walters, M. L., Koay, K. L., & Werry, I. 2005). What is a Robot Companion – Friend, Assistant or Butler?. In Intelligent Robots and Systems, 2005.(IROS 2005). 2005 IEEE/RSJ International Conference, 1192–97.

  • Dautenhahn, K. (1999). Robots as Social Actors: Aurora and the Case of Autism. In Proc. CT99, The Third International Cognitive Technology Conference, August, San Francisco, 374.

  • de Graaf, M. M. A., & Allouch, S. B. (2013). The Relation Between People’s Attitude and Anxiety Towards Robots in Human-Robot Interaction. In IEEE RO-MAN, South Korea, 632–37.

  • De Visser, E., & Parasuraman, R. (2011). Adaptive aiding of human-robot teaming: Effects of imperfect automation on performance, trust, and workload. Journal of Cognitive Engineering and Decision Making, 5(2), 209–31.

    Article  Google Scholar 

  • Dietvorst, B. J., Simmons, J. P., & Massey, C. (2015). Algorithm aversion: People erroneously avoid algorithms after seeing them err. Journal of Experimental Psychology: General, 144(1), 114–126.

    Article  PubMed  Google Scholar 

  • Dietvorst, B. J., Simmons, J. P., & Massey, C. (2018). Overcoming algorithm aversion: People will use imperfect algorithms if they can (even slightly) modify them. Management Science, 64(3), 1155–1170.

    Article  Google Scholar 

  • Djuric, A. M., Urbanic, R. J., & Rickli, J. L. (2016). A framework for collaborative robot (CoBot) integration in advanced manufacturing systems. SAE International Journal of Materials and Manufacturing, 9(2), 457–464.

    Article  Google Scholar 

  • Duffy, B. R. (2003). Anthropomorphism and the Social Robot. Robotics and Autonomous Systems, 42(3–4), 177–190.

    Article  Google Scholar 

  • ECRI. (2017). The Robot Will See You Now: Meet Xiaoyi, the First Robot to Pass China's Medical Licensing Exam. 11/29/2017, [available at https://www.ecri.org/components/HRCAlerts/Pages/HRCAlerts112917_Robots.aspx]. Accessed 6/30/2023.

  • Edwards, A., Omilion-Hodges, L., & Edwards, C. (2017). How Do Patients in a Medical Interview Perceive a Robot Versus Human Physician?. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, 109–10.

  • Fosch-Villaronga, E., & Albo-Canals, J. (2019). ‘I’ll Take Care of You’, Said the Robot. Paladyn, Journal of Behavioral Robotics, 10(1), 77–93.

    Article  Google Scholar 

  • Galinsky, A. D., Gruenfeld, D. H., & Magee, J. C. (2003). From power to action. Journal of Personality and Social Psychology, 85(3), 453–466.

    Article  PubMed  Google Scholar 

  • Geletkanycz, M., & Tepper, B. J. (2012). Publishing in AMJ–part 6: Discussing the implications. Academy of Management Journal, 55(2), 256–260.

    Article  Google Scholar 

  • Giebelhausen, M., Robinson, S. G., Sirianni, N. J., & Brady, M. K. (2014). Touch versus tech: When technology functions as a barrier or a benefit to service encounters. Journal of Marketing, 78(4), 113–124.

    Article  Google Scholar 

  • Glaa, B., Kristensson, P., & Witell, L. (2019). Service Teams and Understanding of Customer Value Creation. Service Innovation for Sustainable Business: Stimulating, Realizing and Capturing the Value from Service Innovation, 117–133.

  • Gombolay, M. C., Gutierrez, R. A., Clarke, S. G., Sturla, G. F., & Shah, J. A. (2015). Decision-making authority, team efficiency and human worker satisfaction in mixed human-robot teams. Autonomous Robots, 39(3), 293–312.

    Article  Google Scholar 

  • Gombolay, M., Yang, X. J., Hayes, B., Seo, N., Liu, Z., Wadhwania, S., Tania, Yu., Shah, N., Golen, T., & Shah, J. (2018). Robotic assistance in the coordination of patient care. The International Journal of Robotics Research, 37(10), 1300–1316.

    Article  Google Scholar 

  • Goodrich, M. A., & Schultz, A. C. (2007). Human-robot interaction: A survey. Foundations and Trends in Human-Computer Interaction, 1(3), 203–275.

    Article  Google Scholar 

  • Goodyear-Smith, F., & Buetow, S. (2001). Power issues in the doctor-patient relationship. Health Care Analysis, 9(4), 449–462.

    Article  CAS  PubMed  Google Scholar 

  • Görür, O. C., Rosman, B., Sivrikaya, F., & Albayrak, S. (2018). Social Cobots: Anticipatory Decision-Making for Collaborative Robots Incorporating Unexpected Human Behaviors. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, 398–406.

  • Graber, M. A., Pierre, J., & Charlton, M. (2003). Patient opinions and attitudes toward medical student procedures in the emergency department. Academic Emergency Medicine, 10(12), 1329–1333.

    Article  PubMed  Google Scholar 

  • Grandey, A. A., & Morris, K. (2023). Robots are changing the face of customer service. Harvard Business Review, March 22. Retrieved on 1/31/2024 at https://hbr.org/2023/03/robots-are-changing-the-face-ofcustomer-service

  • Gray, M. L., & Suri, S. (2017). The humans working behind the AI curtain. Harvard Business Review, 9(1), 2–5.

    Google Scholar 

  • Güntürkün, P., Haumann, T., & Mikolon, S. (2020). Disentangling the differential roles of warmth and competence judgments in customer-service provider relationships. Journal of Service Research, 23(4), 476–503.

    Article  Google Scholar 

  • Hamilton, R. (2016). Consumer-based strategy: Using multiple methods to generate consumer insights that inform strategy. Journal of the Academy of Marketing Science, 44, 281–285.

    Article  Google Scholar 

  • Han, D. H., Lalwani, A. K., & Duhachek, A. (2017). Power distance belief, power, and charitable giving. Journal of Consumer Research, 44(1), 182–195.

    Google Scholar 

  • Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J. Y. C., de Visser, E. J., & Parasuraman, R. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human Factors, 53(5), 517–527.

    Article  PubMed  Google Scholar 

  • Haseltine, W. A. (2018). Aging Populations Will Challenge Healthcare Systems All Over The World. https://www.forbes.com/sites/williamhaseltine/2018/04/02/aging-populations-will-challenge-healthcare-systems-all-over-the-world/?sh=58f1a4302cc3. Accessed Dec 2021.

  • Hashemian, M., Paiva, A., Mascarenhas, S., Santos, P. A., & Prada, R. (2019). The Power to Persuade: A Study of Social Power in Human-Robot Interaction. In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp. 1–8. IEEE.

  • Hayes, A. F. (2015). An index and test of linear moderated mediation. Multivariate Behavioral Research, 50(1), 1–22.

    Article  MathSciNet  PubMed  Google Scholar 

  • Hennes, R. (2019). How a Medical Assistance Robot Named ‘Moxi’ is Helping UTMB Galveston Nurses. Chron, [available at https://www.chron.com/neighborhood/bayarea/news/article/Moxi-robot-UTMB-Galveston-Diligent-Robotics-13577952.php]. Accessed 5/13/2019.

  • Henschel, A., Laban, G., & Cross, E. S. (2021). What makes a robot social? A review of social robots from science fiction to a home or hospital near you. Current Robotics Reports, 2, 9–19.

    Article  PubMed  PubMed Central  Google Scholar 

  • Hinds, P. J., Roberts, T. L., & Jones, H. (2004). Whose job is it anyway? A study of human-robot interaction in a collaborative task. Human-Computer Interaction, 19(1), 151–181.

    Article  Google Scholar 

  • Hodgson, A. J., & Emrich, R. (2002). Control of minimally constrained cobots. Journal of Robotic Systems, 19(7), 299–314.

    Article  Google Scholar 

  • Hoffman, G., & Breazeal, C. (2004). Collaboration in Human-Robot Teams. In AIAA 1st Intelligent Systems Technical Conference, Chicago, IL, 1–18.

  • Hofstede, G. (2001). Culture’s Consequences: Comparing Values, Behaviors, Institutions and Organizations Across Nations, (J. Brace-Thompson, ed.). SAGE.

  • Holland, J., Kingston, L., McCarthy, C., Armstrong, E., O’Dwyer, P., Merz, F., & McConnell, M. (2021). Service robots in the healthcare sector. Robotics, 10(1), 47.

    Article  Google Scholar 

  • Holthöwer, J., & van Doorn, J. (2023). Robots do not judge: Service robots can alleviate embarrassment in service encounters. Journal of the Academy of Marketing Science, 51, 767–784.

    Article  Google Scholar 

  • Hoorn, J. F., & Winter, S. D. (2018). Here comes the bad news: Doctor robot taking over. International Journal of Social Robotics, 10(4), 519–535.

    Article  Google Scholar 

  • Hou, Y. T.-Y., & Jung, M. (2018). Robots in Power. In Proceedings of Longitudinal Human-Robot Teaming Workshop at HRI18, 325–38.

  • Hou, Y. T.-Y., Lee, W.-Y., & Jung, M. (2023). ‘Should I Follow the Human, or Follow the Robot?’—Robots in Power Can Have More Influence Than Humans on Decision-Making. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–13.

  • Howard, J. (2023). Concern Grows Around US Health-Care Workforce Shortage: ‘We Don’t Have Enough Doctors. CNN, 5/16/2023, [available at: https://www.cnn.com/2023/05/16/health/health-care-worker-shortage/index.html]. Accessed 6/30/2023.

  • Huang, M.-H., & Rust, R. T. (2018). Artificial intelligence in service. Journal of Service Research, 21(2), 155–172.

    Article  Google Scholar 

  • Hui, M. K., & Bateson, J. E. G. (1991). Perceived control and the effects of crowding and consumer choice on the service experience. Journal of Consumer Research, 18(2), 174–184.

    Article  Google Scholar 

  • Hulland, J. (2019). In through the out door. Journal of the Academy of Marketing Science, 47, 1–3.

    Article  Google Scholar 

  • Hulland, J., & Houston, M. B. (2023). To Boldly Go …. Journal of the Academy of Marketing Science, 50(1), 1–3.

    Article  Google Scholar 

  • Inskeep, S. (2020). Robot Dog in Singapore Reminds People To Socially Distance, [available at https://www.npr.org/2020/05/11/853618557/robot-dog-in-singapore-reminds-people-to-socially-distance]. Accessed 11 May 2020.

  • Intel Technologies. (n.d.). Robotics in Healthcare to Improve Patient Outcomes. https://www.intel.com/content/www/us/en/healthcare-it/robotics-in-healthcare.html. Accessed 1 Sept 2023.

  • International Federation of Robotics (IFR). (2022). China Overtakes USA in Robot Density. https://ifr.org/ifr-press-releases/news/china-overtakes-usa-in-robot-density. Accessed 1 Sept 2023.

  • Intuitive. (n.d.). About Intuitive. Intuitive, [available at https://www.intuitive.com/en-us/about-us/company]. Accessed 25 Apr 2019.

  • Jackson, J. C., Castelo, N., & Gray, K. (2020). Could a rising robot workforce make humans less prejudiced? American Psychologist, 75(7), 969–982.

    Article  PubMed  Google Scholar 

  • Janssen, C. P., Donker, S. F., Brumby, D. P., & Kun, A. L. (2019). History and future of human-automation interaction. International Journal of Human-Computer Studies, 131(November), 99–107.

    Article  Google Scholar 

  • Javaid, M., Haleem, A., Singh, R. P., Rab, S., & Suman, R. (2022). Significant applications of cobots in the field of manufacturing. Cognitive Robotics, 2, 222–233.

    Article  Google Scholar 

  • Jörling, M., Böhm, R., & Paluch, S. (2019). Service robots: Drivers of perceived responsibility for service outcomes. Journal of Service Research, 22(4), 404–420.

    Article  Google Scholar 

  • Kalis, B., Collier, M., & Fu, R. (2018). 10 Promising AI Applications in Health Care. Harvard Business Review, [available at http://www.ajronline.org/doi/abs/10.2214/AJR.11.7522]. Accessed 2 Aug 2018.

  • Kashyap, V., Antia, K. D., & Frazier, G. L. (2012). Contracts, extracontractual incentives, and ex post behavior in franchise channel. Journal of Marketing Research, 49(2), 260–276.

    Article  Google Scholar 

  • Kavilanz, P. (2018). The US Can’t Keep Up with Demand for Health Aides, Nurses and Doctors. CNN Business, [available at https://money.cnn.com/2018/05/04/news/economy/health-care-workers-shortage/index.html]. Accessed 14 Mar 2020.

  • Kochan, A. (2004). A cobotic solution for surgical applications. Industrial Robot: An International Journal, 31(6), 478–480.

    Article  Google Scholar 

  • Krüger, J., Lien, T. K., & Verl, A. (2009). Cooperation of human and machines in assembly lines. CIRP Annals, 58(2), 628–646.

    Article  Google Scholar 

  • Kuka. (2019). Robot-Assisted Rehabilitation – ROBERT and KUKA Facilitate Mobilization. KUKA, [available at https://www.kuka.com/en-us/industries/solutions-database/2019/08/robert-from-life-science-robotics]. Accessed 14 Mar 2020.

  • Lalwani, A. K., & Forcum, L. (2016). Does a dollar get you a dollar’s worth of merchandise? The impact of power distance belief on price-quality judgments. Journal of Consumer Research, 43(2), 317–333.

    Article  Google Scholar 

  • Lammers, J., Stoker, J. I., Rink, F., & Galinsky, A. D. (2016). To have control over or to be free from others? The desire for power reflects a need for autonomy. Personality and Social Psychology Bulletin, 42(4), 498–512.

    Article  PubMed  Google Scholar 

  • Laveist, T. A., & Nuru-Jeter, A. (2002). Is doctor-patient race concordance associated with greater satisfaction with care? Journal of Health and Social Behavior, 43(3), 296–306.

    Article  PubMed  Google Scholar 

  • Leotti, L. A., & Delgado, M. R. (2011). The inherent reward of choice. Psychological Science, 22(10), 1310–1318.

    Article  PubMed  Google Scholar 

  • Leotti, L. A., Iyengar, S. S., & Ochsner, K. N. (2010). Born to choose: The origins and value of the need for control. Trends in Cognitive Sciences, 14(10), 457–463.

    Article  PubMed  PubMed Central  Google Scholar 

  • Leung, E., Paolacci, G., & Puntoni, S. (2018). Man versus machine: Resisting automation in identity-based consumer behavior. Journal of Marketing Research, 55(6), 818–831.

    Article  Google Scholar 

  • Li, J., Ju, W., & Nass, C. (2015). Observer Perception of Dominance and Mirroring Behavior in Human-Robot Relationships. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 133–140.

  • Liberman-Pincu, E., Van Grondelle, E. D., & Oron-Gilad, T. (2021, March). Designing robots with relationships in mind: Suggesting two models of human-socially assistive robot (SAR) relationship. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction (pp. 555–558).

  • Logg, J. M., Minson, J. A., & Moore, D. A. (2019). Algorithm appreciation: People prefer algorithmic to human judgment. Organizational Behavior and Human Decision Processes, 151(March), 90–103.

    Article  Google Scholar 

  • Longoni, C., Bonezzi, A., & Morewedge, C. K. (2019). Resistance to medical artificial intelligence. Journal of Consumer Research, 46(4), 629–650.

    Article  Google Scholar 

  • Lu, V. N., Wirtz, J., Kunz, W. H., Paluch, S., Gruber, T., Martins, A., & Patterson, P. G. (2020). Service robots, customers and service employees: What can we learn from the academic literature and where are the gaps? Journal of Service Theory and Practice, 30(3), 361–391.

    Article  Google Scholar 

  • Lumer, E., & Buschmeier, H. (2022). Perception of Power and Distance in Human-Human and Human-Robot Role-Based Relations. In 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 895–899.

  • MacInnis, D. J. (2011). A framework for conceptual contributions in marketing. Journal of Marketing, 75(4), 136–154.

    Article  Google Scholar 

  • MacInnis, D. J., Morwitz, V. G., Botti, S., Hoffman, D. L., Kozinets, R. V., Lehmann, D. R., Lynch Jr, J. G., & Pechmann, C. (2020). Creating boundary-breaking, marketing-relevant consumer research. Journal of Marketing, 84(2), 1–23.

    Article  Google Scholar 

  • Magee, J. C., & Galinsky, A. D. (2008). Social hierarchy: The self-reinforcing nature of power and status. Academy of Management Annals, 2(1), 351–398.

    Article  Google Scholar 

  • Marr, B. (2018). How Is AI Used In Healthcare - 5 Powerful Real-World Examples That Show The Latest Advances. Forbes, [available at https://www.forbes.com/sites/bernardmarr/2018/07/27/how-is-ai-used-in-healthcare-5-powerful-real-world-examples-that-show-the-latest-advances/#62a1c1f75dfb]. Accessed 2 Aug 2018.

  • Marteau, T. M., & Bekker, H. (1992). The development of a six-item short-form of the state scale of the Spielberger State-Trait Anxiety Inventory (STAI). British Journal of Clinical Psychology, 31(3), 301–306.

    Article  CAS  PubMed  Google Scholar 

  • Matthews, K. (2020). Pandemic Proves Utility of a Wide Range of Service Robots,” The Robot Report, [available at https://www.therobotreport.com/pandemic-proves-utility-wide-range-service-robots/]. Accessed 11 May 2020.

  • Mattila, A. S. (2006). The power of explanations in mitigating the Ill-Effects of service failures. Journal of Services Marketing, 20(7), 422–428.

    Article  Google Scholar 

  • Mende, M., Scott, M. L., van Doorn, J., Grewal, D., & Shanks, I. (2019). Service robots rising: How humanoid robots influence service experiences and elicit compensatory consumer responses. Journal of Marketing Research, 56(4), 535–556.

    Article  Google Scholar 

  • Meuter, M. L., Ostrom, A. L., Roundtree, R. I., & Bitner, M. J. (2000). Self-service technologies: Understanding customer satisfaction with technology-based service encounters. Journal of Marketing, 64(3), 50–64.

    Article  Google Scholar 

  • Mosaic. (2021). The Future is Nao: Is Robot Therapy the Next Big Thing for Seniors’ Rehab?. https://aic-mosaic.sg/2021/04/07/robot-therapy-nao-future-rehabilitation-yishun-community-hospital/, April 07, 2021. Accessed 28 Nov 2023.

  • Moulières-Seban, T., Bitonneau, D., Salotti, J.-M., Thibault, J.-F., & Claverie, B. (2016). Human factors issues for the design of a cobotic system. In J. Chen (Ed.), Advances in human factors in robots and unmanned systems (pp. 375–385). Springer.

    Google Scholar 

  • Murphy, R. R., Adams, J., & Gandudi, V. B. M. (2020). Robots are Playing Many Roles in the Coronavirus Crisis. https://theconversation.com/robots-are-playing-many-roles-in-the-coronavirus-crisis-and-offering-lessons-for-future-disasters-135527. Accessed 1 Sept 2023.

  • Noble, S. M., & Mende, M. (2023). The future of artificial intelligence and robotics in the retail and service sector: Sketching the field of consumer-robot-experiences. Journal of the Academy of Marketing Science, 51, 747–756.

    Article  Google Scholar 

  • Nomura, T., Kanda, T., & Suzuki, T. (2006). Experimental investigation into influence of negative attitudes toward robots on human-robot interaction. AI & Society, 20(2), 138–150.

    Article  Google Scholar 

  • Oravec, J. A. (2023). Rage against robots: Emotional and motivational dimensions of anti-robot attacks, robot sabotage, and robot bullying. Technological Forecasting and Social Change, 189, 122249.

    Article  Google Scholar 

  • Ostrom, A. L., Field, J. M., Fotheringham, D., Subramony, M., Gustafsson, A., Lemon, K. N., Huang, M.-H., & McColl-Kennedy, J. R. (2021). Service research priorities: Managing and delivering service in turbulent times. Journal of Service Research, 24(3), 329–353.

    Article  Google Scholar 

  • Panesar, S. S. (2018). The Surgical Singularity Is Approaching. Scientific American, [available at https://blogs.scientificamerican.com/observations/the-surgical-singularity-is-approaching/]. Accessed 25 Apr 2019.

  • Passaperuma, K., Higgins, J., Power, S., & Taylor, T. (2008). Do patients’ comfort levels and attitudes regarding medical student involvement vary across specialties? Medical Teacher, 30(1), 48–54.

    Article  PubMed  Google Scholar 

  • Pauliková, A., Babeľová, Z. G., & Ubárová, M. (2021). Analysis of the impact of human–cobot collaborative manufacturing implementation on the occupational health and safety and the quality requirements. International Journal of Environmental Research and Public Health, 18(4), 1927.

    Article  PubMed  PubMed Central  Google Scholar 

  • Perlmuter, L. C., Scharff, K., Karsh, R., & Monty, R. A. (1980). Perceived control a generalized state of motivation. Motivation and Emotion, 4(1), 35–45.

    Article  Google Scholar 

  • Peshkin, M., & Colgate, J. E. (1999). Cobots. Industrial Robot: An International Journal, 26(5), 463–468.

    Article  Google Scholar 

  • Playle, J. F., & Keeley, P. (1998). Non-compliance and professional power. Journal of Advanced Nursing, 27(2), 304–311.

    Article  CAS  PubMed  Google Scholar 

  • Rae, I. R., Takayama, L., & Mutlu, B. (2013). The Influence of Height in Robot-Mediated Communication. Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI’13). IEEE Press, Piscataway, NJ, 1–8.

  • Rafferty, A.-M., Ball, J., & Aiken, L. H. (2001). Are teamwork and professional autonomy compatible, and do they result in improved hospital care? BMJ Quality & Safety, 10(Suppl 2), ii32–ii37.

    Google Scholar 

  • Reddy, S., Allan, S., Coghlan, S., & Cooper, P. (2020). A governance model for the application of AI in health care. Journal of the American Medical Informatics Association, 27(3), 491–497.

    Article  PubMed  Google Scholar 

  • Reich-Stiebert, N., & Eyssel, F. (2015). Learning with educational companion robots? Toward attitudes on education robots, predictors of attitudes, and application potentials for education robots. International Journal of Social Robotics, 7(5), 875–888.

    Article  Google Scholar 

  • Robert Jr, L. P., Alahmad, R., Esterwood, C., Kim, S., You, S., & Zhang, Q. (2020). A review of personality in human-robot interactions. Foundations and Trends in Information Systems, 4(2), 107–212.

    Article  Google Scholar 

  • Robinson, B. (2021). ‘The Great Resignation’ Migration And What This Means For Your Career. https://www.forbes.com/sites/bryanrobinson/2021/06/11/the-great-resignation-migration-and-what-this-means-for-your-career/?sh=24bd842069aa. Accessed 12/2021.

  • Rühr, A., Berger, B., & Hess, T. (2019). Can I Control My Robo-Advisor? Trade-Offs in Automation and User Control in (Digital) Investment Management. In Twenty-fifth Americas Conference on Information, 1–10.

  • Ryan, J., & Sysko, J. (2007). The contingency of patient preferences for involvement in health decision making. Health Care Management Review, 32(1), 30–36.

    Article  PubMed  Google Scholar 

  • Sarker, S., Jamal, L., Ahmed, S. F., & Irtisam, N. (2021). Robotics and artificial intelligence in healthcare during COVID-19 pandemic: A systematic review. Robotics and Autonomous Systems. https://doi.org/10.1016/j.robot.2021.10390

    Article  PubMed  PubMed Central  Google Scholar 

  • Savage, N. (2022). Robots rise to meet the challenge of caring for old people. Nature, 601, S8–S10.

    Article  CAS  PubMed  ADS  Google Scholar 

  • Schei, E. (2006). Doctoring as leadership: The power to heal. Perspectives in Biology and Medicine, 49(3), 393–406.

    Article  PubMed  Google Scholar 

  • Scholtz, J., Young, J., Drury, J. L., & Yanco, H. A. (2004). Evaluation of Human-Robot Interaction Awareness in Search and Rescue. In IEEE International Conference on Robotics and Automation, Barcelona, Spain, 2327–32.

  • Schüle, M., Kraus, J. M., Babel, F., & Reissner, N. (2022). Patients’ Trust in Hospital Transport Robots: Evaluation of the Role of User Dispositions, Anxiety, and Robot Characteristics. HRI '22: Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction, March, 246–255.

  • Shen, J., Zhang, C. J. P., Jiang, B., Chen, J., Song, J., Liu, Z., He, Z., Wong, S. Y., Fang, P.-H., & Ming, W.-K. (2019). Artificial intelligence versus clinicians in disease diagnosis: Systematic review. JMIR Medical Informatics, 7(3), e10010.

    Article  PubMed  PubMed Central  Google Scholar 

  • Specht, J., Egloff, B., & Schmukle, S. C. (2013). Everything Under control? The effects of age, gender, and education on trajectories of perceived control in a nationally representative German sample. Developmental Psychology, 49(2), 353–364.

    Article  PubMed  Google Scholar 

  • Spiller, S. A., Fitzsimons, G. J., Lynch Jr, J. G., & McClelland, G. H. (2013). Spotlights, floodlights, and the magic number zero: Simple effects tests in moderated regression. Journal of Marketing Research, 50(2), 277–288.

    Article  Google Scholar 

  • Stein, J.-P., Liebold, B., & Ohler, P. (2019). Stay Back, Clever Thing! Linking situational control and human uniqueness concerns to the aversion against autonomous technology. Computers in Human Behavior, 95(June), 73–82.

    Article  Google Scholar 

  • Strickland, E. (2016). Would You Trust a Robot Surgeon to Operate on You?. IEEE Spectrum. Accessed 25 Apr 2019.

  • Tafarodi, R. W., Milne, A. B., & Smith, A. J. (1999). The confidence of choice: Evidence for an augmentation effect on self-perceived performance. Personality and Social Psychology Bulletin, 25(11), 1405–1416.

    Article  Google Scholar 

  • Tsai, C.-Y., Marshall, J. D., Choudhury, A., Serban, A., Hou, Y.-Y., Jung, M. F., Dionne, S. D., & Yammarino, F. J. (2022). Human-robot collaboration: A multilevel and integrated leadership framework. The Leadership Quarterly, 33(1), 101594.

    Article  Google Scholar 

  • Tsarouchi, P., Makris, S., & Chryssolouris, G. (2016). On a human and dual-arm robot task planning method. Procedia CIRP, 57, 551–555.

    Article  Google Scholar 

  • van Doorn, J., Mende, M., Noble, S. M., Hulland, J., et al. (2017). Domo Arigato Mr. Roboto: Emergence of automated social presence in organizational frontlines and customers’ service experiences. Journal of Service Research, 20(1), 43–58.

    Article  Google Scholar 

  • van Doorn, J., Smailhodzic, E., Puntoni, S., Li, J., Schumann, J. H., & Holthöwer, J. (2023). Organizational frontlines in the digital age: The Consumer-Autonomous Technology–Worker (CAW) framework. Journal of Business Research, 164, 114000.

    Article  Google Scholar 

  • Walliser, J. C., de Visser, E. J., Wiese, E., & Shaw, T. H. (2019). Team structure and team building improve human-machine teaming with autonomous agents. Journal of Cognitive Engineering and Decision Making, 13(4), 258–278.

    Article  Google Scholar 

  • West, C. (1984). When the doctor is a ‘Lady’: Power, status and gender in physician-patient encounters. Symbolic Interaction, 7(1), 87–106.

    Article  Google Scholar 

  • Westfall, C. (2022). Troubling trend: Great resignation versus AI, robotics and automation. Forbes, January 11. Retrieved on 1/31/2024 from https://www.forbes.com/sites/chriswestfall/2022/01/11/great-resignation-versusincreasing-investment-in-ai-robotics-and-automation-a-troubling-trend/?sh=3a610efa63f6

  • Winfield, A. (2012). Robotics: A very short introduction. OUP Oxford.

    Book  Google Scholar 

  • Wirtz, J., Patterson, P. G., Kunz, W. H., Gruber, T., et al. (2018). Brave new world: Service robots in the frontline. Journal of Service Management, 29(5), 907–931.

    Article  Google Scholar 

  • Yan, L. (2018). Chinese AI Beats Doctors in Diagnosing Brain Tumors. Popular Mechanics. https://www.popularmechanics.com/technology/robots/a22148464/chinese-ai-diagnosed-brain-tumors-more-accurately-physicians/. Accessed 1 Sept 2023.

  • You, S., & Robert, L. (2019). Trusting robots in teams: Examining the impacts of trusting robots on team performance and satisfaction. In: Proceedings of the 52nd Hawaii International Conference on System Sciences, 8–11.

  • Zeithaml, V. A., Berry, L. L., & Parasuraman, A. (1996). The behavioral consequences of service quality. Journal of Marketing, 60(2), 31–46.

    Article  Google Scholar 

  • Zhang, Y., Winterich, K. P., & Mittal, V. (2010). Power distance belief and impulsive buying. Journal of Marketing Research, 47(5), 945–954.

    Article  Google Scholar 

  • Złotowski, J., Yogeeswaran, K., & Bartneck, C. (2017). Can we control it? Autonomous robots threaten human identity, uniqueness, safety, and resources. International Journal of Human-Computer Studies, 100(April), 48–54.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maura L. Scott.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Kirk Plangger served as Guest Editor for this article.

This research is based on the dissertation of the first author, Ilana Shanks, who passed away unexpectedly and is deeply missed. The authors gratefully acknowledge a Customer Experience Grant from the Marketing Science Institute, which helped to support this research.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 4426 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shanks, I., Scott, M.L., Mende, M. et al. Cobotic service teams and power dynamics: Understanding and mitigating unintended consequences of human-robot collaboration in healthcare services. J. of the Acad. Mark. Sci. (2024). https://doi.org/10.1007/s11747-024-01004-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11747-024-01004-1

Keywords

Navigation