Abstract
The implementation of care robotics in care settings is identified by some authors as a disruptive innovation, in the sense that it will upend the praxis of care. It is an open ethical question whether this alleged disruption will also have a transformative impact on established ethical concepts and principles. One prevalent worry is that the implementation of care robots will turn deception into a routine component of elderly care, at least to the extent that these robots will function as simulacra for something that they are not (i.e. human caregivers). At face value, this may indeed seem to indicate a concern for how this technology may upend existing practices and relationships within a care setting. Yet, on closer inspection, this reaction may rather point to a rediscovery and a revaluation of a particularly well-entrenched value or virtue, i.e. veracity. The virtue of veracity is one of the values that is mobilized to argue against a substitution of human caregivers (while a combination of care robots and human caregivers is much more accepted). The subject of this paper is to explore how the moral panic surrounding care robots should not so much be interpreted as an anticipated and probable disruptor in a care setting, but rather as a sensitizing – in a way conservationist – argument that identifies veracity as an established value that is supposed to be protected and advanced in present day and future care settings.
Similar content being viewed by others
Notes
This indicates that a so-called principle-based approach can be compatible with a form of virtue ethics, and indeed also with what has been called a care-ethics approach (which Beauchamp & Childress 2019) actually regard as “a form of virtue ethics”), for they explicitly mention the virtues of caring as fundamental in health care practices and relationships. On that reading, and contrary to what is sometimes maintained (without elaborate argumentation), a principlist account need not overlook the virtues (of care), as it may, indeed, encompass a commitment to those traits that are valued in real-life settings of (aged) care.
Concrete examples of robots in social care practice are Paro (which looks like a baby seal), Pepper (a humanoid robot), MiRo (an animal-like robot companion), and Zora’s virtual robot (embedded in a simulated environment on a computer screen, though compatible with ‘real-life robots’). Other examples are discussed by Sharkey and Sharkey (2011) and more recently in the NHS’ note on robotics in social care (Parliamentary Office of Science and Technology, 2018).
Sparrow and Sparrow (2006) illustrate this point by referring to Nozick’s Experience Machine argument, claiming that we desire the real world to be a certain way, not just our beliefs about the world to be a certain way, which is why most of us would allegedly not be happy to substitute our “real-world life” for life connected to a hypothetical machine that could give us any experience one might desire. It would lead me too far to discuss this elaborately, though it is worth mentioning that the assumption – shared by Sparrow and Sparrow (2006) – that people resist connecting to the experience machine because they believe that a good life is about more than having beliefs/experiences about the world, is not the consensus view within the realm of philosophy. A common criticism of this assumption is that most people would rather not choose to plug in to the machine because they fear that it might malfunction (see e.g. Bramble, 2016). Relatedly, though more directly concerning the deception objection against “emotional robots”, is Coeckelbergh’s (2012) explicit criticism of Sparrow and Sparrow (2006), in which the author critically engages with the “ideal emotional communication condition” which seems to serve as a normative requirement in the deception objection.
One may find it notable that Sparrow (2002), who played a significant role in pointing out the ethical matter of deception in the context of robotic companions, carefully alluded that this matter may perhaps not be “the most urgent issue facing society today”. That said, it is intelligible that today some critics may see valid reason for concern, considering how motivations to develop SCRE’s are linked to meeting the shortage of human caregivers and addressing challenges of an ageing population, which already seems to include a notion of substitution.
Some would even say that the former is a specification of the latter (Beauchamp & Childress, 2019).
References
Beauchamp, T. L., & Childress, J. F. (2019). Principles of biomedical ethics (8 ed.). Oxford University Press.
Borenstein, J., & Pearson, Y. (2012). Robot caregivers: Ethical issues across the human lifespan. In P. Lin, K. Abney, & G. Bekey (Eds.), Robot ethics: The ethical and social implications of robotics (pp. 251–265). MIT Press.
Bramble, B. (2016) The experience machine. Philosophy Compass, 11, 136–145. https://doi.org/10.1111/phc3.12303.
Coeckelbergh, M. (2012). Are emotional robots deceptive? IEEE Transactions on Affective Computing, 3, 388–393. https://doi.org/10.1109/T-AFFC.2011.29.
Coeckelbergh, M. (2015). Artificial agents, good care, and modernity. Theoretical Medicine and Bioethics, 36, 265–277. https://doi.org/10.1007/s11017-015-9331-y.
Coeckelbergh, M. (2016). Care robots and the future of ICT-mediated elderly care: A response to doom scenarios. AI & Society, 31(4), 455–462. https://doi.org/10.1007/s00146-015-0626-3.
Fox, R. C., & Swazey, J. P. (1984). Medical morality is not bioethics–medical ethics in China and the United States. Perspectives in Biology and Medicine, 27(3), 336–360. https://doi.org/10.1353/pbm.1984.0060.
Fry, S. T. (1989). Toward a theory of nursing ethics. Advances in Nursing Science, 11(4), 9–22.
Fry, S. T. (1995). Nursing ethics. In S. G. Post (Ed.), Encyclopedia of bioethics (pp. 1898–1903). Gale Group.
Isle of Wight Council (2018). Social care digital innovation programme. Discovery phase report for exploring the potential for cobots to support carers. Isle of Wight Council. Retrieved March 7, 2022, from https://www.local.gov.uk/sites/default/files/documents/IoW%20final%20deliverable%20FINAL%20for%20publication.pdf
Jasanoff, S. (2016). The ethics of invention: Technology and the human future. W.W.Norton.
Jecker, N. S., & Reich, W. T. (1995). Care. In S. G. Post (Ed.), Encyclopedia of bioethics (pp. 349–374). Gale Group.
Kalisz, D. E., Khelladi, I., Castellano, S., & Sorio, R. (2021). The adoption, diffusion & categorical ambiguity trifecta of social robots in e-health – insights from healthcare professionals. Futures, 129, 102743. https://doi.org/10.1016/j.futures.2021.102743.
Maibaum, A., Bischof, A., Hergesell, J., & Lipp, B. (2022). A critique of robotics in health care. AI & Society, 37, 467–477. https://doi.org/10.1007/s00146-021-01206-z.
Meyers, C. (2021). Deception and the clinical ethicist. The American Journal of Bioethics, 21(5), 4–12. https://doi.org/10.1080/15265161.2020.1863513.
Ministerie, van Volksgezondheid, & Welzijn en Sport. (2022). WOZO programma wonen, ondersteuning en zorg voor ouderen. Retrieved October 11, 2022, from https://www.rijksoverheid.nl/documenten/rapporten/2022/07/04/wozo-programma-wonen-ondersteuning-en-zorg-voor-ouderen
Mois, G., & Beer, J. M. (2020). Robotics to support aging in place. In R. Pak, E. J. de Visser, & E. Rovira (Eds.), Living with robots (pp. 49–74). Academic Press.
Parliamentary Office of Science and Technology (2018). Robotics in social care, The Parliamentary Office of Science and Technology. Retrieved March 7, 2022, from https://researchbriefings.files.parliament.uk/documents/POST-PN-0591/POST-PN-0591.pdf
Parviainen, J., Turja, T., Van Aerschot, L. (2019). Social robots and human touch in care: The perceived usefulness of robot assistance among healthcare professionals. In Social robots: Technological, societal and ethical aspects of human-robot interaction. Human–Computer Interaction Series. Springer. https://doi.org/10.1007/978-3-030-17107-0_10
Pekkarinen, S., Hennala, L., Tuisku, O., Gustafsson, C., Johansson-Pajala, R. M., Thommes, K., Hoppe, J. A., & Melkas, H. (2020). Embedding care robots into society and practice: Socio-technical considerations. Futures, 122, 102593. https://doi.org/10.1016/j.futures.2020.102593.
Pellegrino, E. D. (1985). The caring ethic: The relation of physician to patient. In A. H. Bishop, & J. R. Scudder, Jr. (Eds.), Caring, curing, coping (pp. 8–30). The University of Alabama Press.
Pellegrino, E. D., & Thomasma, D. C. (1993). The virtues in medical practice. Oxford University Press.
Pugh, J., Kahane, G., Maslen, H., & Savulescu, J. (2016). Lay attitudes toward deception in medicine: Theoretical considerations and empirical evidence. AJOB Empirical Bioethics, 7(1), 31–38. https://doi.org/10.1080/23294515.2015.1021494.
Schermer, M. (2007). Nothing but the truth? On truth and deception in dementia care. Bioethics, 21(1), 13–22. https://doi.org/10.1111/j.1467-8519.2007.00519.x.
Sharkey, A., & Sharkey, N. (2011). Children, the elderly, and interactive robots. Anthropomorphism and deception in robot care and companionship. IEEE Robotics & Automation Magazine, 18, 32–38. https://doi.org/10.1109/MRA.2010.940151.
Sharkey, N., & Sharkey, A. (2012a). The eldercare factory. Gerontology, 58, 282–288. https://doi.org/10.1159/000329483.
Sharkey, A., & Sharkey, N. (2012b). Granny and the robots: Ethical issues in robot care for the elderly. Ethics and Information Technology, 14(1), 27–40. https://doi.org/10.1007/s10676-010-9234-6.
Sharkey, A., & Sharkey, N. (2021). We need to talk about deception in social robotics! Ethics and Information Technology, 23, 309–316. https://doi.org/10.1007/s10676-020-09573-9.
Sharon, T. (2016). The googlization of health research: From disruptive innovation to disruptive ethics. Personalized Medicine, 13(6), 563–574. https://doi.org/10.2217/pme-2016-0057.
Sparrow, R. (2002). The march of the robot dogs. Ethics and Information Technology, 4(4), 305–318. https://doi.org/10.1023/A:1021386708994.
Sparrow, R., & Sparrow, L. (2006). In the hands of machines? The future of aged care. Minds and Machines, 16(2), 141–161. https://doi.org/10.1007/s11023-006-9030-6.
Technopolis Group (2018). The silver economy. European Commission DG Communications Networks, Content & Technology. Retrieved October 11, 2022, from https://op.europa.eu/en/publication-detail/-/publication/a9efa929-3ec7-11e8-b5fe-01aa75ed71a1
van der Burg, S., & Swierstra, T. (2013). Introduction: Enhancing ethical reflection in the laboratory: How soft impacts require tough thinking. In S. van der Burg, & T. Swierstra (Eds.), Ethics on the laboratory floor (pp. 1–17). Palgrave Macmillan.
van Wynsberghe, A. (2021). Social robots and the risks to reciprocity. AI & Society. Advance online publication. https://doi.org/10.1007/s00146-021-01207-y
Vandemeulebroucke, T., Dierckx de Casterlé, B., & Gastmans, C. (2018). The use of care robots in aged care: A systematic review of argument-based ethics literature. Archives of Gerontology and Geriatrics, 74, 15–25. https://doi.org/10.1016/j.archger.2017.08.014.
Vandemeulebroucke, T., Dierckx de Casterlé, B., & Gastmans, B. (2020). Ethics of socially assistive robots in aged-care settings: A socio-historical contextualisation. Journal of Medical Ethics, 46(2), 128–136. https://doi.org/10.1136/medethics-2019-105615.
Vandemeulebroucke, T., Dierckx de Casterlé, B., & Gastmans, B. (2021). Socially assistive robots in aged care: Ethical orientations beyond the care-romantic and technology-deterministic gaze. Science and Engineering Ethics, 27(2), 17. https://doi.org/10.1007/s11948-021-00296-8.
Veatch, R. M. (1981). Nursing ethics, physician ethics, and medical ethics. Law Medicine and Health Care, 9(6), 17–19. https://doi.org/10.1111/j.1748-720X.1981.tb01916.x.
Veatch, R. M. (1998). The place of care in ethical theory. The Journal of Medicine and Philosophy, 23(2), 210–224. https://doi.org/10.1076/jmep.23.2.210.8925.
Verbeek, P. P. (2013). Technology design as experimental ethics. In S. van der Burg, & T. Swierstra (Eds.), Ethics on the laboratory floor (pp. 79–96). Palgrave Macmillan.
Verbeek, P. P. (2008). Obstetric ultrasound and the technological mediation of morality: A postphenomenological analysis. Human Studies, 31, 11–26. https://doi.org/10.1007/s10746-007-9079-0.
Vallor, S. (2011). Carebots and caregivers: Sustaining the ethical ideal of care in the twenty-first century. Philosophy and Technology, 24, 251–268. https://doi.org/10.1007/s13347-011-0015-x.
Whitby, B. (2012). Do you want a robot lover? The ethics of caring technologies. In P. Lin, K. Abney, & G. Bekey (Eds.), Robot ethics: The ethical and social implications of robotics (pp. 233–248). MIT Press.
Woods, M. (2005). Nursing ethics education: Are we really delivering the good(s)? Nursing Ethics, 12(1), 5–18. https://doi.org/10.1191/0969733005ne754oa. PMID: 15685964
Acknowledgements
This research has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 949841). I would like to thank Heidi Mertes for commenting on earlier versions of this paper.
Funding
This study was funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 949841).
Author information
Authors and Affiliations
Contributions
SS is the sole author.
Corresponding author
Ethics declarations
Competing Interests
The author has no competing interests to declare that are relevant to the content of this article.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Segers, S. Robot Technology for the Elderly and the Value of Veracity: Disruptive Technology or Reinvigorating Entrenched Principles?. Sci Eng Ethics 28, 64 (2022). https://doi.org/10.1007/s11948-022-00420-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11948-022-00420-2