Skip to main content
Log in

Robot Technology for the Elderly and the Value of Veracity: Disruptive Technology or Reinvigorating Entrenched Principles?

  • Original Research/Scholarship
  • Published:
Science and Engineering Ethics Aims and scope Submit manuscript

Abstract

The implementation of care robotics in care settings is identified by some authors as a disruptive innovation, in the sense that it will upend the praxis of care. It is an open ethical question whether this alleged disruption will also have a transformative impact on established ethical concepts and principles. One prevalent worry is that the implementation of care robots will turn deception into a routine component of elderly care, at least to the extent that these robots will function as simulacra for something that they are not (i.e. human caregivers). At face value, this may indeed seem to indicate a concern for how this technology may upend existing practices and relationships within a care setting. Yet, on closer inspection, this reaction may rather point to a rediscovery and a revaluation of a particularly well-entrenched value or virtue, i.e. veracity. The virtue of veracity is one of the values that is mobilized to argue against a substitution of human caregivers (while a combination of care robots and human caregivers is much more accepted). The subject of this paper is to explore how the moral panic surrounding care robots should not so much be interpreted as an anticipated and probable disruptor in a care setting, but rather as a sensitizing – in a way conservationist – argument that identifies veracity as an established value that is supposed to be protected and advanced in present day and future care settings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. This indicates that a so-called principle-based approach can be compatible with a form of virtue ethics, and indeed also with what has been called a care-ethics approach (which Beauchamp & Childress 2019) actually regard as “a form of virtue ethics”), for they explicitly mention the virtues of caring as fundamental in health care practices and relationships. On that reading, and contrary to what is sometimes maintained (without elaborate argumentation), a principlist account need not overlook the virtues (of care), as it may, indeed, encompass a commitment to those traits that are valued in real-life settings of (aged) care.

  2. Concrete examples of robots in social care practice are Paro (which looks like a baby seal), Pepper (a humanoid robot), MiRo (an animal-like robot companion), and Zora’s virtual robot (embedded in a simulated environment on a computer screen, though compatible with ‘real-life robots’). Other examples are discussed by Sharkey and Sharkey (2011) and more recently in the NHS’ note on robotics in social care (Parliamentary Office of Science and Technology, 2018).

  3. Sparrow and Sparrow (2006) illustrate this point by referring to Nozick’s Experience Machine argument, claiming that we desire the real world to be a certain way, not just our beliefs about the world to be a certain way, which is why most of us would allegedly not be happy to substitute our “real-world life” for life connected to a hypothetical machine that could give us any experience one might desire. It would lead me too far to discuss this elaborately, though it is worth mentioning that the assumption – shared by Sparrow and Sparrow (2006) – that people resist connecting to the experience machine because they believe that a good life is about more than having beliefs/experiences about the world, is not the consensus view within the realm of philosophy. A common criticism of this assumption is that most people would rather not choose to plug in to the machine because they fear that it might malfunction (see e.g. Bramble, 2016). Relatedly, though more directly concerning the deception objection against “emotional robots”, is Coeckelbergh’s (2012) explicit criticism of Sparrow and Sparrow (2006), in which the author critically engages with the “ideal emotional communication condition” which seems to serve as a normative requirement in the deception objection.

  4. One may find it notable that Sparrow (2002), who played a significant role in pointing out the ethical matter of deception in the context of robotic companions, carefully alluded that this matter may perhaps not be “the most urgent issue facing society today”. That said, it is intelligible that today some critics may see valid reason for concern, considering how motivations to develop SCRE’s are linked to meeting the shortage of human caregivers and addressing challenges of an ageing population, which already seems to include a notion of substitution.

  5. Some would even say that the former is a specification of the latter (Beauchamp & Childress, 2019).

References

Download references

Acknowledgements

This research has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 949841). I would like to thank Heidi Mertes for commenting on earlier versions of this paper.

Funding

This study was funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 949841).

Author information

Authors and Affiliations

Authors

Contributions

SS is the sole author.

Corresponding author

Correspondence to Seppe Segers.

Ethics declarations

Competing Interests

The author has no competing interests to declare that are relevant to the content of this article.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Segers, S. Robot Technology for the Elderly and the Value of Veracity: Disruptive Technology or Reinvigorating Entrenched Principles?. Sci Eng Ethics 28, 64 (2022). https://doi.org/10.1007/s11948-022-00420-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11948-022-00420-2

Keywords

Navigation