
We are excited to share a new publication by Dr. Lillian Hung, Investigator at the Edwin S.H. Leong Centre for Healthy Aging, titled “Ethical considerations in the use of social robots for supporting mental health and wellbeing in older adults in long-term care,” published in Frontiers in Robotics and AI.
As social robots like Paro and Lovot become more common in long-term care settings, their potential to support mental health and reduce loneliness among older adults is increasingly recognized. However, ethical challenges such as inequitable access, consent, risks of substituting human care, and concerns around infantilization remain underexplored.
To better understand these issues, Dr. Hung and her team conducted an empirical study across several Canadian long-term care homes. They gathered perspectives through observations, interviews, and participatory workshops involving a wide range of stakeholders—older adults living in care homes, staff members, family caregivers, and healthcare professionals. This approach ensured that voices often marginalized in technology research, such as residents with dementia or communication challenges, were included in the conversation.
Their findings highlighted several practical challenges:
- Language and Cultural Barriers: Access to social robots was sometimes limited for residents who spoke languages other than English, raising concerns about equity.
- Consent Processes: While verbal consent was commonly sought at the outset, there were few mechanisms to revisit or monitor consent during ongoing use, particularly for residents with cognitive decline.
- Impact on Human Care: Staff noted concerns that social robots might, over time, be seen as replacements for human interaction, rather than complements to it.
- Perceptions of Infantilization: Some stakeholders worried that certain robot designs and interactions could risk treating older adults in ways that felt patronizing.
To address these issues, the team recommends a focus on “everyday relational ethics”—meaning that ethical care practices should be ongoing, responsive, and grounded in genuine relationships with residents. Practical strategies include:
- Offering robot interactions that respect cultural and linguistic diversity
- Building regular consent check-ins into care routines
- Using robots to enhance, not substitute, human contact
- Designing robot activities that recognize older adults’ autonomy, capabilities, and preferences
Looking forward, the study emphasizes that as social robots become more integrated into care environments, ethical frameworks must evolve alongside them. Future research and technology design should prioritize relational care, ensuring that innovations support—not undermine—the dignity, rights, and well-being of older adults.
📖 Read the full article here.