Can Artificial Intelligence mitigate the Long-Term Care crisis?
by Dr. Sivan Tamir*
Society seems to be facing a long-term care (LTC) crisis. LTC is a collective term for various services (such as assistance in daily living activities; home (medical) health care; and emergency medical alert systems) designed to meet older persons’ health or personal care needs. The 2017 UN Report on World Population Aging predicted that with the rise of life expectancy, by the year 2050, society will see more older persons aged 60 or over than adolescents and youth aged 10-24. This phenomenon, known as ‘population ageing’, poses severe global problems, such as an increase in health expenditure and the need for expansive LTC strategies to accommodate the ageing society.
For instance, it is already apparent that traditional care models relying on face-to-face patient monitoring are no longer sufficient to timely meet the medical and personal needs of older persons. Another, not-strictly-medical, feature of population ageing is social isolation and loneliness – strongly felt during the COVID-19 pandemic. Social isolation was found to gravely affect the physical and cognitive health and emotional well-being of older persons. It is associated with increased morbidity and early mortality.
Population ageing, coupled with an acute shortage of direct care workers and a socially-induced drop in informal (typically family) caregivers – invites the development of various technological solutions.
And indeed, one way of addressing the LTC crisis and its implications, and improving the quality of care provided to older persons, is through harnessing emerging technologies, particularly artificial intelligence (AI)-based tools, as a possible solution to mitigating many of the shortcomings within the LTC ecosystem. For instance, Anita Ho suggests that AI health monitoring technologies “may play a novel and significant role in filling the human resource gaps in caring for older adults by complementing current care provision, reducing the burden on family caregivers, and improving the quality of care.”
Here are a few examples for AI-based tools in LTC use:
- Telehealth/Telemedicine. Telemedicine represents one of the fastest-growing areas for AI in LTC. AI-based telemedicine applications include tele-assessment, tele-diagnosis, tele-interactions, and tele-monitoring, all of which improve health care delivery to older persons and monitoring thereof. Telemedicine essentially facilitates accessibility to health care, inter alia, for older adults for whom leaving their home to get care is complicated or carries health risks. Data-based telemedicine also offers personalised care. Telemedicine allows moving (where needed) from the paradigm of human-to-human interaction to conversational agents and virtual assistants and provides personalised means for addressing social isolation.
- Remote monitoring technologies (such as tracking wearables and automated systems, using computer vision analytics). Remote monitoring technologies are applied to closely monitor and document older persons’ health status and analyse patterns in their activity. These can help determine, for example, whether the older adult patient exhibits healthy movement patterns or detect diminished capacities for performing everyday tasks. These technologies also include AI-enabled electrocardiogram monitors and blood pressure monitors for the prediction or early detection of conditions such as hypertension or atrial fibrillation.
Four values are imperative in shaping the architecture of technologies designed for older people: human dignity, autonomy, beneficence, and (fair) accessibility. We shall address them below.
Selected ethical considerations
Human dignity. Providing older persons and their caregivers, or physicians, with effective health and well-being monitoring tools, potentially allows them (where no further assistance is required) to keep residing in the familiar environment of their home and to independently manage their health care needs. These reassure their human dignity, sense of self-worth, right to self-determination (as functioning, autonomous, non-burdening members of society) and contributes to their emotional welfare and cognitive well-being.
However, where older adults are continuously monitored by invasive technologies, not only their autonomy and privacy may be adversely impacted, but also their human dignity (due to degrading or unwarranted exposure such measures entail).
A University of Michigan poll conducted in June 2020 found that 56% of older persons felt isolated during the peak of the pandemic. As people are social creatures, social isolation and loneliness (whether forced by circumstances or a product of choice) may also be broadly construed as harming their human dignity. To avoid that (and other health-related effects of isolation), various technological tools have been specifically devised for older adults: from video-conferencing and online social media groups to AI-based social robots, conversational agents, artificial language, and virtual reality functions to provide support, companionship and stimulations.
Although some express skepticism about the worth of remote-companionship technologies for battling loneliness, others may draw optimism from evidence showing AI to be a potentially reliable tool for reducing social isolation. Researchers from Nanyang Technological University found that nursing home residents genuinely enjoyed interacting with an AI-empowered humanoid robot. Nadine, a social robot, played Bingo with residents while researchers tracked their emotions and facial expressions. They interestingly found that “the elderly participants were significantly happier and more attentive when playing with Nadine”, in comparison to a control group who played Bingo led by a human.
Autonomy. Autonomy as ‘self-rule’ or ‘self-determination’ is a multi-faceted concept, particularly in the context of AI in LTC. It has to do with individual control of decision-making in various areas, including healthcare. Namely, in the context before us – autonomy is about allowing a competent older person to choose whether or not to use AI-LTC technologies or be subject to their monitoring, and respecting their decision. Such respect for autonomy primarily is reflected in seeking older persons’ informed consent to be users/subjects of such applications. This would require transparency about the actual operation of such tools, their limitations and risks, and informing about existing alternatives. Transparency should be coupled with a basic level of explainability about those systems’ workings. These two features (transparency and explainability) will enhance trust in LTC technologies and encourage their usage. The potential tension between the older person’s autonomy and what she, healthcare professionals or family members perceive to be in, or against, her best interest is a significant challenge for respecting autonomy. Naturally, such a challenge is exacerbated, where older people do not have full decision-making capacity. Even in such a case, an autonomy-respecting approach would require that the older person’s opinion should be taken into consideration and never utterly dismissed.
So, if, for instance, an older adult cardiac patient refuses to use a wearable tracking device for cardiovascular monitoring, sufficient respect should be given to her refusal, along with consideration of other circumstantial factors, such as decision-making competency, and living conditions (living alone, or with family members).
Lastly, there is the AI-specific issue of artificial autonomy, namely, the decision-making power of an AI-based technology. This becomes relevant where the AI tool makes strong (nearly authoritative) medical recommendations concerning the older adult, which might constrain her personal autonomy. The limits of artificial autonomy seemingly hinge on the technology design and the degree of human involvement (namely, physician-in-the-loop).
Privacy. Personal privacy and medical confidentiality – components of patient’s autonomy – may be jeopardised by the use of AI-based LTC technologies. These typically involve substantial data collection, transmission and sharing (with family members, healthcare professionals, and external commercial entities), of which the older person is not always aware, or to which she may not consent. LTC technologies enable ongoing, continuous monitoring and surveillance. Their routine application is prone to disproportionate or unlimited use, which intrudes upon the privacy of the older person. Arguably, only the minimal, necessary level of privacy intrusion required for maintaining the health and combating the solitude of the older person – can be justified.
Beneficence. Promoting the health and well-being of older persons is the leading motivation for the development of LTC technologies. Where such technologies are AI-based, they should particularly adhere to AI ethics principles calling for the technology to be one that promotes human well-being. Benefits of such technologies for older persons include:
- Personalised healthcare (owing to the technologies’ data-based design).
- Remote monitoring and caring. Featuring geographic indifference, these technologies enable healthcare delivery (consultation, monitoring, diagnosis and risk-prediction) to older persons residing in rural areas with a shortage of physicians and medical imaging technologies.
- Preservation of human dignity (as described above).
Accessibility. Feasible and equitable (fair, equal) accessibility to LTC technologies is pivotal to their value. Two main barriers to accessibility of older persons to LTC technologies exist:
- Digital/technological (il)literacy is a highly prominent psychological and practical barrier to the adoption of LTC technologies by older persons. They typically have poor digital/technology literacy for appropriately using LTC technologies. Such deficient relevant literacy causes them to recoil from implementing the latter. LTC technologies should therefore be designed to fit not only the needs of older persons but also their limited technological orientation, by creating user-friendly, intuitive tools. Including the relevant target population – namely, older persons – in the design of such technologies is an effective way to mitigate this challenge.
- Cost is another accessibility barrier. In the interest of fairness and equity, LTC technologies should either enjoy substantial health insurance coverage, or be reasonably priced to render them affordable for older persons. As Kuziemsky et al. assert, “[w]hile AI and technology can enhance access to and delivery of services, it can also increase the divide between the have and have nots.” In the U.S., for instance, where a privatised healthcare insurance system is in place, the COVID-19 pandemic crisis has seen insurance coverage for telehealth services. This was particularly significant for older persons who are at increased risk for contracting COVID-19 and were resorted to consuming healthcare – remotely.
LTC technologies are clearly beneficial for older persons, as they meet many of their needs in terms of healthcare delivery and social isolation reduction. Maintaining human dignity, autonomy and privacy, in the design and usage of LTC technologies is paramount. In fact, these are necessary conditions for their ethicality. Lastly, removing accessibility barriers to such technologies is primarily a matter of policy-setting, as the uptake of LTC technologies can potentially advance a collective goal of public health, with impacts on the economy and social well-being.
*The author thanks Luke Schwartz, Duke University Sanford School of Public Policy student, for his ideas and for meticulous research assistance.