
Computers are increasingly guiding decisions about elder care – and tracking everything from toilet visits to whether someone has bathed
Kellye Franklin recalls the devastation when her now 81-year-old father, a loyal air force veteran, tried to make his own breakfast one morning. Seven boxes of open cereal on the living room floor with milk poured directly into every one of them. He would later be diagnosed with moderate to severe dementia.
Yet Franklin, 39, who is her dad’s only child and his primary caregiver, does not worry about that repeating now.
In late 2019, she had motion sensors that are connected to artificial intelligence (AI) system installed in the two-floor townhome she and her dad share in Inglewood, in Los Angeles county. Sensors at the top of doors and in some rooms monitor movements and learn the pair’s daily activity patterns, sending warning alerts to Franklin’s phone if her dad’s normal behaviour deviates – for instance if he goes outside and doesn’t return quickly.
“I would have gotten an alert as soon as he went to the kitchen that morning,” she says, because it would have been out of the ordinary for her dad to be in the kitchen at all, especially that early. Franklin says the system helps her “sanity”, taking a little weight off an around-the-clock job.

Donald Franklin, 81, and his daughter Kelly Franklin, 39. Kellye, Donald’s primary caretaker, has an AI surveillance system to help monitor her dad. Photograph: Jessica Pons/The Guardian
Welcome to caregiving in the 2020s: in rich societies, computers are guiding decisions about elder care, driven by a shortage of caregivers, an ageing population and families wanting their seniors to stay in their own homes longer. A plethora of so called “age tech” companies have sprung up over the last few years including to keep tabs on older adults, particularly those with cognitive decline. Their solutions are now beginning to permeate into home care, assisted living and nursing facilities.
The technology can free up human caregivers so they can be “as efficient as potentially possible” sums up Majd Alwan, the executive director of the Center for Aging Services Technologies at LeadingAge, an organization representing non-profit ageing services providers.
But while there are potential benefits of the technology in terms of safety for older people and a reprieve for caregivers, some also worry about its potential harms. They raise questions around the accuracy of the systems, as well as about privacy, consent and the kind of world we want for our elders. “We’re introducing these products based on this enthusiasm that they’re better than what we have – and I think that’s an assumption,” says Alisa Grigorovich, a gerontologist who has also been studying the technology at the KITE-Toronto Rehabilitation Institute, University Health Network, Canada.Q&A
What is AI?
Show
Technology to help keep seniors safe has been in use for a long time – think life alert pendants and so called “nanny cams” set up by families fearful their loved ones could be mistreated. But incorporating systems that use data to make decisions – what we now call AI – is new. Increasingly cheap sensors collect many terabytes of data which is then analyzed by computer scripts known as algorithms to infer actions or patterns in activities of daily living and detect if things might be off.
A fall, “wandering behavior”, or a change in the number or duration of bathroom visits that might signal a health condition such as a urinary tract infection or dehydration are just some of the things that trigger alerts to carers. The systems use everything from motion sensors to cameras to even lidar, a type of laser scanning used by self-driving cars, to monitor spaces. Others monitor individuals using wearables.
CarePredict, a watch-like device worn on the dominant arm, can track the specific activity that a person is likely to be engaged in by considering the patterns in their gestures, among other data. If repetitive eating motions aren’t detected as expected, a carer is alerted. If the system identifies someone as being in the bathroom and it detects a sitting posture, it can be inferred that the person “is using the toilet”, notes one of its patents.
The system in use in the Franklins’ home is called People Power Family. An addition to it, targeted at care agencies, includes daily reports tracking when someone fell asleep, whether they bathed, and bathroom visits. “You can manage more clients with fewer caregivers,” says the promotional video.https://interactive.guim.co.uk/embed/from-tool/photo-collage/index.html?vertical=News&opinion-tint=false&left-image=https%3A%2F%2Fmedia.gutools.co.uk%2Fimages%2F3dbc2dec5b6103127e85d0f57de6ff2a919313a3%3Fcrop%3D0_0_4837_3738&left-caption=&right-image=https%3A%2F%2Fmedia.gutools.co.uk%2Fimages%2F6a3d483860793859d627cff89dd752ee34b6541e%3Fcrop%3D0_0_4969_3840&right-caption=&always-place-captions-below=false Kellye Franklin, 39, has an AI surveillance system installed in her house to help monitor her dad, Donald Franklin, who has dementia. The system is connected to her iPad and smart phone. Photograph by Jessica Pons/The Guardian
The large blue warning signs read “Video recording for fall detection and prevention” on the third-floor dementia care unit of the Trousdale, a private-pay senior living community in Silicon Valley where a studio starts from about $7,000 per month.
In late 2019, AI-based fall detection technology from a Bay Area startup, SafelyYou, was installed to monitor its 23 apartments (it is turned on in all but one apartment where the family didn’t consent). A single camera unobtrusively positioned high on each bedroom wall continuously monitors the scene.
If the system, which has been trained on SafelyYou’s ever expanding library of falls, detects a fall, staff are alerted. The footage, which is kept only if an event triggers the system, can then be viewed in the Trousdale’s control room by paramedics to help decide whether someone needs to go to hospital – did they hit their head? – and by designated staff to analyze what changes could prevent the person falling again.
“We’ve probably reduced our hospital trips by 80%,” says Sylvia Chu, the facility’s executive director. The system has captured every fall she knows of, though she adds that sometimes it turns out the person is on the ground intentionally, for example to find something that has fallen on the floor. “I don’t want to say it is a false alarm … but it isn’t a fall per se,” she says. And she stresses it is not a problem – often the resident still needs help to get back up and staff are happy to oblige.https://interactive.guim.co.uk/uploader/embed/2021/06/pull-quote/giv-825QJyIPAlnG84J/?speaker_name=Sylvia%20Chu,%20The%20Trousdale"e_text=%22We%E2%80%99ve%20probably%20reduced%20our%20hospital%20trips%20by%2080%20percent.%22&sub_quote=EXECUTIVE%20DIRECTOR%20OF%20TROUSDALE%20DEMENTIA%20CARE%20UNIT,%20WHERE%20AN%20AI-BASED%20FALL%20DETECTION%20SYSTEM%20IS%20USED
“We’re still just scratching the surface,” when it comes to accuracy,says George Netscher, SafelyYou’s founderand CEO. Non-falls – which the company refers to as “on-the-ground events” – are in fact triggering the system about 40% of the time, he says, citing someone kneeling on the ground to pray as an example. Netscher says that while he wants to get the error rate down, it is better to be safe rather than sorry.
Companies must also think about bias. AI models are often trained on databases of previous subjects’ behavior, which might not represent all people or situations. Problems with gender and racial biases have been well documented in other AI-based technology such as facial recognition, and they could also exist in these types of systems, says Vicente Ordóñez-Roman, a computer vision expert at the University of Virginia.
That includes cultural biases. CarePredict, the wearable which detects eating motions, hasn’t been fine-tuned for people who eat with chopsticks instead of forks – despite recently launching in Japan. It is on the to-do list, says Satish Movva, the company’s founder and CEO.
For Clara Berridge, who studies the implications of digital technologies used in elder care at the University of Washington, privacy intrusion on older adults is one of the most worrying risks. She also fears it could reduce human interaction and hands-on care – already lacking in many places – further still, worsening social isolation for older people.
In 2014, Berridge interviewed 20 non-cognitively-impaired elder residents in a low-income independent living apartment building that used an AI-based monitoring system called QuietCare, based on motion detection. It triggered an operator call to residents – escalating to family members if necessary – in cases such as a possible bathroom fall, not leaving the bedroom, a significant drop in overall activity or a significant change in nighttime bathroom use.

Kellye Franklin’s AI surveillance system. Photograph: Jessica Pons/The Guardian
What she found was damning. The expectation of routine built into the system disrupted the elders’ activities and caused them to change their behaviour to try to avoid unnecessary alerts that might bother family members. One woman stopped sleeping in her recliner because she was afraid it would show inactivity and trigger an alert. Others rushed in the bathroom for fear of the consequences if they stayed too long.
Some residents begged for the sensors to be removed – though others were so lonely they tried to game the system so they could chat with the operator.
A spokesperson for PRA Health Sciences, which now makes QuietCare, noted the configuration studied in the paper was a historical version and the current version of QuietCare is only installed at assisted living facilities where facility staff, rather than relatives, are notified regarding changes in patients’ patterns or deviations in trends.
Berridge’s interviews also revealed something else worrying: evidence of benevolent coercion by social workers and family members to get the elders to adopt the technology. There is a “potential for conflict”, says Berridge. Another of her studies has found big differences in enthusiasm for in-home monitoring systems between older people and their adult kids. The latter were gung ho.
Though sometimes the seniors win the day. Startup Cherry Labs is pivoting partially because it ran into problems obtaining seniors’ consent. Its home monitoring system, Cherry Home, features up to six AI cameras with sound recorders to capture concerning behaviour and issue alerts; facial recognition to distinguish others in the space such as carers from seniors; and the ability for family members or carers to look in on how the senior is doing in real time.https://interactive.guim.co.uk/uploader/embed/2021/06/pull-quote/giv-825QJyIPAlnG84J/?speaker_name=Max%20Goncharov,%20CEO%20of%20Cherry%20Labs"e_text=%22The%20seniors%20were%20against%20it.%22&sub_quote=CHILDREN%20COULDN%27T%20CONVINCE%20PARENTS%20TO%20USE%20THE%20COMPANY%27S%20MONITORING%20SYSTEM
But Max Goncharov, its co-founder and CEO, notes that business has been tough not least because adult children couldn’t convince their parents to accept the system. “The seniors were against it,” he says. Cherry Labs now has a different application – targeting its technology at industrial workplaces that want to monitor employee safety.
Franklin, in Inglewood, says the fact her system uses motion sensors rather than cameras is a big deal. She and her dad, Donald, are African American and she just couldn’t imagine her dad being comfortable with a video-based system. “He was born in 1940 in the south and he has seen the evolution and backpedalling on racial issues. He definitely has some scars. There are various parts of our American culture he is distrustful of,” says Franklin.
She has done her best to explain the monitoring system, for which she now pays $40 a month, simply and without sugar-coating. For the most part, he’s all right with it as long as it helps her.
“I never want to be a burden,” he says. But he also wants her to know that he has a plan if they ever decide the technology is too invasive: they can move out of their townhome and rent it to someone else.
“You have to have a trick bag to protect yourself from their trick bag,” he tells her. “I am still your dad no matter how many sensors you got.”
- Automating Care is our new series on the rise of AI in caregiving
… as you’re joining us from the UAE, we have a small favour to ask. Tens of millions have placed their trust in the Guardian’s high-impact journalism since we started publishing 200 years ago, turning to us in moments of crisis, uncertainty, solidarity and hope. More than 1.5 million readers, from 180 countries, have recently taken the step to support us financially – keeping us open to all, and fiercely independent.
With no shareholders or billionaire owners, we can set our own agenda and provide trustworthy journalism that’s free from commercial and political influence, offering a counterweight to the spread of misinformation. When it’s never mattered more, we can investigate and challenge without fear or favour.
Unlike many others, Guardian journalism is available for everyone to read, regardless of what they can afford to pay. We do this because we believe in information equality. Greater numbers of people can keep track of global events, understand their impact on people and communities, and become inspired to take meaningful action.
We aim to offer readers a comprehensive, international perspective on critical events shaping our world – from the Black Lives Matter movement to the new American administration, Brexit, and the world’s slow emergence from a global pandemic. We are committed to upholding our reputation for urgent, powerful reporting on the climate emergency, and made the decision to reject advertising from fossil fuel companies, divest from the oil and gas industries, and set a course to achieve net zero emissions by 2030.
All Rights Reserved for Zoë Corbyn