SHARE THIS
Hong Kong’s rapidly aging population is grappling with a dementia crisis, where over 150,000 individuals are affected, yet diagnosis delays average 18–24 months due to overburdened clinics and costly, episodic assessments. This proposal presents a collaborative initiative to develop an AI and smartphone-based digital intervention system that leverages large language models (LLMs) to analyze multimodal sensor data (e.g., motion, location, app usage) collected by smartphones for real-time detection of dementia behavior biomarkers and generation of personalized, just-in-time health interventions. Targeting community-dwelling elderly with mild dementia who can use smartphones independently, our solution leverages ubiquitous smartphone sensors and LLMs to create a low-cost, scalable tool for proactive care. The system employs LLMs to interpret subtle, longitudinal behavioral changes— such as physical inactivity, sleep fragmentation, and social withdrawal— based on sensor data, and generates personalized intervention suggestions accordingly, delivering them in right time based on run-time contextual data. Unlike conventional methods, our platform operates passively via ubiquitous smartphones, enabling continuous monitoring of behavior and lifestyle changes, and generating personalized intervention suggestions in daily living environments. Such digital health systems have the potential to slow the progression of dementia and reduce the burden on nurses and healthcare systems.
Principal Investigator (PI): Xiaomin Ouyang
Department/ Division: Computer Science & Engineering
This project is funded by the HKUST IEMS Research Grants 2025.
Get updates from HKUST IEMS