Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Atrial fibrillation (AF) is a leading cause of stroke and mortality,
particularly in elderly patients. Wrist-worn photoplethysmography (PPG) enables
non-invasive, continuous rhythm monitoring, yet suffers from significant
vulnerability to motion artifacts and physiological noise. Many existing
approaches rely solely on single-channel PPG and are limited to binary AF
detection, often failing to capture the broader range of arrhythmias
encountered in clinical settings. We introduce RhythmiNet, a residual neural
network enhanced with temporal and channel attention modules that jointly
leverage PPG and accelerometer (ACC) signals. The model performs three-class
rhythm classification: AF, sinus rhythm (SR), and Other. To assess robustness
across varying movement conditions, test data are stratified by
accelerometer-based motion intensity percentiles without excluding any
segments. RhythmiNet achieved a 4.3% improvement in macro-AUC over the PPG-only
baseline. In addition, performance surpassed a logistic regression model based
on handcrafted HRV features by 12%, highlighting the benefit of multimodal
fusion and attention-based learning in noisy, real-world clinical data.
Authors (4)
Yangyang Zhao
Matti Kaisti
Olli Lahdenoja
Tero Koivisto
Submitted
November 2, 2025
Key Contributions
Introduces RhythmiNet, a multimodal fusion model using PPG and accelerometer data with temporal and channel attention for robust three-class heart rhythm classification. It demonstrates improved performance over PPG-only baselines, especially under varying motion conditions.
Business Value
Enables more accurate and reliable continuous monitoring of heart rhythms using wearable devices, potentially leading to earlier detection of arrhythmias like AF and reducing stroke risk.