Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 70% Match Research Paper Neuroscientists,Biomedical Engineers,Machine Learning Researchers,AI Developers in Healthcare 1 week ago

REVE: A Foundation Model for EEG -- Adapting to Any Setup with Large-Scale Pretraining on 25,000 Subjects

generative-ai β€Ί diffusion
πŸ“„ Abstract

Abstract: Foundation models have transformed AI by reducing reliance on task-specific data through large-scale pretraining. While successful in language and vision, their adoption in EEG has lagged due to the heterogeneity of public datasets, which are collected under varying protocols, devices, and electrode configurations. Existing EEG foundation models struggle to generalize across these variations, often restricting pretraining to a single setup, resulting in suboptimal performance, in particular under linear probing. We present REVE (Representation for EEG with Versatile Embeddings), a pretrained model explicitly designed to generalize across diverse EEG signals. REVE introduces a novel 4D positional encoding scheme that enables it to process signals of arbitrary length and electrode arrangement. Using a masked autoencoding objective, we pretrain REVE on over 60,000 hours of EEG data from 92 datasets spanning 25,000 subjects, representing the largest EEG pretraining effort to date. REVE achieves state-of-the-art results on 10 downstream EEG tasks, including motor imagery classification, seizure detection, sleep staging, cognitive load estimation, and emotion recognition. With little to no fine-tuning, it demonstrates strong generalization, and nuanced spatio-temporal modeling. We release code, pretrained weights, and tutorials to support standardized EEG research and accelerate progress in clinical neuroscience.
Authors (8)
Yassine El Ouahidi
Jonathan Lys
Philipp ThΓΆlke
Nicolas Farrugia
Bastien Pasdeloup
Vincent Gripon
+2 more
Submitted
October 24, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

Presents REVE, a foundation model for EEG pretrained on a massive scale (60,000+ hours, 92 datasets, 25,000 subjects) to address the heterogeneity challenge in EEG data. It introduces a novel 4D positional encoding for handling arbitrary signal lengths and electrode configurations, significantly improving generalization.

Business Value

Accelerates research and development of AI applications for EEG analysis, enabling more robust diagnostic tools and brain-computer interfaces across different clinical and research settings.