Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 95% Match Research Paper Neuroscientists,Machine learning researchers,Data scientists in healthcare,AI researchers 2 weeks ago

MEG-GPT: A transformer-based foundation model for magnetoencephalography data

large-language-models › model-architecture
📄 Abstract

Abstract: Modelling the complex spatiotemporal patterns of large-scale brain dynamics is crucial for neuroscience, but traditional methods fail to capture the rich structure in modalities such as magnetoencephalography (MEG). Recent advances in deep learning have enabled significant progress in other domains, such as language and vision, by using foundation models at scale. Here, we introduce MEG-GPT, a transformer based foundation model that uses time-attention and next time-point prediction. To facilitate this, we also introduce a novel data-driven tokeniser for continuous MEG data, which preserves the high temporal resolution of continuous MEG signals without lossy transformations. We trained MEG-GPT on tokenised brain region time-courses extracted from a large-scale MEG dataset (N=612, eyes-closed rest, Cam-CAN data), and show that the learnt model can generate data with realistic spatio-spectral properties, including transient events and population variability. Critically, it performs well in downstream decoding tasks, improving downstream supervised prediction task, showing improved zero-shot generalisation across sessions (improving accuracy from 0.54 to 0.59) and subjects (improving accuracy from 0.41 to 0.49) compared to a baseline methods. Furthermore, we show the model can be efficiently fine-tuned on a smaller labelled dataset to boost performance in cross-subject decoding scenarios. This work establishes a powerful foundation model for electrophysiological data, paving the way for applications in computational neuroscience and neural decoding.
Authors (5)
Rukuang Huang
Sungjun Cho
Chetan Gohil
Oiwi Parker Jones
Mark Woolrich
Submitted
October 20, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

Introduces MEG-GPT, a transformer-based foundation model for magnetoencephalography (MEG) data, utilizing time-attention and next time-point prediction. It also proposes a novel data-driven tokenizer for continuous MEG signals, enabling the generation of realistic spatio-spectral properties.

Business Value

Enables deeper understanding of brain function and dysfunction, potentially leading to new diagnostic tools, therapeutic interventions, and advanced brain-computer interfaces.