<  Retour au portail Polytechnique Montréal

Neuro-BERT: rethinking masked autoencoding for self-supervised neurological pretraining

Di Wu, Siyuan Li, Jie Yang et Mohamad Sawan

Article de revue (2024)

Document en libre accès chez l'éditeur officiel
Un lien externe est disponible pour ce document
Afficher le résumé
Cacher le résumé

Abstract

Deep learning associated with neurological signals is poised to drive major advancements in diverse fields such as medical diagnostics, neurorehabilitation, and brain-computer interfaces. The challenge in harnessing the full potential of these signals lies in the dependency on extensive, high-quality annotated data, which is often scarce and expensive to acquire, requiring specialized infrastructure and domain expertise. To address the appetite for data in deep learning, we present Neuro-BERT, a self-supervised pre-training framework of neurological signals based on masked autoencoding in the Fourier domain. The intuition behind our approach is simple: frequency and phase distribution of neurological signals can reveal intricate neurological activities. We propose a novel pre-training task dubbed Fourier Inversion Prediction (FIP), which randomly masks out a portion of the input signal and then predicts the missing information using the Fourier inversion theorem. Pre-trained models can be potentially used for various downstream tasks such as sleep stage classification and gesture recognition. Unlike contrastive-based methods, which strongly rely on carefully hand-crafted augmentations and siamese structure, our approach works reasonably well with a simple transformer encoder with no augmentation requirements. By evaluating our method on several benchmark datasets, we show that Neuro-BERT improves downstream neurological-related tasks by a large margin.

Mots clés

Département: Département de génie électrique
Organismes subventionnaires: STI2030-Major Projects, National Natural Science Foundation of China, "Pioneer and Leading Goose" R&D Program of Zhejiang, Key Project of Westlake Institute for Optoelectronics
Numéro de subvention: 2022ZD0208805, 623B2085, 2024C03002, 2023GD004
URL de PolyPublie: https://publications.polymtl.ca/65420/
Titre de la revue: IEEE Journal of Biomedical and Health Informatics
Maison d'édition: IEEE
DOI: 10.1109/jbhi.2024.3415959
URL officielle: https://doi.org/10.1109/jbhi.2024.3415959
Date du dépôt: 07 mai 2025 15:53
Dernière modification: 25 nov. 2025 11:10
Citer en APA 7: Wu, D., Li, S., Yang, J., & Sawan, M. (2024). Neuro-BERT: rethinking masked autoencoding for self-supervised neurological pretraining. IEEE Journal of Biomedical and Health Informatics, 3415959 (11 pages). https://doi.org/10.1109/jbhi.2024.3415959

Statistiques

Dimensions

Actions réservées au personnel

Afficher document Afficher document