Preprint Article Version 1 This version is not peer-reviewed

EEG-TCNTransformer: A Temporal Convolutional Transformer for Motor Imagery Brain-Computer Interfaces

Version 1 : Received: 8 August 2024 / Approved: 9 August 2024 / Online: 9 August 2024 (12:15:45 CEST)

How to cite: Nguyen, A. H. P.; Oyefisayo, O.; Pfeffer, M. A.; Ling, S. H. EEG-TCNTransformer: A Temporal Convolutional Transformer for Motor Imagery Brain-Computer Interfaces. Preprints 2024, 2024080676. https://doi.org/10.20944/preprints202408.0676.v1 Nguyen, A. H. P.; Oyefisayo, O.; Pfeffer, M. A.; Ling, S. H. EEG-TCNTransformer: A Temporal Convolutional Transformer for Motor Imagery Brain-Computer Interfaces. Preprints 2024, 2024080676. https://doi.org/10.20944/preprints202408.0676.v1

Abstract

In Brain-Computer Interface Motor Imagery (BCI-MI) systems, Convolutional Neural Networks (CNNs) have traditionally dominated as the deep learning method of choice, demonstrating significant advancements in state-of-the-art studies. Recently, Transformer models with attention mechanisms have emerged as a sophisticated technique, enhancing the capture of long-term dependencies and intricate feature relationships in BCI-MI. This research investigates the performance of EEG-TCNet and EEG-Conformer models, which are trained and validated using various hyperparameters and bandpass filters during preprocessing to assess improvements in model accuracy. Additionally, this study introduces the EEG-TCNTransformer, a novel model that integrates the convolutional architecture of EEG-TCNet with a series of self-attention blocks employing a multi-head structure. The EEG-TCNTransformer achieves an accuracy of 82.97% without the application of bandpass filtering. The source code of EEG-TCNTransformer is available on GitHub.

Keywords

Brain-computer interface; motor imagery; electroencephalography; convolutional neural network; transformer; self-attention; bandpass filter

Subject

Engineering, Bioengineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.