Identifying motor and mental imagery electroencephalography (EEG) signals is imperative to realizing automated, robust brain-computer interface (BCI) systems. In the present study, we proposed a pretrained convolutional neural network (CNN)-based new automated framework feasible for robust BCI systems with small and ample samples of motor and mental imagery EEG training data. The framework is explored by investigating the implications of different limiting factors, such as learning rates and optimizers, processed versus unprocessed scalograms, and features derived from untuned pretrained models in small, medium, and large pretrained CNN models. The experiments were performed on three public datasets obtained from BCI Competition III. The datasets were denoised with multiscale principal component analysis, and time-frequency scalograms were obtained by employing a continuous wavelet transform. The scalograms were fed into several variants of ten pretrained models for feature extraction and identification of different EEG tasks. The experimental results showed that ShuffleNet yielded the maximum average classification accuracy of 99.52% using an RMSProp optimizer with a learning rate of 0.000 1. It was observed that low learning rates converge to more optimal performances compared to high learning rates. Moreover, noisy scalograms and features extracted from untuned networks resulted in slightly lower performance than denoised scalograms and tuned networks, respectively. The overall results suggest that pretrained models are robust when identifying EEG signals because of their ability to preserve the time-frequency structure of EEG signals and promising classification outcomes.
|Journal||Computers in Biology and Medicine|
|Publication status||Published - 25 Jan 2022|
Bibliographical noteFunding Information:
This work was supported by King Saud University , Riyadh, Saudi Arabia, through Researchers Supporting Project number RSP-2021/184 .
© 2022 Elsevier Ltd
- Brain-computer interface
- Convolution neural network
- Learning rates
- Transfer learning