Instant classification for the spatially-coded BCI
- PMID: 35482705
- PMCID: PMC9049359
- DOI: 10.1371/journal.pone.0267548
Instant classification for the spatially-coded BCI
Abstract
The spatially-coded SSVEP BCI exploits changes in the topography of the steady-state visual evoked response to visual flicker stimulation in the extrafoveal field of view. In contrast to frequency-coded SSVEP BCIs, the operator does not gaze into any flickering lights; therefore, this paradigm can reduce visual fatigue. Other advantages include high classification accuracies and a simplified stimulation setup. Previous studies of the paradigm used stimulation intervals of a fixed duration. For frequency-coded SSVEP BCIs, it has been shown that dynamically adjusting the trial duration can increase the system's information transfer rate (ITR). We therefore investigated whether a similar increase could be achieved for spatially-coded BCIs by applying dynamic stopping methods. To this end we introduced a new stopping criterion which combines the likelihood of the classification result and its stability across larger data windows. Whereas the BCI achieved an average ITR of 28.4±6.4 bits/min with fixed intervals, dynamic intervals increased the performance to 81.1±44.4 bits/min. Users were able to maintain performance up to 60 minutes of continuous operation. We suggest that the dynamic response time might have worked as a kind of temporal feedback which allowed operators to optimize their brain signals and compensate fatigue.
Conflict of interest statement
The authors have declared that no competing interests exist.
Figures
![Fig 1](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g001.gif)
![Fig 2](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g002.gif)
![Fig 3](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g003.gif)
![Fig 4](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g004.gif)
![Fig 5](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g005.gif)
![Fig 6](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g006.gif)
![Fig 7](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g007.gif)
![Fig 8](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g008.gif)
![Fig 9](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g009.gif)
![Fig 10](https://cdn.statically.io/img/www.ncbi.nlm.nih.gov/pmc/articles/instance/9049359/bin/pone.0267548.g010.gif)
Similar articles
-
Training the spatially-coded SSVEP BCI on the fly.J Neurosci Methods. 2022 Aug 1;378:109652. doi: 10.1016/j.jneumeth.2022.109652. Epub 2022 Jun 15. J Neurosci Methods. 2022. PMID: 35716819
-
Application of a single-flicker online SSVEP BCI for spatial navigation.PLoS One. 2017 May 31;12(5):e0178385. doi: 10.1371/journal.pone.0178385. eCollection 2017. PLoS One. 2017. PMID: 28562624 Free PMC article.
-
A sub-region combination scheme for spatial coding in a high-frequency SSVEP-based BCI.J Neural Eng. 2023 Jul 27;20(4). doi: 10.1088/1741-2552/ace8bd. J Neural Eng. 2023. PMID: 37467742
-
Incorporation of dynamic stopping strategy into the high-speed SSVEP-based BCIs.J Neural Eng. 2018 Aug;15(4):046025. doi: 10.1088/1741-2552/aac605. Epub 2018 May 18. J Neural Eng. 2018. PMID: 29774867
-
The effect of stimulus number on the recognition accuracy and information transfer rate of SSVEP-BCI in augmented reality.J Neural Eng. 2022 May 13;19(3). doi: 10.1088/1741-2552/ac6ae5. J Neural Eng. 2022. PMID: 35477130
Cited by
-
Target of selective auditory attention can be robustly followed with MEG.Sci Rep. 2023 Jul 6;13(1):10959. doi: 10.1038/s41598-023-37959-4. Sci Rep. 2023. PMID: 37414861 Free PMC article.
References
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources