This paper presents a P300-based Brain Computer Interface for mechatronic device driving, i.e. without need of any physical control. The technique is based on a machine learning algorithm, which exploits a spatio-temporal characterization of the P300, analyses all the binary discrimination scenarios and pipes them into a multiclass classification problem. The BCI architecture is made up by (i) the acquisition unit, (ii) the processing unit and (iii) the navigation unit. The acquisition unit is a wireless 32-channel EEG headset collecting data from 6 electrodes (parietal-cortex area). The processing unit is a dedicated µPC performing stimuli delivery, data gathering, Machine Learning (ML) and real-time multidimensional classification leading to the user intention interpretation. The ML stage is based on a custom algorithm (t-RIDE) which trains the following classification stage on the user-tuned P300 reference features. The extracted features undergo a dimensionality reduction step and then, are used to define proper decision boundaries for the real-time classification. In this way, the real-time classification performs a functional approach for time-domain features extraction, reducing the amount of data to be analysed and speeding up the responses. The Raspberry-based navigation unit actuates the received commands and supports the wheelchair motion using peripheral sensors. The experimental results, based on a dataset of 7 subjects, demonstrate that the complete classification chain is performed, on average, in 8.16ms allowing the real-time control of the wheelchair with a classification accuracy of 84.28 ± 0.87 %.
Real-Time P300-based BCI in Mechatronic Control by using Multidimensional Approach / De Venuto, Daniela; Annese, Valerio; Mezzina, Giovanni. - In: IET SOFTWARE. - ISSN 1751-8806. - ELETTRONICO. - 12:5(2018), pp. 418-424. [10.1049/iet-sen.2017.0340]
Real-Time P300-based BCI in Mechatronic Control by using Multidimensional Approach
De Venuto, Daniela
;Mezzina, Giovanni
2018
Abstract
This paper presents a P300-based Brain Computer Interface for mechatronic device driving, i.e. without need of any physical control. The technique is based on a machine learning algorithm, which exploits a spatio-temporal characterization of the P300, analyses all the binary discrimination scenarios and pipes them into a multiclass classification problem. The BCI architecture is made up by (i) the acquisition unit, (ii) the processing unit and (iii) the navigation unit. The acquisition unit is a wireless 32-channel EEG headset collecting data from 6 electrodes (parietal-cortex area). The processing unit is a dedicated µPC performing stimuli delivery, data gathering, Machine Learning (ML) and real-time multidimensional classification leading to the user intention interpretation. The ML stage is based on a custom algorithm (t-RIDE) which trains the following classification stage on the user-tuned P300 reference features. The extracted features undergo a dimensionality reduction step and then, are used to define proper decision boundaries for the real-time classification. In this way, the real-time classification performs a functional approach for time-domain features extraction, reducing the amount of data to be analysed and speeding up the responses. The Raspberry-based navigation unit actuates the received commands and supports the wheelchair motion using peripheral sensors. The experimental results, based on a dataset of 7 subjects, demonstrate that the complete classification chain is performed, on average, in 8.16ms allowing the real-time control of the wheelchair with a classification accuracy of 84.28 ± 0.87 %.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.