In this paper we present a P300-based Brain Computer Interface (BCI) for the remote control of a mechatronic actuator, such as wheelchair, or even a car, driven by EEG signals to be used by tetraplegic and paralytic users or just for safe drive in case of car. The P300 signal, an Evoked Related Potential (ERP) devoted to the cognitive brain activity, is induced for purpose by visual stimulation. The EEG data are collected by 6 smart wireless electrodes from the parietal-cortex area and online classified by a linear threshold classifier, basing on a suitable stage of Machine Learning (ML). The ML is implemented on a μPC dedicated to the system and where the data acquisition and processing is performed. The main improvement in remote driving car by EEG, regards the approach used for the intentions recognition. In this work, the classification is based on the P300 and not just on the average of more not well identify potentials. This approach reduces the number of electrodes on the EEG helmet. The ML stage is based on a custom algorithm (t-RIDE) which tunes the following classification stage on the user’s “cognitive chronometry”. The ML algorithm starts with a fast calibration phase (just ~190s for the first learning). Furthermore, the BCI presents a functional approach for time-domain features extraction, which reduces the amount of data to be analyzed, and then the system response times. In this paper, a proof of concept of the proposed BCI is shown using a prototype car, tested on 5 subjects (aged 26 ± 3). The experimental results show that the novel ML approach allows a complete P300 spatio-temporal characterization in 1.95s using 38 target brain visual stimuli (for each direction of the car path). In free-drive mode, the BCI classification reaches 80.5 ± 4.1% on single-trial detection accuracy while the worst-case computational time is 19.65ms ± 10.1. The BCI system here described can be also used on different mechatronic actuators, such as robots.
An Embedded System Remotely Driving Mechanical Devices by P300 Brain Activity / DE VENUTO, Daniela; Annese, Valerio Francesco; Mezzina, Giovanni. - ELETTRONICO. - (2017), pp. 1014-1019. (Intervento presentato al convegno Design, Automation & Test in Europe Conference & Exhibition, DATE 2017 tenutosi a Lausanne, Switzerland nel March 27-31, 2017) [10.23919/DATE.2017.7927139].
An Embedded System Remotely Driving Mechanical Devices by P300 Brain Activity
DE VENUTO, Daniela
;Mezzina, Giovanni
2017-01-01
Abstract
In this paper we present a P300-based Brain Computer Interface (BCI) for the remote control of a mechatronic actuator, such as wheelchair, or even a car, driven by EEG signals to be used by tetraplegic and paralytic users or just for safe drive in case of car. The P300 signal, an Evoked Related Potential (ERP) devoted to the cognitive brain activity, is induced for purpose by visual stimulation. The EEG data are collected by 6 smart wireless electrodes from the parietal-cortex area and online classified by a linear threshold classifier, basing on a suitable stage of Machine Learning (ML). The ML is implemented on a μPC dedicated to the system and where the data acquisition and processing is performed. The main improvement in remote driving car by EEG, regards the approach used for the intentions recognition. In this work, the classification is based on the P300 and not just on the average of more not well identify potentials. This approach reduces the number of electrodes on the EEG helmet. The ML stage is based on a custom algorithm (t-RIDE) which tunes the following classification stage on the user’s “cognitive chronometry”. The ML algorithm starts with a fast calibration phase (just ~190s for the first learning). Furthermore, the BCI presents a functional approach for time-domain features extraction, which reduces the amount of data to be analyzed, and then the system response times. In this paper, a proof of concept of the proposed BCI is shown using a prototype car, tested on 5 subjects (aged 26 ± 3). The experimental results show that the novel ML approach allows a complete P300 spatio-temporal characterization in 1.95s using 38 target brain visual stimuli (for each direction of the car path). In free-drive mode, the BCI classification reaches 80.5 ± 4.1% on single-trial detection accuracy while the worst-case computational time is 19.65ms ± 10.1. The BCI system here described can be also used on different mechatronic actuators, such as robots.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.