In the electronic musical instrument scenario, the current paradigm of sound modification during live performance is predominantly based on the use of external control mechanisms to adjust sound configurations predefined by the performer. However, this approach is limited by the introduction of marginal latencies during the transition between sound configurations. To overcome these limitations, this study introduces a novel application of Brain-Computer Interface (BCI) technology in a control system environment for musical instruments during live performances. The proposed system exploits classification between mental states of activation and relaxation, employing a Machine Learning (ML) system that achieves an average Accuracy of 0.92. Using Beta Protocol, the system allows dynamic modulation of sound according to the mental state of the performer. Finally, an explainability analysis was performed to clarify the impact of specific features during the prediction process.
Neural Musical Instruments through Brain-Computer Interface and Biofeedback / Colafiglio, Tommaso; Lofu, Domenico; Sorino, Paolo; Lombardi, Angela; Narducci, Fedelucio; Di Noia, Tommaso. - ELETTRONICO. - (2025), pp. 489-494. ( 33rd Conference on User Modeling, Adaptation and Personalization, UMAP 2025 New York City June 16-19, 2025) [10.1145/3708319.3733644].
Neural Musical Instruments through Brain-Computer Interface and Biofeedback
Tommaso Colafiglio;Domenico Lofu;Paolo Sorino;Angela Lombardi;Fedelucio Narducci;Tommaso Di Noia
2025
Abstract
In the electronic musical instrument scenario, the current paradigm of sound modification during live performance is predominantly based on the use of external control mechanisms to adjust sound configurations predefined by the performer. However, this approach is limited by the introduction of marginal latencies during the transition between sound configurations. To overcome these limitations, this study introduces a novel application of Brain-Computer Interface (BCI) technology in a control system environment for musical instruments during live performances. The proposed system exploits classification between mental states of activation and relaxation, employing a Machine Learning (ML) system that achieves an average Accuracy of 0.92. Using Beta Protocol, the system allows dynamic modulation of sound according to the mental state of the performer. Finally, an explainability analysis was performed to clarify the impact of specific features during the prediction process.| File | Dimensione | Formato | |
|---|---|---|---|
|
2025_Neural_Musical_Instruments_through_Brain-Computer_Interface_and_Biofeedback_pdfeditoriale.pdf
accesso aperto
Tipologia:
Versione editoriale
Licenza:
Creative commons
Dimensione
1.27 MB
Formato
Adobe PDF
|
1.27 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

