The growing interest in wearable robots for assistance and rehabilitation purposes opens the challenge for developing intuitive and natural control strategies. Among several human-machine interaction approaches, myoelectric control consists in decoding the motor intention from muscular activity (or EMG signals) with the aim at moving the assistive robotic device accordingly, thus establishing an intimate human-machine connection. In this scenario, bio-inspired approaches, e.g. synergy-based controllers, are reveling to be the most robust.In this work, the authors presented an undercomplete autoencoder (AE) to extract muscles synergies for motion intention detection. The proposed AE topology has been validate with EMG signals acquired from the main upper limb muscles during planar isometric reaching tasks performed in a virtual environment while wearing an exoskeleton. The presented AE have shown promising results in muscle synergy extraction comparing its performance with the Non-Negative Matrix Factorization algorithm, i.e. the most used approach in literature. The synergy activations extracted with the AE have been then used for estimating the moment applied at the shoulder and elbow joints. Comparing such estimation with the results of other synergy-based techniques already proposed in literature, it emerged that the proposed method achieves comparable performance.

An undercomplete autoencoder to extract muscle synergies for motor intention detection / Buongiorno, Domenico; Camardella, Cristian; Cascarano, Giacomo Donato; Pelaez Murciego, Luis; Barsotti, Michele; De Feudis, Irio; Frisoli, Antonio; Bevilacqua, Vitoantonio. - ELETTRONICO. - (2019). (Intervento presentato al convegno International Joint Conference on Neural Networks, IJCNN 2019 tenutosi a Budapest, Hungary nel July 14-19, 2019) [10.1109/IJCNN.2019.8851975].

An undercomplete autoencoder to extract muscle synergies for motor intention detection

Buongiorno, Domenico
;
Cascarano, Giacomo Donato;Bevilacqua, Vitoantonio
2019-01-01

Abstract

The growing interest in wearable robots for assistance and rehabilitation purposes opens the challenge for developing intuitive and natural control strategies. Among several human-machine interaction approaches, myoelectric control consists in decoding the motor intention from muscular activity (or EMG signals) with the aim at moving the assistive robotic device accordingly, thus establishing an intimate human-machine connection. In this scenario, bio-inspired approaches, e.g. synergy-based controllers, are reveling to be the most robust.In this work, the authors presented an undercomplete autoencoder (AE) to extract muscles synergies for motion intention detection. The proposed AE topology has been validate with EMG signals acquired from the main upper limb muscles during planar isometric reaching tasks performed in a virtual environment while wearing an exoskeleton. The presented AE have shown promising results in muscle synergy extraction comparing its performance with the Non-Negative Matrix Factorization algorithm, i.e. the most used approach in literature. The synergy activations extracted with the AE have been then used for estimating the moment applied at the shoulder and elbow joints. Comparing such estimation with the results of other synergy-based techniques already proposed in literature, it emerged that the proposed method achieves comparable performance.
2019
International Joint Conference on Neural Networks, IJCNN 2019
978-1-7281-1985-4
An undercomplete autoencoder to extract muscle synergies for motor intention detection / Buongiorno, Domenico; Camardella, Cristian; Cascarano, Giacomo Donato; Pelaez Murciego, Luis; Barsotti, Michele; De Feudis, Irio; Frisoli, Antonio; Bevilacqua, Vitoantonio. - ELETTRONICO. - (2019). (Intervento presentato al convegno International Joint Conference on Neural Networks, IJCNN 2019 tenutosi a Budapest, Hungary nel July 14-19, 2019) [10.1109/IJCNN.2019.8851975].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11589/183510
Citazioni
  • Scopus 10
  • ???jsp.display-item.citation.isi??? 1
social impact