Generating Shared Latent Variables for Robots to Imitate Human Movements and Understand their Physical Limitations - Archive ouverte HAL Access content directly
Conference Papers Year : 2019

Generating Shared Latent Variables for Robots to Imitate Human Movements and Understand their Physical Limitations

(1) , (2)
1
2
Maxime Devanne
Sao Mai Nguyen

Abstract

Assistive robotics and particularly robot coaches may be very helpful for rehabilitation healthcare. In this context, we propose a method based on Gaussian Process Latent Variable Model (GP-LVM) to transfer knowledge between a physiotherapist, a robot coach and a patient. Our model is able to map visual human body features to robot data in order to facilitate the robot learning and imitation. In addition , we propose to extend the model to adapt robots' understanding to patient's physical limitations during the assessment of rehabilitation exercises. Experimental evaluation demonstrates promising results for both robot imitation and model adaptation according to the patients' limitations.
Fichier principal
Vignette du fichier
main.pdf (897.5 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-01891414 , version 1 (09-10-2018)
hal-01891414 , version 2 (14-02-2019)
hal-01891414 , version 3 (02-07-2019)
hal-01891414 , version 4 (03-11-2021)

Identifiers

Cite

Maxime Devanne, Sao Mai Nguyen. Generating Shared Latent Variables for Robots to Imitate Human Movements and Understand their Physical Limitations. IEEE ECCV Workshop on Transferring and Adapting Source Knowledge in Computer Vision (TASK-CV), Sep 2018, Munich, Germany. pp.190-197, ⟨10.1007/978-3-030-11012-3_15⟩. ⟨hal-01891414v4⟩
387 View
246 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More