Unravelling individual rhythmic abilities using machine learning
Résumé
Humans can easily extract the rhythm of a complex sound, like music, and move to its regular beat, for example in dance. These abilities are modulated by musical training and vary significantly in untrained individuals. The causes of this variability are multidimensional and typically hard to grasp with single tasks. To date we lack a comprehensive model capturing the rhythmic fingerprints of both musicians and non-musicians. Here we harnessed machine learning to extract a parsimonious model of rhythmic abilities, based on the behavioral testing (with perceptual and motor tasks) of individuals with and without formal musical training ( n = 79). We demonstrate that the variability of rhythmic abilities, and their link with formal and informal music experience, can be successfully captured by profiles including a minimal set of behavioral measures. These profiles can shed light on individual variability in healthy and clinical populations, and provide guidelines for personalizing rhythm-based interventions.
Domaines
NeurosciencesOrigine | Fichiers éditeurs autorisés sur une archive ouverte |
---|