User Tools

Site Tools


deepperformancediff

DeepPerformanceDiff

Authors Amélie Anglade, Joachim Ganseman, Vladimir Viro
Affiliation Various
Code Github Link

DeepPerformanceDiff allows you to compare renditions of the same piece (of which the score is provided by Peachnote through IMSLP) by different performers (as found on YouTube). Differences in tempo and loudness are visualised in a graph that makes it easy to spot sections that any performer plays markedly differently from others.

Origins

This project originates from an idea that is part of the Music Connection Machine brainstorm taking place at Peachnote. A minimal mockup version had already been coded. During the HAMR@ISMIR hackday, this mockup has been extended to a fully functional prototype which includes more audio features (notably loudness), better search functionality, and a more flexible layout.

Used Technologies and Data

  • Most of the code has been written in JavaScript.
  • Calculation of loudness was done in Python using Essentia.
  • The web interface was wrapped in Bootstrap
  • IMSLP scores are displayed using the Peachnote API
  • Videos are gathered from YouTube

Relation to MIR Research and Potential Impact

Study of performance practice, and especially differences between performers, is a research topic that has already garnered some interest in the past 1) 2) 3). Note that many of these papers rely on (weighted combinations of) tempo and loudness to differentiate between or distinguish different performers. DeepPerformanceDiff visualizes these features in a single graph.

This intuitive visualization does not only create possibilities for data exploration, but is also of practical interest. We think more specifically in the direction of performance practice education, where more and more people upload recordings of themselves playing on YouTube. Everyone is now able to compare themselves to others, to professional performers, or to earlier versions of themselves, which also enables one to track their own progress.

Try it out!

The code is available for download on Github. Fork/clone the repository and point a webserver to the project directory. Additionally, you need a few precomputed files - contact Peachnote to obtain them:

  • Alignment definitions (json) between IMSLP scores and YouTube videos
  • Alignment Quality definitions (json) between IMSLP scores and YouTube videos

The Bootstrap-wrapped version is in a separate branch in the repository called 'bootstrap'.

Future Work

For the study of performance, we could most definitely think towards clustering of the tempo/loudness profiles. On such a large body of data, this could give us information on stylistic relationships between performers, maybe allowing us to detect schools of performance, teacher-student relationships, or regional differences in interpretation.

Music education practice would most definitely be served by a tool that visualizes the progress of students. Professional performances already appear often on YouTube, and given that we live in the age of Instagram and Vine, we foresee no lack of uploads of amateur performances in the near future either. The comparisons between performances by people on different levels with different abilities may provide valuable pedagogical insights for tutors.

deepperformancediff.txt · Last modified: 2015/10/26 17:13 by jganseman