User Tools

Site Tools


deepperformancediff

**This is an old revision of the document!** ----

A PCRE internal error occured. This might be caused by a faulty plugin

====== DeepPerformanceDiff ====== | Authors | Amélie Anglade, Joachim Ganseman, Vladimir Viro | | Affiliation | Various | | Code | [[https://github.com/peachnote/sync-visualization|Github Link]] | *DeepPerformanceDiff* allows you to compare renditions of the same piece (of which the score is provided by PeachNote through IMSLP) by different performers (provided by YouTube). Differences in tempo and loudness are visualised in a graph that makes it easy to spot passages that are played markedly different from others. ===== Origins ===== This project originates from an idea that is part of the [[http://music-connection-machine.peachnote.com/|Music Connection Machine]] brainstorm taking place at [[http://www.peachnote.com|PeachNote]]. A minimal mockup version had already been coded. During the HAMR@ISMIR hackday, this mockup has been extended to a fully functional prototype include more audio features (notably loudness), better search functionality, and a more flexible layout. ===== Used Technologies and Data ===== - Most of the code has been written in JavaScript. - Calculation of loudness features was done in Python - The web interface was wrapped in [[http://getbootstrap.com/|Bootstrap]] ===== Research challenges and potential impact ===== Study of performance practice, and especially differences between performers, is a research topic that has already garnered some interest in the past. ((See e.g. [[http://www.sciencedirect.com/science/article/pii/S0004370205000196]] , [[http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3842509/]] , [[http://www.mirlab.org/conference_papers/International_Conference/ISMIR%202008/papers/ISMIR2008_240.pdf]] )) . Note that many of these papers focus on (weighted combinations of) tempo and loudness differentiation to distinguish different performers - which are the exact features that DeepPerformanceDiff is able to visualize in a single graph. With this project, we are able to make an intuitive visualization of the data. This does not only create possibilities for data exploration, but is also of practical interest. We think more specifically in the direction of performance practice education, where more and more people upload recordings of themselves playing on YouTube. They are now able to compare themselves to others, to professional recordings, or to earlier versions of themselves, which enables them to track their own progress. ===== Try it yourself! ===== A footnote or reference looks like this((Mastering is the process of applying equalization, compression, reverb, and other nonlinearities to an audio recording to improve it sound.)). You can also include images: {{http://labrosa.ee.columbia.edu/files/labROSA-header.png|}}

deepperformancediff.1445784858.txt.gz · Last modified: 2015/10/25 10:54 by jganseman