====== Brain Beats: Tempo Tracking in EEG Data ====== | Authors | Sebastian Stober, Thomas Prätzlich, Meinard Müller | | Affiliation | University of Western Ontario; International Audio Laboratories Erlangen| | Code | {{::brainbeats.zip| brainbeats.zip}} | | Dependencies | [[https://github.com/sstober/openmiir| OpenMIIR Dataset (Github)]] | | | [[https://github.com/sstober/deepthought| deepthought project (Github)]] for data export | | | [[http://resources.mpi-inf.mpg.de/MIR/tempogramtoolbox/| Tempogram Toolbox]] | ===== Opening Question ===== Can we track the beat or the tempo of a music piece in brain waves recorded during listening? This is the question we tackled on the hack day at ISMIR 2015. ===== Data ===== The OpenMIIR dataset((Sebastian Stober, Avital Sternin, Adrian M. Owen and Jessica A. Grahn: "Towards Music Imagery Information Retrieval: Introducing the OpenMIIR Dataset of EEG Recordings from Music Perception and Imagination." In: Proceedings of the 16th International Society for Music Information Retrieval Conference (ISMIR’15), 2015. [[ http://ismir2015.uma.es/articles/224_Paper.pdf |Paper]] [[http://bib.sebastianstober.de/ismir2015poster.pdf|Poster]])) comprises EEG recordings((Electroencephalography (EEG) is a popular non-invasive neuroimaging technique that relies on electrodes placed on the scalp to measure the electrical activity of the brain.)) of people listening to 12 short music pieces. The following figure shows the waveform of the acoustic stimulus (waveform of a music recording) and the EEG signal. {{:brainbeats_wiki_figure_1.png?400|}} ===== Problem Specification ===== We wanted to know whether it is possible to track the tempo in the EEG signal just as this would be done for audio data. {{:brainbeats_wiki_figure_2.png?400|}} ===== Tempogram for Music Recording ===== Using the Tempogram Toolbox((Peter Grosche and Meinard Müller: "Tempogram Toolbox: MATLAB tempo and pulse analysis of music recordings." In: In Late-Breaking and Demo Session of the 12th International Conference on Music Information Retrieval (ISMIR), 2011. [[https://www.audiolabs-erlangen.de/content/05-fau/professor/00-mueller/03-publications/2011_GroscheMueller_TempogramToolbox_ISMIR-LateBreaking.pdf |Paper]] [[http://resources.mpi-inf.mpg.de/MIR/tempogramtoolbox/|Website]])), we converted the music recording into a time-tempo representation (tempogram). Aggregating the time-tempo information over the time, we derived from the tempogram a tempo histogram. As shown by the following figure, this results reveals the correct tempo. {{:brainbeats_wiki_figure_3.png?400|}} ===== Pre-Processing of EEG data ===== Our idea is to apply similar techniques to the EEG data. However, because the EEG is very noisy, we applied some pre-processing. First, we aggregated several EEG channels into one signal. Then, we applied a suitable high-pass filter and normalized the signal by subtracting a kind of moving average curve. {{:brainbeats_wiki_figure_4.png?300|}} ===== Tempogram for EEG data ===== The resulting signal is then used as a kind of novelty curve. We then applied the same tempo estimation techniques as for the music case. The resulting EEG tempogram and EEG tempo histogram is shown by the following figure. {{:brainbeats_wiki_figure_5.png?400|}} ===== Resources ===== The MATLAB code and the data used for generating the above figures can be found {{::brainbeats.zip| here}}. The algorithmic details are described in the original article by Grosche and Müller((Peter Grosche and Meinard Müller: "Extracting Predominant Local Pulse Information from Music Recordings." IEEE Transactions on Audio, Speech, and Language Processing, 19(6): 1688—1701, 2011.)) and in the textbook Fundamentals of Music Processing((Meinard Müller: "Fundamentals of Music Processing: Audio, Analysis, Algorithms, Applications." Springer, 2015. [[https://www.music-processing.de |Link]])).