User Tools

Site Tools


brainbeats

====== Differences ====== This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
brainbeats [2015/10/25 11:38]
meinard [Task]
brainbeats [2015/10/25 16:17] (current)
tpraetzlich [Pro-Processing of EEG data]
Line 3: Line 3:
 | Authors | Sebastian Stober, Thomas Prätzlich, Meinard Müller | | Authors | Sebastian Stober, Thomas Prätzlich, Meinard Müller |
 | Affiliation | University of Western Ontario; International Audio Laboratories Erlangen| | Affiliation | University of Western Ontario; International Audio Laboratories Erlangen|
-| Code | [[https://github.com/​github/​dmca|Github Link]] ​|+| Code | {{::brainbeats.zipbrainbeats.zip}} ​|
 | Dependencies | [[https://​github.com/​sstober/​openmiir| OpenMIIR Dataset (Github)]] | | Dependencies | [[https://​github.com/​sstober/​openmiir| OpenMIIR Dataset (Github)]] |
 |  | [[https://​github.com/​sstober/​deepthought| deepthought project (Github)]] for data export | |  | [[https://​github.com/​sstober/​deepthought| deepthought project (Github)]] for data export |
Line 9: Line 9:
  
  
-===== Question =====+===== Opening ​Question =====
  
 Can we track the beat or the tempo of a music piece in brain waves recorded during listening? This is the question we tackled on the hack day at ISMIR 2015. Can we track the beat or the tempo of a music piece in brain waves recorded during listening? This is the question we tackled on the hack day at ISMIR 2015.
Line 15: Line 15:
 ===== Data ===== ===== Data =====
  
-The OpenMIIR dataset((Sebastian Stober, Avital Sternin, Adrian M. Owen and Jessica A. Grahn: "​Towards Music Imagery Information Retrieval: Introducing the OpenMIIR Dataset of EEG Recordings from Music Perception and Imagination."​ In: Proceedings of the 16th International Society for Music Information Retrieval Conference (ISMIR’15),​ 2015. [[ http://​ismir2015.uma.es/​articles/​224_Paper.pdf |Paper]] [[http://​bib.sebastianstober.de/​ismir2015poster.pdf|Poster]])) comprises EEG recordings((Electroencephalography (EEG) is a popular non-invasive neuroimaging technique that relies on electrodes placed on the scalp to measure the electrical activity of the brain.)) of people ​listing ​to 12 short music pieces. ​+The OpenMIIR dataset((Sebastian Stober, Avital Sternin, Adrian M. Owen and Jessica A. Grahn: "​Towards Music Imagery Information Retrieval: Introducing the OpenMIIR Dataset of EEG Recordings from Music Perception and Imagination."​ In: Proceedings of the 16th International Society for Music Information Retrieval Conference (ISMIR’15),​ 2015. [[ http://​ismir2015.uma.es/​articles/​224_Paper.pdf |Paper]] [[http://​bib.sebastianstober.de/​ismir2015poster.pdf|Poster]])) comprises EEG recordings((Electroencephalography (EEG) is a popular non-invasive neuroimaging technique that relies on electrodes placed on the scalp to measure the electrical activity of the brain.)) of people ​listening ​to 12 short music pieces.  
 +The following figure shows the waveform of the acoustic stimulus (waveform of a music recording) and the EEG signal.
  
 {{:​brainbeats_wiki_figure_1.png?​400|}} {{:​brainbeats_wiki_figure_1.png?​400|}}
Line 25: Line 26:
 {{:​brainbeats_wiki_figure_2.png?​400|}} {{:​brainbeats_wiki_figure_2.png?​400|}}
  
-The Tempogram ​Toolbox already does a great job for the music stimuli.+===== Tempogram for Music Recording ===== 
 + 
 +Using the Tempogram Toolbox((Peter Grosche and Meinard Müller: "​Tempogram Toolbox: MATLAB tempo and pulse analysis of music recordings."​ In: In Late-Breaking and Demo Session of the 12th International Conference on Music Information Retrieval (ISMIR), 2011. [[https://​www.audiolabs-erlangen.de/​content/​05-fau/​professor/​00-mueller/​03-publications/​2011_GroscheMueller_TempogramToolbox_ISMIR-LateBreaking.pdf |Paper]] [[http://​resources.mpi-inf.mpg.de/​MIR/​tempogramtoolbox/​|Website]])),​ we converted the music recording into a time-tempo representation (tempogram). Aggregating the time-tempo information over the time, we derived from the tempogram a tempo histogram. As shown by the following figure, this results reveals the correct tempo
  
 {{:​brainbeats_wiki_figure_3.png?​400|}} {{:​brainbeats_wiki_figure_3.png?​400|}}
 +
 +===== Pre-Processing of EEG data =====
  
 Our idea is to apply similar techniques to the EEG data. However, Our idea is to apply similar techniques to the EEG data. However,
Line 35: Line 40:
 the signal by subtracting a kind of moving average curve. the signal by subtracting a kind of moving average curve.
  
-{{:​brainbeats_wiki_figure_4.png?​400|}}+{{:​brainbeats_wiki_figure_4.png?​300|}}
  
-The resulting signal is then used as a kind of novelty curve.+===== Tempogram for EEG data ===== 
 + 
 +The resulting signal is then used as a kind of novelty curve. We then applied the same tempo estimation techniques as for the music case. The resulting EEG tempogram and EEG tempo histogram is shown by the following figure
  
 {{:​brainbeats_wiki_figure_5.png?​400|}} {{:​brainbeats_wiki_figure_5.png?​400|}}
  
 +===== Resources =====
 +
 +The MATLAB code and the data used for generating the above figures can be found {{::​brainbeats.zip| here}}.
  
 +The algorithmic details are described in the original article by Grosche and Müller((Peter Grosche and Meinard Müller: "​Extracting Predominant Local Pulse Information from Music Recordings."​ IEEE Transactions on Audio, Speech, and Language Processing, 19(6): 1688—1701,​ 2011.))
 +and in the textbook Fundamentals of Music Processing((Meinard Müller: "​Fundamentals of Music Processing:
 +Audio, Analysis, Algorithms, Applications."​ Springer, 2015. [[https://​www.music-processing.de |Link]])).
brainbeats.1445787493.txt.gz · Last modified: 2015/10/25 11:38 by meinard