User Tools

Site Tools


interactive_interface_for_audio_perception_experiment

====== Interactive Interface For Auditory Perception Experiments ====== | Author | James Traer | | Affiliation | McDermott Lab, Brain and Cognitive Sciences Department, MIT | | Code | will be uploaded to www.jamestraer.com when I get the website up and running | Abstract: ===== - Introduction ===== Although audio processing technology has advanced tremendously in recent years, state-of-the-art machine hearing algorithms still lag far behind the human auditory system when required to extract acoustic information from noisy real-world environments. Exactly what processing techniques are utilized by the human auditory system to perform these tasks are largely unknown and this is an active area of research. A better understanding of how audio signals are processed in the brain may provide inspiration for improved machine listening algorithms as well as clinical benefits for the hearing impaired. ===== - Yesterday's research methodology ===== One way to study how the audio processing of the human brain is to measure the ability of human listener's to perform auditory classification tasks in difficult conditions. Such tasks could include word comprehension, spatial localization, melody extraction, texture identification, volume comparisons, etc... In each case the human listener must ignore distracting information to extract one particular feature of the acoustic signal. By precisely controlling the acoustic signal heard by the listener and measuring the listener's performance as a function of both the signal, and distracting noise, the performance capabilities of the human auditory system can be inferred. A comparison task is one commonly used methodology. In this scenario the listener is presented with a number of audio recordings and must make a subjective judgement ie. which sound is louder, which has a higher pitch, which is closer, etc. When dealing with real-world sources, an extra level of complexity is introduced by the variability of real-world sources. As such experiments must be repeated many times to average out variations from other factors. However one source of difficulty in such experiments is the large number of variable parameters that characterize real-world sources. Both the informational source, and the distractors can be varied in a vast number of ways and measuring listener performance across a meaningful range of parameter space can be prohibitively time consuming. ===== - Tomorrow's research methodology ===== The hack I implemented embedded scripts to generation synthetic sounds with precisely controlled parameters into a Pure Data patch with an intuitive interface. Volunteer listeners can thus, with no prior training, manipulate the statistical properties of sounds via a MIDI controller. The proposed experiment allows the listener to listen to one sound, the "target" -- which remains constant with respect to specific parameters -- and vary the statistical parameters of a variable sound with the stated task of trying to match the two sounds as closely as possible. Thus the subject is given the freedom to explore a large region of the parameter space describing all possible sounds. The sensitivity of the human auditory system to each statistical parameter can be assessed by how closely the subject matches the parameters of the variable source to the target.

interactive_interface_for_audio_perception_experiment.txt ยท Last modified: 2013/06/30 18:16 by jtraer