Egocentric hearing: Study clarifies how we can tell where a sound is coming from A new UCL and University of Nottingham study has found that most neurons in the brain’s auditory cortex detect where a sound is coming from relative to the head, but some are tuned to a sound source’s actual position in the world.
The study, published in PLOS Biology, looked at whether head movements change the responses of neurons that track sound location.
“Our brains can represent sound location in either an egocentric manner – for example, when I can tell that a phone is ringing to my left – or in an allocentric manner – hearing that the phone is on the table. If I move my head, neurons with an egocentric focus will respond differently, as the phone’s position relative to my ears has changed, while the allocentric neurons will maintain their response,” said the study’s first author, Dr Stephen Town (UCL Ear Institute).
The researchers monitored ferrets while they moved around a small arena surrounded by speakers that emitted clicking sounds. Electrodes monitored the firing rates of neurons in the ferrets’ auditory cortex, while LEDs were used to track the animals’ movement.
Among the neurons under investigation that picked up sound location, the study showed that most displayed egocentric orientations by tracking where a sound source was relative to the animal’s head, but approximately 20% of the spatially tuned neurons instead tracked a sound source’s actual location in the world, independent of the ferret’s head movements.
The researchers also found that neurons were more sensitive to sound location when the ferret’s head was moving quickly.
“Most previous research into how we determine where a sound is coming from used participants with fixed head positions, which failed to differentiate between egocentric and allocentric tuning. Here we found that both types coexist in the auditory cortex,” said the study’s senior author, Dr Jennifer Bizley (UCL Ear Institute).
The researchers say their findings could be helpful in the design of technologies involving augmented or virtual reality.
“We often hear sounds presented though earphones as being inside our heads, but our findings suggest sound sources could be created to appear externally, in the world, if designers incorporate information about body and head movements,” Dr Town said.
Source & further reading:http://www.ucl.ac.uk/news/news-articles/0617/150617-egocentric-hearing
Journal article:http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2001878
The study, published in PLOS Biology, looked at whether head movements change the responses of neurons that track sound location.
“Our brains can represent sound location in either an egocentric manner – for example, when I can tell that a phone is ringing to my left – or in an allocentric manner – hearing that the phone is on the table. If I move my head, neurons with an egocentric focus will respond differently, as the phone’s position relative to my ears has changed, while the allocentric neurons will maintain their response,” said the study’s first author, Dr Stephen Town (UCL Ear Institute).
The researchers monitored ferrets while they moved around a small arena surrounded by speakers that emitted clicking sounds. Electrodes monitored the firing rates of neurons in the ferrets’ auditory cortex, while LEDs were used to track the animals’ movement.
Among the neurons under investigation that picked up sound location, the study showed that most displayed egocentric orientations by tracking where a sound source was relative to the animal’s head, but approximately 20% of the spatially tuned neurons instead tracked a sound source’s actual location in the world, independent of the ferret’s head movements.
The researchers also found that neurons were more sensitive to sound location when the ferret’s head was moving quickly.
“Most previous research into how we determine where a sound is coming from used participants with fixed head positions, which failed to differentiate between egocentric and allocentric tuning. Here we found that both types coexist in the auditory cortex,” said the study’s senior author, Dr Jennifer Bizley (UCL Ear Institute).
The researchers say their findings could be helpful in the design of technologies involving augmented or virtual reality.
“We often hear sounds presented though earphones as being inside our heads, but our findings suggest sound sources could be created to appear externally, in the world, if designers incorporate information about body and head movements,” Dr Town said.
Source & further reading:http://www.ucl.ac.uk/news/news-articles/0617/150617-egocentric-hearing
Journal article:http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2001878
Source: Corina Marinescu
No comments:
Post a Comment