Headphones

Someone asks: I have a question about headphones –
“What do we really need the ‘L’ and ‘R’ indicators on them for? Is it not true that the vast majority of music will sound fine whichever ear you stuff the things in?”

If the audio is linked with vision then obviously it matters. I wired my projector the wrong way into the stereo and it was quite odd seeing people walking from left to right while hearing their steps pattering from right to left.

Swapping the L&R of a piece of music should not affect its objective integrity. It’s still exactly the same piece of music after all, with nothing added or subtracted. The trouble is it then has to strike our ears and be processed by two halves of brain in order to be heard and enjoyed. If the listener is habituated to certain conventions such as the spatial arrangement of an orchestra, e.g the trombones parping on the right, then to hear them suddenly on the left may produce a feeling of unease, much as one feels looking at a photo of oneself when used to seeing one’s ugly fizzog in the mirror. This is I suspect merely due to conditioning and could presumably be reduced with repetition.

More interestingly perhaps are subtle disctinctions between both the ways in which the different ears respond to sound and the way in which sounds are processed by the two brain hemispheres.

“From birth, the ear is structured to distinguish between various types of sounds and to send them to the optimal side in the brain for processing,” explains Yvonne Sininger, Ph.D., visiting professor of head and neck surgery at the David Geffen School of Medicine at UCLA. “Yet no one has looked closely at the role played by the ear in processing auditory signals.” LOL at David Geffen school of medicine. School of rawk, surely?

“The auditory regions of the two halves of the brain sort out sound differently. The left side dominates in deciphering speech and other rapidly changing signals, while the right side leads in processing tones and music. Our findings demonstrate that auditory processing starts in the ear before it is ever seen in the brain […] even at birth, the ear is structured to distinguish between different types of sound and to send it to the right place in the brain.”

So could this mean, for example, that rhythms might be processed more effectively if piped in on one side and complex melodies or harmonies on t’other? And thus to swap over one’s favourite track could ensure we process it suboptimally and don’t enjoy it as much as we should? Are sound engineers aware of this? I suspect the brain is so fast it makes little difference, but you never know. Either way, if it’s Oasis blaring out I’ll be using my third ear only, to mark my appreciation.

Leave a Reply

Your email address will not be published.


*