“When I was your age …” is something people have been hearing from their parents since the beginning of time. It’s an especially popular phrase in conversations about music. Just about everyone thinks the singers, groups, and listening devices from their childhood were the best there ever was.
For as long as music fills our speakers, debates will rage. The Beatles vs. The Rolling Stones. Turntables vs. MP3 players. The ‘60s vs. the ‘90s. Epic live concerts vs. streaming from anywhere, anytime. Early rock ‘n’ roll vs. Nirvana, Pearl Jam, and their “grunge” colleagues.
In reality, it’s the mixture of major differences and foundational similarities between eras that make it such a fun topic for debate. Diversity of music and technology still plays an important role in American culture. Think about TV singing competitions such as The Voice and American Idol, or even the variety of niche satellite channels out there. From electronica and hip hop to country and soul, young people keep pushing forward a wide variety of music genres that speak to their souls.
For those seeking a career in the music or sound industry, it’s particularly interesting. The more that aspiring engineers and producers can do to understand the subtleties within each genre and subgenre, as well between audio technologies throughout the decades, the better they will be able to communicate with artists and end users for years to come.
Sound experts also must navigate the plethora of ways that people now listen to music. It is more complex than ever before—and it can be incredibly frustrating.
Today most people access their favorite music on their phones, via bluetooth through car speakers, through Wi-Fi connectivity, and more. Streaming options include SoundCloud, YouTube, Spotify, Pandora—the list goes on and on.
In 2015, it’s fairly standard to drop $200 on a pair of headphones that are supposed to let you hear “all the music.” But that’s only one part of the equation. If consumers don’t learn about and focus on the medium where their music is stored and reproduced, those expensive headphones won’t live up to their potential.
Audio technology professionals hope users will make a conscious effort to think carefully about how they are consuming the media that artists work so hard to create. For instance, one of our students recently was expressing frustration with modern-day music trends such as highly compressed, lossy files that take up less space—a key factor on mobile devices—but that discard valuable data, quality, and dynamic range from the original composition.
The general public can enhance their listening experience by simply researching the bit rate and file format of each music streaming service they might use. With this information in hand, users can change the player preferences accordingly and listen to the music closer to how it was engineered to be heard.
Then there’s bluetooth technology, which involves a vast list of variables, all of which drastically affect the sound quality coming through those speakers and headphones.
One of our students summed it up very well:
“Simply asking the user to make a conscious effort to recognize how they are consuming their media is a huge step forward.”
Our audio technology graduate students at American University are passionate about music. They’re experts, but they’re also major music enthusiasts, just like everyone else. From the recording studio and the live stage to cell phones and home entertainment systems, they strive to maximize the benefits of their art for all open ears.
Interested in becoming a thought leader and innovator in sound engineering? Learn how a master’s degree in audio technology from American University can help you get there.