Why high bit-rate and HD audio are all about marketing, not music
High bit rate — kilobit per second — and HD audio — 24bit vs. 16bit — are getting a lot of attention lately, whether it's because streaming services are offering more or less kbps or upcoming devices are promising higher fidelity sound or Apple is rumored to be adding those features to iOS 8 and the iPhone 6. The truth is, as far as it's been explained to me and I'm able to understand it, is that higher bit-rate and higher bit audio is more about marketing than it is about music. Yes, the quality of the mastering matters incredibly, as does the quality of the transcoding, but for most audio, with most modern codecs, we're well past the levels where things become transparent to the listener. Why is that?
Here's the introduction to the best, most understandable explanation I've found. It's by Chris "Monty" Montgomery, the creator of the Ogg format and the Vorbis codec. And as a good friend of mine said, "he knows his shit." From xiph.org:
Read the rest of Montgomery's lengthy, detailed article to see why he would rather see resources being spent on things like better quality headphones (and I'll add speakers), overcoming the technical hurdles to real, efficient surround sound, lossless formats to eliminate the risk of bad encoding and generation loss, and high-quality masters.
Again, the remastering of the original audio that's being done in advance of the push to higher bit-rate and higher bit-depth audio will no doubt result in fantastic versions of the music we know and love. It's just that those new remasters would sound every bit as good to humans in existing bit-rates and bit-depths.
When that's taken into consideration, the primary advantage of going to higher bit rates and 24bit becomes clear — marketing escalation. If one music service can say they offer higher kbps streams, even if they're higher beyond the point where it makes any difference, they look more impressive. If a device says it supports 24-bit rather than 16-bit audio, even if all it does it take up more storage space on that device.
We'll no doubt see many more products and rumors that hawk higher quality audio as a selling point, and we may even see Apple bullet point them on a Keynote deck so they stay competitive in the perception-is-reality space. But when the time comes to pick a streaming service or a device, don't fall victim to the bit-race. Pick the ones that offers the best mastered versions of the most music you love most in the way that sounds best to you.
Are you interested in higher bit rate or 24-bit audio? If so, what makes it compelling to you?
Get the best of iMore in your inbox, every day!
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.
Doesn't actually say why. It's just called out as this guy's opinion - and everyone is supposed to agree with it because your friend claims "he knows his shit". How about some details as to *why* it doesn't improve the actual quality or why the lower bit rates actually perform better? I contend they all sound fine in mono through my iPhone speaker in a noisy car... :) But rate escalation is stupid for most users who will never have the quality of equipment to benefit from the sampling. But that's my opinion - I am not being quoted as an authority on the matter. Details count.
2. Most people have no idea what the hell OGG Vorbis is or why that classifies the guy as an expert.
3. Being a douche doesn't help make your point - you just come across as trying to be smug because you feel like you know so much more than anyone else. So I will summarize the reason for you're readers: human hearing for 99.999% of all people only picks up sound in certain ranges. The "extra" sound provided by higher bit-rates is at best lost on our ears or at worse actually interferes with the good sound we need to hear. Here is a far better quote which actually kind of explains why: "Professionals use 24 bit samples in recording and production  for headroom, noise floor, and convenience reasons. 16 bits is enough to span the real hearing range with room to spare. It does not span the entire possible signal range of audio equipment. The primary reason to use 24 bits when recording is to prevent mistakes; rather than being careful to center 16 bit recording-- risking clipping if you guess too high and adding noise if you guess too low-- 24 bits allows an operator to set an approximate level and not worry too much about it. Missing the optimal gain setting by a few bits has no consequences, and effects that dynamically compress the recorded range have a deep floor to work with. An engineer also requires more than 16 bits during mixing and mastering. Modern work flows may involve literally thousands of effects and operations. The quantization noise and noise floor of a 16 bit sample may be undetectable during playback, but multiplying that noise by a few thousand times eventually becomes noticeable. 24 bits keeps the accumulated noise at a very low level. Once the music is ready to distribute, there's no reason to keep more than 16 bits."
This country is becoming crazy.... Are we supposed to ban people just because they disagree with us ????? SMH !!!!!
I would like your thoughts on the same situation in HD video. 720 seems great on the iPhone. I can't tell the difference at 1080p on my iPad mini. I feel it is the same argument you are making for audio. Sent from the iMore App
"20/20 is the visual acuity needed to discriminate two contours separated by 1 arc minute" - wiki "Visual Acuity" about why "E" is commonly used.
1 arc minute is about 0.0625 millimetres @ 6.1 metres
(1/16 of an inch @ 20 feet).
[Note: Not to mention most people are able to read the chart line's below 20/20, "... the average visual acuity with a healthy visual system is typically better."]
Your eyes spatial resolution is high enough to easily discern small features, which would be completely lost by a smaller image, stretched onto a bigger frame. This is abundantly clear on, for example, a 55 inch TV with a 720p image and an identical 1080p image at 0 to 6 metres, though unless you have something exceeding a 100 inch projector, who's sitting 10 metres away?
Source: My eyes. I have two identical televisions, side by side in my living room, appropriately distanced from the couch at 1.8m for gaming. Both calibrated identically, and both viewing a re-encoded 4k video, at both 720p and 1080p. Even at 6m, the result might not stop you watching 720p content, because in a world of limited data and storage, the difference isn't a deal breaker, but it would shut you up over stupid comments like "the only way you will be able to notice a difference [...] is by sitting inches away." The difference really is akin to bluray vs dvd.
Fortunately, these recordings from the 50s through the 80s were recorded well. Music today is not only "brickwalled", but the recordings often have clipping, which results in digital distortion. One example, is Queensrÿche's recent self-titled album. Great music, but it is unlistenable!
Fact the higher the bit depth the closer to the natural analogue level you're going to get. check out this link
http://tweakheadz.com/16-bit-vs-24-bit-audio/ 24bit 44khz is the way to go :) for me I prefer if everything was mixed in surround 5.1 as it sounds awesome :D
I always assumed the idea of 24 bit was so you could get a greater and more detailed dynamic range, turns out I was so wrong :(