How to determine if your streaming music's audio quality is worth paying for

How to find out what audio quality you're really getting when streaming music over cellular on iPhone and iPad

A lot of music streaming services boast different bit rates to make their service more appealing than the next. Spotify, Beats Music, and others even boast up to 320 kbps over cellular. It eats a heck of a lot of data but for audiophiles, good audio quality while streaming music is a must. Unfortunately in my experience, mileage greatly varies from what's actually advertised. So if you're curious what your service really is streaming over cellular, here's how to find out right from your iPhone!

How to tell what bit rate your iPhone is streaming music at over cellular

Before actually testing the bit rate of your chosen music service, it's important to make sure your results aren't skewed by other apps polling data. To do this, you need to make sure all apps are killed from multitasking, cellular data is turned off for everything except your streaming app, and WiFi is disabled. To turn off cellular data for all other apps, go to Settings > Cellular and turn cellular data Off for everything but the streaming app you want to test.

Doing the above steps will prevent data from being skewed during testing.

Finding a reference point

Once you've done the above you now need to find a reference song that is available across all the streaming services you'd like to test. I do this by checking my iTunes Library for a song I already own and viewing the info on it. If possible, look for a track that is at least 256 kbps, 320 if you have any. All purchased content from iTunes will be 256 kbps.

In my example, I'm going to use a track called "Use Somebody" by Kings of Leon that has a bit rate of 320 kbps. It is 3 minutes and 54 seconds long and is 9 MB. The important number here is the 9 MB. Feel free to use the same reference song as me if you'd like, or pick your own. You just need to know the bit rate and how large it is.

Now that we have our reference, we can start our testing.

  1. To start, launch the Settings app on your iPhone or iPad while on cellular data.
  2. Now tap on Cellular.
  3. Scroll all the way to the bottom and tap on Reset Statistics.
  4. On the next screen, just verify you'd like to reset statistics.
  5. Your cellular data usage for the current period should now read 0 bytes.
  6. Now launch your streaming music service and play the track over cellular that you chose as your reference song. Make sure you play it all the way through from beginning to end and stop it once it's over so it doesn't go on to another track.
  7. Now go back into your Settings.
  8. Tap on Cellular and view how many MB show up under Current Period. This is how many MB it took to stream that one reference track.
  9. Compare it with your reference point and you now have a very good idea of what the actual bit rate is for the streaming music service you tested.

As you can see in my above example, 12.8 MB is about right for an MP3 format track that's around 4 minutes in length if streaming truly is around 320 kbps. Also, if your music streaming app has audio quality options, make sure that you change those to highest possible. You can tweak and change them too so you can see the difference between what they're considering low and high quality if you'd like.

That's pretty much it! It may not be the most scientific way in the world but it definitely works and seems to give pretty accurate results. If you tried it, let us know in the comments what service you tested over what network and what your results were like!

Allyson Kazmucha

Editor for iMore, Potter pundit, and the ninja in your iOS

More Posts

 

9
loading...
15
loading...
55
loading...
0
loading...

← Previously

Apple really doesn't like their antitrust monitor; attempts to put his work on hold

Next up →

iWatch rumors keeping you up at night? Apple just hired a sleep specialist for that!

There are 27 comments. Add yours.

Becjr says:

That's a neat little workaround for something that you'd think Apple would've included somewhere in the statistics.
It's interesting to see you How-tos get a little more nerdalicious with a more involved adventure.
Thanks Ally! :D

Allyson Kazmucha says:

Well it's more to check streaming services Apple has nothing to do with. All Apple's stuff is 256kb.

Becjr says:

Ah. I see your point. :D

reptarwilleatu says:

an easier way would just be to use Dataman Next.

Great lil app

Allyson Kazmucha says:

Sure, but it does the same thing as cellular data does. I used the built-in option for simplicity.

mssaucedo0301 says:

I am really glad you posted this. I was curious to compare them as well.

Ally, in your opinion, what service would be best in areas of slow data coverage (using lower quality to load faster)?

Allyson Kazmucha says:

Most have an option or auto-adjust based on bandwidth.

heddhunter says:

Allyson, I have been enjoying your articles on streaming music services, but this statement is confusing, bordering on gibberish: "Since iTunes encodes in AAC, which is more compressed than MP3, that would be the base line for 320 kbps. Anything lower would be less." First off, AAC is not "more compressed" than MP3. It's *differently* compressed, and most people think that it compresses better per bit than a comparable mp3 encode. You can have a 320k AAC, but it's pointless since AAC achieves transparency at a lower bitrate than mp3. Most people don't create AAC themselves, but rather purchase AAC files from iTunes. Those are all 256Kbps. If the "Bit Rate" in the iTunes summary pane says 320Kbps then that's probably the bit rate of the file. It is possible for the summary pane to be wrong, if the file was encoded with a weird tool that wrote the wrong info into the header - this won't affect playback though.

Really the important thing is to just measure the cellular usage, like you describe. Comparing to some kind of baseline is not really important.

Unfortunately even that isn't foolproof. It would be quite possible for Spotify to use mp3 (for example) and Rdio to use AAC, or even OGG. Like I said, different encoders give different results at the same bitrate. So just knowing that Spotify sends you a file at 192Kbps doesn't tell you anything unless you also know the encoder that was used. About the only thing you could really measure with this technique is the relative bitrates of "wifi" vs "cellular" streams on ONE service. Comparing it between services could very well be apples to oranges. Without knowing which encoders are used, the raw bitrate number is fairly meaningless.

What would be most useful would be to run an ABX test, but given that the files are locked up in some crazy DRM scheme, that would be pretty hard to achieve.

Allyson Kazmucha says:

The file I checked is one I imported directly from a CD, it is 320 kbps and was not wrong. I'm removing that sentence you referred to though as it's confusing. I know a lot about bit rate but not encoding so the info i found could have been wrong. Thanks for that.

As far as a reference point, I disagree that isn't important. You have no idea what that translates to if you don't have a base line. Ok, Beats Music ate 12.8 MB, but what does that mean in terms of bit rate? Nothing if you don't have a base line. The point was to get an idea of what bit rate you're getting, not how much data it eats up. There's a lot more that goes into bit rate such as how many channels, etc.. but I don't think that's necessarily important for a roundabout estimate.

I don't think it's scientific and it wasn't meant to be but if you want to know if you're getting what you're paying for, it gives you a roundabout idea.

heddhunter says:

Sorry Allyson but I need to correct you again. "There's a lot more that goes into bit rate such as how many channels" is false. Bitrate is bitrate - it's an absolute. Doesn't matter if it's mono, stereo, 7.1 surround.

"Ok, Beats Music ate 12.8 MB, but what does that mean in terms of bit rate? Nothing if you don't have a base line." You don't need a baseline, just a duration. It's simple math: filesize in KB/duration in seconds = KB/sec.

Unfortunately there are factors that can make the filesize unreliable. Does it have embedded artwork? You have to take that into account. mp3 frame headers consume a good chunk of the file too. All the commercial services use DRM - who knows how much overhead that adds? Without knowing exactly what Spotify et al are packing into their streams, there's no way to know how much is actual audio data and how much isn't.

Also, if the services are using variable bitrate encoding (which they should) then the actual music will make a huge difference. One 3 minute song will have a different size from another 3 minute song, even though the duration is the same.

Allyson Kazmucha says:

Bit rate isn't a random number that's just assigned. And yes, actual bit rate does depend on channels.

http://en.wikipedia.org/wiki/Bit_rate#Audio

As for variables, I wasn't trying to get scientific enough that it would be an exact number, hence why I used a baseline of a song I knew. I chose that method because it was a "known".

Connor Mason says:

For some reason I can't respond to Allyson beneath this comment. Unless you do a spectral analysis on the file, for all you know, it could be a bad transcode. Bitrate doesn't mean anything unless you know exactly where the file came from, or you can analyze the file yourself.

For now, we're just going to have to accept lossy transcodes on our mobile devices. Until storage space and mobile data caps are increased, there's no reason to compare quality because we're getting screwed no matter what.

That being said, as long as you're streaming 256kbps+ AAC/MP3 (common standards), you're going to be fine, and any differences will not be noticeable through your little earbuds.

Allyson Kazmucha says:

I already stated above I pulled it directly from a CD. Honestly, I want to know because I don't want to pay for premium and get 64 kbps, which I have seen happen.

Connor Mason says:

Ah, I was unaware you could upload to Beats

Allyson Kazmucha says:

You can't. I think you're confusing two things. I'm talking about a reference point. I know mileage may vary based on what version they are using, but the point was to find that out. That's the whole point of the article, to see what quality tracks they're using and how high quality they're streaming. It's not scientific, just gives an idea. I used a track as a base line from a CD so I had some idea how large the file should be at cd quality. hopefully that clears things up? I think we were talking about two different things.

Stroodle says:

"I used a track as a base line from a CD so I had some idea how large the file should be at cd quality."

Allyson, the codec makes a big difference. If you didn't rip it in a lossless format it was compressed in a way that literally threw out information from the CD, so it wasn't CD quality. Here's a good write up on compression/codecs:

http://www.stereophile.com/features/308mp3cd

Now, the codec makes a big difference in the amount of information stored in the file but none of that matters if you can't reproduce that information accurately. If you're using an iPod/iPhone and you don't have a direct line out into a headphone amp and high quality headphones or earbuds, (well over $100), you're not hearing CD quality anyhow and may as well be listening to an AAC file at 256 kbps.

You mention "Use Somebody" by Kings of Leon that it is 3:45 long and at 320kbps is 9 MB. I don't listen to them but for comparison my rip of "Leopard-Skin Pill Box Hat" by Bob Dylan tracks at 4:00 minutes exactly, is 860 kbps (the Apple Lossless codec uses VBR), is sampled at 44.100 kHz, (which is CD quality), and it comes out to 24.6 MB.

I mention that only because you used the term "audiophiles" at the beginning of your article and I think you underestimate us. ; )

Galley says:

Rhapsody offers 64Kbps AAC+, with a 192Kbps AAC "High Quality" option. I actually prefer the 192Kbps bitrate because the quality is great and the file sizes are small.

ccppl208 says:

Dont use my dataand it kicked tv on

Sent from the iMore App

Spaz888 says:

Pretty damn complicated set of instructions and quite primitive. Why isn't there an app for that?

Allyson Kazmucha says:

Because streaming rates are variable 9 times out of 10.

asuperstarr says:

Thanks great tip!

Sent from the iMore App

stteve says:

I enjoyed this information. I'd be curious to see your results for each carrier. I really like Rdio and it's UI, but I'm disappointed that it doesn't publish bit rates.

Allyson Kazmucha says:

I guess they do now. 192 Kbps

pamar says:

Why not simply use some decent headphones or speakers and judge qualitatively? Numbers do not always say the whole truth. You can always force a song to be 320kbps but not the quality.

teepeeayy says:

Audio Aly! You GO, girl! I'm totally diggin' reading your posts.

arjunyg says:

This really is no way to measure the quality of the audio. The bit rate of the incoming data steam, sure that can be measured like this, but that may or may not determine what is actually being played back from your phone. It is possible for a streaming service to use lossless compression to lower the data transfer and preserve quality, and it is possible for a service to use lossy compression and destroy your music even more than the bit rate would indicate. A better test would be to hook the AUX output of the phone to spectral analysis software on a computer and examine the graph for compression artifacts, in addition to measuring the bit rate.