Now that Apple has begun to release tracks in DRM-free 256kbps AAC through the iTunes Store, the listening tests are on. MaximumPC gathered 10 people, and had those people select 10 familar tracks, which they then encoded at both 128kbps AAC (which is the current iTunes Store offering), and at 256kbps, which is the new DRM-free bitrate. They then asked their ten subjects (in a double-blind experiment) whether they could tell the difference between the two tracks after repeated listens.
But they also threw a twist into the mix, asking subjects to listen first with a pair of the default Apple earbuds, then with a pair of $400 Shure SE420 phones. Their theory – that more people would be able to tell the difference between the bitrates with the higher-quality earphones – didn’t quite pan out.
The biggest surprise of the test actually disproved our hypothesis: Eight of the 10 participants expressed a preference for the higher-bit rate songs while listening with the Apple buds, compared to only six who picked the higher-quality track while listening to the Shure’s. Several of the test subjects went so far as to tell they felt more confident expressing a preference while listening to the Apple buds. We theorize that the Apple buds were less capable of reproducing high frequencies and that this weakness amplified the listeners’ perception of aliasing in the compressed audio signal. But that’s just a theory.
Also interesting is that the older subjects (whose hearing is supposedly less acute) did a better job of telling the tracks apart consistently than did the younger participants. Could it be that the younger generation has grown up on compressed music and doesn’t know what to listen for? Or it could be an anomalous result (the sample size was so small).
Readers who feel, as MaximumPC did going into the test, that 256kbps is still too low for anything approaching real fidelity, will likely cringe at the results. I’m not cringing exactly, but do wonder why they didn’t bother to give the subjects uncompressed reference tracks to compare against.
Notes: Remember that 128kbps AAC is roughly equivalent to 160kbps MP3, since the AAC codec is more efficient. There’s apparently some suspicion that the iTunes store uses a different encoder than the one provided stock with iTunes. Testing for both bitrate and headphone differences throws variables into the mix that shouldn’t oughta be there – would have been better to give everyone the good phones and focus on the bitrates, without confusing the matter. 10 people is a pretty small sample group – not small enough to be meaningless, but not large enough for substantial findings. Not that we need MaximumPC or focus groups to tell us how to feel about codecs and bitrates…