Music Listeners Test 128kbps vs. 256kbps AAC 428
notthatwillsmith writes "Maximum PC did double-blind testing with ten listeners in order to determine whether or not normal people could discern the quality difference between the new 256kbps iTunes Plus files and the old, DRM-laden 128kbps tracks. But wait, there's more! To add an extra twist, they also tested Apple's default iPod earbuds vs. an expensive pair of Shure buds to see how much of an impact earbud quality had on the detection rate."
It's fairly easy to detect the differences (Score:1, Interesting)
Apple earphones != throw aways (Score:4, Interesting)
I'm a musician. I've recorded and released an album [cdbaby.com] (sorry for the shameless plug but it's only to put my post in context - honest). I own expensive studio earphones, have experience mixing and mastering etc.
I don't own a 5th generation iPod but I do own an iPod Shuffle that has since stopped playing MP3s. It still works as a storage device and I still have the headphones. I kept on to the headphones because I prefer them over all other ear buds I have. They don't beat the studio headphones, but I would not consider them "throw aways". I found they're pretty good quality and I began using them with all of my portable devices. I would generally agree that most ear buds that come with cd players and probably many other mp3 players are of relatively low quality, but I was very impressed with the ones that came with the iPod Shuffle. I will never throw them away.
Re:Synopsis (Score:5, Interesting)
http://en.wikipedia.org/wiki/ABX_test [wikipedia.org]
This would have been more interesting if they had used a statistically valid sample size and not only compared 128 to 256, but also to lossless.
treble troubles (Score:4, Interesting)
I found a tonality frequency setting in LAME that seemed to cure this problem, but neither iTunes nor ITMS seems to let you adjust or purchase based on this issue.
Perhaps not everyone is sensitive to this, but maybe there are other settings or aspects of compression that other people are sensitive to which I am not...leading one to the possible conclusion that compressed music might be made better by personalizing each rip to the hearing response of the listener rather than compromising on an average human hearing model.
Re:The results... (Score:4, Interesting)
Re:Synopsis (Score:4, Interesting)
This is what the Internet has reduced us to: it does not matter if it is correct, so long as it is delivered quickly.
Age and music choice (Score:4, Interesting)
Close but not quite (Score:2, Interesting)
One of their key ideas was having the participants submit music they were intimately familiar with. Unfortunately, they should have taken the idea to its logical conclusion: having each participant tested only with songs they submit. Also, they could have at least published the statistics on how participants performed on the song they submitted.
I find it easy to tell the difference between say lossless or even 320 and 128/192 when listening to music I'm very familiar with. But give me a set of random songs I've never heard before and I'd have a much harder time. You don't have to be an audiophile - you just have to be paying attention.
My grievance with low bit rates and/or inferior sound equipment is simply that you won't know what you are missing. And I'm not one of those gold-plated cable audiophiles either -- my "serious" listening equipment is the Etymotics ER4s with a headphone amp. Used for lossless songs, of course.
Better for albums (Score:5, Interesting)
Yes, ideally I would rip all my music to a lossless format. And ideally everything would be available on SACD at 2822 KHz rather than 44.1 KHz CDs. But that's just not practical with my 500+ album collection. It'd fill up my laptop's hard drive real quick and allow me to put only a fraction onto my iPod.
I'm also disappointed that the article only tested the tracks on iPods with earbuds. Most of my listening is on a decent stereo system fed from my laptop. Ripping is about convenience, not portability. I only use my iPod when riding the Metro or an airplane. With all the outside noise the bitrate doesn't matter.
And being DRM-free isn't just a matter of idealism. I get frustrated when I go to burn an MP3 CD for my car and discover that one of the tracks I selected is DRMed. Sure there are ways to get around it, but it's just not worth the bother.
AlpineR
Re:The results... (Score:5, Interesting)
* sorry, I've no good link- it's in ITU-R BS.1534-1 "Method for the subjective assessment of intermediate quality level of coding systems".
Re:The results... (Score:5, Interesting)
Re:The results... (Score:5, Interesting)
Can anyone explain this to me? I know what aliasing is; basically it's when your top frequencies hit the Nyquist limit and kind of bounce back downward (how's that for scientific?), and I know what it sounds like. However, the last time I checked, you'd remove aliasing by cutting high frequencies out of the final analog wave with a lowpass filter. Unless something's radically changed since then, wouldn't the presumably lower-response Apple buds actually show less aliasing that the expensive ones that can better reproduce the higher (and unwanted) frequencies?
Or have I been trolled into reasoning with audiophiles? If that's the case, let me know so I can pack up and go home.
In my studio... (Score:3, Interesting)
Classical makes it evident (Score:5, Interesting)
Take these subjective tests with a pinch of salt (Score:3, Interesting)
Back when the great audiophile debate was between CD and vinyl, New Scientist magazine put a load of audiophiles to the test by playing them the same piece of music from CD and then from Vinyl and asked them to identify which version was from which media and describe the differences between them.
What they didn't tell them was that they simply played the same CD track twice so any differences they thought they heard were purely a result of their own perception fantasies; it didn't stop them from describing in some detail how the two tracks varied though.
Not quite... (Score:4, Interesting)
Re:Not quite... (Score:5, Interesting)
Huh, you're right...
http://www.cs.columbia.edu/~hgs/audio/44.1.html [columbia.edu]
I always assumed that 44.1kHz was chosen because they took the necessary (Nyquist) sample rate to be able to record up to 20kHz (40kHz), and added a bit for good measure. There's always been that rumor that the time length of a CD was chosen to be able to fit Beethoven's Ninth Symphony, so I always figured they knew they wanted 16 bit, and a length of about 74 minutes, and just picked the >40kHz sampling rate that would get them there with that fancy new "CD" technology that was being developed. I'm happy to know that we're all using 44.1kHz for an even stupider reason