Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Music Apple

Master Engineer: Apple's "Mastered For iTunes" No Better Than AAC-Encoded Music 312

New submitter Stowie101 writes "British master engineer Ian Shepherd is ripping Apple's Mastered for iTunes service, saying it is pure marketing hype and isn't different than a standard AAC file in iTunes. Shepherd compared three digital music files, including a Red Hot Chili Peppers song downloaded in the Mastered for iTunes format with a CD version of the same song, and said there were no differences. Apple or someone else needs to step it up here and offer some true 'CD quality downloads.'"
This discussion has been archived. No new comments can be posted.

Master Engineer: Apple's "Mastered For iTunes" No Better Than AAC-Encoded Music

Comments Filter:
  • hurp (Score:5, Informative)

    by Anonymous Coward on Tuesday February 28, 2012 @02:04PM (#39187115)

    Summary is incorrect. Article says that there was a significant difference between the Mastered for iTunes and CD version, while there was no difference between Mastered for iTunes and a standard AAC track.

  • Bad summary (Score:5, Informative)

    by tooyoung ( 853621 ) on Tuesday February 28, 2012 @02:06PM (#39187133)
    The summary implies that the CD version was identical to the Mastered for iTunes version.

    Shepherd compared three digital music files, including a Red Hot Chili Peppers song downloaded in the Mastered for iTunes format with a CD version of the same song, and said there were no differences.

    Here is the actual relevant part of the article:

    After his comparison of the three digital music files, Shepherd says there was a sonic difference between the Mastered for iTunes waveform and the CD waveform. He says the Mastered for iTunes and AAC-encoded files didn't reveal any differences

  • by VGPowerlord ( 621254 ) on Tuesday February 28, 2012 @02:06PM (#39187141)

    From the /. summary:

    Shepherd compared three digital music files, including a Red Hot Chili Peppers song downloaded in the Mastered for iTunes format with a CD version of the same song, and said there were no differences.

    That'd be a good thing if there were no differences between the CD version and the audio versions. However, what the article actually says is

    After his comparison of the three digital music files, Shepherd says there was a sonic difference between the Mastered for iTunes waveform and the CD waveform. He says the Mastered for iTunes and AAC-encoded files didn't reveal any differences, adding that this proves to him Apple's Mastered for iTunes isn't any different than a standard AAC file from Apple's iTunes store.

    In other words, the Mastered for iTunes version is basically identical to the standard AAC version, and both are different from the CD version.

  • by Mr 44 ( 180750 ) on Tuesday February 28, 2012 @02:14PM (#39187265)

    The summary link just goes to a (slow loading) blog post, the actual article being discussed is at:
    http://productionadvice.co.uk/mastered-for-itunes-cd-comparison/ [productionadvice.co.uk]
    And more specifically, the 11 minute youtube video:
    http://www.youtube.com/watch?v=kGlQs9xM_zI [youtube.com]

  • by Millennium ( 2451 ) on Tuesday February 28, 2012 @02:33PM (#39187509)

    As others point out, it's as good as the source, but only as good as the source. A FLAC file encoded from the original CD track will indeed be 100% CD quality. If you instead encode it from, say, a 96kbps MP3, then it can only be as good as the MP3 was.

    FLAC is very good. It is, however, not magic.

  • by X0563511 ( 793323 ) on Tuesday February 28, 2012 @02:54PM (#39187807) Homepage Journal

    That's exactly how one is supposed to determine if a signal is identical (flip the phase on one and add them).

    This is coming from an amateur producer/mixer and a radio guy... for what it's worth.

  • by dgatwood ( 11270 ) on Tuesday February 28, 2012 @03:00PM (#39187869) Homepage Journal

    CD quality is probably good enough for the final mix. You should always use 24-bit during tracking, of course, and if you plan on doing any vocoder work (Auto-Tune, Melodyne, etc.), you should generally track at a higher sample rate as well.

    Even if you don't plan on doing pitch correction, it would be nice to have a bit higher sampling rate (say 60 kHz) to ensure that the upper limit of human hearing is completely below the point at which the bandpass filter starts rolling off. Software bandpass limiting during sample rate conversion can generally achieve a much tighter filter with less distortion than analog hardware on the input to an ADC.

  • by TheTrueScotsman ( 1191887 ) on Tuesday February 28, 2012 @03:01PM (#39187883)

    Absolute nonsense. As long as you dither correctly (which by by now should be industry standard) there's no 'distortion' created by 'decimating' to 16-bit sound.

    The main problem with modern mastering is too much dynamic compression (not data compression which Owsinski seems to be confused by in the FTA). Given a light touch on Waves L3 (or whatever rinky-dinky limiter the mastering engineer prefers), there is no difference between 16 and 24-bit to even 'golden ears'.

  • by gnasher719 ( 869701 ) on Tuesday February 28, 2012 @03:06PM (#39187947)

    Lets be honest. The only thing you end up losing when going to 16-bit is lost below the noise floor anyway. You use 24 (or better) in the mixing process because that's when it matters.

    Not true. The AAC encoder tries to reproduce its input as faithfully as possible. If you feed it with 16 bit data, that is floating-point data plus quantisation noise, then it tries to reproduce floating-point data plus quantisation noise. Reproducing the quantisation noise is not only pointless (because it is just noise), and takes more bits (because random noise cannot be compressed), or, since the number of bits is fixed, leads to lower quality. If you feed the encoder with floating-point data instead, then it doesn't have to try to encode the noise and has more bits available to encode the actual music.

  • by Anonymous Coward on Tuesday February 28, 2012 @03:59PM (#39188523)

    "Every lossless decoder drops the phase information"

    Ahh No. If the decoder drops the phase information then it is not lossless. In a lossless encoder/decoder if you take an input, compess it, then decompress it and subtract from the source file, the difference is zero. Phase and magnitude are "exactly" the same, hence the name lossless.

    "drops the phase information, because the ear cannot hear it" Wrong again. Humans can hear phase differences. It just a frequency dependent quantity. As frequency rises relative phase information becomes less important.

     

  • by artor3 ( 1344997 ) on Tuesday February 28, 2012 @04:05PM (#39188591)

    Not in a meaningful way. You'd need a bit rate a few orders of magnitude above 2e43 bps (based on the Planck time) to fully represent a real world signal. We call that "analog". Only after compressing that from 10000000000000000000000000000000000000000000 kpbs down to 1000 kbps do we call it "digital". If you call them both digital, then the term loses all meaning.

    Of course, our brains can't pick up the difference between the two, but that's not because "the universe is digital". It's because by the time you get to the trillionth decimal point, the noise has long since swamped out the signal.

  • Duh. (Score:5, Informative)

    by elistan ( 578864 ) on Tuesday February 28, 2012 @04:07PM (#39188627)
    No surprise. And it's a misunderstanding on the author's part, not a misrepresentation on Apple's part.

    Apple's "Mastered for iTunes" is a set of guidelines about how to turn a master recording into an iTunes-optimized digital file. The author of TFA, however, is talking about taking a CD track and making a compressed version that's as close as possible to the CD track. A CD track is NOT a master file. (We don't want a track that's merely a CD representation - we've heard plenty on /. about how a lot of CD tracks just suck.) "Mastered for iTunes" talks about taking a high-resolution digitial file, like 96/24 or 192/24, and then producing the best possible iTunes Plus file (256 kbps VBR AAC.)

    So of course if you make an iTunes track from a CD track via the "Mastered for iTunes" process, you'll get a 256 kbps VBR AAC that's identical to ripping a CD track to a 256 kbps VBR AAC. However, if you follow Apple's recommendations, quoted here:

    To take best advantage of our latest encoders send us the highest resolution master file possible, appropriate to the medium and the project.
    An ideal master will have 24-bit 96kHz resolution. These files contain more detail from which our encoders can create more accurate encodes. However, any resolution above 16-bit 44.1kHz, including sample rates of 48kHz, 88.2kHz, 96kHz, and 192kHz, will benefit from our encoding process.

    you'll probably get something different, perhaps better, than a CD track ripped to AAC.

    Apple is providing the tools they use to convert to AAC so that sound engineers can preview the product before it goes on sale, but they appear to be the same tools they've been using all along. As I said before, "Mastered for iTunes" isn't a new encoding tool - it's a process workflow. Other recommendations:

    - Apple recommends listening to your masters on the devices your audience will be using
    - Be Aware of Dynamic Range and Clipping
    - Master for Sound Check and Other Volume Controlling Technology
    - Remaster for iTunes [That is, they suggest starting over from the original recordings, rather than send in a file that was mastered with CDs in mind.]

  • by elistan ( 578864 ) on Tuesday February 28, 2012 @06:05PM (#39190159)
    From what I understand, "to master" is a verb in the sound engineering realm meaning to produce a product, after recorging, mixing and other sound engineering tweaks, for a particular purpose.

    One can "master" for iTunes, "master" for CD, "master" for live DJ performance, "master" for 64kbps online streaming, "master" for FM radio, etc. "Master" does not imply a particular level of audio fidelity, although it has been misused and misundersood as such. Apple uses the term correctly in their "Mastered for iTunes" guidelines. They're a set of suggestions on what to do to produce the highest quality iTunes Plus 256 kbps variable-bit-rate AAC files. The GIGO principle applies here. Simply running a loudness war victim 44/16 CD track through Apple's "Mastered for iTunes" tools will simply produce a normal AAC. The magic is in providing to Apple a high-quality 196/24 file, with targeted audience specific tweaks, to begin with. There's no hype from Apple going on - just a lot of misunderstanding from other folks.
  • by idontgno ( 624372 ) on Tuesday February 28, 2012 @06:16PM (#39190313) Journal

    The GIGO principle applies here. Simply running a loudness war victim 44/16 CD track through Apple's "Mastered for iTunes" tools will simply produce a normal AAC. The magic is in providing to Apple a high-quality 196/24 file, with targeted audience specific tweaks, to begin with.

    Actually, the GIGO principle doesn't apply here. Garbage in, Garbage labeled with a shiny faux-significant marketing label "Mastered for iTunes" (and thus ennobled beyond its humble origins) Out.

    Or, to put it more simply, it's less effective than Autotune.

    If "Mastered for iTunes" is intended to be a mark of superior quality, it needs to actually start enforcing superior quality on some objective basis. Otherwise, it's just another worthless and misleading label.

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...