Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Music Apple

Master Engineer: Apple's "Mastered For iTunes" No Better Than AAC-Encoded Music 312

New submitter Stowie101 writes "British master engineer Ian Shepherd is ripping Apple's Mastered for iTunes service, saying it is pure marketing hype and isn't different than a standard AAC file in iTunes. Shepherd compared three digital music files, including a Red Hot Chili Peppers song downloaded in the Mastered for iTunes format with a CD version of the same song, and said there were no differences. Apple or someone else needs to step it up here and offer some true 'CD quality downloads.'"
This discussion has been archived. No new comments can be posted.

Master Engineer: Apple's "Mastered For iTunes" No Better Than AAC-Encoded Music

Comments Filter:
  • by Anonymous Coward on Tuesday February 28, 2012 @12:59PM (#39187049)

    You want CD quality downloads? Yeah, magic keyword "FLAC".

    Piracy: giving you for free what the market won't since the first bestiality video was filmed.

    • Re: (Score:2, Insightful)

      by MightyYar ( 622222 )

      Um, all "Mastered for iTunes" does is allow producers to preview how the final file will sound when placed on iTunes, so that they can make changes to the master file. Not sure what the point of the story is, and it definitely has nothing to do with CDs or FLAC.

      • by ackthpt ( 218170 ) on Tuesday February 28, 2012 @01:32PM (#39187489) Homepage Journal

        Um, all "Mastered for iTunes" does is allow producers to preview how the final file will sound when placed on iTunes, so that they can make changes to the master file. Not sure what the point of the story is, and it definitely has nothing to do with CDs or FLAC.

        If you are selling it as "Mastered" for a purpose and the quality is identical than it is only "Mastered" for hype and profit.

        I've got some LP singles, which were intended for radio play, back in the day, which are of an improvement over the usually horrible 45 RPM mass productions, possibly better than mass produced LP versions as well. But consider Apple's source is unlikely in most cases to be original mastering materials (who in their right mind would turn over digital originals to Apple?) for them to manipulate for their product (iTunes). Odds are, 95% of their market can't tell anyway because they're hardly audiophiles and are listening through headphones with absurdly limited range and reproduction quality.

        • by vought ( 160908 ) on Tuesday February 28, 2012 @01:39PM (#39187595)

          But consider Apple's source is unlikely in most cases to be original mastering materials (who in their right mind would turn over digital originals to Apple?)

          Your values are not the same as those looking to make money by reselling audio content. I can assure you that various music distributors would have no problem at all working in the studio with their own or third-party engineers to produce "Mastered for iTunes" versions of a catalog if that's what they think will lure more buyers. Whether or not "Mastered for iTunes" involves a substantively changed version (for example, engineered toward smaller drivers with more bass cutover, increasingly popular these days).

          Regardless of your opinion about how something should work, this kind of collaboration is an every day occurrence in the industry and never relies on "turning over" anything to Apple.

        • by elistan ( 578864 ) on Tuesday February 28, 2012 @05:05PM (#39190159)
          From what I understand, "to master" is a verb in the sound engineering realm meaning to produce a product, after recorging, mixing and other sound engineering tweaks, for a particular purpose.

          One can "master" for iTunes, "master" for CD, "master" for live DJ performance, "master" for 64kbps online streaming, "master" for FM radio, etc. "Master" does not imply a particular level of audio fidelity, although it has been misused and misundersood as such. Apple uses the term correctly in their "Mastered for iTunes" guidelines. They're a set of suggestions on what to do to produce the highest quality iTunes Plus 256 kbps variable-bit-rate AAC files. The GIGO principle applies here. Simply running a loudness war victim 44/16 CD track through Apple's "Mastered for iTunes" tools will simply produce a normal AAC. The magic is in providing to Apple a high-quality 196/24 file, with targeted audience specific tweaks, to begin with. There's no hype from Apple going on - just a lot of misunderstanding from other folks.
          • by idontgno ( 624372 ) on Tuesday February 28, 2012 @05:16PM (#39190313) Journal

            The GIGO principle applies here. Simply running a loudness war victim 44/16 CD track through Apple's "Mastered for iTunes" tools will simply produce a normal AAC. The magic is in providing to Apple a high-quality 196/24 file, with targeted audience specific tweaks, to begin with.

            Actually, the GIGO principle doesn't apply here. Garbage in, Garbage labeled with a shiny faux-significant marketing label "Mastered for iTunes" (and thus ennobled beyond its humble origins) Out.

            Or, to put it more simply, it's less effective than Autotune.

            If "Mastered for iTunes" is intended to be a mark of superior quality, it needs to actually start enforcing superior quality on some objective basis. Otherwise, it's just another worthless and misleading label.

      • by Anonymous Coward on Tuesday February 28, 2012 @03:37PM (#39189031)

        The claim in the article is that there is "no difference". This claim can be validated quite easily by simply taking the two sources, with normalized amplitude, inverting the phase of one signal and then summing. What remains of the signal is the difference, or the lack thereof, between the two sources. With digital sources, anything other than a null result is considered "coloration" and we are into subjective territory. The questions then begin with "is the color within the potential threshold of human perception?" And if the answer is "yes", then you cannot rely on a single person's opinion to make a determination about the character of the coloration.

    • Re: (Score:2, Insightful)

      by GWBasic ( 900357 )
      Mastered for iTunes is better then CD quality assuming that the producer encodes directly from the 24-bit master. AAC is totally floating-point; its compression process arguably creates a more accurate sound then decimating 24-bit to 16-bit.

      If you're going to ask for FLAC, at least make sure it's 24-bit. Otherwise, you're just wasting space to carry around the distortion created when decimating to 16-bit sound.

      • by X0563511 ( 793323 ) on Tuesday February 28, 2012 @01:45PM (#39187665) Homepage Journal

        Lets be honest. The only thing you end up losing when going to 16-bit is lost below the noise floor anyway. You use 24 (or better) in the mixing process because that's when it matters.

        • by gnasher719 ( 869701 ) on Tuesday February 28, 2012 @02:06PM (#39187947)

          Lets be honest. The only thing you end up losing when going to 16-bit is lost below the noise floor anyway. You use 24 (or better) in the mixing process because that's when it matters.

          Not true. The AAC encoder tries to reproduce its input as faithfully as possible. If you feed it with 16 bit data, that is floating-point data plus quantisation noise, then it tries to reproduce floating-point data plus quantisation noise. Reproducing the quantisation noise is not only pointless (because it is just noise), and takes more bits (because random noise cannot be compressed), or, since the number of bits is fixed, leads to lower quality. If you feed the encoder with floating-point data instead, then it doesn't have to try to encode the noise and has more bits available to encode the actual music.

      • by TheTrueScotsman ( 1191887 ) on Tuesday February 28, 2012 @02:01PM (#39187883)

        Absolute nonsense. As long as you dither correctly (which by by now should be industry standard) there's no 'distortion' created by 'decimating' to 16-bit sound.

        The main problem with modern mastering is too much dynamic compression (not data compression which Owsinski seems to be confused by in the FTA). Given a light touch on Waves L3 (or whatever rinky-dinky limiter the mastering engineer prefers), there is no difference between 16 and 24-bit to even 'golden ears'.

    • by Overzeetop ( 214511 ) on Tuesday February 28, 2012 @01:35PM (#39187531) Journal

      I know many friends who have used higher compression on their FLAC files and, with my gear, I can clearly hear the artifacts. I realize most people won't but I've got mostly high end stuff, and I always burn in both my audio and network cables before using them and mark them with directional arrows (only with pvc-free tape and audio-grade markers) so that the don't get installed backwards after they've been burned in.

      I'm amazed at how many people can't seem to grasp the fine points of lossless compression for audio work. I find most non-audiophiles expect that lossless means that what you put in exactly matches what you put out. I can tell you first hand, though, that when you spend as much money on gear as I have, you recognize that perfection comes from not just the bits, but the purity in which the bits are delivered. They may be the same ones and zeros, but a discerning ear can always tell the difference in the various lossless formats when listening to the color and soundstage of the reproduced performance.

      • Hahah! beware of Poe's law...
      • Re: (Score:2, Funny)

        by Anonymous Coward

        You sir, are an idiot. You claim to be an audiophile, and granted you do some things correct, yet you make no mention of how you condition your power source. If you are not using a Powerflux Power Cord with at least the following, all the warmth in your pure bits is leaking out:

        Powerflux conductors are 68-strand (Alpha) OCC twisted around –conductor strands with a special-grade PE insulation or dielectric. (Alpha conductors are fine OCC wire treated with Furutech’s Alpha Cryogenic and Demagn

    • by smcdow ( 114828 ) on Tuesday February 28, 2012 @01:35PM (#39187535) Homepage

      If you want to put those FLAC downloads on your iOS device, keep in mind that FLAC to ALAC is easy-peasy using ffmpeg [wikipedia.org]. It even preserves the tags.

  • by Kenja ( 541830 ) on Tuesday February 28, 2012 @01:02PM (#39187091)
    While I agree that its all bunk, I would be interested in knowing if the two files where bit for bit the same or just sound the same to the listener?
    • by vlm ( 69642 )

      Somewhere in between. From the fine article he reversed the phase on one and added it and listened to what fell out, which wasn't much. Essentially a lot of complicated analog foolishness to figure out the delta between two files. Would seem you could do a lot simpler version of this digitally, decode both into raw / wav files, then calculate the diff between the two raw files.

      • by Anonymous Coward on Tuesday February 28, 2012 @01:11PM (#39187223)

        The problem with computing the digital difference between two files is that sound, and especially music, is an inherently analog experience. All the digital douchery in the world won't change the fact that your ears are not made of robot.

        • All the digital douchery in the world won't change the fact that your ears are not made of robot.

          Neither, however, does all the analog sweetness in the world change the fact that your ears are not made of god. "Digital douchery" for things like this does not have to be perfect, as long as it can outstrip the limitations of human hearing.

      • by X0563511 ( 793323 ) on Tuesday February 28, 2012 @01:54PM (#39187807) Homepage Journal

        That's exactly how one is supposed to determine if a signal is identical (flip the phase on one and add them).

        This is coming from an amateur producer/mixer and a radio guy... for what it's worth.

      • Re: (Score:3, Interesting)

        by gnasher719 ( 869701 )

        Somewhere in between. From the fine article he reversed the phase on one and added it and listened to what fell out, which wasn't much. Essentially a lot of complicated analog foolishness to figure out the delta between two files. Would seem you could do a lot simpler version of this digitally, decode both into raw / wav files, then calculate the diff between the two raw files.

        Every lossless decoder drops the phase information, because the ear cannot hear it. That's half the data dropped without any loss in sound quality. So if you convert AAC back to uncompressed, the individual values have no similarity with the original at all.

        Imagine recording the same music with microphones that are one meter apart. The sound is the same to the human ear. But one meter is about 3 milliseconds, so any sound at 600 Hz will have exactly the opposite amplitude on both microphones.

        So what h

    • by ackthpt ( 218170 )

      While I agree that its all bunk, I would be interested in knowing if the two files where bit for bit the same or just sound the same to the listener?

      Probably not. Expect some renoberation and bit twiddling to have taken place in the "Master" process. Perhaps they did something like Dolby noise suppression or changed equalizer settings.

    • by phayes ( 202222 ) on Tuesday February 28, 2012 @01:37PM (#39187551) Homepage

      The sample size is ridiculous: One song was compared between CD/AAC/AAC (Mastered for iTunes), not even one album just one song!

      This may be just another tempest in a teacup because somebody uploaded the wrong file to AAC (M4iT) & people are making wildly erroneous extrapolations from it.

    • by icebike ( 68054 ) * on Tuesday February 28, 2012 @02:39PM (#39188317)

      While I agree that its all bunk, I would be interested in knowing if the two files where bit for bit the same or just sound the same to the listener?

      The summary above is sort of confusing. You have to RTFA

      Quote TFA

      The British mastering engineer Ian Shepherd goes deeper in his analysis of Mastered for iTunes by using a music engineering tool called a null test. Shepherd explains this procedure as a method of reversing the phase of a song’s waveform so that after a song’s waveforms and volumes are matched in software a mixing engineer can play them back to see if the song’s out of phase waveform cancels or nulls out the normal version of the song.

      After his comparison of the three digital music files, Shepherd says there was a sonic difference between the Mastered for iTunes waveform and the CD waveform. He says the Mastered for iTunes and AAC-encoded files didn’t reveal any differences,

      So the the answer is that there is no reason to believe the files were bit-for-bit the same (that would be impossible in any encoding), and they didn't necessarily sound the same either. He had to use digital methods to discover the differences.

      And he was comparing the standard AAC against the CD and the Mastered for Itunes against the CD, and the standard AAC against the MFI encoding.

      And in both cases there were differences between the AAC versions and the CD, but none between the two encoded versions.

      He did not say he could hear the differences without technical means. Usually if the engineer has to go to these lengths to discern any differences it means he couldn't tell them by ear alone.

      And if he can't tell by ear alone, then A) it doesn't matter, or B) he has geezer ears.
      "Mastering Engineer" status is a short lived career. By the time you get there, your ears are no longer qualified for the work. Technical means to the rescue.

  • hurp (Score:5, Informative)

    by Anonymous Coward on Tuesday February 28, 2012 @01:04PM (#39187115)

    Summary is incorrect. Article says that there was a significant difference between the Mastered for iTunes and CD version, while there was no difference between Mastered for iTunes and a standard AAC track.

    • Exactly. All "Mastered for iTunes" does is provide the supplier of the music with a PDF document describing best practices and an AAC encoding tool so that they can preview how the file will sound when available on iTunes. A supplier may already be using best-practices, or they may sign up for the program but ignore the PDF. Apparently this is the case with the example track he uses (Red Hot Chili Peppers).

      • by adisakp ( 705706 )
        AAC is a lossy format. That said, it's much better than MP3 for the same data rates if you use a good encoder -- and if you start with a high-bitrate high-quality source, it can sound as good as a CD to 99% of listeners.

        You do have to follow a number of guidelines or experiment with variables to get best results from what you are using as source to how you compress things (two-pass / filtering / vbr /cbr / etc).

        I've done objective testing using the "best" recommended settings with professional audio guys
  • RHCP? C'mon! (Score:2, Insightful)

    by arth1 ( 260657 )

    To test with Red Hot Chili Peppers is rather pointless, I would think - they're one of the most compressed bands there is, probably not using more than the top 4-5 bits out of 16. So yes, it's going to be fairly similar no matter what the format, unless you can get ahold of the sources to the original masters.

    • Or get the LP version [hometheaterforum.com] of the album :)

      • by arth1 ( 260657 )

        Or, for one of the albums, you can get the MFSL version [discogs.com].
        The gold plated CDs and ultra-heavy LPs may be a gimmick, but they do know how to mix masters.

    • While you are probably right about the RHCP (as well as any other CD released lately) about the gain compression, there is no reason why you still cant do quantitative analysis to see differences between the two different compression schemes when they are decompressed. It would just be a matter of comparing the bytes of decompressed files to the original, never compressed (ie from the CD) data. Compute the variance of the errors in the two different schemes referenced to the CD and the check to see if they
      • by arth1 ( 260657 )

        That being said, you would probably have a better indicator of errors if you did use a source that was not heavily gain compressed before data compression, but that is another debate.

        No, that was actually my point. If you use the CD as a master, and the CD is heavily compressed (as is the case here), there will be far less difference between different compressed versions. You're not going to get anything that sounds better than the master you use, cause the bits that are gone are gone. And in the case of RHCP CDs, that's unfortunately most of the bits.

        tl;dr: GIGO

  • Bad summary (Score:5, Informative)

    by tooyoung ( 853621 ) on Tuesday February 28, 2012 @01:06PM (#39187133)
    The summary implies that the CD version was identical to the Mastered for iTunes version.

    Shepherd compared three digital music files, including a Red Hot Chili Peppers song downloaded in the Mastered for iTunes format with a CD version of the same song, and said there were no differences.

    Here is the actual relevant part of the article:

    After his comparison of the three digital music files, Shepherd says there was a sonic difference between the Mastered for iTunes waveform and the CD waveform. He says the Mastered for iTunes and AAC-encoded files didn't reveal any differences

    • Mod parent up, summary makes a completely incorrect statement. Additionally:

      Apple or someone else needs to step it up here and offer some true 'CD quality downloads.'

      I think what we have here is TFS contradicting itself when it contradicts TFA. Presumably he wants "better than CD quality downloads."

  • I am not an Apple fan, and, actually, dislike the company. However, I wonder if they are truly making claims that are not true or if their claims are simply carefully worded to convey, well, nothing. Apple seems to big and way too self-important to risk the scandal of an outright lie. It reminds me of how they handled the antennae problem with the iPhone. It would be interesting to hear Apple's response but my guess is that they will simply not respond and their fans will be fine with that.
  • by VGPowerlord ( 621254 ) on Tuesday February 28, 2012 @01:06PM (#39187141)

    From the /. summary:

    Shepherd compared three digital music files, including a Red Hot Chili Peppers song downloaded in the Mastered for iTunes format with a CD version of the same song, and said there were no differences.

    That'd be a good thing if there were no differences between the CD version and the audio versions. However, what the article actually says is

    After his comparison of the three digital music files, Shepherd says there was a sonic difference between the Mastered for iTunes waveform and the CD waveform. He says the Mastered for iTunes and AAC-encoded files didn't reveal any differences, adding that this proves to him Apple's Mastered for iTunes isn't any different than a standard AAC file from Apple's iTunes store.

    In other words, the Mastered for iTunes version is basically identical to the standard AAC version, and both are different from the CD version.

    • by dbet ( 1607261 )
      HOW different though? I am willing to bet that people tested on very good stereo equipment can't tell the difference between a 320k MP3 and redbook audio. With their ears - not with waveform equipment.

      And I mean a real blind test, not just playing both and having them claim that one sounds better.
  • by ackthpt ( 218170 ) on Tuesday February 28, 2012 @01:07PM (#39187161) Homepage Journal

    Those wonderful color screens people could put on their TV's to impreove the picture -- you can't get more out of something than you put into it. If the lossy music process has lost data you can't put it back (but you can always convince the gullible that you can!)

    Now, buy my Slashdot Post Converter, which placed on your screen turns each of my posts into a fantastic media experience! Zowie!

  • by Jah-Wren Ryel ( 80510 ) on Tuesday February 28, 2012 @01:13PM (#39187251)

    This "mastered for itunes" stuff is pointless crap as long as we are still fighting the Loudness War. [wikipedia.org]

    The Red Hot Chili Peppers are a particularly bad test case because all of their albums have massive loudness-compression. And the same guy responsible for that travesty has started to do the mastering on recent Metallica albums [youtube.com] so their stuff is going to be all suck too.

  • first:
      "Mastered for iTunes format with a CD version of the same song, and said there were no differences. "

    then
    " Apple or someone else needs to step it up here and offer some true 'CD quality downloads."

    Isn't no different then the CD version CD quality?

  • by Mr 44 ( 180750 ) on Tuesday February 28, 2012 @01:14PM (#39187265)

    The summary link just goes to a (slow loading) blog post, the actual article being discussed is at:
    http://productionadvice.co.uk/mastered-for-itunes-cd-comparison/ [productionadvice.co.uk]
    And more specifically, the 11 minute youtube video:
    http://www.youtube.com/watch?v=kGlQs9xM_zI [youtube.com]

  • by rograndom ( 112079 ) on Tuesday February 28, 2012 @01:19PM (#39187329) Homepage
    I can't speak for the RHCP tracks, but I downloaded a dozen or so tracks I already have on CD and exist as both FLAC and LAME MP3s on my computer on the day of the announcement to see what the difference is. I could immediately tell a difference with the Master for iTunes tracks, better or worse, I'm not sure yet. They are easy to pick out in A/B testing, the most glaring difference is in the mid-bass area 80-120hz is noticeably boosted in the rock tracks I downloaded.
  • by milbournosphere ( 1273186 ) on Tuesday February 28, 2012 @01:20PM (#39187337)
    Here's Apple's mixing guide for sound engineers. It contains some more technical guidelines and specs: http://images.apple.com/itunes/mastered-for-itunes/docs/mastered_for_itunes.pdf [apple.com]

    It's interesting that this sudden focus on compressed music as opposed to uncompressed (iTunes Plus) has cropped up so soon after Steve's demise. IIRC, Steve was a music nut and was always pushing for DRM-free, higher fidelity digital downloads through iTunes. My foil-hat says that this might be an attempt to sell shitty quality music at a higher price. However, it could also ease network burden when streaming audio on the go. That said, one should still have access to high quality, uncompressed music for when you want to pump up the volume on your home system.

    • Specifically, under "Best Practices", the guide says, "An ideal master will have 24-bit 96kHz resolution. These files contain more detail from which our encoders can create more accurate encodes. However, any resolution above 16-bit 44.1kHz, ... will benefit from our encoding process."

      Which implies that encoding from a CD, which is only 16-bit/44.1kHz, will NOT benefit from the MfiT encoding/compression technique.

      If the RHCP tracks were encoded for iTunes from a 16-bit, 44.1kHz source (which they probably

  • Is CD quality really the holy grail of audio quality? I thought DVD Audio [wikipedia.org] with up to 24-bit bit depth and 192kps sampling rate was supposed to the the best in audio quality - far beyond the human ear's ability to hear.

    Or is CD Quality "good enough", even for audio engineers?

    • by dgatwood ( 11270 ) on Tuesday February 28, 2012 @02:00PM (#39187869) Homepage Journal

      CD quality is probably good enough for the final mix. You should always use 24-bit during tracking, of course, and if you plan on doing any vocoder work (Auto-Tune, Melodyne, etc.), you should generally track at a higher sample rate as well.

      Even if you don't plan on doing pitch correction, it would be nice to have a bit higher sampling rate (say 60 kHz) to ensure that the upper limit of human hearing is completely below the point at which the bandpass filter starts rolling off. Software bandpass limiting during sample rate conversion can generally achieve a much tighter filter with less distortion than analog hardware on the input to an ADC.

  • It's just guidelines (Score:2, Interesting)

    by GWBasic ( 900357 )

    "Mastered for iTunes" is just a set of guidelines that ensure that the resulting AAC file is the highest quality possible when encoded directly from a 24-bit master. It's higher quality then most FLACs because they are usually 16-bit, whereas AAC is essentially 24-bit when the source material is 24-bit. In essence, compressing 24-bit audio to 256kbps AAC sounds better then going to 16-bit uncompressed audio.

    If you're going to go FLAC, at least make sure that you're getting 24-bit.

    • by Skapare ( 16644 )

      If you are ripping a CD, then 24-bit gains you nothing. Just be sure to not modify the audio, which FLAC accomplishes just fine. Of course, if your source is 24-bit at a 192k sample rate, you preserve it best by encoding FLAC at the same number of bits and same sample rate.

      BTW, the Nyquist limit that says you can encode at twice the sample rate applies when encoding a single sine wave. A mix of multiple sine waves requires more to get them accurate. And I have not heard any music recently that is made e

    • That is debatable. If you encode AAC from a 24-bit master, you may get higher dynamic range, but you will still get the _artifact_ from AAC bitrate compression. That is _not_ a good tradeoff, in my opinion. In most cases, I would rather listen to a 16-bit/44.1kHz lossless encoding than a 24-bit lossy encoding, because I tolerate a roughly -90dB noise floor much better than I tolerate _distortion_.

      I agree with you, of course, that 24-bit lossless is better than either of those alternatives.

  • I want Vinyl quality.

    • by Skapare ( 16644 )

      Record once, play once, technology. After that, the recording is modified (generally for the worse, except in the case of Justin Bieber).

    • I want Vinyl quality.

      No actually you don't. You want analogue reel to reel tape quality.

    • Seriously, try 24-bit, 96kHz (or better), uncompressed (of course). Vinyl was *never* that good.

      And yes, you can get recordings in that resolution.

  • This completely misrepresents what the 'Mastered for iTunes' represents.

    If give the producer the tools and options to create CD quality files.

    If a producer is putting a mastered for iTunes stamp ion the song that hasn't been improved beyond the most filmiest technicality, then it's on the producer.

    There are a lot of issues regarding Apple products, and how Apple runs it's business. Lets not try to make some up, m'kay?

  • Isn't that the point? This is confusing: " no difference from the CD " ... "someone needs to offer CD quality downloads..." .

    The point for "Mastered for iTunes" is not to make it different from the CD, it's to make the compressed, lossy AAC file as close to the CD as possible. It sounds like they've done that.
  • by ZaskarX ( 1314327 ) on Tuesday February 28, 2012 @02:14PM (#39188037)

    What good is an optimized (if not lossess) format when played through and iPod with a digital to analog converter that costs 50 cents? My $40 Sansa Clip plays FLAC and has a better quality DAC than a $300 iPod. Why is Apple even bothering?

  • Here's what the article actually says: If you have a sound engineer who creates a recording with material that is badly distorted in the first place, then whatever Apple tries to do with "Mastered for iTunes" is not going to help, and the AAC encoded material sounds the same as AAC encoded material converted from a CD.

    According to the article, the recording itself is not clipped, but it sounds as if clipping has happened at some time earlier in the production. Garbage-in, garbage-out principle.
  • First problem:

    Subtracting one waveform from another to look at the difference, doesn't prove there are audible difference in the case of AAC vs CD.

    It just proves that one file is using perceptual encoding, which we already know.

    Perceptual encoding changes the waveform, but that does not prove that the difference is audible when in the original file being masked by louder material. To prove that you would need Double blind listening tests.

    So that point is a total failure.

    Second problem.

    "Mastered for iTunes"

  • Duh. (Score:5, Informative)

    by elistan ( 578864 ) on Tuesday February 28, 2012 @03:07PM (#39188627)
    No surprise. And it's a misunderstanding on the author's part, not a misrepresentation on Apple's part.

    Apple's "Mastered for iTunes" is a set of guidelines about how to turn a master recording into an iTunes-optimized digital file. The author of TFA, however, is talking about taking a CD track and making a compressed version that's as close as possible to the CD track. A CD track is NOT a master file. (We don't want a track that's merely a CD representation - we've heard plenty on /. about how a lot of CD tracks just suck.) "Mastered for iTunes" talks about taking a high-resolution digitial file, like 96/24 or 192/24, and then producing the best possible iTunes Plus file (256 kbps VBR AAC.)

    So of course if you make an iTunes track from a CD track via the "Mastered for iTunes" process, you'll get a 256 kbps VBR AAC that's identical to ripping a CD track to a 256 kbps VBR AAC. However, if you follow Apple's recommendations, quoted here:

    To take best advantage of our latest encoders send us the highest resolution master file possible, appropriate to the medium and the project.
    An ideal master will have 24-bit 96kHz resolution. These files contain more detail from which our encoders can create more accurate encodes. However, any resolution above 16-bit 44.1kHz, including sample rates of 48kHz, 88.2kHz, 96kHz, and 192kHz, will benefit from our encoding process.

    you'll probably get something different, perhaps better, than a CD track ripped to AAC.

    Apple is providing the tools they use to convert to AAC so that sound engineers can preview the product before it goes on sale, but they appear to be the same tools they've been using all along. As I said before, "Mastered for iTunes" isn't a new encoding tool - it's a process workflow. Other recommendations:

    - Apple recommends listening to your masters on the devices your audience will be using
    - Be Aware of Dynamic Range and Clipping
    - Master for Sound Check and Other Volume Controlling Technology
    - Remaster for iTunes [That is, they suggest starting over from the original recordings, rather than send in a file that was mastered with CDs in mind.]

  • by Pf0tzenpfritz ( 1402005 ) on Tuesday February 28, 2012 @03:42PM (#39189089) Journal
    "Mastered for iTunes" is indeed optimized for iTunes: it's optimized for separating the gullible from their money.
  • by funkboy ( 71672 ) on Tuesday February 28, 2012 @04:07PM (#39189393) Homepage

    Using RHCP records as a basis for comparison is a terrible example; everything they've brought out since One Hot Minute has been overcompressed to death at multiple stages in production (Californication is even cited as a specific example of a crappily mastered record in the Wiki article [wikipedia.org]).

    Shortly after reading this article on ars [arstechnica.com] I went to check it out for myself. Yes, technically they are still "just" 256k VBR AAC files just like other stuff in the iTunes Store. But if the engineer doing the mastering has busted his/her ass to play the cat & mouse cycle of re-tweaking the dynamics after listening to the encoded result a few times, the results are extremely surprising.

    If you've got a good stereo or a nice pair of headphones, go listen to a normal CD version of Jimmy Smith's "The Cat" ripped at 256k VBR AAC, and then listen to the "mastered for iTunes" [apple.com] version. I had no idea lossily compressed audio from 40+ year old analog master tapes could sound that good.

Promising costs nothing, it's the delivering that kills you.

Working...