Forgot your password?
typodupeerror
AMD Intel Apple

Sources Say Apple Originally Planned AMD Chip For MacBook Air 197

Posted by timothy
from the llano's-just-a-town-in-texas dept.
Several media sources (here's PC Magazine's version), all seemingly based on an account at SemiAccurate citing (but not naming) "multiple sources," report that Apple originally planned an AMD-chip based MacBook Air, rather than the Intel-based version that emerged later ("Plan B," says the report).
This discussion has been archived. No new comments can be posted.

Sources Say Apple Originally Planned AMD Chip For MacBook Air

Comments Filter:
  • In summary (Score:5, Informative)

    by phantomfive (622387) on Sunday November 20, 2011 @02:12PM (#38117956) Journal
    The AMD chips had a significantly better GPU, at the cost of a slightly slower CPU (which is a good tradeoff). Apple didn't go with it because AMD couldn't guarantee the volumes that Apple needed.

    And this is essentially the story of AMD for the last decade.
    • by perpenso (1613749) on Sunday November 20, 2011 @02:28PM (#38118068)

      The AMD chips had a significantly better GPU, at the cost of a slightly slower CPU (which is a good tradeoff).

      In the context of something like a MacBook Air power consumption is a far greater factor than CPU or GPU performance.

      • by phantomfive (622387) on Sunday November 20, 2011 @02:52PM (#38118268) Journal

        In the context of something like a MacBook Air power consumption is a far greater factor than CPU or GPU performance.

        I'm not sure why you think this, if they were looking for power consumption, wouldn't they go with the Atom?

        I can tell you at least anecdotally, the last time I was looking at a laptop I really wanted something like an Air because of its nice slender shape, but I decided against it because it is underpowered compared to most other laptops I was considering, and I am ok with a shorter battery life.

        • by allanw (842185) on Sunday November 20, 2011 @03:16PM (#38118434)
          Atoms are friggin slow compared to a regular CPU and should only be used for sub-$400 netbooks, not $1000 laptops. One of the great things about the Air is that it doesn't use some dumbed down CPU, it's just a regular Sandy Bridge clocked down.
          • Not only is Atom slow, it runs hot. Worst of both worlds, no interest whatsoever in going that route again.

        • Re: (Score:2, Troll)

          by gnasher719 (869701)

          I'm not sure why you think this, if they were looking for power consumption, wouldn't they go with the Atom?

          Only an imbecile would put an Atom processor into a laptop. Performance is about a factor five less than what is in the slowest current MacBook Air. Atom is only for toy netbooks.

          • Re: (Score:3, Funny)

            by afabbro (33948)

            Atom is only for toy netbooks.

            I guess I'll just power off my Atom-powered toy and stop reading Slashdot. If only I was using a real, manly laptop like gnasher719, sigh...

      • by Telvin_3d (855514)

        Yes, but power consumption can be a tricky thing. If enough can be off-loaded to a GPU that is more efficient it can come out better in the end. That's basically been Apple's strategy with the iPhones and iPads. An OK processor coupled with a GPU that that been customized to fit the device and software customized to get the most out of the hardware.

      • by Theovon (109752) on Sunday November 20, 2011 @04:33PM (#38119104)

        People tend to conflate power with energy, and you may be doing it here. If you're going to be executing a particular job, and you want to optmize its efficiency, then it will consume some power over some time period, which is ENERGY. On the other hand, if you're talking about the battery life of your laptop, then the computer is almost completely idle, and what we want to therefore minimize is idle and average power.

        Optimizing just for power isn't sufficient. If something uses half the power but takes 4 times as long, then it's twice as bad. However, we don't typically wake our computers to run compute-intensive jobs, just to put them back to sleep when those are done. We do a lot of screen-staring, which complicates the issue.

        Interestingly, performance per watt IS in the right units. Performance would be something comparable to operations per second, while watts is joules per second. The seconds cancel out, giving you operations per joule, which is the correct efficiency metric.

    • It was also the story of Motorola back in the early Eighties, when IBM was developing that first Personal Computer: the story I always heard was that IBM chose the Intel line over Motorola's more capable 68K series simply because Intel had secondary sourcing and could guarantee volume, but Motorola was the sole source and couldn't.

      • by rrossman2 (844318)

        Intel's second source for 386, 486, etc... AMD

        • by macraig (621737)

          I think there was yet another source as well, but I have no memory for detail and no time to Google the blanks.

          • by thsths (31372)

            It was a bit later, but NEC produced the V20 and V30, very worthy competitors to the early Intel x86 CPUs.

            • by macraig (621737)

              I know! We used to overclock them in some systems by swapping the clock crystals. :-)

          • by mirix (1649853)

            Harris and Siemens made them too, and several Japanese manufacturers. I don't remember the timeline though, I'm certain some of them didn't start production until the 8088 was no longer bleeding edge tech... I seem to think IBM had rights to make them as well, but don't recall if they ever did. I don't think so, though.

            On a side note Intersil (a portion of Harris' semiconductor business, before) still makes 8088, 8086... (hell, they even make RCA 1802) to this day. Seems this is mostly aimed at military and

      • by sribe (304414)

        It was also the story of Motorola back in the early Eighties, when IBM was developing that first Personal Computer: the story I always heard was that IBM chose the Intel line over Motorola's more capable 68K series simply because Intel had secondary sourcing and could guarantee volume, but Motorola was the sole source and couldn't.

        Actually, I think Moto was just talking about the 68k but hadn't yet managed to ship.

        • by macraig (621737)

          Was the 8088 in use elsewhere before IBM picked it up? I wonder if perhaps both of them weren't quite shipping when IBM made the decision? Got a published timeline from a mag or site article?

          • by sribe (304414)

            Nope, no published timeline, just my ancient memories ;-) I think the gap in shipping was only a few months, but IBM was in a severe rush to get a product released in order to prevent other micros from continuing to establish a foothold in business use.

          • by hedwards (940851)

            I don't have a source but I'm pretty sure it was. The reason that IBM chose the components that it did for their computers was largely because they could be put into a workable computer quickly. It's also why the competition was able to create clones so quickly pretty much all the parts were off the shelf.

      • by jedidiah (1196)

        Are you sure it wasn't because the MC68000 was insanely expensive when compared to the Intel part that IBM eventually chose? The Motorola part was a much more capable bit of tech and it was priced accordingly.

        Volume and secondary sources were likely relatively minor concerns.

      • by warrigal (780670)
        That may have contributed but the main reason was that the PC team had only a year to bring the product to market. They preferred the 32-bit 68000 over the 16-bit 80XX but Motorola's design and dev tools were far inferior to Intel's and the engineers had much more experience working with Intel chips in IBM's Vendor Technology Logic-based products. So, in their rush to market, the team saddled us with all that Expanded/Extended Memory stuff as well as other sins.
        • by macraig (621737)

          ... saddled us with all that Expanded/Extended Memory stuff as well as other sins.

          Yeah, I was thinking of that specifically when I compared the two. Were it not for IBM's choice, though, one of the companies that once employed me, Quarterdeck, might never have even existed. Well, at least its first product never would have.

    • Re:In summary (Score:4, Informative)

      by fuzzyfuzzyfungus (1223518) on Sunday November 20, 2011 @03:06PM (#38118372) Journal
      It is slightly more specific than that, in this case:

      Apple continued to ship Core2s in their smaller systems for a surprising length of time after the newer intel gear became available because that was the only way they could continue to get Nvidia GPUs in anything too small for a discrete graphics card, and they were just that unimpressed with intel's offering.

      Given that, it seems likely that AMD must have had real, serious, dealbreaker, volume issues with their APU parts(not just 'we need our Intel marketing support money' volume issues) for Apple to have dropped that plan.

      It would be interesting to know if AMD just can't ship them in quantity at all(which seems modestly unlikely, given the number of cheapie PC laptops where they've popped up, and the fairly low prices they must be selling for), or if Apple required some fancy low voltage bin that AMD's process just didn't hit regularly enough...
      • If you knew the answer to this, it could make all the difference in whether you should buy AMD stock or short it.
      • It would be interesting to know if AMD just can't ship them in quantity at all(which seems modestly unlikely, given the number of cheapie PC laptops where they've popped up, and the fairly low prices they must be selling for), or if Apple required some fancy low voltage bin that AMD's process just didn't hit regularly enough...

        Well considering that Apple is selling about 3M laptops a quarter, they were probably projecting at least 1M per quarter if not more. Unlike their other suppliers, Apple could not help AMD expand their manufacturing by fronting them capital funds. That kind of expansion would take years which would be the limiting factor.

    • Re:In summary (Score:5, Interesting)

      by symbolset (646467) * on Sunday November 20, 2011 @03:36PM (#38118662) Journal
      On the other hand for what Apple's paying for Intel chips Apple could just buy AMD and fix their supply chain problems. AMD could be had for about $5 billion today. Apple's moving about 16 million Macs a year. It wouldn't take too long for that to pay off. And 64-core Mac desktops would be pretty neat.
      • I honestly don't know why this hasn't happened... Intel must be dumping mountains of cash on Apple to make this idea look unattractive.
      • by Agripa (139780)

        AMD's x86 IP licensing agreements with Intel are not transferable if AMD is bought. At best, the buyer would end up in a long legal fight with Intel.

    • by washu_k (1628007)

      The AMD chips had a significantly better GPU, at the cost of a slightly slower CPU (which is a good tradeoff). Apple didn't go with it because AMD couldn't guarantee the volumes that Apple needed.

      Umm, no. AMD doesn't have a chip that competes with Intel's ultra low power Sandy Bridge chips like in the Air.

      The AMD Brazos chips compete on power consumption, but they are way slower. They are an Atom competitor, something they do very well but SB chips are in a completely different performance bracket.

      The AMD Llano chips would qualify as "significantly better GPU, at the cost of a slightly slower CPU", but at much higher power consumption. Not suitable for the AIR either.

    • by hairyfeet (841228)

      Probably more likely Intel offered some of their famous kickbacks since this was before they got caught. Jobs was always a shrewd business man and if he got Intel chips 40% cheaper thanks to a little under the table kickback from Intel I doubt he'd pass it up. Hell everyone else did it, Dell, Gateway, eMachines, why not Apple?

      In the end this just supports something I've been saying for years, which is for the vast majority CPUs are long past good enough and getting into extreme overkill. I've got tons of cu

  • by AmiMoJo (196126) <mojo@NOspAm.world3.net> on Sunday November 20, 2011 @02:21PM (#38118012) Homepage

    So Apple were trying to chose between the only two players in the performance x86 world?! They actually stopped to consider the alternative rather than just picking the default when millions of dollars were at stake?

    I'm blown away, like everyone else I thought Steve Jobs just picked names out of a hat.

  • by perpenso (1613749) on Sunday November 20, 2011 @02:24PM (#38118020)
    AMD is always considered before negotiating prices with Intel. Flirting with AMD before choosing Intel is a pretty common practice, even for those who planned on going with Intel all along.
    • by Baloroth (2370816) on Sunday November 20, 2011 @02:57PM (#38118312)

      It's one thing to flirt. It is entirely another to be actually planing on using them, which by most accounts Apple was. I don't think this was just a gambit. AMD also would have given them a couple of advantages. Far superior GPU and better power efficiency (so I have heard, anyways), mainly. Probably would have been cheaper too, although that is just a guess.

  • Uh... (Score:3, Funny)

    by AngryDeuce (2205124) on Sunday November 20, 2011 @02:26PM (#38118046)

    Okay.

    So, are we just going to run any old article with Apple in the title now?

    • by Swampash (1131503)

      So, are we just going to run any old article with Apple in the title now?

      Only while Slashdot sells advertising.

  • by ClaraBow (212734) on Sunday November 20, 2011 @02:29PM (#38118088)
    Hackintosh community as drivers for AMD based netbooks and laptops would've become available. So wish AMD had the resources to meant high volume demands. Maybe next time!
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      I was running os x on my old amd system quite early on in the osx86 scene...

  • by Doc Ruby (173196) on Sunday November 20, 2011 @02:42PM (#38118192) Homepage Journal

    Years ago when Apple dropped the PowerPC in favor of Intel, Jobs claimed it was because the electrical W:MIPS of PPC was predicted to soon fall short of the performance of x86, with battery, fan and other limits to consider - just as iP* and other mobiles dominated Jobs' vision.

    How has that turned out? Have PPCs really fallen behind, or hit a wall, compared to Intel's CPUs Apple uses? How do the AMD x86es compare to the Intel ones on that criterion?

    • Re: (Score:2, Informative)

      by Anonymous Coward
      I don't know if he ever made that specific claim, but IBM was more interested in the XBox 360 and PS 3 than the Macintosh. Intel was able to provide a roadmap of future plans/processors.
      • by Doc Ruby (173196)

        Jobs most certainly did make that specific claim [everymac.com]:

        When we look at Intel, they've got great performance, yes, but they've got something else that's very important to us. Just as important as performance, is power consumption. And the way we look at it is performance per watt. For one watt of power how much performance do you get? And when we look at the future road maps projected out in mid-2006 and beyond, what we see is the PowerPC gives us sort of 15 units of performance per watt, but the Intel road map in

    • How has that turned out? Have PPCs really fallen behind, or hit a wall, compared to Intel's CPUs Apple uses?

      Does my XBox count?

    • PowerPC is oriented towards embedded (PPC 4xx) and manycore supercomputing (BlueGene) workloads, and isn't really ideal for desktops at this time. POWER7 is IBM's flagship server processor, and outperforms anything x86 by quite a considerable margin (admittedly while drawing 200W.)
      • Its a common mistake, and one I often make... thinking that the desktop is what drives processor technology... hearing the echo's of fanbois makes me chuckle ("x86 was better than the G4 Macs!"). If Intel could compete with PPC, then IBM would be using Intel, end of story. No one seems to notice, or place any importance, on the fact that Intel's R&D hit the same frequency wall that IBM PPC's did: I don't see any consumer-level 10Ghz Intel processors available, which one might have predicted we'd have by

When you make your mark in the world, watch out for guys with erasers. -- The Wall Street Journal

Working...