Sources Say Apple Originally Planned AMD Chip For MacBook Air 197
Several media sources (here's PC Magazine's version), all seemingly based on an account at SemiAccurate citing (but not naming) "multiple sources," report that Apple originally planned an AMD-chip based MacBook Air, rather than the Intel-based version that emerged later ("Plan B," says the report).
In summary (Score:5, Informative)
And this is essentially the story of AMD for the last decade.
CPU & GPU performance not relevant (Score:5, Insightful)
The AMD chips had a significantly better GPU, at the cost of a slightly slower CPU (which is a good tradeoff).
In the context of something like a MacBook Air power consumption is a far greater factor than CPU or GPU performance.
Re:CPU & GPU performance not relevant (Score:5, Insightful)
In the context of something like a MacBook Air power consumption is a far greater factor than CPU or GPU performance.
I'm not sure why you think this, if they were looking for power consumption, wouldn't they go with the Atom?
I can tell you at least anecdotally, the last time I was looking at a laptop I really wanted something like an Air because of its nice slender shape, but I decided against it because it is underpowered compared to most other laptops I was considering, and I am ok with a shorter battery life.
Re:CPU & GPU performance not relevant (Score:5, Insightful)
Re: (Score:2)
Not only is Atom slow, it runs hot. Worst of both worlds, no interest whatsoever in going that route again.
Re:CPU & GPU performance not relevant (Score:5, Informative)
Obviously power consumption is important but performance is also very important. An Atom is an extremely cheap CPU that doesn't deserve to go in a $1000 laptop, like I said. Otherwise you can take the argument to silliness by asking why Apple didn't go with ARM or something.
I've found that Macbooks are pretty comparable in price to a Windows laptop now, at least the Airs (since we're on that topic). Nothing out there matches a Macbook Air in price, considering that the Air comes with an SSD and a Sandy Bridge CPU.
Re: (Score:2, Informative)
BZZZT, wrong, but thanks for playing. Asus U36SD [bhphotovideo.com].
US$861
Macbook Air 13" [bhphotovideo.com]
US$1250
The Asus has a faster processor, switch-able graphics, USB 3, HDMI, VGA, SATA 3, Gigabit Ethernet and if you wanted to stick a 128 GB SSD [bhphotovideo.com] into it, you're still $150 up on the Macbook. The Macbook also solders the RAM to the mainboard meaning it's
Re: (Score:3)
Re:CPU & GPU performance not relevant (Score:5, Insightful)
Sure, but you also managed to overlook the Macbook Air benefits compared to your excellent Asus model. First of all Macbook Air is just under 3lbs, while Asus is 3.7lbs. That weight difference alone explains some of the hardware differences and design decisions. Additionally Macbook Air has better resolution on the display, which is a huge plus in my eyes, 1440x900 compared to 1366x768. Add the ultra light power adapter of Macbook Air to the mix and you get portable system with you well under 4lbs.
Second area where I believe Macbook Air will prevail is heat management. Try using all those goodies loaded in Asus for an extended period of time and the laptop becomes unbearably hot and reduces battery life significantly. Macbook Air also heats but I believe less so because of lower powered CPU and no dedicated GPU. Adding dedicated GPU or more CPU power is less appealing on ultra portable than on a desktop computer and should be always weighted on the down side they create.
Nice things Macbook Air has that are more rarely found in competing products: Magsafe power port, OSX Lion, Thunderbolt port, excellent microphone, and great webcam.
Re: (Score:2)
I agree with everything you say except maybe display resolution. While it's nice to have more pixels, most people consume media more than they produce it and they will want a panel that matches the media they're watching at least some of the time.
Some Asus laptops are really nice about heat management, and most of the ones that aren't are really fast.
Magsafe is offensive. I have a magnetic power cord from a waffle maker or something right here by my desk. Magsafe should not have been patentable.
Re: (Score:3)
The trouble with Atom is its really not powerful enough for anything but what GP said... netbooks, web, light word processing. Yes, its very low power, but AMD is in its ball park power-wise, and AMD completely spanks the Atom in processing power. I realize Atom has a huge following, and so does Intel in general, due to their chip fabs being the best, even if they couldn't produce a viable GPU to save their lives. Trouble is the Atom is actually equivalent in processing power to a PowerPC G4 ... so for Appl
Re: (Score:2)
I have an Atom netbook, and it was way ahead of AMD at the time. But after AMD launched Brazos in January this year, the Atom has looked old. It's still a bit lower power but for a lot lower performance. CedarTrail-M is supposed to be out this month, which may breathe a little more life into it but neither that nor Saltwell out next year seem very impressive. The first really major architecture upgrade isn't until Silvermont in 2013, until then AMD is more than a match for the Atom.
Re: (Score:2)
I have an Atom netbook, and it was way ahead of AMD at the time.
I bought an Atom netbook and an AMD netbook at the same time. The AMD netbook with a 1.2 GHz 64 bit Athlon chip and a halfway decent GPU (but only halfway) runs rings around the intel-based system, which has a 400MHz faster clock. Unfortunately, it also has AMD R690M chipset and Athlon 64 L110 and AMD really dropped the ball on LInux support. The graphics driver is unusable even with all options turned off (display trashing, then crashing) and the power management doesn't. I had Windows 7 on it for a while
Re: (Score:2)
Re: (Score:2)
Re:CPU & GPU performance not relevant (Score:4, Informative)
ah... I wasn't aware AMD sold off every fab... more fallout from Intel's dirty tricks... fuckers... competition drives technology, and Intel set technology back a bit by doing what they did.
The fallout from the "dirty tricks" was maybe that AMD couldn't gain as much marketshare as they might have during the window of time when AMD's technology was competitive, but that was mostly limited by how fast they could expand production facilities (building chip fabs is very expensive and time consuming). Many other factors went into AMD's fab sell-off, most of them self-inflicted wounds. Off the top of my head:
* The last time AMD executed well on CPU core design was the original Athlon 64/Opteron core. Everything since has been between terrible and mediocre, with the only minor success finally coming this year in the "APU" products (which still only give them a foothold at the low end). In the meantime, Intel hit a home run with Core 2 and kept executing extremely well thereafter.
* AMD had to cancel an entire next-gen CPU architecture, and based on recent events probably should've cancelled Bulldozer too. While waiting for these new architectures they could do little but release minor retreads of the aging K8 (Athlon 64) core, which kept them a year or more behind Intel in performance (especially in the growing laptop segment, where AMD was very weak). Worse, they managed to screw up some of the retread products with serious bugs, hurting their credibility (especially in the server market).
* AMD correctly anticipated the need to acquire GPU technology when GPU + CPU integration was on the horizon. However, after failing to acquire NVidia, they then overpaid for ATI by several billion dollars. After the acquisition, ATI went through a multiyear stretch of disappointing products, so the ATI division kept posting losses. It was so bad that for a year or two AMD had to periodically write off hundreds of millions of dollars of "goodwill" to reflect the declining value of the ATI division relative to what they'd paid.
* Core 2 hurt demand for AMD's CPUs badly enough to drop orders far below AMD's production capacity, at a time when AMD was trying to expand from one to two fabs. They were eventually forced to mothball the new facility partway through completing it, which is very bad news financially (fab equipment is horribly expensive and depreciates quickly, so buying a bunch of it and then having it sit idle means you're losing money at a scary rate). It's also bad financially to not fully utilize a completed fab, for the same reason, but because AMD's process tech was too unique, nobody wanted to build ASICs in AMD's fabs, so they were stuck with just letting it be partially idle.
* Tying into that, the reason for expansion was that during the P4 vs. Athlon64 era, AMD's CEO (Hector Ruiz) had come up with a long term strategy of expanding marketshare to 30%. This required a second fab (AMD had traditionally had just one), and aggressive price competition with Intel to buy marketshare. He stuck with it long after Core 2 changed AMD's competitive position for the worse. This left AMD spending tons of money and pricing its products too low during a time when they should've been putting the market share expansion plan on hold, maximizing profits on the products they had, and focusing on new products to put themselves back in front of Intel.
This all snowballed to the point that AMD was unable to get new loans because they were too much of a credit risk. They were hovering on the edge of bankruptcy, and were having problems with the capital expenditures needed to keep up with Intel on fab technology even after giving up on capacity expansion. It became a death spiral which could only be stopped by selling the fabs. It never would've gotten so bad if it AMD hadn't stumbled so badly on execution and made so many tactical and strategic mistakes.
(Many in the industry think AMD hasn't been run well since Jerry Sanders retired in 2
Re: (Score:2)
For most devices these days (even desktops), there's a TDP ceiling for the design. If a chip is a watt over that (or in smartphones, a milliwatt over - no joke), you have no product. Only if you fit within that TDP does performance matter. In that sense, power consumption is a far greater factor. If your power consumption is too high, you don't even get benchmarked. You don't just lose the game, you never even make it on the field.
That said, SemiAccurate and its owner are in general full of crap. I'm thankf
Re: (Score:2, Troll)
I'm not sure why you think this, if they were looking for power consumption, wouldn't they go with the Atom?
Only an imbecile would put an Atom processor into a laptop. Performance is about a factor five less than what is in the slowest current MacBook Air. Atom is only for toy netbooks.
Re: (Score:3, Funny)
Atom is only for toy netbooks.
I guess I'll just power off my Atom-powered toy and stop reading Slashdot. If only I was using a real, manly laptop like gnasher719, sigh...
Re: (Score:2)
I think the early Atoms might have been poor and that's how they got the bad rep but the one in my netbook seems to be pretty good to me and I really like the battery life.
Re: (Score:2)
Yes, but power consumption can be a tricky thing. If enough can be off-loaded to a GPU that is more efficient it can come out better in the end. That's basically been Apple's strategy with the iPhones and iPads. An OK processor coupled with a GPU that that been customized to fit the device and software customized to get the most out of the hardware.
Re:CPU & GPU performance not relevant (Score:5, Interesting)
People tend to conflate power with energy, and you may be doing it here. If you're going to be executing a particular job, and you want to optmize its efficiency, then it will consume some power over some time period, which is ENERGY. On the other hand, if you're talking about the battery life of your laptop, then the computer is almost completely idle, and what we want to therefore minimize is idle and average power.
Optimizing just for power isn't sufficient. If something uses half the power but takes 4 times as long, then it's twice as bad. However, we don't typically wake our computers to run compute-intensive jobs, just to put them back to sleep when those are done. We do a lot of screen-staring, which complicates the issue.
Interestingly, performance per watt IS in the right units. Performance would be something comparable to operations per second, while watts is joules per second. The seconds cancel out, giving you operations per joule, which is the correct efficiency metric.
Repeating history (Score:3)
It was also the story of Motorola back in the early Eighties, when IBM was developing that first Personal Computer: the story I always heard was that IBM chose the Intel line over Motorola's more capable 68K series simply because Intel had secondary sourcing and could guarantee volume, but Motorola was the sole source and couldn't.
Re: (Score:2)
Intel's second source for 386, 486, etc... AMD
Re: (Score:2)
I think there was yet another source as well, but I have no memory for detail and no time to Google the blanks.
Re: (Score:2)
It was a bit later, but NEC produced the V20 and V30, very worthy competitors to the early Intel x86 CPUs.
Re: (Score:2)
I know! We used to overclock them in some systems by swapping the clock crystals. :-)
Re: (Score:2)
Harris and Siemens made them too, and several Japanese manufacturers. I don't remember the timeline though, I'm certain some of them didn't start production until the 8088 was no longer bleeding edge tech... I seem to think IBM had rights to make them as well, but don't recall if they ever did. I don't think so, though.
On a side note Intersil (a portion of Harris' semiconductor business, before) still makes 8088, 8086... (hell, they even make RCA 1802) to this day. Seems this is mostly aimed at military and
Re: (Score:2)
Pretty sure not Cyrix. It didn't even exist at that time AFAIK.
Re: (Score:2)
It was also the story of Motorola back in the early Eighties, when IBM was developing that first Personal Computer: the story I always heard was that IBM chose the Intel line over Motorola's more capable 68K series simply because Intel had secondary sourcing and could guarantee volume, but Motorola was the sole source and couldn't.
Actually, I think Moto was just talking about the 68k but hadn't yet managed to ship.
Re: (Score:2)
Was the 8088 in use elsewhere before IBM picked it up? I wonder if perhaps both of them weren't quite shipping when IBM made the decision? Got a published timeline from a mag or site article?
Re: (Score:2)
Nope, no published timeline, just my ancient memories ;-) I think the gap in shipping was only a few months, but IBM was in a severe rush to get a product released in order to prevent other micros from continuing to establish a foothold in business use.
Re: (Score:2)
I don't have a source but I'm pretty sure it was. The reason that IBM chose the components that it did for their computers was largely because they could be put into a workable computer quickly. It's also why the competition was able to create clones so quickly pretty much all the parts were off the shelf.
Re: (Score:2)
Are you sure it wasn't because the MC68000 was insanely expensive when compared to the Intel part that IBM eventually chose? The Motorola part was a much more capable bit of tech and it was priced accordingly.
Volume and secondary sources were likely relatively minor concerns.
Re: (Score:2)
Re: (Score:2)
... saddled us with all that Expanded/Extended Memory stuff as well as other sins.
Yeah, I was thinking of that specifically when I compared the two. Were it not for IBM's choice, though, one of the companies that once employed me, Quarterdeck, might never have even existed. Well, at least its first product never would have.
Re:In summary (Score:4, Informative)
Apple continued to ship Core2s in their smaller systems for a surprising length of time after the newer intel gear became available because that was the only way they could continue to get Nvidia GPUs in anything too small for a discrete graphics card, and they were just that unimpressed with intel's offering.
Given that, it seems likely that AMD must have had real, serious, dealbreaker, volume issues with their APU parts(not just 'we need our Intel marketing support money' volume issues) for Apple to have dropped that plan.
It would be interesting to know if AMD just can't ship them in quantity at all(which seems modestly unlikely, given the number of cheapie PC laptops where they've popped up, and the fairly low prices they must be selling for), or if Apple required some fancy low voltage bin that AMD's process just didn't hit regularly enough...
Re: (Score:2)
Re: (Score:2)
It would be interesting to know if AMD just can't ship them in quantity at all(which seems modestly unlikely, given the number of cheapie PC laptops where they've popped up, and the fairly low prices they must be selling for), or if Apple required some fancy low voltage bin that AMD's process just didn't hit regularly enough...
Well considering that Apple is selling about 3M laptops a quarter, they were probably projecting at least 1M per quarter if not more. Unlike their other suppliers, Apple could not help AMD expand their manufacturing by fronting them capital funds. That kind of expansion would take years which would be the limiting factor.
Re: (Score:2)
Which Apple would've been just fine with...
So.. you're saying that AMD backed out, rather than Apple choosing?
Re:In summary (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2)
AMD's x86 IP licensing agreements with Intel are not transferable if AMD is bought. At best, the buyer would end up in a long legal fight with Intel.
Re: (Score:2)
Who knows? (Score:2)
Whatever it is, AMD is up to something new [mercurynews.com], and will announce in February.
Re: (Score:3)
The AMD chips had a significantly better GPU, at the cost of a slightly slower CPU (which is a good tradeoff). Apple didn't go with it because AMD couldn't guarantee the volumes that Apple needed.
Umm, no. AMD doesn't have a chip that competes with Intel's ultra low power Sandy Bridge chips like in the Air.
The AMD Brazos chips compete on power consumption, but they are way slower. They are an Atom competitor, something they do very well but SB chips are in a completely different performance bracket.
The AMD Llano chips would qualify as "significantly better GPU, at the cost of a slightly slower CPU", but at much higher power consumption. Not suitable for the AIR either.
Re: (Score:3)
*** SHOCK *** (Score:5, Funny)
So Apple were trying to chose between the only two players in the performance x86 world?! They actually stopped to consider the alternative rather than just picking the default when millions of dollars were at stake?
I'm blown away, like everyone else I thought Steve Jobs just picked names out of a hat.
Re:*** SHOCK *** (Score:4, Funny)
Re: (Score:2)
you know that your product is cursed.
Unless it's a mobile phone—then it's the other way around. (Pssh, yeah, sure, Intel. You'll get a slice of that pie someday, I'm sure.)
AMD always considered ... (Score:5, Interesting)
Re:AMD always considered ... (Score:5, Insightful)
It's one thing to flirt. It is entirely another to be actually planing on using them, which by most accounts Apple was. I don't think this was just a gambit. AMD also would have given them a couple of advantages. Far superior GPU and better power efficiency (so I have heard, anyways), mainly. Probably would have been cheaper too, although that is just a guess.
Re: (Score:2)
Indeed, perhaps they would actually flirt with with AMD, considering it as an actual option. If AMD is hungry enough, they might might go in for razor thin margins, or customizations.
And by locking up AMD stock for a period, they not only strengthen an Intel competitor, but also temporarily give intel a headache in the anti-trust arena - they want AMD to be small, but not so small in their market that regulators come sniffing around....
Re: (Score:2)
Uh... (Score:3, Funny)
Okay.
So, are we just going to run any old article with Apple in the title now?
Re: (Score:2)
So, are we just going to run any old article with Apple in the title now?
Only while Slashdot sells advertising.
This would have been great for.. (Score:5, Insightful)
Re: (Score:2, Interesting)
I was running os x on my old amd system quite early on in the osx86 scene...
PPC vs Intel vs AMD? (Score:3)
Years ago when Apple dropped the PowerPC in favor of Intel, Jobs claimed it was because the electrical W:MIPS of PPC was predicted to soon fall short of the performance of x86, with battery, fan and other limits to consider - just as iP* and other mobiles dominated Jobs' vision.
How has that turned out? Have PPCs really fallen behind, or hit a wall, compared to Intel's CPUs Apple uses? How do the AMD x86es compare to the Intel ones on that criterion?
Re: (Score:2, Informative)
Re: (Score:3)
Jobs most certainly did make that specific claim [everymac.com]:
Re: (Score:2)
How has that turned out? Have PPCs really fallen behind, or hit a wall, compared to Intel's CPUs Apple uses?
Does my XBox count?
Re:PPC vs Intel vs AMD? (Score:4, Funny)
Does my XBox count?
It's a computer. What else would it do?
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Uh, duh? Laptop versus desktop.
The G4 wasn't meant to be as powerful. For that, it would've sucked up 300% the power.
Re: (Score:2)
Pure performance is not what I'm talking about, and besides your claim is arguable.
Where's a chart of the watts per performance, when running apps like Photoshop and Office, for each of PPC, Intel and AMD? That's the issue that I'm talking about, and that Jobs claimed made his decision to switch.
Re:PPC vs Intel vs AMD? (Score:4, Informative)
Re: (Score:2)
(Reminds me of a line from Amadeus, a complaint about Mozart's new opera:" Too many notes for the royal ear...")
By that time I'm not sure DEC could have delivered enough volume for Mac desktops, though the Alpha was fabbed by Intel for a while.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Apple going with AMD wouldnt improve the battery life as OP implies, AMD having better battery life would increase chances of it getting into Apple devices
Re: (Score:3)
Nope. Intel mobiles perform more processing per watt than AMD, and it's been that way for a few years.
Re:wish they had used AMD chips from the beginning (Score:4, Informative)
Apple going with AMD wouldnt improve the battery life as OP implies, AMD having better battery life would increase chances of it getting into Apple devices
making the comment original
Re: (Score:2)
Oh. The OP's message was kind of weird. Basically saying he wishes AMD was the first choice because it's battery was worse. Maybe he wanted Apple laptops to suck.
Re: (Score:2)
The issue is that intel FORCES "HD3000" graphics to buy the mobile processors.
In my opinion that is a deal breaker on things like the Air. Even light modern gaming is painful on intel graphics... Sure the new Air is "better" than the last one... With 3x the CPU thrown at the problem. Consider the Air with the same CPU but newer Nvidia graphics? At that point an AMD processor that's slower, but with a better tightly bound graphics is going to be a better experience for the target low-end users more likely to
Re: (Score:2)
I don't think most Mac Air buyers are looking for a gaming laptop.
Re: (Score:2)
The OP isn't talking about a "gaming laptop" but a laptop capable of limited casual gaming. This is perfectly within scope for the target demographic.
I see the total lack of support my i945 Minis have in this regard and wonder how the MBA gets treated. Extant products may simply tell you to take a hike if you try to install them on a MBA.
The Apple netbook should be able play some 5 year old RTS port.
Re: (Score:2)
Intel mobiles perform more processing per watt than AMD, and it's been that way for a few years.
Performance per watt is only tangentially related to battery life. Most laptop CPUs spend 95% of their power on hours idle, which means that the important figure is idle power draw. The fact that the Intel chip could be doing 60% more calculations if it actually had something to do doesn't make the battery last any longer.
Re: (Score:2)
No... it's entirely normal for an intel based laptop to get between 5 and 10 hours depending on the size of the battery and the speed of the chip. I'm not sure I've seen a single AMD laptop (that isn't based on the E-350) with battery life over 4 hours.
Re: (Score:2)
Apple going with AMD wouldnt improve the battery life as OP implies, AMD having better battery life would increase chances of it getting into Apple devices
Re: (Score:2)
I don't see where the OP implies that... To me, he implies that AMD's battery life sucks compared to intel's and that that would be sufficient for apple to tell them to fuck off.
Re: (Score:2)
He wishes Apple had gone with AMD.
Follows up with "AMD would not have a mobile platform practically dead in the water"
implying apple going with AMD would lead to better battery life
Re: (Score:2)
No, implying that AMD wouldn't be stuck with a platform that gets 0 sales and hence 0 investment.
Re: (Score:2)
Re: (Score:2)
No... it's entirely normal for an intel based laptop to get between 5 and 10 hours depending on the size of the battery and the speed of the chip. I'm not sure I've seen a single AMD laptop (that isn't based on the E-350) with battery life over 4 hours.
Considering that the CPU is only a fraction of the power draw of a laptop, a factor of two difference in battery life is almost certainly not attributable to the difference in CPUs.
The primary reason for the battery life difference is probably that Intel chips are sold in higher end laptops that contain higher capacity batteries.
Re: (Score:2)
Actually, the typical draw of a whole laptop is in the region of the 50W range, the typical draw of the CPU is in the 30-35W range. Make your CPU 10W less efficient and you shorten battery life by ~20%.
Re: (Score:2)
I think you're confusing actual power draw with TDP. So for example, an A8-3500M has a 35W TDP, that's the most it will draw for a sustained period of time. The actual power draw at idle will be more in the neighborhood of maybe 10-20W, something like that. So you take that, you add the screen, the hard drive, memory, wireless, etc. and you get your 50W. The CPU is not the dominant factor. In most cases the screen uses more.
Incidentally, Llano has lower idle power consumption [lmgtfy.com] than Core i3.
So as for this:
I'm not sure I've seen a single AMD laptop (that isn't based on the E-350) with battery life over 4 hours.
He [hp.com]
Re: (Score:3)
Good thing they have an Intel in the iPhone now, right?
Re: (Score:3)
Sayeth the n00b that obviously never owned a Pentium 4.
Re: (Score:2)
But the Pentium 4 would throttle when it got too hot. An Athlon would melt.
Re:Not Sure This is Newsworthy (Score:4, Insightful)
You don't know why people write articles that you admit you find interesting?
You judge Slashdot articles on whether computer supply chain logistics readers already know the stories?
Have another bottle of beer.
Re: (Score:2)
You don't know why people write articles that you admit you find interesting?
I find one speculation in the article interesting, the rest is just remarking on the obvious. I also find it interesting that Isaac Newton stuck a leather awl into his eye, but that doesn't mean it is news.
You judge Slashdot articles on whether computer supply chain logistics readers already know the stories?
I judge articles based on if they present useful information and I judge news articles based upon their presenting non-obvious facts about current events. This provided obvious statements about current events and speculation about a very specific topic that was interesting... but which it had no real evid
Re:Not Sure This is Newsworthy (Score:5, Insightful)
It's been a while since AMD was plan A for a thin-n-light laptop design...
Re:Intel (Score:5, Interesting)
At least AMD doesn't build hardware level backdoors into their CPUs.
That you know of.
Re:Intel (Score:4, Interesting)
Re: (Score:2)
so the next mac pro will have to have on board vid (Score:2)
For it to get Thunderbolt as it seems unlikey to be part of any add in video card.
So what will the new intel MB with TB look like? will the high end Core i7 and sever chips have no hope of TB in less intel add's video to the cpus and even then how will TB work with a add in ATI or nvidia card?
Re: (Score:2)
Afaict thunderbolt is implemented as a seperate chip connected via PCIe and displayport. So I don't see any reason why you couldn't have a system with an AMD processor but an intel thunderbolt chip.