What is the Intel Switch Costing Apple? 531
SenseOfHumor writes "A Business Week article says that it costs Apple $898 for an Intel iMac before loading it with software and packaging. From the article: 'But for Apple, the switch to Intel chips is less about saving money in the short term, and more about hitching its wagon to Intel's longer-term product road maps, particularly in the area of notebooks. IBM's chips are power-hungry and generate a lot of heat, and therefore not suitable to notebook computers.'"
Re:Don't We Know this already? (Score:5, Informative)
Really, what this article is saying is that Apple is only making $450 per low-end iMac sold, based on their own estimates, which are most likely wrong.
Why is THAT news? You got me.
Re:When did this change? (Score:2, Informative)
In the mid 1990s, Apple showed the famous picture of a Pentium grilling a hot dog and claimed Intel's chips were power hungry and ran hot compared to the nice cool sleek PowerPC. That was one of the supporting reasons that Apple ostensibly switched, according to all the engineering presentations at WWDC. So when did this change?
Just within the last 12 months has Intel started releasing chips that focus on lower heat and power. The Pentium M chips were a step towards lower power, but the Intel Core Duo that ships in the imac is the first chip that is really ahead of AMD for mobile systems.
Re:Then what are the savings on battery life? (Score:5, Informative)
The G4 had a great processing/watt ratio -- for its time. So did the G3. So did the 603. However, each new generation of laptop used MORE power to get FAR MORE processing done.
Re:When did this change? (Score:5, Informative)
No one at the time expected the changes in CISC processors. CISC processors still do have a "complex" instruction set in that they allow multiple forms of adddressing and varying length opcodes. However, internally these chips have become much more RISC-like. The current generation of Pentiums actually does an internal version of dynamic translation from CISC to RISC-micro-ops (which may be 1 or more per CISC instruction) and executes the micro-ops using a different instruction set internally. This internal RISC instruction set is used so central to the design that the L1 I-Cache is not actually a verbatim data cache of the CISC instructions but actually a trace cache of the translated RISC-like micro-ops.
Re:what about overhead? (Score:5, Informative)
Re:Don't We Know this already? (Score:2, Informative)
And as it turns out, IBM Microelectronics just had a fantastic financial quarter, having switched volume from money-sucking G5s to money-minting XBoxes.
Re:When did this change? (Score:3, Informative)
Intel noted this issue and produced the Pentium M [wikipedia.org] processor (part of their whole "Centrino" push), which significantly reduced processor power usage on mobile computers. In the meantime, Apple was unable to convince IBM to produce low power G5's as they had gotten Motorola to do in the past. Apple was thus stuck with older processor technology for its laptops.
Does that help explain things?
Re:the real costs (Score:5, Informative)
Now, I'm sure emulators will eventually appear, but this isn't the best example to present to demonstrate Apple backward compatibility =)
I completely disagree (Score:5, Informative)
False. Your statement isn't giving Intel enough credit and is not supported by the numbers. Since the original Banias Pentium M's were released back in March of 2003, we've seen Intel's mobile products have very good performance per watt ratios and overall power usage numbers. In fact, the overall power usage was the lowest in the original Pentium M's out of the entire line. You statement would be correct if you it said this: "...within the last 34 months (i.e. ~3 years) has Intel started releasing chips that focus on lower heat and power."
Data pulled from Intel Product Specifications at http://www.intel.com/ [intel.com]
Banias (the normal voltage models-i.e. 1.7 GHz, 1.6 GHz, 1.4 GHz, etc):
Thermal Design Power: 24.5 W (Full speed) / 6 W (Speedstep)
Sleep Power: 1.7 W
Deep Sleep Power: 1.1 W
Deeper Sleep Power: 0.55 W
Dothan (any model #):
Thermal Design Power: 21 W (Full speed) / 7.5 W (Speedstep)
Sleep Power: 3.2 W
Deep Sleep Power: 2.5 W
Deeper Sleep Power: 0.8 W
Core Duo (any standard power model #):
Thermal Design Power: 31 W (Full speed) / 13.1 W (Speedstep)
Sleep Power: 4.7 W
Deep Sleep Power: 3.4 W
Deeper Sleep Power: 2.2 W
The Pentium M chips were a step towards lower power, but the Intel Core Duo that ships in the imac is the first chip that is really ahead of AMD for mobile systems.
Again, False. The first part of that sentence has already been proven false with the numbers I've posted. The second part of your AMD fanboy'ism is also incorrect. AMD offers two TDP ranges in their "Lancaster" single core Turion64 mobile processors: 25 watts and 35watts. As you can see with the data presented above, both of these TDP's are larger than Intel's single core Pentium M offerings which have been available since March 2003. AMD's Turion didn't even arrive on the scene until 2005 which gives Intel a solid two year headstart. What's even more interesting is that more than half of AMD's entire single core Turion line consumes more power than Intel's dual core Core Duo mobile processors. AMD has yet to release their dual core Turion processors. So your statement that the Intel Core Duo is the "first chip that is really ahead of AMD for mobile systems" is complete wrong. Intel has had AMD beat since March of 2003 in the mobile market and still continues to beat it. Please check your facts before posting lies or put an AMD fanboy disclaimer on your posts.
Note: I didn't both including Intel's various Low Voltage and Ultra Low Voltage Pentium M, Core Solo and Core Duo processors that have an even lower TDP than the standard voltage processor numbers I posted above. Adding this information would only serve to futher prove that your statements are wrong.
Re:Pentium-M (Score:3, Informative)
Lower power consumption yes, Athlon64 performance? Not yet. According to this roadmap [theinquirer.net] the highest clockspeed Yonah core is slated for January 2006 release and it's 2.16 GHZ. Now, Anandtech did some tests of the 2.0 GHZ [anandtech.com] core. This 2.0 GHZ core is barely able to reach Athlon64 3800+ X2 performance levels (the "slowest" AMD dualcore CPU, 90nm vs 65nm of Yonah). A 2.16 GHZ version should reach the 4200+ X2 and that's about it. Yonah is a nice CPU, but nowhere near top AMD performance. Maybe with higher clockspeeds in the future, not today though.
Re:Here's some irony for you to chew on! (Score:3, Informative)
Re:When did this change? (Score:4, Informative)
Re:Brainiac design (Score:3, Informative)
It's been many years since fast "RISC" chips were simple. It's very straightforward to design a RISC CPU that executes one instruction per clock. I once met the design team for a midrange MIPS CPU, and it was about 15 people. The design team for the Pentium Pro (Intel's first superscalar, and the innards of the Pentium II and III) was over 3000 people.
Once Intel could execute more than one instruction per clock, the RISC people had to catch up. And that meant all the complexity of a superscalar CPU. The advantages of RISC then disappeared - it wasn't simpler any more.
PowerPC CPUs have been superscalar all the way to the PowerPC 601, in 1992.
Re:Pentium-M (Score:4, Informative)
Today's Core Duo is a laptop chip, and it's already competing with AMD desktop chips. Imagine how Intel's high-end desktop chips will perform when released this fall.
Freescale 8641D (Score:4, Informative)
Re:When did this change? (Score:3, Informative)
I agree with you about intel's future, unless maybe they come up with a decent bus (sometime next year) or just license AMD's :)
Why Intel? Jobs wanted Intel. Why now? OS 9. (Score:5, Informative)
The ISVs, paricularly Adobe, plotzed. There was a major row with threats of abandoning the platform, and Apple backed off, improved Classic, came up with Carbon as a transition API, and brought out OS 9 and eventually OS X.
Steve Jobs reportedly had wanted to go with Intel as soon as possible. He thought Apple had made a mistake switching to the Power PC while he was away at NeXT. OpenStep ran on Intel, of course, and Apple had versions of Rhapsody that ran on Intel boxes, even on generic clones. They had a fat binary mechanism in OpenStep that supported by the end as many as five different processor architectures.
And that's why intel. Not because IBM screwed up, but because it was in their long term roadmap and had been for years.
But obviously... that wouldn't fly if they couldn't even cram classic Mac OS off in Blue Box.
But they kept their Intel code base alive, and every other year, about, they tested the waters by trying to stop offering a Mac that could boot up into OS 9.
Every time there was a user revolt.
Until late 2004. The last G4 that could boot to OS 9 disappeared from the Apple store, without any fanfare. And, apparently, there just weren't that many people dependent on OS 9 to make enough noise to notice.
A little over 6 months later, they announced the Intel switch.
Rosetta will run all legacy Power PC applications... well, all legacy Carbon and Cocoa applications that run on OS X. They're not running Classic under Rosetta. Classic is dead.
And nobody's bitching about that, either. Which means they guessed right, and Apple can finally drive a stake into the heart of Classic Mac OS and leave it behind for good.
And that's why they did it now. Because they could.
Re:If they don't know.... (Score:1, Informative)
Vincent: And you know what they call a... a... a Quarter Pounder with Cheese in Paris?
Jules: They don't call it a Quarter Pounder with cheese?
Vincent: No man, they got the metric system. They wouldn't know what the f*** a Quarter Pounder is.
Jules: Then what do they call it?
Vincent: They call it a Royale with cheese.
Jules: A Royale with cheese. What do they call a Big Mac?
Vincent: Well, a Big Mac's a Big Mac, but they call it le Big-Mac.
Jules: Le Big-Mac. Ha ha ha ha. What do they call a Whopper?
Vincent: I dunno, I didn't go into Burger King.