Apple, ARM, and Intel 246
Hugh Pickens writes "Jean-Louis Gassée says Apple and Samsung are engaged in a knives-out smartphone war. But when it comes to chips, the two companies must pretend to be civil because Samsung is the sole supplier of ARM-based processors for the iPhone. So why hasn't Intel jumped at the chance to become Apple's ARM source? 'The first explanation is architectural disdain,' writes Gassée. 'Intel sees "no future for ARM," it's a culture of x86 true believers. And they have a right to their conviction: With each iteration of its manufacturing technology, Intel has full control over how to improve its processors.' Next is pride. Intel would have to accept Apple's design and 'pour' it into silicon — it would become a lowlymerchant foundry.' Intel knows how to design and manufacture standard parts, but it has little experience manufacturing other people's custom designs or pricing them. But the most likely answer to the Why-Not-Intel question is money. Intel meticulously tunes the price points for its processors to generate the revenue that will fund development. Intel's published prices range from a 'low' $117 for a Core i3 processor to $999 for a top-of-the-line Core i7 device. Compare this to iSuppli's estimate for the cost of the A6 processor: $17.50. Even if more A6 chips could be produced per wafer — an unproven assumption — Intel's revenue per A6 wafer start would be much lower than with their x86 microprocessors. In Intel's perception of reality, this would destroy the business model. 'For all of Intel's semiconductor design and manufacturing feats, its processors suffer from a genetic handicap: They have to support the legacy x86 instruction set, and thus they're inherently more complicated than legacy-free ARM devices, they require more transistors, more silicon. Intel will argue, rightly, that they'll always be one technological step ahead of the competition, but is one step enough for x86 chips to beat ARM microprocessors?'"
Complicated Story (Score:2, Insightful)
I can't find the angle here.
"Legacy Free" vs "Costs".
"Legacy Free" is a nice sounding term for "won't run $hit". So much for your 1,000 app and app-lets you rely on, Business.
So I give up on this story and will let the rest of y'all thrash it out.
Re:Complicated Story (Score:4, Interesting)
Won't run shit is interesting. With Windows forking into an ARM and x86 (or AMD64/IA64 whatever want to call it) versions, the writing may be on the wall for Intel. If one of the ARM guys can produce chips that will do the 150-200 dollar price bracket as well as Intel chips can on windows this becomes a whole other ball game.
I'm not sure where anywhere near there yet. But with Qualcomm feasting on the remains of AMD, Samsung producing millions of parts a year and a few others with them it's entirely possible that within the next 10 years ARM will be a major competitor to x86. Which is why MS is forking - it's going to confuse the hell out of consumers and is, from an end user perspective a terrible idea to go out and buy a Windows RT anything on friday (windows 8 launch day) but MS plans to support their ugly bastard for a long time, so who knows. And in 3 or 4 years when we see Windows 9 roll around we may have enough software that has been compiled and for and runs on both that your 'won't run shit' assertion would no longer apply.
Re: (Score:3)
With Windows forking into an ARM and x86 (or AMD64/IA64 whatever want to call it) versions
Windows dropped IA64 support, like it did PPC, Alpha and MIPS before.
Re: (Score:2)
Right, I confused the Itanium instruction set with whatever intel brands its 64 bit ISA.
Re: (Score:3, Insightful)
Right, I confused the Itanium instruction set with whatever intel brands its 64 bit ISA.
Just remember, ia64 == iTanic == shit sandwich, amd64 is where it's at. Which is why it's so heartbreaking to see AMD so far to the rear in terms of performance today.
Re: (Score:2)
If the queen had nuts she'd be king.
Re: (Score:2)
ARM is competing with itself. With all those companies making ARM chips they have significant price competition which will lead to reduced R&D budgets. Meanwhile if you want top performance there is only one game in town and they get to charge top doll
Re: (Score:3)
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
Windows Tablet PRO-- key word
Windows Tablet is gonna be both ARM and X86/X64. Which is gonna cause more confusion when son says "Buy Windows Tablet Pro" and Dad buys "Windows Tablet" (not pro) and it won't do what he wants.
Re: (Score:3)
Oh, I see. So I can run my Win 8 Pro apps on my Win 8 RT tablet? No. Can I run my Win 8 Phone apps on my Win 8 RT tablet? No. In RT, Microsoft has created an environment completely distinct from both their desktop and phone platforms.
Ya lost me, Microsoft. Why would I want three separate platforms between my desktop, mobile, and semi-mobile devices? This doesn't make sense. With either Android or iOS, I can have the same apps on my phone and my tablet. With "real" Windows 8, I can have the same app
Re: (Score:2)
Of course that might not be what you want, but that is what it offers the consumer. You get a truly productive tablet, that is cheaper than Windows 8, ligther, and more secure (since it is more locked down, hopefully hacks would occur less and malware shouldn't be as frequent in the Windows
Re: (Score:2)
What does RT provide me as a consumer? As far as I can tell, it just adds complexity.
Well, iOS doesn't run legacy software and yet people seem to enjoy running it. So on day one RT will be dumb. But let's say 3 years from now when there may exist non crappy apps, you'll be able to run the same non crappy RT apps on a desktop as well as a portable device.
Re: (Score:2)
That wooshing sound is my point going over your head. Nothing I said had anything to do with the number of apps available.
iOS = Phone and tablet.
Android = Phone and tablet.
Full Windows 8 = Desktop and tablet.
Windows 8 RT = Tablet only.
With three of those platforms, you get two devices sharing a common base of applications. With one of those platforms, there is no crossover. Windows 8 RT does not share an application base with the phone. It does not share an application base with the desktop. It complim
Re:Complicated Story (Score:5, Insightful)
It doesn't give you a smaller desktop. You have an app for your phone, a different app for your desktop, and yet another app for your tablet. Windows Phone and Windows RT are orphans. Do you see what I'm saying now?
Yes, I think I get what you're saying. Thank you for your clarification; perhaps I can make some of my own.
When iOS was first released (2007) it didn't run any sort of legacy programs from existing touch screen smartphones, and yet people really seemed to enjoy using those devices. When Android was first released, and was still just a phone OS, people seemed to enjoy using those devices. When the iPad first came out it could run existing iPhone apps; but even Apple said that the experience for many apps wasn't good and that app authors needed to (and still do) optimize their apps for the larger screen. There's hardly an article on the web about Android tablets that doesn't mention how there's a pithy of tablet optimized apps, and that while the tablets can run all Android apps, most of them really suck on the tablet. So I don't think that not having the ability to run legacy programs is a nail in the coffin for any new device/platform.
So right now a Windows RT computer does have limited appeal, due to not having much of an ecosystem. That was kind of my point by saying "3 years from now", the ecosystem may come. The ecosystem may come easily because if developers write apps for the Windows 8 Store (targeting desktop) the apps will also light up on Windows RT; and they will light up without any "tablet optimization" step that iOS and Android apps suffer from. The screen sizes are the same. So I think that there will be crossover soon, at least from the point of view of the end user; because they'll be able to find the same apps on a Windows RT computer as they do on their Win8 laptop/desktop.
If RT could run Phone apps, I wouldn't be typing right now.
Yes, Microsoft doesn't have a runtime that runs across Win32, .Net, WinRT, Phone and Xbox. But they do have portable assemblies [microsoft.com] which do allow for an assembly to run within .Net, WinRT, Phone and Xbox. So someone would still need a separate app/program for the different runtimes, but if the business logic is the same for all of the apps there only needs to be one portable assembly.
Re: (Score:3)
Microsoft's branding around Windows sucks abysmally, but that's no excuse for just getting it wrong. The correct information is easy to find. Please stop muddying the waters with brand names that outright do not exist.
There is no such thing as Windows 8 RT. Windows RT and Windows 8 are very nearly identical, but Windows RT is expressly *not* marketed as Windows 8.
There's no such thing as Windows 8 Phone. Just like there was no such thing as Windows 7 Phone. Windows Phone is the name of the OS family. Window
Re: (Score:2)
Right, I confused the Itanium instruction set with whatever intel brands its 64 bit ISA.
Duplicate comment because multiple people corrected the same thing in my post...
Re: (Score:2)
"Legacy Free" is a nice sounding term for "won't run $hit". So much for your 1,000 app and app-lets you rely on, Business.
I think that's less of a problem in the cell-phone market than in the desktop market.
In the cell-phone market, increasingly there is an App Store type service that automatically upgrades people's installed applications as necessary, so the onus is no longer on the user to do the work.
Re: (Score:2)
Nice Markov Chain generator. (Score:2, Offtopic)
Wow, that last article looks like a really good Makov Chain [wikipedia.org] generator (or whatever the kids these days are using).
The War Between Intel Core and ARM (Score:5, Funny)
The war between CORE and ARM raged across thousands of worlds, ravaging the galaxy. Neither would waver in their belief in their own supremacy. For each side, the only acceptable outcome is the complete elimination of the other.
Re: (Score:2)
And off to the side all but forgotten, APUs bid their time...
Re: (Score:2)
(now you got me thinking, I had 16MB of RAM back then, now I have 16GB... time to see if building a 50,000 unit army would work!)
Re:The War Between Intel Core and ARM (Score:5, Informative)
The two factions of Total Annihilation were the Core and the Arm: http://en.wikipedia.org/wiki/Total_Annihilation [wikipedia.org]
"Genetic Handicap" (Score:3, Interesting)
"For all of Intel's semiconductor design and manufacturing feats, its processors suffer from a genetic handicap: They have to support the legacy x86 instruction set, and thus they're inherently more complicated than legacy-free ARM devices"
Oh shut up. This argument comes up every time there's an ARM vs Intel debate. And you know what? Intel is pushing hard and successfully into ARM's territory and ARM has yet to hit back with any chip that can compete with Intel in servers or high end laptops or etc. And that's WITH Intel's huge profit margins. ARM certainly doesn't have the profit margin's to spare in any price war. Intel is a huge monster to defeat, and its supposed handicap means far less worry for programmers, unlike trying to support the million and growing ARM SOCs out there and the nightmare that is.
Re: (Score:3)
Are you really so ignorant as to not recognize that ARM isn't in the 'server or high end laptop' world?
They're cheap and low power. Perfect for small mobile devices.
However, I really don't understand why Intel won't play both sides of the fence. Why not build an AMD line/factory to offer both types of chips. Take away business from competitors. Get the past to pay for the future.
Re:"Genetic Handicap" (Score:4, Informative)
I do believe that ARM is not a chip or even a product. It's an architecture that is licensed by others. This means that the company behind ARM makes money on every chip regardless of price. They don't care if it costs $17 or $170 to manufacture and distribute, they have little overhead so it's almost all profit at this point.
Intel OTOH sells chips. They a much higher amount of manufacturing and sales costs.
Re: (Score:2)
ARM is killing all the other companies in a blood bath of competition. Meanwhile Intel is absolutely dominating in fabrication. The foundries don't feel a need to compete with Intel because x86 isn't their market so they don't feel the need to worry.
x86 vs ARM, what about atom (Score:5, Informative)
Intel's published prices range from a 'low' $117 for a Core i3 processor.
What about atom? You know, the processor produced by Intel, specifically for the same markets that ARM are dominating now.
Re:x86 vs ARM, what about atom (Score:5, Informative)
Well, sort of, but not really. (Score:5, Informative)
Intel has made ARM processors in the past (xScale [wikipedia.org]), and, apparently, still retains an ARM license. Intel has manufactured RISC chips, as well (i960, for example). There is absolutely no reason why Intel wouldn't/couldn't produce an ARM chip, if they wanted to. There's just no reason to do so.
Also, using the Core i3 as an example of Intel's "low-end" is not very fair. Intel's low-end chips are the Pentium and Celeron, not the i3. The Atom is the closest thing to a competitor to the ARM chips. Pricing for Atom chips varies extensively, from $20 to $100, depending on features,
Re: (Score:2)
Intel has made ARM processors in the past (xScale [wikipedia.org]), and, apparently, still retains an ARM license.
They were crap though. I have an XScale-based PDA lying around somewhere. They were truly the Netbursts of the ARM world: high clock speed and power consumption but low performance.
Re:Well, sort of, but not really. (Score:4, Interesting)
Compared to the Samsung arm chips at the same time the xScale blew the doors off them in performance clock for clock, and at that time no one did well with power consumption except when asleep.
Re:Well, sort of, but not really. (Score:5, Interesting)
Intel sold the ARM license to Marvell who owns the architectural license to it. Intel does re-license back the Xscale core for some of their networking processors though.
As for Xscale being crap - back in the day, StrongARM and Xscale were the top of the line - the PXA255 being one of the fastest ARM chips around. The next-generation chip was supposed to be even faster, but Intel sold it to Marvell who doesn't seem to have done anything with it.
While StrongARM was pushing 200MHz, other ARMs were barely breaking 133MHz and not very fast at it. When the PXA255 upped it to 400, it was no competition. Then ARM decided they had enough of being outclassed by Intel and designed some decent ARM11 cores and continued onward with the Cortex series.
Ironic (Score:5, Interesting)
If Gassée is right about "architectural disdain" then it's kind of ironic. Intel itself exhibited the same disdain for x86 architecture when they initially refused to make their first 64-bit chip, the Itanium, backward compatible with it. It was only after AMD demonstrated that the architecture still had legs that they brought it to the 64-bit world — after wasting billions on Itanium development.
Those that forget history, yada yada.
Re: (Score:2)
It's ironic that you posted that ironic comment, as it's ironic that Gassée would be right after being so spectacularly wrong about a similar topic.
"I once preached peaceful coexistence with Windows. You may laugh at my expense - I deserve it."
-- Jean-Louis Gassée, CEO Be, Inc.
Re: (Score:2)
It's ironic that you posted that ironic comment
Huh?
Re:Ironic (Score:4, Informative)
You have that completely backwards. The first Itaniums WERE backwards compatible with IA-32 (x86) at the hardware level. It was later Itaniums that ditched backwards compatibility in favour of the software based IA-32 Execution Layer.
what am I missing? (Score:4, Interesting)
Apple's the one currently manufacturing their A6 chips for $17, while the comparable Intel chip retails for much more?
Isn't this more a statement of how well Apple's vertical integration of chip manufacturing went?
Re: (Score:2)
Well for starters, speed, flexibility, speed, speed, flexibility and speed.
DESTROY ALL LEGACY (Score:3)
Wow, ARM people are just like Java cultists... calling everything else legacy.
Re: (Score:2)
Microsoft has long referred to anything not-Microsoft as legacy. It's nothing new to absolutists.
Inflammatory story... (Score:5, Informative)
It's been pretty much proven that the "x86 legacy baggage" or however you want to put it does not seriously affect Intel's Atom for phones.
http://www.anandtech.com/show/6330/the-iphone-5-review/10 [anandtech.com]
Razer i, which has an Atom processor, beats A6, the best performer in the ARM field, most of the time in non-GPU tasks (one area it is lacking is GPU power), while power consumption is average for a phone. Android adds additional overhead not present in iOS, too.
If anyone can work miracles and cram x86 into a phone, it's Intel. As ARM designs have to start dealing with greater complexity, Intel can apply their immense experience with x86 and improve performance without dramatically increasing power consumption.
With some more work, I can see Atom beating the hell out of any ARM design in the same power envelope. I'll give it one or two generations.
Re: (Score:2)
That's some serious ass whipping. No power numbers though. At 7B per a new FAB, just wondering who has the muscle to compete with Intel.
Re: (Score:2)
If anyone can work miracles and cram x86 into a phone, it's Intel.
Then why haven't they yet? It's not like they haven't been trying for years. Why did Apple (and Google) have to create the market that MS and Intel now feel they need to invade?
Intel is the challenger here, and ARM has proven itself several billion times over.
Re:Inflammatory story... (Score:5, Interesting)
"Trying" is probably an overstatement in this case. Intel has a well-devised plan to get there, but it's a plan that involves them taking one step at a time. First they needed the Atom CPU design, then they needed to get it integrated into a true SoC, then they need to integrate their own GPU, etc.
Intel Atom roadmap [anandtech.com]
Silvermont is where Intel makes their architectural leap over ARMv7 (Cortex) with the new Atom architecture coupled with Intel's own, higher performance GPUs. Then in 2014 Intel does Airmont, where Atom gets promoted to first-class status in Intel's fabs, jumping to new process nodes at the same time as Core. If all goes to plan, at this point Intel will be roughly a node ahead of the competition with an architecture as good as or better than any planned ARMv7 designs. This is the tick-tock strategy in full swing, the same strategy that is currently bludgeoning AMD to death.
So Intel may be the challenger here, but never underestimate them. Their fabs are unrivaled and they can afford to hire some of the best architects on Earth. If Intel does their homework and doesn't screw up, they're a very dangerous foe. The only place Intel can't (or won't) go is into low-margin products, and as bad as competition from Intel would be, the ARM partners don't want to sacrifice their margins too much just to scare off Intel. It would be a Pyrrhic victory.
Re: (Score:3)
But who is going to buy their chips?
Not Apple. Not Samsung.
Intel doesn't have a major buyer lined up. They'll have to license the architecture out for pennies or make a new market.
Could be a case of too little too late unless they are willing and able to do low margin, high volume.
Genetic disadvantage? Hardly (Score:5, Informative)
Intel and AMD x86 processors moved on to using micro-ops and risc like operations internally years ago. The only disadvantage nowadays is a small translator that converts x86 machine code into micro-ops. Compared to the actual logic or cache on the cpu the number of transistors that the translation takes is minimal and not a big deal especially when you consider the size of cpus nowadays.
Re: (Score:2)
EXACTLY. ARM's architecture may provide a slight advantage for low-power use compared to x86. But it's very, very slight. Certainly, Intel's advantage in process technology would outweigh ARM's advantage in architecture. The only real reason x86 hasn't competed with ARM so far in very-low-power is that no one has tried hard enough. There's finally enough demand for higher-end low-power chips that Intel is taking notice. I think Intel is also taking notice because they don't like seeing an ARM-based so
Comment removed (Score:3, Insightful)
Intel should preparel x86 replacement (Score:4, Interesting)
Here is a 50$ ARM general purpose multicore-CPU example for matching 999$ performance of fastests Intel Core i7 (e.g. i7-3770K 3.9GHz (peak), 4 CPU, 8 threads, 2 SIMD ALU/CPU = 8 SIMD ALUs = 64 FLOPs/clock -> 3.9*10^9Hz * 64 FLOP/s = 249.6 GFLOPS [intel.com]:
For increasing integer and load/store performance, it could be achieved with pipeline and issue/execution modifications, using more functional units. The limit is to keep the OooE simple enough for avoiding wasting transistor in executing tons of instructions unnecesarily.
Explain this one to me... (Score:2)
Intel has some of the best (the best?) fabs in the world, and has chips that use a smaller process than what other companies are pushing out, right? So why can't they make a small, power-efficient chip that can at least meet (if not beat) the offerings from ARM and the licenses?
To put it another way, Wikipedia tells me that ARM Holdings has 2,000 employees and a revenue of about 490 million pounds (in 2011). Intel has 100,000 employees and a 2011 revenue of 54 billion dollars (about 34 billion pounds). How
Re: (Score:2)
Intel has some of the best (the best?) fabs in the world, and has chips that use a smaller process than what other companies are pushing out, right? So why can't they make a small, power-efficient chip that can at least meet (if not beat) the offerings from ARM and the licenses?
From what I've read on AnandTech [anandtech.com], low power Haswell chips might meet your criteria, which are due out the middle of next year. I'd be very surprised if Broadwell (the 14nm die shrink of 22nm Haswell) doesn't.
Re: (Score:2)
SO, they promisse that the chip they'll release next year can compete with the chips ARM is selling* now? (And, yeah, that would be the first time Intel overpromissed on power consuption... Forget about last year, and the year before that, and...)
* Ok, ARM doesn't sell, I know. but they are getting done now, with inferior processes, bigger feature sizes and are still competitive with the offering Intel has for the next year.
that old canard about x86 complexity (Score:4, Insightful)
The old 80386 based on the "complex" x86 instruction set had 275,000 transistors. Intel is now making chips with 2.6 billion transistors and somehow what they once implemented as one functional unit within a budget of 0.000275 billion transistors is holding them back?
Certainly they would rather do a few things differently had they been worried about 2013 back in 1978. Transistor count is the least of the matter. What buggers up x86 is the number of active transistors handling the instruction stream at each instruction cycle. There's no way to align variable-length instructions without active transistors (regardless of whether the transistors involved amount to a wart on a small toe of a juvenile mosquito).
The x86 story bugs the hell out of me. Considered how well it actually held up for 45 years (and counting) it's one of the ugly duckling success stories of all time (hint: it wasn't so ugly after all).
It was also a founding member of the Steve Jobs reality distortion field. I'm concerned his posthumous aura will continue to glow with the uplift of falsehood. He should be credited more for what he accomplished than the lies he polished to get there.
It wasn't just Steve, it was the entire RISC consortium manufacturing an Achilles heel out of whole cloth. Far closer to the truth of the matter is that x86 has a much higher design cost than an orthogonal clean-sheet alternative. The design cost was a small multiple. Intel's resources were a large multiple. It didn't go well for RISC. The much vaunted DEC Alpha had a metal connect layer for single-cycle carry-add propagation that forever segregated it from the mass-consumer price point. It was the instruction set. No, it was the instruction set aided by a titanium stent.
Also, the RISC design advantage does not extend to the memory cache and system bus design. These are a bear to design well for any instruction set. The RISC people moaned about the exceptional Pentium Pro performance level on server workloads (it was the first memory bus from Intel that didn't totally suck). Well, Intel broke into the server market with their crappy old x86 instruction set by grafting it onto a titanium alloy cache hierarchy and bus controller (with multiple dies grafted into the same chip package at enormous expense). Cache latency and branch prediction absolutely dwarf instruction set as the big thing to worry about since around this time. If Steve hadn't grabbed onto the inferiority of CISC around this time, it might have died a timely death.
In low power applications, ARM has a real advantage, enough to win a huge market share at race-to-the-bottom price points. How much does the cost of a CPU influence a handset? How much everything else? I've put $300 Intel CPUs in $2000 boxes. I've put $250 Intel CPUs in $1000 boxes. I've put $60 CPUs in $500 boxes. A $16 CPU in a phone that retails for $600 for just a few months, before landing in the discount bin? I'm sure Intel wants a huge slice of that.
One reason Intel has held their ground is that the Cortex-A15 (out-of-order superscalar multiprocessor) is starting to look a lot like the old Pentium Pro. Sure the instruction set is modern and clean (though it took ARM surprisingly long to come up with the mixed 16/32 bit instruction encoding format due to misguided ideological purity; how many active transistors does it take to determine whether the next 32 bit chunk from the instruction stream is one lump or two? More or less than the number of active transistors in the icache devoted to storing common instructions bloated to 32 bits just because?). But all the rest of the issues are pretty much the same: branch prediction stalls, cache snooping, and memory path latency.
From Intel's perspective, an ugly instruction set is good for business. (Then they went on a jag thinking that if ugly is good, atrocious is better, and the Itanium was hatched with a jackhammer from a mastodon egg.)
After another three die shrinks, when half the processor implements on-demand power management, and most of the other half provides task-specialized execution units, is the instruction set going to matter a hill of beans for anything other than legacy lock-in?
Re:that old canard about x86 complexity (Score:5, Interesting)
Far closer to the truth of the matter is that x86 has a much higher design cost than an orthogonal clean-sheet alternative.
True. Years ago I went to a talk where the head of the Pentium Pro design team showed a graph of the number of engineers working on the project. It peaked around 3,000. Nobody had ever had a CPU design team that big before.
The variable length instruction alignment problem of x86, although ugly, isn't a huge consumer of transistors. AMD dealt with it by expanding instructions to fixed length when loaded into cache. Intel dealt with it by sometimes starting ambiguous cases in parallel and discarding the bogus results later. The downside of fixed-length instructions, as in RISC machines, is code bloat - PowerPC code is about twice as big as x86 code, which impacts cache miss rate.
While one instruction per clock RISC CPUs (low-end MIPS and DEC Alpha parts, and the Atmel AVR series are examples) are simple, superscalar machines executing more than one instruction per clock are almost as complex as x86 CPUs. That's why RISC stopped being a win.
Harry Pyle was developing the instruction set [computerhistory.org] for the Datapoint 2200 in his dorm room at Case Tech in Cleveland in the late 1960s. Same building I was in; different floor. That led to the 8008 and the 8080 and the 80286 and the 80386 and ...
It's all about code parallelism (Score:3)
For a long time, single-core applications were the rule so the CPU Mhz race was on. Once that ended around 3Ghz, the pressure was on for programmers to make computer code better at dividing the load between multiple cores.
It turns out that ARM does well with lower frequencies, and delivers the best performance per watt ratio. Also, it turns out that once all your code is written for 2, 4 or 8+ cores, it doesn't matter much if your cpus are clocked at 1.3Ghz (A6/Snapdragon) instead of 2.6Ghz (i7 in macbook pro 2012).
And if you're doing mobile, where battery life is a big factor, you need the ppw ratio more than anything, so you go ARM.
On mobile, Intel is in a similar situation now that they were against AMD back in the AMD64 days. Their current models (atom) are inferior but competitive. They are dominating servers and desktops which gives them a secure base to experiment from and I expect their mobile offerings in the next 5 years to bridge the gap with ARM.
Will they win? I have no clue. They might crush ARM or become the premier ARM licensee with the best ARM chips. Either way, Intel is going to lead.
Apple doesn't want to be *more* dependent on Intel (Score:5, Insightful)
Intel wants to be the only company that can meet your needs. That way, they can make you pay premium prices for their chips. This is perfectly understandable; that is what is best for Intel.
Apple wants to be vertically integrated. They want full control over everything they do. Partly this is so they can keep as much as possible of the money they collect; partly this is so that they can guarantee excellent quality and excellent availability. This is what is best for Apple, and it isn't bad for their customers either.
Intel does not want to become just another ARM source, competing on price with all the others. But Apple will never lock themselves in to depending on Intel for mobile chips, when ARM chips have been shown to be more than adequate. And Apple would not be investing in custom ARM chips if it was planning to adopt Intel mobile chips.
People keep pointing out that Intel's mobile x86 chips are competitive with ARM. That won't cut it. Intel's chips would have to be better, and so much better that the risk of depending on Intel is worth it.
That was the case for the PowerPC to x86 transition! Intel's chips were so much better than PowerPC for laptops that it was worth getting into an entangling relationship with Intel. AMD was not able to guarantee delivery of the massive quantities of chips Apple was planning to sell, and Intel was, so AMD wasn't really an option... but at least they served to keep Intel from trying to charge totally outrageous prices for their chips; there was always a credible threat of going to AMD.
Hmm. It's looking like AMD is going to crater in spectacular fashion soon. I wonder if Apple will make a serious attempt to buy what's left of the company. That would enable Apple to make its own x86 chips! Eh, probably not. AMD is behind Intel on process, so switching to AMD chips would mean taking a hit on performance, power use, or both.
The "SemiAccurate" web site thinks that Apple will transition to using ARM chips for laptops [semiaccurate.com], not just for mobile devices, once ARM chips are good enough (which they will be soon). So, transitioning away from x86 and to, say, multi-core 64-bit ARM chips is another way Apple can untangle from Intel.
Apple may not be in a big hurry to actually complete the transition away from Intel chips; just a credible threat of switching to ARM chips might be enough to negotiate good prices on x86 chips. That would leave lower power consumption as the main reason to go to ARM, but a laptop's display is probably the worst power drain, especially with a Retina display.
steveha
Re:Apple doesn't want to be *more* dependent on In (Score:4, Insightful)
Also note that Apple has people paying $2500 and up for the Mac Pro, and $1000 and up for laptops. But mobile devices are closer to $500, and the Android competition is hitting the $200 price point.
There just isn't as much room to pay top dollar prices for Intel parts in the mobile space.
So even if Intel mobile x86 parts are slightly faster than the ARM chips, will Intel be happy selling at prices competitive with ARM prices? History suggests "no". The cheapest Atom chips are around $20 but Intel makes those suck, just as much as Intel can get away with.
Intel is the master of segmenting markets. Different chips at different price points have different features enabled. Cheaper chips are as crippled as possible, to encourage you to buy a more expensive chip. For example, Intel doesn't support virtualization features on their less-expensive chips; and Intel mostly reserves support for ECC RAM to only the Xeon processors.
(In contrast, AMD puts full functionality in all their parts; they are #2 and they are trying harder [follisinc.com] to please the customer. That is how you can get an HP Proliant MicroServer with a 1.5 GHz dual-core AMD Turion processor for $320 at Newegg [newegg.com], with full support for virtualization and ECC RAM. I cannot imagine a MicroServer with equal or better Intel parts hitting that price point.)
Intel will try to balance the functionality it allows into the mobile chips against the price it can get. Apple just wants the best chips for the cheapest price. These two goals are not in alignment.
Tis a fool.... (Score:2, Interesting)
Tis a fool who looks for logic in the chambers of the human heart. Or from Cupertino. And that's not a dig, Apple fans, that's just the truth. Apple will dump Intel when they feel like it, for reasons that they alone decide.
Apple is a bit like the interrogator in 1984. They believe that can levitate off the ground and float around the room should they choose to, and what the outside world thinks makes no difference at all.
Re: (Score:2)
Tis a fool who looks for logic in the chambers of the human heart. Or from Cupertino. And that's not a dig, Apple fans, that's just the truth. Apple will dump Intel when they feel like it, for reasons that they alone decide.
Apple is a bit like the interrogator in 1984. They believe that can levitate off the ground and float around the room should they choose to, and what the outside world thinks makes no difference at all.
This.
It's Intel looking at the big picture. Samsung was one of Apple's biggest suppliers, look at what Apple tried to do to them (although it did backfire horribly for Apple, you cant count on that happening every time). Apple is turning out to be a riskier partner than Microsoft was.
If you can figure out ... (Score:2, Insightful)
There is so much sub rosa crap (not all of it ethical or legal) going on between the players we may never know the truth.
TSMC (Score:2)
Don't know what Jean-Louis is talking about, as there were press releases and everything not long ago about Apple ramping up production at TSMC foundries. Don't think they feel they need Samsung or Intel for their ARM production.
Replay of the mainframers (Score:3)
This could be a replay of the old days of mainframes. At more than one company, the engineers came up with mainframes on a desk, but the marketers could not see selling a desktop mainframe at the old 7-digit prices. So they just making the big boxes, til their eventual death. This happened to CDC, Data General, Digital, and Perkin-Elmer to name a few. Intel will undoubtedly survive, but it could be a long painful decline or change of direction. The "new architecture" fanatics there probably don't have much traction after the Itanium disaster.
Re: (Score:2)
For TTL?
Re: (Score:2)
TSMC, Global Foundries ....
Apple is rumoured to be shifting production to TMSC to take advantage of their upcoming 20nm process for their quad-core chips.
http://cens.com/cens/html/en/news/news_inner_41728.html [cens.com]
Re: (Score:2)
And Qualcomm, and Globalfoundries. And I think a few more beyond that too.
Re: (Score:2)
TI has no modern fabs; they decided to get out of the fab business a few years ago.
Re:Long term (Score:5, Interesting)
What a bold prediction, you understand of course that Intel has buried every single competing architecture from the past? Intel has a process advantage, even if they have to spend 10% of their die on decoding/rearranging they still have a significant transistor lead by remaining a process ahead AND still use lower power. Not only that but because x86 is nothing more than an abstraction layer at this point the internal architecture of their chips is free to move with the winds of computing in the best direction for the balance of power use, processing capacity and weight. They've had almost 2 decades to improve this abstraction layer to the point of perfection.
People like you forget how long it takes to design and build a microprocessor. From design to hard silicon is almost 5 years. So the designs Intel releases this year were planned out in 2007. Given the ARM didn't start to make an impact (on Markets Intel considers themselves part of) until 2006-7 we are JUST starting to see an Intel design philosophy that emphasizes power as a critical function. Haswell is probably the first chip that Intel hasn't tried to tack power efficiency on add-on at tape out. I fully expect Intel to demonstrate that x86 under their lead has the ability to compete directly with ARM on their best footing, power consumption.
So watch and learn young padawan. Intel has the best process engineers in the business and if things in the foundry business keep going like they are (TSMC and Global Foundaries have both been very very late moving forward on process while Intel hasn't missed a stride) they are going to be two steps ahead on process in the next year or two and that would be an advantage not even the best ARM design could beat even if Intel bungles their design. I fully expect that if Intel wants it they could take the whole ARM chip market. The only reason they haven't up till now is it would destroy their margins. So we will watch them balance their designs to retain the high margin products and forgo the cheap. This could ultimately be their undoing but once power efficiency becomes a priority of their designs which begin with Haswell, Intel will be in a position to take the ARM chip market any time they want.
Don't ever discount the power of the foundry.
Re:Long term (Score:4, Insightful)
What a bold prediction, you understand of course that Intel has buried every single competing architecture from the past? Intel has a process advantage, even if they have to spend 10% of their die on decoding/rearranging they still have a significant transistor lead by remaining a process ahead AND still use lower power.
We'll see if that really will happen. But it's very true about the process advantage. If Intel chose to build and ARM CPU, it would blow the doors off all the other ARMs because they can build faster/smaller/lower-leakage transistors. But it might be more expensive, and the ARM market is very price sensitive. I imagine there are Intel people who have considered this carefully and concluded that they can't build an ARM processor that's cheaper AND better than the competitors.
Re:Long term (Score:4, Insightful)
You are forgetting 2 non technical matters.
A: Why let Intel 'win' so they can turn around and stuff higher prices clean up your backside? One supplier markers are best for the supplier.
B: Anti-trust. Once you start taking over entire markets all of a sudden you have even more government flashlights up your ass. Only a dumb person wants total control unless he has TOTAL CONTROL, otherwise powers stronger then you take too much interest in your daily work.
Re: (Score:2)
What a bold prediction, you understand of course that Intel has buried every single competing architecture from the past?
Except for ARM. That architecture has been dominating the mobile device space for some time now, despite serious efforts by Intel to displace it.
Re: (Score:3)
I also think that microsoft had as much or even more to do with this than Intel. Intel had better processor families than the x86 in the past ( like the 432 ) but due to microsoft's needs they never really had a chance. ( yes i know Microsoft dabbled in MIPS and Alpha support for a short time but they were never really serious about it as it fragmented their code base too much. )
The world is different now and microsoft does not dictate like they could before.
Re: (Score:2)
Lets do this again in 10 years and see where the chips fall and who was right.
Re: (Score:2)
What a bold prediction, you understand of course that Intel has buried every single competing architecture from the past?
I forgot to mention that ARM hasn't been 'buried' or we wouldn't be having this discussion.
SPARC isn't gone either. ( basically a modernized MIPS, which was another wonderful architecture, from an engineering standpoint )
Just realized... (Score:4, Interesting)
Intel is not playing the same game. With everyone else (except AMD) making ARM devices on older process nodes, Intel should not make ARM chips because that would create the perception of competition and force TSMC and GF to advance their process. So long as all the foundry customers appear to be competing with each other it looks like a close race and there may be less pressure to advance. The further Intel stays away from their products, the less those guys will feel like they are competing with Intel and they will not worry about the process gap so much - they're still close to their "competitors" capability after all.
Everyone seems to have forgotten what business they're in. Those who can design have gone fabless while those who can fab now have more than enough customers to not care about process advancement so long as they can keep up with their perceived competition. In fact, all those customers probably slow them down with countless designs that each need scheduling and a design tweak or two. Meanwhile Intel turns that crank every 2 years. The longer people forget that they're all in the same business, the wider the gap is going to get.
Re:Long term (Score:5, Interesting)
That 10% becomes a lot more important going forward. The current buzzword in the semiconductor industry is dark silicon. To keep within the same thermal envelope (power dissipation per unit area), you need to have more transistors idle and in a low-power state in every subsequent generation. If you add complex vector instructions, for example, they're great because they give a big speedup when they're in use and draw almost no power when they aren't. The same with things like AES encryption. The instruction decoder, however, is something that you can't ever turn off. Xeons try to: they cache decoded micro-ops in tight loops, but this means that they have some extra SRAM for the micro-op cache and a micro-op decoder that must always be powered, and these between them take more power than an ARM decoder, and a big fat decoder that must be active all of the time.
Intel had an advantage over other RISC architectures (and Itanium) in terms of instruction density, which meant that they needed to waste a lot more die space with instruction cache than x86 to get the same fetch performance, but ARM is already about as dense as x86 and Thumb-2 is typically 5-10% denser, so Intel is on the losing side of this comparison for the first time.
The process advantage is something that Intel has had over AMD, but it's not something that they have to the same degree over some of the foundries that produce ARM chips. They're on 22nm, and the faster ARM SoCs are made on a 25nm process: that's nowhere near the kind of process advantage Intel is accustomed to. In terms of fab R&D, the industry is almost split into two camps, Intel on one side and everyone else pooling resources on the other side.
Re: (Score:2)
I disagree.
In general as CPU cores and their associated cache structure get bigger and more powerful the proportion of transitors devoted to instruction decoding goes down. So the complexity of decoding becomes less important and the density of the code becomes more important (because if means you can fit more code in cache). Afaict x86 does pretty well at instruction density. 32-bit x86 is register starved but 64-bit x86 doubles the register count (making it the same as 32-bit arm).
Re: (Score:3)
ARM has great instruction density... But it doesn't really matter, because you can power an ARM core and an instruction decompressor with less power than you needed for a x86 core, and zipped instructions have a much bigger density than x86, whatever architecture it is.
Anyway, the transistors on cache use much less energy thant he ones on the core. As cache becomes bigger, the core becomes less important, but not as fast as you imply.
Re:Long term (Score:5, Interesting)
Meanwhile I wonder which ARMs even have instructions like divide or reciprocal square root.
Re:Long term (Score:5, Insightful)
All right, let's compare more (Score:2, Insightful)
If you measure miles per hour, the horse will win. If you measure miles per hour per calorie, the snail will win. Now, which one would you like to pull your next carriage?
Re:All right, let's compare more (Score:5, Interesting)
Re: (Score:3)
Re: (Score:2)
Or which one would like to be on the menu of an up-scale restaurant?
Re: (Score:3)
In France, both.
Re: (Score:2)
If you measure miles per hour, the horse will win. If you measure miles per hour per calorie, the snail will win. Now, which one would you like to pull your next carriage?
A Nissan 370GT uses 11 L/100KM, a Nissan Micra uses 6.5 L/100KM. Both will do the same job but the Micra wont got 0-100 in 6 seconds. So it depends if I'm racing or saving fuel.
Re: (Score:3)
Re: (Score:3)
Also, getting your SoCs from Intel still means you have a single supplier that could squeeze you in the next contract, especially if they find they really like your SoCs. With ARM, you can shop around for suppliers and keep several if your runs are large enough.
Re: (Score:3, Interesting)
Haswell will be available for $20 in its underclocked ultra-low-power configuration?
Re: (Score:2)
As a former EE i can attest that the design of ARM is far more elegant than the abomination of the x86 line. The only reason they are 'faster' is they throw more transistors ( electrical power ) to work around fundamental design flaws.
ARM could easily scaled up to surpass x86 performance if power was no longer a factor, but it is so you wont see that happening.
Backwards compatibility with existing 'enterprise' apps, well you do have me there.
Re: (Score:2)
You get into RISC vs CISC arguments here. There are only sample quantities of 64-bit ARM at this juncture. There's really no good reference model when you start going that direction with ARM.
Intel isn't paranoid enough to survive, methinks, and will be left behind.
And the scale you'll see is already here; SeaMicro is doing it, and HP will do it if they can get Project Moonshot off the ground.
Transistors are one measure, an important one, but design ease and getting work done is another. Ultimately, power co
Re: (Score:2)
Intel isn't paranoid enough to survive, methinks, and will be left behind.
This is Intel. They are by far and away the most paranoid of all Silicon Valley companies. It's in their DNA. Their unofficial motto is "Only the Paranoid Survive".
Re: (Score:2)
I read the book. I see them not-dominating, not-getting original, making and squandering acquisitions, and generally living off the oil well in the basement called x86.
Tablets, smartphones, device control, embedded systems, graphics subsystems, all go some place called: not Intel. Innovation? Snore.
One. Trick. Pony.
Re: (Score:3, Interesting)
Anon Intel Employee here.
Very. Fucking. Paranoid
Re-look at some of the acquisitions from a "If we buy it, you can't" perspective and see if they make more sense
Re:Long term (Score:4, Insightful)
Re: (Score:3)
Long Term?
Apple should acquire AMD, and shift them to being primary supplier and ARMs dealer.
They will go for a song, while Apple had the highest market cap in history.
That's kind of stupid (Score:2, Insightful)
That's rather dumb considering AMD doesn't actually manufacture chips any more.
If AMD did make chips, Apple could get nice GPUs and license ARM cores. But then if AMD hadn't dropped the ball on manufacturing they might still be a viable company.
Re: (Score:3)
Why isn't Apple constructing their own chip fab?
That's actually a damn good question.
My only guess is because it still makes more economic sense for Apple to outsource chip manufacturing to a third party. Brand new fabs are about the opposite of cheap as it gets: on the order of $30,000-$50,000 per square foot of cleanroom for the factory alone, including the actual construction cost and tooling the factory, but not counting labor costs and engineering support needed to run the factory, and not counting annual upkeep (which probably approaches 10% of the