Forgot your password?
typodupeerror
Businesses Intel Apple Hardware

Apple, ARM, and Intel 246

Posted by Soulskill
from the dysfunctional-courtship dept.
Hugh Pickens writes "Jean-Louis Gassée says Apple and Samsung are engaged in a knives-out smartphone war. But when it comes to chips, the two companies must pretend to be civil because Samsung is the sole supplier of ARM-based processors for the iPhone. So why hasn't Intel jumped at the chance to become Apple's ARM source? 'The first explanation is architectural disdain,' writes Gassée. 'Intel sees "no future for ARM," it's a culture of x86 true believers. And they have a right to their conviction: With each iteration of its manufacturing technology, Intel has full control over how to improve its processors.' Next is pride. Intel would have to accept Apple's design and 'pour' it into silicon — it would become a lowlymerchant foundry.' Intel knows how to design and manufacture standard parts, but it has little experience manufacturing other people's custom designs or pricing them. But the most likely answer to the Why-Not-Intel question is money. Intel meticulously tunes the price points for its processors to generate the revenue that will fund development. Intel's published prices range from a 'low' $117 for a Core i3 processor to $999 for a top-of-the-line Core i7 device. Compare this to iSuppli's estimate for the cost of the A6 processor: $17.50. Even if more A6 chips could be produced per wafer — an unproven assumption — Intel's revenue per A6 wafer start would be much lower than with their x86 microprocessors. In Intel's perception of reality, this would destroy the business model. 'For all of Intel's semiconductor design and manufacturing feats, its processors suffer from a genetic handicap: They have to support the legacy x86 instruction set, and thus they're inherently more complicated than legacy-free ARM devices, they require more transistors, more silicon. Intel will argue, rightly, that they'll always be one technological step ahead of the competition, but is one step enough for x86 chips to beat ARM microprocessors?'"
This discussion has been archived. No new comments can be posted.

Apple, ARM, and Intel

Comments Filter:
  • Complicated Story (Score:2, Insightful)

    by TaoPhoenix (980487) <TaoPhoenix@yahoo.com> on Monday October 22, 2012 @06:19PM (#41734465) Journal

    I can't find the angle here.

    "Legacy Free" vs "Costs".

    "Legacy Free" is a nice sounding term for "won't run $hit". So much for your 1,000 app and app-lets you rely on, Business.

    So I give up on this story and will let the rest of y'all thrash it out.

  • Re:Long term (Score:5, Insightful)

    by Marillion (33728) <ericbardes AT gmail DOT com> on Monday October 22, 2012 @06:36PM (#41734697)
    If you measure operations per second, the x86 chip will win. If you measure operations per second per watt, the ARM chip will win.
  • by yayoubetcha (893774) on Monday October 22, 2012 @06:56PM (#41734861)

    Most people think Intel pumps out chips. Where this is true, Intel's real business is mass producing cash. Their Fabs are, in all ways that matter, a currency printing factory.

    Say what you want about x86 technology or bash Intel because their not 'cool', but what they do better than ANYBODY is mass produce chips that have pretty sweet financial margins.

    Intel is also not afraid of making dramatic changes. They actually started the DRAM Business by creating the first commercial chip, the 1103. They made big bucks with this business, but in the mid-80's competition from Japan were "dumping" DRAM into the market causing their margins to evaporate.

    Intel was faced with an enormous business problem. Andy Grove decided, against many voices on the board, to abandon the business that made them a success: DRAM. Instead they focused on another business: the x86 microprocessor market. This was a huge risk, and the outcome was not certain at all what would happen.

    Many are criticizing Intel for not getting on-board the mobile band-wagon in a big way. I know Intel quite well. I think, what we are currently witnessing in the microprocessor market is Intel's version of Ali's "rope-a-dope".

    One thing is true, and Intel knows it.... only the paranoid survives.

  • by Chemisor (97276) on Monday October 22, 2012 @07:00PM (#41734901)

    If you measure miles per hour, the horse will win. If you measure miles per hour per calorie, the snail will win. Now, which one would you like to pull your next carriage?

  • by epine (68316) on Monday October 22, 2012 @07:06PM (#41734963)

    The old 80386 based on the "complex" x86 instruction set had 275,000 transistors. Intel is now making chips with 2.6 billion transistors and somehow what they once implemented as one functional unit within a budget of 0.000275 billion transistors is holding them back?

    Certainly they would rather do a few things differently had they been worried about 2013 back in 1978. Transistor count is the least of the matter. What buggers up x86 is the number of active transistors handling the instruction stream at each instruction cycle. There's no way to align variable-length instructions without active transistors (regardless of whether the transistors involved amount to a wart on a small toe of a juvenile mosquito).

    The x86 story bugs the hell out of me. Considered how well it actually held up for 45 years (and counting) it's one of the ugly duckling success stories of all time (hint: it wasn't so ugly after all).

    It was also a founding member of the Steve Jobs reality distortion field. I'm concerned his posthumous aura will continue to glow with the uplift of falsehood. He should be credited more for what he accomplished than the lies he polished to get there.

    It wasn't just Steve, it was the entire RISC consortium manufacturing an Achilles heel out of whole cloth. Far closer to the truth of the matter is that x86 has a much higher design cost than an orthogonal clean-sheet alternative. The design cost was a small multiple. Intel's resources were a large multiple. It didn't go well for RISC. The much vaunted DEC Alpha had a metal connect layer for single-cycle carry-add propagation that forever segregated it from the mass-consumer price point. It was the instruction set. No, it was the instruction set aided by a titanium stent.

    Also, the RISC design advantage does not extend to the memory cache and system bus design. These are a bear to design well for any instruction set. The RISC people moaned about the exceptional Pentium Pro performance level on server workloads (it was the first memory bus from Intel that didn't totally suck). Well, Intel broke into the server market with their crappy old x86 instruction set by grafting it onto a titanium alloy cache hierarchy and bus controller (with multiple dies grafted into the same chip package at enormous expense). Cache latency and branch prediction absolutely dwarf instruction set as the big thing to worry about since around this time. If Steve hadn't grabbed onto the inferiority of CISC around this time, it might have died a timely death.

    In low power applications, ARM has a real advantage, enough to win a huge market share at race-to-the-bottom price points. How much does the cost of a CPU influence a handset? How much everything else? I've put $300 Intel CPUs in $2000 boxes. I've put $250 Intel CPUs in $1000 boxes. I've put $60 CPUs in $500 boxes. A $16 CPU in a phone that retails for $600 for just a few months, before landing in the discount bin? I'm sure Intel wants a huge slice of that.

    One reason Intel has held their ground is that the Cortex-A15 (out-of-order superscalar multiprocessor) is starting to look a lot like the old Pentium Pro. Sure the instruction set is modern and clean (though it took ARM surprisingly long to come up with the mixed 16/32 bit instruction encoding format due to misguided ideological purity; how many active transistors does it take to determine whether the next 32 bit chunk from the instruction stream is one lump or two? More or less than the number of active transistors in the icache devoted to storing common instructions bloated to 32 bits just because?). But all the rest of the issues are pretty much the same: branch prediction stalls, cache snooping, and memory path latency.

    From Intel's perspective, an ugly instruction set is good for business. (Then they went on a jag thinking that if ugly is good, atrocious is better, and the Itanium was hatched with a jackhammer from a mastodon egg.)

    After another three die shrinks, when half the processor implements on-demand power management, and most of the other half provides task-specialized execution units, is the instruction set going to matter a hill of beans for anything other than legacy lock-in?

  • by steveha (103154) on Monday October 22, 2012 @07:20PM (#41735071) Homepage

    Intel wants to be the only company that can meet your needs. That way, they can make you pay premium prices for their chips. This is perfectly understandable; that is what is best for Intel.

    Apple wants to be vertically integrated. They want full control over everything they do. Partly this is so they can keep as much as possible of the money they collect; partly this is so that they can guarantee excellent quality and excellent availability. This is what is best for Apple, and it isn't bad for their customers either.

    Intel does not want to become just another ARM source, competing on price with all the others. But Apple will never lock themselves in to depending on Intel for mobile chips, when ARM chips have been shown to be more than adequate. And Apple would not be investing in custom ARM chips if it was planning to adopt Intel mobile chips.

    People keep pointing out that Intel's mobile x86 chips are competitive with ARM. That won't cut it. Intel's chips would have to be better, and so much better that the risk of depending on Intel is worth it.

    That was the case for the PowerPC to x86 transition! Intel's chips were so much better than PowerPC for laptops that it was worth getting into an entangling relationship with Intel. AMD was not able to guarantee delivery of the massive quantities of chips Apple was planning to sell, and Intel was, so AMD wasn't really an option... but at least they served to keep Intel from trying to charge totally outrageous prices for their chips; there was always a credible threat of going to AMD.

    Hmm. It's looking like AMD is going to crater in spectacular fashion soon. I wonder if Apple will make a serious attempt to buy what's left of the company. That would enable Apple to make its own x86 chips! Eh, probably not. AMD is behind Intel on process, so switching to AMD chips would mean taking a hit on performance, power use, or both.

    The "SemiAccurate" web site thinks that Apple will transition to using ARM chips for laptops [semiaccurate.com], not just for mobile devices, once ARM chips are good enough (which they will be soon). So, transitioning away from x86 and to, say, multi-core 64-bit ARM chips is another way Apple can untangle from Intel.

    Apple may not be in a big hurry to actually complete the transition away from Intel chips; just a credible threat of switching to ARM chips might be enough to negotiate good prices on x86 chips. That would leave lower power consumption as the main reason to go to ARM, but a laptop's display is probably the worst power drain, especially with a Retina display.

    steveha

  • by steveha (103154) on Monday October 22, 2012 @07:51PM (#41735335) Homepage

    Also note that Apple has people paying $2500 and up for the Mac Pro, and $1000 and up for laptops. But mobile devices are closer to $500, and the Android competition is hitting the $200 price point.

    There just isn't as much room to pay top dollar prices for Intel parts in the mobile space.

    So even if Intel mobile x86 parts are slightly faster than the ARM chips, will Intel be happy selling at prices competitive with ARM prices? History suggests "no". The cheapest Atom chips are around $20 but Intel makes those suck, just as much as Intel can get away with.

    Intel is the master of segmenting markets. Different chips at different price points have different features enabled. Cheaper chips are as crippled as possible, to encourage you to buy a more expensive chip. For example, Intel doesn't support virtualization features on their less-expensive chips; and Intel mostly reserves support for ECC RAM to only the Xeon processors.

    (In contrast, AMD puts full functionality in all their parts; they are #2 and they are trying harder [follisinc.com] to please the customer. That is how you can get an HP Proliant MicroServer with a 1.5 GHz dual-core AMD Turion processor for $320 at Newegg [newegg.com], with full support for virtualization and ECC RAM. I cannot imagine a MicroServer with equal or better Intel parts hitting that price point.)

    Intel will try to balance the functionality it allows into the mobile chips against the price it can get. Apple just wants the best chips for the cheapest price. These two goals are not in alignment.

  • by drinkypoo (153816) <martin.espinoza@gmail.com> on Monday October 22, 2012 @08:29PM (#41735669) Homepage Journal

    Right, I confused the Itanium instruction set with whatever intel brands its 64 bit ISA.

    Just remember, ia64 == iTanic == shit sandwich, amd64 is where it's at. Which is why it's so heartbreaking to see AMD so far to the rear in terms of performance today.

  • by Anonymous Coward on Monday October 22, 2012 @08:40PM (#41735755)

    Apple should acquire AMD, and shift them to being primary supplier and ARMs dealer.

    That's rather dumb considering AMD doesn't actually manufacture chips any more.

    If AMD did make chips, Apple could get nice GPUs and license ARM cores. But then if AMD hadn't dropped the ball on manufacturing they might still be a viable company.

  • Re:Long term (Score:4, Insightful)

    by Shavano (2541114) on Monday October 22, 2012 @08:44PM (#41735791)

    What a bold prediction, you understand of course that Intel has buried every single competing architecture from the past? Intel has a process advantage, even if they have to spend 10% of their die on decoding/rearranging they still have a significant transistor lead by remaining a process ahead AND still use lower power.

    We'll see if that really will happen. But it's very true about the process advantage. If Intel chose to build and ARM CPU, it would blow the doors off all the other ARMs because they can build faster/smaller/lower-leakage transistors. But it might be more expensive, and the ARM market is very price sensitive. I imagine there are Intel people who have considered this carefully and concluded that they can't build an ARM processor that's cheaper AND better than the competitors.

  • Re:Long term (Score:4, Insightful)

    by PlusFiveTroll (754249) on Monday October 22, 2012 @08:48PM (#41735827) Homepage

    You are forgetting 2 non technical matters.

    A: Why let Intel 'win' so they can turn around and stuff higher prices clean up your backside? One supplier markers are best for the supplier.

    B: Anti-trust. Once you start taking over entire markets all of a sudden you have even more government flashlights up your ass. Only a dumb person wants total control unless he has TOTAL CONTROL, otherwise powers stronger then you take too much interest in your daily work.

  • by PPH (736903) on Monday October 22, 2012 @09:58PM (#41736435)

    ... the politics between Apple, Samsung, Microsoft, Intel, ARM and others, you should be working at the United Nations on a solution for eternal world peace.

    There is so much sub rosa crap (not all of it ethical or legal) going on between the players we may never know the truth.

  • by jader3rd (2222716) on Tuesday October 23, 2012 @01:26AM (#41737823)

    It doesn't give you a smaller desktop. You have an app for your phone, a different app for your desktop, and yet another app for your tablet. Windows Phone and Windows RT are orphans. Do you see what I'm saying now?

    Yes, I think I get what you're saying. Thank you for your clarification; perhaps I can make some of my own.
    When iOS was first released (2007) it didn't run any sort of legacy programs from existing touch screen smartphones, and yet people really seemed to enjoy using those devices. When Android was first released, and was still just a phone OS, people seemed to enjoy using those devices. When the iPad first came out it could run existing iPhone apps; but even Apple said that the experience for many apps wasn't good and that app authors needed to (and still do) optimize their apps for the larger screen. There's hardly an article on the web about Android tablets that doesn't mention how there's a pithy of tablet optimized apps, and that while the tablets can run all Android apps, most of them really suck on the tablet. So I don't think that not having the ability to run legacy programs is a nail in the coffin for any new device/platform.

    So right now a Windows RT computer does have limited appeal, due to not having much of an ecosystem. That was kind of my point by saying "3 years from now", the ecosystem may come. The ecosystem may come easily because if developers write apps for the Windows 8 Store (targeting desktop) the apps will also light up on Windows RT; and they will light up without any "tablet optimization" step that iOS and Android apps suffer from. The screen sizes are the same. So I think that there will be crossover soon, at least from the point of view of the end user; because they'll be able to find the same apps on a Windows RT computer as they do on their Win8 laptop/desktop.

    If RT could run Phone apps, I wouldn't be typing right now.

    Yes, Microsoft doesn't have a runtime that runs across Win32, .Net, WinRT, Phone and Xbox. But they do have portable assemblies [microsoft.com] which do allow for an assembly to run within .Net, WinRT, Phone and Xbox. So someone would still need a separate app/program for the different runtimes, but if the business logic is the same for all of the apps there only needs to be one portable assembly.

  • Re:Long term (Score:4, Insightful)

    by slashping (2674483) on Tuesday October 23, 2012 @05:32AM (#41739033)
    Except that for ARM to scale up to surpass x86 performance, it would need out-of-order scheduling, multiscalar execution, instruction pre-decoding, hardware loop unrolling, speculative execution, memory/register renaming, multiprocessor cache coherency and all that fun stuff that Intel has. After that has been implemented, let's see if the design is still far more elegant. I'll bet it's going to look pretty similar, with x86 instruction decoding becoming an almost irrelevant issue compared to all the other things. And, by the way, have you taken a look at the Cortex architecture ? It's getting less and less elegant. You are right about the compatibility, though. ARM is a mess, with a dozen different, incompatible, architectures in the core alone.

"Our vision is to speed up time, eventually eliminating it." -- Alex Schure

Working...