Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Apple Hardware

How a Decision by Apple 15 Years Ago Hurts Intel Now (scmp.com) 127

Last month Intel's stock lost $50 billion in valuation — while the valuation for Taiwan-based TSMC jumped by over 50%.

The former chief of staff to Intel CEO Andrew Grove (and later general manager of Intel China) explains why this moment was 15 years in the making: Learning curve theory says that the cost of manufacturing a product declines as the volume increases. Manufacturing chips for the whole computer industry gave Intel huge advantages of scale over any other semiconductor manufacturer and resulted in the company becoming the world's largest chip manufacturer with enviable profit margins.

Chaos theory says that a small change in one state of a system can cause a large change in a later stage. In Intel's case, this was not getting selected by Apple for its iPhones. Each successive era of computing was 10x the size of the previous era, so while Intel produced hundreds of millions of microprocessors per year, the mobile phone industry sells billions of units per year. Apple's decision in 2005 to use the ARM architecture instead of Intel's gave Taiwan-based TSMC, the foundry chosen to manufacture the processor chips for the iPhone, the learning curve advantage which over time enabled it to pull ahead of Intel in manufacturing process technology.

Intel's integrated model, its competitive advantage for decades, became its vulnerability. TSMC and ARM created a tectonic shift in the semiconductor industry by enabling a large number of "fabless" chip companies such as Apple, AMD, Nvidia and Qualcomm, to name a few. These fabless companies began to out-design Intel in the mobile phone industry and accelerated TSMC's lead over Intel in high volume manufacturing of the most advanced chips. Samsung, which also operates a foundry business, has been another beneficiary of this trend.

This discussion has been archived. No new comments can be posted.

How a Decision by Apple 15 Years Ago Hurts Intel Now

Comments Filter:
  • by rahvin112 ( 446269 ) on Sunday August 23, 2020 @02:54PM (#60432895)

    That is the silliest premise I've read in years with no connection to reality at all.

    Intel not getting to supply Apple's cellphone chips has nothing to do with their stock price today or 15 years ago. The entire premise is based around a supposition with no evidence to support it.

    This is not evidence based, it's a made up fiction.

    • by knarf ( 34928 ) on Sunday August 23, 2020 @03:05PM (#60432929)

      There is a very simple and concise explanation for this parable: those of the true faith have a fiduciary duty to explain current and past events in ways which lead to the true faith. No more, no less.

    • The worst smartphone I ever owned was the Asus Zenphone. It had an intel processor. It was unreliable and died faster than the typical disposable technology. I think this article just highlights the fact that intel missed the boat on producing cheap quality underpowered processors for the designed obsolescence smartphone market.
      • Thats nothing. I had a Microsoft PDA phone running some crappy windows ce and needed a stylus! Lol the ui was basically windows with a start menu, not designed for mobile use at all. I think it was the samsung i730 somethign or other. Took a walk on the beach with it, and she never turned back on. I guess the salt air killed it. That was a good $700 down the drain.
    • by UnknowingFool ( 672806 ) on Sunday August 23, 2020 @03:20PM (#60432981)
      It also wasn’t an Earth shattering decision by Apple to use ARM based processors for their portable, battery operated devices. Intel x86 chips while more power efficient today are not close to being a power efficient as ARM chips. 15 years ago the power problem was worse. There are rumors that the first Apple iPad prototypes were Intel Atom based but were considered too power hungry by Apple.
      • by K. S. Kyosuke ( 729550 ) on Sunday August 23, 2020 @03:44PM (#60433057)
        "15 years ago"? Apple decided to use ARM in mobile devices thirty years ago.
        • Yuppers... (Score:3, Informative)

          by Anonymous Coward

          From Wikipedia https://en.wikipedia.org/wiki/... [wikipedia.org]

          "Apple first utilized the ARM architecture in 1993 in its Newton personal digital assistant"

        • I never said the beginning of the first Apple device to use ARM was 15 years ago. The development of the iPhone and iPad specifically started 15 years ago and Apple had to decide whether to use ARM or Intel.
          • You literally said "decision by Apple to use ARM based processors for their portable, battery operated devices", which is a decision that took place around 1990, which is around 30 years ago.
            • Did you read the title of this thread that literally say "How a Decision by Apple 15 Years Ago Hurts Intel Now". How about in the context of my reply where I literally said "15 years ago the power problem was worse. . . ". Did you?
              • I wasn't the one who wrote the erroneous title.
                • No, you are the one that can't be bothered to read past one sentence and not to understand context. This is the full text:

                  It also wasn’t an Earth shattering decision by Apple to use ARM based processors for their portable, battery operated devices. Intel x86 chips while more power efficient today are not close to being a power efficient as ARM chips. 15 years ago the power problem was worse. There are rumors that the first Apple iPad prototypes were Intel Atom based but were considered too power hungry by Apple.

                  I didn't write a 7,000 word treastie. It was one paragraph. You couldn't be bothered to read one paragraph before you went off half-cocked. You couldn't be bothered to read the summary.

                  • I have read both, and history still makes the writers of both the people who missed the point completely. Someone else already explained here that Arm wouldn't be the company we know today if it hadn't been for Apple from 30 years ago.
      • by AmiMoJo ( 196126 )

        It's not like Intel could just have pulled a competitive design out of it's arse either, those things take many many years to develop and even current Intel stuff is way off the pace compared to ARM. Their only real selling point is that they are x86.

    • by phalse phace ( 454635 ) on Sunday August 23, 2020 @03:30PM (#60433011)

      That is the silliest premise I've read in years with no connection to reality at all.

      Intel not getting to supply Apple's cellphone chips has nothing to do with their stock price today or 15 years ago. The entire premise is based around a supposition with no evidence to support it.

      This is not evidence based, it's a made up fiction.

      If Intel had been successful in the mobile chip area, they would have been able to diversify their revenue stream away from the microprocessors that power personal computers... where they make a lot of their money.

      Client Computing Group (CCG)

      2014 = $33.210 billion
      2015 = $30.654 billion
      2016 = $32.2 billion
      2017 = $34 billion
      2018= $37 billion
      2019 = $37.1 billion

      Their CCG group is hardly growing, and their Data Center Group (DCG) isn't doing all that great either.

      Total DCG revenue

      2014 = $14.387 billion
      2015 = $15.997 billion
      2016 = $15.981 billion
      2017 = $19.1 billion
      2018 = $23.0 billion
      2019 = $23.5 billion

      Intel's stock price is where it's at because there's hardly any growth.

      Processors and modems for smartphones is a multi billion dollar a year business which Intel handed over to Arm Holdings and Qualcomm. Remember that Intel sold their modem business to Apple for $1 billion and took a multi-billion loss for it. Intel spent billions on R&D for years with nothing to show for it.

      Intel Lost Billions Selling Its Modem Business to Apple [fool.com]

      "In July 2019, Intel sold most of its modem business to Apple at a multi-billion dollar loss," according to Intel. Chipzilla had been investing as much as $4 billion per year on mobile research and development, while the mobile business lost an estimated $16 billion between 2011 and 2018, according to Strategy Analytics.

      How many billions in potential sales did Intel miss out on?

      • by dgatwood ( 11270 ) on Sunday August 23, 2020 @03:40PM (#60433041) Homepage Journal

        Processors and modems for smartphones is a multi billion dollar a year business which Intel handed over to Arm Holdings and Qualcomm. Remember that Intel sold their modem business to Apple for $1 billion and took a multi-billion loss for it. Intel spent billions on R&D for years with nothing to show for it.

        Intel Lost Billions Selling Its Modem Business to Apple [fool.com]

        "In July 2019, Intel sold most of its modem business to Apple at a multi-billion dollar loss," according to Intel. Chipzilla had been investing as much as $4 billion per year on mobile research and development, while the mobile business lost an estimated $16 billion between 2011 and 2018, according to Strategy Analytics.

        How many billions in potential sales did Intel miss out on?

        Don't forget about Intel selling off XScale (its ARM chip line) to Marvell in 2006. Sure, Apple chose Samsung's chips initially, but Intel could have competed with them to try to build better ARM chips and get into a future generation of Apple products. Instead, they sold off their ARM manufacturing in 2006. Apple didn't buy PA Semi, which presumably marked the point where they started developing their own chips, until two years later, in 2008, and didn't start actually shipping any chips built for them by TSMC until 2010.

        So no, it was not Apple's decision in 2005 to use a Samsung ARM chip for the iPhone CPU that screwed up Intel's ability to compete, but rather Intel's own failure to take the market seriously.

        • by Pieroxy ( 222434 )

          So no, it was not Apple's decision in 2005 to use a Samsung ARM chip for the iPhone CPU that screwed up Intel's ability to compete, but rather Intel's own failure to take the market seriously.

          Totally. And now if Apple's claims (ARM is powerful enough for desktop + power per watt is much better than x86) are true - and so far nothing points in the other direction, ARM might make a sizeable dent into Intel's territory. Apple is 15% of all desktop/laptops sold, that's already quite a dent.

          And with so many ARM chips in the wild, it's only a matter of time before it happens on the servers market.

          • by dgatwood ( 11270 )

            It's almost a given that ARM will eventually eat Intel's lunch. Intel dragged out the RISC-versus-CISC battle far longer than anybody expected by basically building a RISC core and wrapping it in CISC instruction cracking logic, but it was always only a matter of time before RISC won.

            The only thing disappointing is that these days, the ARM cores are all running little endian. :-)

            That said, Apple has a really long way to go before their ARM hardware will be a serious contender. The ARM Mac Mini hardware i

            • by Pieroxy ( 222434 )

              They seem to be saying that the current Mac Mini they gave devs to play around is not even close to what they prepare in terms of performance.

              • by dgatwood ( 11270 )

                Sure. The real hardware will have to have per-core speeds of at least half again more just to match the production Mac Mini, and it would need half again more cores. I totally expect Apple to build a 6-core design design that would be adequate to serve the needs of 90% of their Mac line, and maybe even an 8-core design that will take care of the next 9.9% of their Mac line. Heck, the rumors of a 12-core chip might even be right. Maybe. But that's seriously pushing it. The more cores, the more complex

                • by dgatwood ( 11270 )

                  And just as a data point, multicore benchmarks on the 80-core Ampere Altra reportedly show it being about 2.3x as fast (at least in integer performance) as that top-end 28-core Xeon in the current Mac Pro. Presumably the 128-core Altra stomps it into the ground.

                  It's unclear how the floating-point performance compares. Then again, Apple took such a huge step backwards in floating-point performance when they switched to Intel that it took a full seven years before they were building something that could ac

      • But it wasn't a decision by Apple, per se, it was a decision by (almost) everyone who manufactured mobile devices to not use the crappy mobile chips that Intel put out.

        Also, I think this is a buzz-compliant misapplication of chaos theory, in that it is entirely foreseeable (really, almost predetermined) that failing to get in near the ground-floor of a growing segment makes it more difficult to break into the maturing market.

        The issue isn't learning-curve theory or chaos-theory, it's Intel's view of their b

      • Intel lost that market because they didn't have power-efficient chips. It wasn't a "decision by Apple". A dozen other phone manufacturers looked at the same shitty chips and came to the same conclusion. The decision that cost Intel was Intel's.

      • Your numbers aren't apples to apples if you will. Gross margin and margin percentage are what drive the semiconductor industry. Moving into cell phone chips would have eroded both and likely resulted in lower stock prices overall.

        There is a good reason intel never took these markets seriously and it's that doing so would have eroded their margins from nearly 50% to be much closer to the around 10% you see on mobile chips. Taken into account with much lower selling prices and you erode both profit and margi

      • Those numbers mean you should be on the Board of Intel, and the old guard removed. Goldman Sachs should now place a sell recommendation. To be fair Intel has a lot of money squirreled away in tax free havens, and enjoyed Trumps buybacks - meaning going forward will be difficult. The mystery is how come TSMC came to be #1. I thought most of the equipment came from Japan, which may explain Intels assumption that internal pure organic R&D progress was unlikely - (see Huawei), although Samsung was also mak
      • How many billions in potential sales did Intel miss out on?

        The same amount as I did: None. Because I too don't have a viable product, and didn't 15 years ago either.

    • by mschaffer ( 97223 ) on Sunday August 23, 2020 @03:37PM (#60433029)

      Indeed a silly article. Especially considering how ARM was a joint venture between Acorn, Apple, and VLSI formed in 1990---fifteen years before the scope of the article.

      • It's older than that, even. Before ARM existed as a separate company, Acorn had the ARM architecture working in VLSI Tech-produced silicon back in 1985, and in an actual product (the Acorn Archimedes) two years later.

        • Yes, Acorn's RISC Machine (the ARM architecture) predates that, but Apple wasn't very interested in Acorn's chips until sometime in the Newton development timeframe (1987-1993). The OP was trying to make the case that Apple capriciously decided to use fabless ARM architecture over Intel in 2005 when they were highly instrumental in forming ARM Holdings in 1990.

    • it's a made up fiction.

      As is the entire stock market right now. It is absurd to talk about prices and "valuations" while the basement is being flooded by the federal reserve...

    • Intel's appeal has been that it sells a wide variety of x86 based chips that make development across a broad spectrum of platforms relatively easy. You don't have to re-optimize for different architectures.

      This advantage is lessened when:

      1. Compilers get smart enough that they can optimize well enough for different architectures given a relatively high-level language, like C++. Or, you have optimized virtual machines like Java that mostly do away with the need for heavy user-side code optimization.
      2. CPUs g

    • Apple sells about 215 million iPhones each year.
      I'm curious to hear your explanation of why an order for 200 million CPUs every year isn't a benefit to TSMC I competing with Intel.

      For comparison, about 250 million PCs are sold each year worldwide. Of course half of those are AMD, so iPhone is more CPUs than all of Intel's PC cpu business.

      Tell me how losing out on over half of the orders doesn't matter.

      • by teg ( 97890 )

        Apple sells about 215 million iPhones each year. I'm curious to hear your explanation of why an order for 200 million CPUs every year isn't a benefit to TSMC I competing with Intel.

        For comparison, about 250 million PCs are sold each year worldwide. Of course half of those are AMD, so iPhone is more CPUs than all of Intel's PC cpu business.

        Tell me how losing out on over half of the orders doesn't matter.

        AMD claims to have a market share of about 15% [amd.com], not even close to 50%.

        That said, Intel has been very far from successful the last couple of years and have set themselves up as a target: They used to be far ahead of the industry in process technology, but their 10 nm transition has just been failure after failure - and they're now behind. Also, they've gotten too used to being dominant that they've put more effort into customer segmentation that actual progress... their product lines are now huge, bloated

      • It's not about the number of actual processors, it's about the number of "wafer starts."

        A 12-inch wafer can contain many more ARM SoCs than a typical x86/x64 CPU or AMD/NVidia GPU.

      • Because the OP never said it was "good" that Intel isn’t supplying 200 million chips a year. What the OP said is that Intel not being in the cellphone chip industry isn’t the main cause for their current problems.

        • > What the OP said is that Intel not being in the cellphone chip industry isnâ(TM)t the main cause for their current problems.

          What the OP said is:

          >> Intel not getting to supply Apple's cellphone chips has nothing to do with their stock price today or 15 years ago.

          The claim is missing out on mobile, on 80% of processor sales, which funded the growth of TSMC as competition, has "nothing to do" with Intel's position.

          • Again, read carefully: the OP is saying the root cause of their current problems would have still existed even if Apple decided to use them 15 years ago. Their current problems are issues with chip fabrication on 7nm and 10nm lines. Committing to supply 200 million more chips to Apple would not have solved these current problem of very low yields.

            • > Their current problems are issues with chip fabrication on 7nm and 10nm lines. Committing to supply 200 million more chips to Apple would not have solved these current problem of very low yields.

              First, was that their problem 15 years ago? Thr statement was that losing several billion dollars in orders 15 years ago didn't affect them 15 years ago - and there is no follow-on effect later.

              The article argues that billions of dollars to TSMC, and the experience they gained making mobile chips, helped TSMC

              • First, was that their problem 15 years ago?

                From the rumors and general industry consensus is that Intel x86 was too power hungry for small devices like iPhones. This is still somewhat true today. While Atom is more power efficient than 15 years ago, ARM architecture is way more power efficient. There were probably secondary reasons like ARM is much more flexible in configurations so that Apple can tweak their designs to use slightly different versions to power their iPads and iPhones.

                Thr statement was that losing several billion dollars in orders 15 years ago didn't affect them 15 years ago - and there is no follow-on effect later.

                First of all Intel did not "lose" several billion dollars. Intel n

                • >> Is it possible that a couple billion dollars or R&D and additional experience would either a) help Intel to develop better processes,

                  > Again, Intel's problems are not just the design of their processors. Their problem is fabricating processors.

                  I think you read "processors" when I was saying "processes". It seems to me, a few billion dollars would help TSMC develop better fabrication processes. It just seems to me that R&D is kinda important for a fab.

                  > Intel has spent billions in chi

                  • I think you read "processors" when I was saying "processes". It seems to me, a few billion dollars would help TSMC develop better fabrication processes. It just seems to me that R&D is kinda important for a fab.

                    You seem to completely miss the point that whether or not Apple is a customer, TSMC would have still spent billions on 7nm. You seemed to miss the point that every single fab company including Intel has spent billions on R&D.

                    Why the heck would they do that? You just told me that a few billion to spend on R&D doesn't make any difference.

                    What I asked you is how would spending billions more solve their yield issue. You stated that it would. Please provide specifics.

                    If a couple billion of R&D from Apple and other mobiles didn't help TSMC become more competitive, why would Intel waste billions?

                    Again, TSMC's roadmap for 7nm did not include donations from Apple. Again, TSMC was going to build 7nm back in 2014 with or without Apple as TSMC has othe

                • Ps, you keep trying to change what was said. I don't know if you're aware, but on Slashdot you can still see what was posted yesterday. It's not like spoken word where you can say something, then 30 minutes later claim you didn't say it and nobody can tell for sure.

                  > Second, he didn't say there is "no effect".

                  The text is still right up there. What was said is:

                  "Intel not getting to supply Apple's cellphone chips has nothing to do with their stock price today or 15 years ago."

                  "Nothing to do". Missing o

    • THIS IS modern SLASHDOT

      "That is the silliest premise I've read in years with no connection to reality at all. Intel not getting to supply Apple's cellphone chips has nothing to do with their stock price today or 15 years ago. The entire premise is based around a supposition with no evidence to support it. This is not evidence based, it's a made up fiction."
  • by jfdavis668 ( 1414919 ) on Sunday August 23, 2020 @02:59PM (#60432907)
    If Apple didn't select it for the Newton, ARM RISC would probably never gotten anywhere. Though the iPhone made it mainstream, it started with Apple and ARM back in 1990.
    • by fermion ( 181285 )
      This is generally true for most of Apple product. Apple does not design to meet a price point, it designs to create product for the end user that has funds to purchase good kit.

      With the Mac, Apple is concerned with cycles per watt. The 6800 was used because it do stuff, like use a graphics coprocessor, and had a richer instruction set for what the Mac had to do.

      Eventually Motorola could not do what apple wanted. To get power, it had to build a normal computer, that is a tower, and this has seemed to

      • >the cylinder mac pro
        >a beautiful machine
        Okay, now I know you're trolling.

        • The Cube was useless (underpowered and overpriced) but the Trashcan Pro is actually a pretty neat machine. If you have a traditional three monitor set up it fits nicely behind/between two of them so you have easy access for all of your peripherals. Any comparable machine has to sit on the floor.

      • The 6800 was used because it do stuff, like use a graphics coprocessor, and had a richer instruction set for what the Mac had to do.

        Apple used the 68000 for the Mac. The 6800 was an 8 bit processor with a 16 bit address bus. It was the first generation Motorola process, a peer to Intels 8080. I'd prefer it to the 8080 but the Z80 is better than either. In that time period Apple was using the 6502 which was anaemic compared to anything else but very cheap.

        But back to the 68000, it was a fully realized 16 bit processor, in a high pin count package without the constraints that Intel's 8086 family made to fit in a 40 pin package. The origi

        • I have run NetBSD on an SE/30 with that framebuffer.

          Hello, brother! Mine dual boots NetBSD and A/UX, internally sports 2x 2GB HDD (one HDD for each OS), a Daystar 33Mhz 68030 processor card, network card and 128MB RAM. I could never find the greyscale card and a second PDS splitter. I swapped out the B&W vacuum tube for the green tube from an Apple //c monitor.

        • I started coding on the 6809 and loved it compared to the hoops my friends had to jump through on the 6502. I never coded on the Z80, curious what were it's strengths?
        • by hawk ( 1151 )

          also, note that the original Mac design was based on the 6809, not the 68000. They got as far as a prototype and bouncing ball in quickdraw.

          They had the bright idea, though, that a computer should have a single bank of memory chips, and the display would have taken over a third of the resulting 64k of memory.

          As I understand it, they switched before hitting problems with performance, which I suspect would also have bit them.

          (the 6809 was a kind of sort of 16 bit successor to the 6800 but still on an 8 bit b

    • Except that Acorn was already selling ARM-based machines back in 1987.

  • by 93 Escort Wagon ( 326346 ) on Sunday August 23, 2020 @03:03PM (#60432923)

    You'd have to charge them every 6-8 hours. Plus you'd likely need insulated gloves whenever you used the thing.

    Not to mention the IME would almost certainly have an unprotected, always-on, hard-wired connection to your phone's cell modem.

    • You realize the cell base-band of these phones are not better, and can snoop on arbitrary memory locations?

  • by Doub ( 784854 ) on Sunday August 23, 2020 @03:04PM (#60432925)
    They still don't. This is like saying that NASA bankrupted Blockbuster when they didn't pick them for ISS resupply missions.
    • Intel made the StrongARM and XScale processors. Yes, they were ARM based, but many mobile phones used them.
    • by Gravis Zero ( 934156 ) on Sunday August 23, 2020 @03:15PM (#60432973)

      Intel never made mobile chips. They still don't.

      Actually, they [intel.com] did. [anandtech.com] The problem was that they were power hungry beasts. It wasn't their first attempt. Intel has been trying to get their chips into cell phones for nearly two decades. [zdnet.com]

    • They did: StrongARM. As for x86, they tried to position their Atom processors for this market but with little success with smartphone makers.
    • by RazorSharp ( 1418697 ) on Sunday August 23, 2020 @03:45PM (#60433063)

      They still don't. This is like saying that NASA bankrupted Blockbuster when they didn't pick them for ISS resupply missions.

      I think the argument isn't as silly as you make it out to be. By choosing ARM and having TSMC manufacture them, TSMC had to scale up production rapidly and the scales of economy began to work in their favor as Apple and other mobile chip designers went to them for manufacturing. AMD then benefited from TSMC's superior tooling that they were able to achieve due to the experience and scale required to serve the mobile market. What's sinking Intel right now isn't mobile ARM chips, it's the fact that AMD is perceived to be better.

      However, I would argue that what wrecked Intel wasn't Apple's choice of ARM/TSMC, it was Intel's inability to make any inroads in the mobile market. They didn't commit the R&D and resources into becoming competitive in the mobile field even though it was obvious that things were headed that direction. Now they are pretty much locked out of the largest growth market and when it comes to their bread and butter AMD is piggybacking off TSMC's improvements driven by mobile to make a better product. My understanding is that fabrication is holding Intel back more than design.

      • No the argument is still very much silly. Apple had as much ability at choosing Intel as NASA did Blockbuster. You said it yourself "inability to make any inroads in the mobile market". They flat out didn't have a viable product.

    • by c ( 8461 )

      They did make mobile chips. ARM chips, in fact, which Apple used in the Newton, and also got used in a whack of PDA's. The StrongARM and XScale chips were used all over the place.

      Then, right about 15 years ago, they sold off that business. But yeah, it's Apple's fault...

  • 1. AMD wasn't fabless until 2009, when they spun off their fabs as Global Foundries.
    2. As I recall, neither Intel nor AMD offered a low-power phone SoC in 2005. ARM and MIPS were the choices at the time, used in many already-existing smartphones.
    3. Apple only started designing their own chips a couple of years ago, based on the ISA and designs licenced from ARM.

    • I forgot about the DEC-derived ARM-based StrongARM and XScale SoCs... probably because they weren't very good.

      Also, the article forgets to mention the major boost Intel got when Apple switched from PowerPC to Intel CPUs in their Macs.

    • by jabuzz ( 182671 ) on Sunday August 23, 2020 @03:29PM (#60433007) Homepage

      Indeed , Nokia and RIM where using ARM chips in their phones long before Apple started even making phones. However the rise of the phone and especially the smart phone gave the likes of TSMC and Samsung the money to draw close to Intel in the fab technology. Then Intel fumbled on their 10nm process and they where able to draw level. All Intel,ever had going for them was the x86 design (there is a long list of failed none x86 CPU designs from Intel) and a fab advantage. The last CPU design failure in Itanium means Intel had to license the x86-64 design from AMD. The 10nm process failure was game over. I would note that it was AMD that said you could have a 64bit x86 design, that the front side bus was duff and now have the chiplet idea. Meanwhile all Intel have managed is to be caught playing fast and loose with security.

      I would note that Intel where recently saying AVX512 is brilliant and all the HPC customers love it. I would at this point note that Intel is getting hammered by AMD in the HPC space right now. I can't think of a recent major HPC procurement announcement that uses Intel in the last 18 months. Hell the top machine in the world soon will be running Fujitsu A64FX CPU which is ARM.

      • Nokia and RIM where using ARM chips in their phones long before Apple started even making phones.

        Don't forget HTC - they were the 800lb gorilla of smartphones long before Apple got involved, and for some time after... Remember that early iOS didn't support cut'n'paste or MMS?

      • Hell the top machine in the world soon will be running Fujitsu A64FX CPU which is ARM.

        That has nothing to do with x86 though. The A64FX replaces Fujitsu's Sparc64 FX chips. x86 was never in the running.

  • Intel's integrated model, its competitive advantage for decades, became its vulnerability.

    Intel had (and still has) the option to offload manufacturing at any point, even if it's only supplementary. The real thing that is killing Intel is (and always has been) their lack of investment in actual architectural design and security. They had all the opportunity to make a better microarchitecture but the one time they tried, all the bad things about Intel sabotaged itself. Itanium could have been the modern architecture but they decided to be very closed and kept compiler developers in the dark.

    • Re:Bullshit. (Score:5, Insightful)

      by Moof123 ( 1292134 ) on Sunday August 23, 2020 @04:46PM (#60433241)

      Intel actually does a lot of design work in TSMC processes for non-processor stuff, and the reasons behind that are more at the root of things.

      Anything not a mainline processor is low priority, and the fab will not give them any real support. Management similarly throws spaghetti at the wall, and when new initiatives don’t rapidly become on par with the processor’s ROI they kill it. They also have a culture driven by overworking PhD’s who hyper-specialize into tiny niches of the design, and have no hope of cross pollinating to truly innovate into truly new product lines. All the employees are used to, and expect, to be laid off regularly. At the end of a project you are put into the pool, and if your hyper specialized skills are not needed, you get laid off.

      Being a monopoly lets lots of bad culture and practices settle in, as for years and years the stayed profitable. Now competition is coming fast and fierce from all directions. If my local property values were not so tightly tied to their fate I would be relishing it much more.

    • by jmauro ( 32523 )

      Intel had (and still has) the option to offload manufacturing at any point, even if it's only supplementary.

      Why? This is Intel's only advantage. It runs the most efficient fabs in the world. Spinning them off kills Intel because...

      The real thing that is killing Intel is (and always has been) their lack of investment in actual architectural design and security. They had all the opportunity to make a better microarchitecture but the one time they tried, all the bad things about Intel sabotaged itself. Itanium could have been the modern architecture but they decided to be very closed and kept compiler developers in the dark. This was to give themselves an advantage to sell their own compiler but AMD64 dropped, was easier to adapt to and completely destroyed Itamium. Even after the Itanic sank they insisted their processors were better and never really improved on their microarchitecture but instead bolted on new features. Without seriously considering security, their shortcuts finally caught up with them when the Meltdown flaw was discovered.

      TL;DR: Intel played itself and still is.

      this has been true for the past 40 years. Intel has never been the "best" processor or the most efficient design. It's only been able to do slight modifications of the existing processors to make them better, not really go into a revolutionary design change. It's just not it's thing. It's thing has been being the cheapest and the only company that can guarantee volume until r

    • by vbdasc ( 146051 )

      Itanium could have been the modern architecture but they decided to be very closed and kept compiler developers in the dark. This was to give themselves an advantage to sell their own compiler but AMD64 dropped, was easier to adapt to and completely destroyed Itamium.

      IMHO, Itanium and its VLIW architecture were a huge mistake. They were not practical to implement as a workhorse for the general-purpose personal computing and no, they never could have been the modern architecture Intel hoped to get. We're all speaking in this discussion about x86's failure to penetrate the low-power device business, but when I try to imagine Itanium in a smartphone I can't help but shudder. AMD64 won over Itanium not because Intel decided to be closed. It won because it was the right thin

    • The Itanium failed because VLIW wasn't ever well-aligned with mainstream workloads. There were some features that almost let it work, and it worked very well for some workloads. But at the end of the day it wasn't even finished properly and make too many compromises to upset the mainstream.

  • I cite the First Law of Chaos Theory: Anyone citing chaos theory must do so in a way that discomprehends chaos theory.

  • 15 years ago -- let's 17 years to allow for the ramp-up that never happened -- the majority were still saying "who would want a PDA and phone in one device? I prefer to keep them separate". Even Nokia was saying at the time they wouldn't bother engineering phones to all simultaneous talk and data because they didn't see a market for it. I wonder if there were any engineers at Intel who had the foresight to see beyond such tunnel vision, and if their voices were ignored.
    • 15 years ago -- let's 17 years to allow for the ramp-up that never happened -- the majority were still saying "who would want a PDA and phone in one device? I prefer to keep them separate".

      The first phone Blackberry was released in 2002. By 2004, Blackberries were super popular in the business world, and several smartphones running Windows had been released. By 2005 a lot of people were calling for an "iPod phone" but instead they got the Motorola Rokr. The integration went the other way, too, with more and more PDA features getting added to even cheap phones.

      Blackberry had been planning a smartphone since the 90s, just waiting until the technology progressed to a point where it was possib

    • by ledow ( 319597 )

      2005? Are ya kidding?

      We were on 3G (HSDPA) by 2002.

      And things like Palms and the early Nokias had been around for ages.

      I had a PCMCIA GPRS card before that. People were literally turning their laptops into phones.

  • Comment removed based on user account deletion
  • TSMC has been around and big for a whle, making ASICs and GPUs and chips for cell phones.

  • The reason Apple iterated so well on their ARM-series chips was because they had a goal in mind: a mobile computer.

    The iPhone is a computer that happens to have a phone, so needed graphics performance and low-power. Other vendors at the time saw the mobile phone as a phone with a computer grafted onto it, so chip performance was a secondary (or tertiary) concern to cost.

  • by presearch ( 214913 ) on Sunday August 23, 2020 @08:45PM (#60433987)

    I was an employee there back when the iPhone came out.
    Intel's problems are at least two fold; Old time managers wishing for one more big
    windfall like they got back in the Intel Inside days. Hence, x86 forever.
    The other problem is lower-level department heads chasing whatever trend is getting press.
    They wanted to compete against iPhone, Wii motion controllers, smart TVs, wearables, infotainment,
    digital signage, teleconferencing, raid arrays... whatever somebody else had done already,
    they tried them all, at the same time. Burned money and threw away any gained knowledge.

    That's what got them will.i.am and Lady Gaga and Orange County Choppers on board.
    All a waste. Meanwhile, those $8 billion are only good for maybe 8 years at best.
    They are just one big machine, and they wear out or print too-little-too-late chips.

    The subsidized lunches and free Starbucks all day was nice, as were the $5000 massage
    chairs and game room with a $9000 flat panel on the wall. Can't last forever though...

  • That's what happened to Motorola in the early '80s. Middle management wanted the 68000 to be used in tens of thousands of Unix servers, rather than millions of devices of general microcomputers or embedded systems. They were too proud of it to let it be used in cheap products. When IBM came around looking for a CPU for their "PC" product, Motorola dragged their feet. (My interpretation of all I've read about this is that IBM wanted the 68008, and Motorola wasn't interested in making it a priority. Supposedl

    • by ledow ( 319597 )

      "Our product is better than letting people like YOU use it" is literally the stupidest thing a company can say to any customer.

  • Intel NEVER had a chance of getting x86 in to phones. Their "if only we did this" speculation is their fantasy.

  • "Chaos theory says that a small change in one state of a system can cause a large change in a later stage." False. Well, technically true, but only for a dynamical system governed by ordinary differential equations that happens to have the necessary features for there to be chaos. I'm not entirely sure what ODEs govern Intel's business; no doubt Andy Grove's right hand man would know.

    This idiocy is akin to saying quantum mechanics is the reason we know god exists.

We are each entitled to our own opinion, but no one is entitled to his own facts. -- Patrick Moynihan

Working...