

How Apple Developed an Nvidia Allergy 38
Apple has long avoided directly purchasing Nvidia's chips and is now developing its own AI server chip with Broadcom, aiming for production by 2026, The Information reported Tuesday, shedding broader light on why the two companies don't get along so well.
The relationship deteriorated after a 2001 meeting where Steve Jobs accused Nvidia of copying technology from Pixar, which he then controlled. Relations worsened in 2008 when Nvidia's faulty graphics chips forced Apple to extend MacBook warranties without full compensation.
Rather than buying Nvidia's dominant AI processors directly like its tech peers, the Information reports, Apple rents them through cloud providers while also using Google's custom chips for training large AI models. The company's new chip project, code-named Baltra, marks its most ambitious effort yet to reduce reliance on external AI processors, despite being one of the largest indirect users of Nvidia chips through cloud services.
The relationship deteriorated after a 2001 meeting where Steve Jobs accused Nvidia of copying technology from Pixar, which he then controlled. Relations worsened in 2008 when Nvidia's faulty graphics chips forced Apple to extend MacBook warranties without full compensation.
Rather than buying Nvidia's dominant AI processors directly like its tech peers, the Information reports, Apple rents them through cloud providers while also using Google's custom chips for training large AI models. The company's new chip project, code-named Baltra, marks its most ambitious effort yet to reduce reliance on external AI processors, despite being one of the largest indirect users of Nvidia chips through cloud services.
will the server chip have locked in raid 0 storage (Score:2)
will the server chip have locked in (non .2) raid 0 storage that needs an 2th system to reload the full OS + firmware after any base storage change?
WIll they be able to at least have DFU mode in an IPMI ui?
Will it have lot's of pci-e and maybe 2-4 TB ports?
Will it have ram slots?
the information dot com (Score:2)
The linked article is behind a paywall.
Re: (Score:2)
C'mon man, this is Slashdot - no one reads the articles! :)
One can only speculate whether this is Apple internal or they plan to release some form of cloud OSX Server to rival AWS.
Re: (Score:2)
How would anyone know now? (Score:2)
If their competitor has it, and customers want it enough (read Apple profits would drop), they'll try to offer stuff. Scarier or lower priority features may take longer, or be broken the first time they try... Patents would be the biggest worry there (are any of those things new, and NVIDIA patented now?).
NVIDIA has been actively focusing and working on this for a decade plus at least. Meaning aiming for the data center market and shared GPU's. With those GPU's being designed for and optimized for those
Re: (Score:2)
How often are servers upgraded after the original purchase/setup? With planning and enough capital... I'm guessing pretty rarely. Just buy what you actually want/need.
It depends. For production servers, like for a web server farm, you only buy new stuff when old stuff breaks. However, for research use cases like scientific computing or developing better LLMs, the, performance is a key consideration, so overspending on GPUs one year can often be followed by overspending on GPUs the nexy year. For example, it‘s not uncommon for a leading edge supercomputer to rip out and upgrade all GPUs after just a few years. It is easy to understand why. Just consider the cos
storage can get changes and spinning disks have th (Score:2)
storage can get changes and spinning disks have there place in severs still.
can /. get any worse? (Score:2, Troll)
What an inane "article". Who cares what happened over 20 years ago involving at CEO that is long dead. And a decision to rent does not suggest an "allergy".
Re: (Score:2)
Considering apple effectively, and willingly, pays more money to use it anyways without directly "touching" it, that sounds like how one would treat something they're allergic to. Though it also speaks loudly of apple's highly toxic NIH syndrome, which is so bad that they're fully willing to compromise the user experience over.
Re: (Score:3)
I broke most of those stories (Score:3, Interesting)
As the person who broke both the Nvidia bad bumps story and their ousting from Apple, I can say with authority that the real reason Nvidia is out is the patent trolling rampage they tried to start. I wrote some of it up, a bit blurred to protect friends, here:
https://www.semiaccurate.com/2... [semiaccurate.com]
The bad bumps were a big blow but that was just money. The patent trolling threats were a deal breaker for Apple and many other silicon vendors. Go look up the Nvidia vs Qualcomm and Samsung suits for more but the company is not wanted anywhere in the ecosystem. Some HAVE to use them but no one wants to.
-Charlie
Re:I broke most of those stories (Score:4, Informative)
That link of yours may have been useful 11 years and 5 months ago, but it says virtually nothing now. And the pdf that article links to has since vanished.
Re: (Score:2)
That link of yours may have been useful 11 years and 5 months ago, but it says virtually nothing now. And the pdf that article links to has since vanished.
Odd, the pdf is not available on the wayback machine, either, so it's not just link rot. I wonder if it was deliberately erased.
Re: (Score:3)
Apple patents everything with near zero research, specifically in areas they see competitors do research in ... they are not morally outraged over anti-competitive patent schemes, they would need to have morals for that.
Re: (Score:3)
So they are behaving like Intel used to behave when they were untouchable?
As soon as someone manage to make better and cheaper chips Nvidia might fall hard.
Re: (Score:2)
As soon as someone manage to make better and cheaper chips Nvidia might fall hard.
This is a very true statement. However, part of the reason for its truth is that it's universal. "As soon as someone manages to make better and better X, company Y might fall hard." This is one of the truisms in the ideas of disruptive innovation. Apple is no more immune to this than Nvidia.
Re: (Score:2)
Don't forget that Nvidia also managed to spill the beans of new Apple computers about a week before Apple was set to announce new computers.
The Nvidia keynote for a new GPU accidentally mentioned a new Mac. Apple did not like that - their announcements at the time tend to be pretty secret so people were paying huge amounts of money to get the latest details on upcoming Apple products.
Steve Jobs basically halted the production line and got a replacement GPU from ATI so when the computer was properly announce
Re: (Score:2)
I always thought that one of the main reasons was that Apple were about to announce a new iMac (maybe the Pixar lamp / flowerpot iMac G4? Or maybe the white polycarbonate iMac?) and a day or two before Apple announced it, NVIDIA leaked it with a press release saying that it had an NVIDIA GPU in it. Steve Jobs was absolutely furious and this was the beginning of the end for NVIDIA in Apple computers.
They just want vertical integration (Score:4, Interesting)
It has nothing to do with these anthropomorphized reasons, vertical integration prevents their investments from being leveraged by competitors that's all there is to it.
There is only one long term viable consumer electronic ecosystem in existence at the moment, Apple would like to keep it that way. Currently a lot of fundamental tech needed to start a competitor is still available on the market, Apple is working hard to remove that. They want NVIDIA dead.
If Intel was remotely competitive with TSMC foundry wise, they'd buy their foundry business in a heart beat for the same reason. They want TSMC dead too.
Re: (Score:2)
Re: (Score:2)
There is only one long term viable consumer electronic ecosystem in existence at the moment
WTF are you on about? What does "viable consumer electronic ecosystem" even mean? Are you saying every company should build a vertically integrated stack and call it an 'ecosystem'? I am not understanding what you are saying here.
Re: (Score:2)
A viable consumer electronic ecosystem doesn't need to be vertically integrated, but all consumer electronics will on the medium term integrate with an ecosystem. People expect everything to integrate seamlessly with their phones and ecosystem account. Their laptop, their software, their car, their home automation, their TV ... and the ecosystem owner can leverage that, to either charge fees for access to the ecosystem, or to compete in a niche and leverage their inherent advantages to destroy third parties
Why buy a temporary resource? (Score:3)
Apple ... is now developing its own AI server chip with Broadcom, aiming for production by 2026 ... Rather than buying Nvidia's dominant AI processors directly like its tech peers, the Information reports, Apple rents them through cloud providers
Why buy a temporary resource? Apple is building a replacement for the NVIDIA chips, renting makes sense.
Plus there may be tax advantages.
There is no general GPU service option. (Score:2)
Latency is an obvious problem with assuming all hard rendering will be done remotely. Meaning unless everyone has data centers nearby with that non-existant GPU service they're going to have lag between input and results.
And NVIDIA wants every company to buy their hardware. Then use it themself, or create a service on top. They don't want one company to become the biggest and monopolize all profits from being a general GPU provider. And NVIDIA doesn't want to become that provider either (setting up then
Re: (Score:2)
What the GPU farm is really for is Machine Learning, building models. Which again are built remotely and downloaded upon completion.
It's ironic. (Score:2)
"We don't like businesses that treat us the same way that we treat other businesses! We're Apple, dammit!" - Tim Apple
I'd wondered why they didn't switch GPU's as much. (Score:2)
They kept using AMD (ATI) stuff in their devices. Never made sense to me to only ever use one company's thing over and over, especially if it wasn't the "best".
AMD has rarely had the lead anytime that I can remember. And if they did it was from much bigger silicon that used way more power, so it was more expensive for them than the at the time slightly worse NVIDIA stuff that was cheaper to make. And since we've gotten into crazy town for power the higher power needs (and cooling afterward) mean it's not
Re: (Score:2)
They kept using AMD (ATI) stuff in their devices. Never made sense to me to only ever use one company's thing over and over, especially if it wasn't the "best".
Well, it can make sense. Apple has long surrendered the gaming market to non-Apple devices. As a result, they don't need high-end graphics, at least not for gaming. For high-end Apple workstations, Apple can still use AMD because customers who buy Apple are not singularly focused on specs, so the cachet of the workstation is software and the Apple logo.
Re: (Score:1)
Apple did not surrender anything gaming wise.
Game studios simply go for the low hanging fruits first. And if those do not sell superbly, they do not port to Macs.
Basically all Macs have high end CPUs.
And even the laptops with only onboard CPUs are pretty good.
The Friction Went in Both Directions (Score:3)
One perennial sticking point was that Apple forbade NVIDIA from selling Mac-targeted GPU cards independently. If you wanted an NVIDIA-equipped Mac Pro, you had to buy it from Apple, even after the Mac Pro abandoned the proprietary backplane and went fully over to PCI-e. NVIDIA naturally found this frustrating, because it cut off users of the previous generation Mac Pros from upgrading their still-working systems to the latest NVIDIA GPUs.
We also occasionally got in to arguments where there were clear inefficiencies in Apple's driver API -- the API that passed us the data we had to work with. Even after dozens of emails, meetings, example code, and benchmarks proving our suggested change would benefit all GPU implementations, Apple would still say, "Nope, we're not changing it."
As for the bump crack issue, there was a lot of backstage finger-pointing, but ultimately Jensen, to his credit, said, "It's our name on the chip," and NVIDIA spent over $750 million remediating broken machines across all vendors, including Apple.
There are other stories (*cough*iMovie*cough*), but I'm already skating perilously close to the edge of my NDA. But since Apple now lives in a literal glass house, they might care to consider more carefully before throwing stones.
Is the Nvidia allergy unique? (Score:2)
Doesn't Apple have an allergy to everything not invented at Apple? Although the sickness is to at least some extent universal, Apple is the ultimate extreme in "not invented here."
As for whether Apple can continue its AI plans with TPUs and eventually transitioning to just its own chips, we'll see. Tesla had similar plans and finally gave up after many years and now enthusiastically welcomes Nvidia. Apple has also been working for at least the last 5 years on its own phone modem but still hasn't created
Cook sleeping (Score:2)
Since Jobs died.. with the exception of the Mx chips, Tim Cook has been sleeping on the job.
He hasn't really come up with any compelling initiative.
Very sleepy CEO.