Apple To Develop Its Own GPU, UK Chip Designer Imagination Reveals In 'Bombshell' PR (anandtech.com) 148
From a report on AnandTech: In a bombshell of a press release issued this morning, Imagination has announced that Apple has informed their long-time GPU partner that they will be winding down their use of Imagination's IP. Specifically, Apple expects that they will no longer be using Imagination's IP in 15 to 24 months. Furthermore the GPU design that replaces Imagination's designs will be, according to Imagination, "a separate, independent graphics design." In other words, Apple is developing their own GPU, and when that is ready, they will be dropping Imagination's GPU designs entirely. This alone would be big news, however the story doesn't stop there. As Apple's long-time GPU partner and the provider for the basis of all of Apple's SoCs going back to the very first iPhone, Imagination is also making a case to investors (and the public) that while Apple may be dropping Imagination's GPU designs for a custom design, that Apple can't develop a new GPU in isolation -- that any GPU developed by the company would still infringe on some of Imagination's IP. As a result the company is continuing to sit down with Apple and discuss alternative licensing arrangements, with the intent of defending their IP rights.
They were going to buy them... (Score:2, Interesting)
Because they couldn't get around the patents they had. They must have figured out another way to do things if they're just cutting them loose.
Poor guys, the stock was down 63% this morning.
Re: (Score:2)
Re: (Score:1, Offtopic)
Re: (Score:2)
My thoughts exactly. Because Apple represents the lion share of the company profits, it makes sense for Apple to say they are building out their own GPU, watch the stock drop, then be in a position to better buy out position.
I doubt that would work, even if it were true. As soon as word got out about talks the price would recover and the sale is a negotiation, not merely buying all the stock on the market. Imagination wold negotiate a fair price if Apple decided it needed to buy it for the IP. The IP might also be worth licensing to other Imagination customers as well, or as one more set of patents to potentially beat someone with if they decide to sue Apple. If Apple is deciding not to use their IP they may be going in a compl
Re: (Score:2)
Re: (Score:2)
It's not a leak. It sounds like something they are required to disclose, as it materially affects the company's value (hence, the immediate dumping of stock).
Re: (Score:3)
Why is that the case? I don't see AMD, Intel, or nVidia among their licensees [imgtec.com], and they make GPUs. Maybe they have a patent for "GPU, but on an Apple product."
And it looks like Imagination's first GPU (by the name PowerVR) came out in 1996. So it seems that the foundational patents would be expired by now.
Re: (Score:2)
Re: (Score:2)
Because they were designing custom stuff for Apple, where Apple didn't own the customized stuff they were buying. That makes it hard for Apple to cut them loose and replace those parts with their own. They would have be building something significantly different for themselves than the thing they're replacing.
When you buy a GPU from AMD or Intel, you're just buying a pre-made design. Of course you can't copy it, but you don't need to. You're just buying it. But now you ask to have a custom feature added, an
Re: (Score:1)
Yes, because Apple has historically been run by complete morons. Apple is chock full of 1) really smart people and 2) lawyers with IP experience. They know exactly what it means to make their own GPU and the risks of being sued by Imagination and/or pretty much every other GPU manufacturer [everybody will go for jumping on the "I want a cut of Apple's revenue cuz of my wonderous GPU IP" bandwagon]...
Re: (Score:2)
Golly, they know what they're doing, so that guantees success and forecloses analysis! Wowsers, Batman!
Nobody is ever wrong, nothing is ever contested, and everybody always wins. Why? Because their lawyers had experience. Duh.
roflcopter
Also, Apple never lost a court case, right?
Maybe instead we should just assume that everybody on slashdot knows that Apple spends a lot of money on lawyers, and sometimes they break the law and get in trouble. For example, price fixing in e-books. Other times they get away
Re: (Score:2)
Re: (Score:2)
Patents are a minefield. They are written broadly to cover as much as possible. It's hard for a new company to enter the field without getting sued by those that own the patents. Not sure what Apple's strategy here is but I doubt they can avoid paying royalties to someone for GPU patents. Maybe they think they can bluff their way into the market.
Re: (Score:1)
Still, PowerVR might conceivably have some useful power saving technologies that the others simply don't care about on account of not requiring absolute minimal power levels. This is entirely speculation of course.
Re: (Score:3)
Because they couldn't get around the patents they had. They must have figured out another way to do things if they're just cutting them loose.
Poor guys, the stock was down 63% this morning.
Probably gonna get worse for Imagination. About the only reason they were selling anything to SoC folks is that they could point and say, Apple uses our GPUs and that's why we are going to stay in business (used to be Apple and Intel). Now, not so much, and ARM/Mali is probably gonna come in and eat their lunch. Imagination isn't gonna be much better than Vivante after this.
FWIW, Vivante isn't in much better shape than Imagination, their main customer is Freescale, which was bought by NXP which was recen
Re: (Score:2)
Poor guys, the stock was down 63% this morning.
Things are likely to get a lot more grim in the near future.
The article mentions that Apple's licensing payments account for 69% of Imagination's annual revenue (Imagination even referred to Apple as an "Essential Contract" in its filings). As is to be expected, that amount is larger than the entirety of their profits, meaning that the loss of Apple immediately plunges them into the red. It looks like they'll have 1.5-2 years to figure out how to reduce their R&D costs or increase the payments they rece
Re: (Score:1)
Surely some percentage of their expenditures are also related to fulfilling their obligations to Apple and their costs go down too ;) No reason at all to presume they'll be in the red, they might just be a lot smaller.
Also, the R&D wouldn't still be getting spent right up to the day Apple stops buying the manufactured chips, that would be silly. The R&D costs would be scaling down right away, while the profit from existing Apple sales would continue for 18-24 months. We not only don't know they'll g
Re: (Score:2)
Saying "immediately in the red" was a poor choice of words on my part. What I meant to convey was that, as things are today and when taken by itself, the loss of Apple would be sufficient to put them into the red. You're quite right that that the loss isn't set to happen immediately and that they are likely to make adjustments in the meantime. Even so, what I was getting at is that I don't know that it will allow them to remain relevant.
Surely some percentage of their expenditures are also related to fulfilling their obligations to Apple and their costs go down too
As the article points out, their costs are almost entirely fixed R&
Re: (Score:2)
Re: (Score:2)
None of that means they aren't screwed.
Develop a MOBILE GPU, yes? (Score:3)
Re: (Score:2, Insightful)
There would be no point in telling their supplier of mobile GPU's "oh hey, we're about to drop you" if they were developing a desktop GPU.
Re: (Score:3)
Re:Develop a MOBILE GPU, yes? (Score:5, Insightful)
It's not like Apple really cares about Macs anymore. The last Mac mini update in 2014 was even a downgrade from their 2012 models. The Mac mini slide from the Keynote even implied that SSD was standard, but it's not. Still using 5400 RPM HDDs in their overpriced 2017 computers. Shame on you, Apple.
Re: (Score:2)
MacRumors' buyers guide rates everything but the MacBook Pro as "Don't buy" right now...
https://buyersguide.macrumors.... [macrumors.com]
Re: (Score:1)
MacRumors' buyers guide rates everything but the MacBook Pro as "Don't buy" right now...
https://buyersguide.macrumors.... [macrumors.com]
Yeah, because everyone who's in the know about Apple realizes that a desktop upgrade is imminent. Even I recommended to someone not to upgrade their aging iMac (2007, still going strong, but the display is getting a bit dim), but rather buy an external monitor for it and wait for the next models. So, $250 and he has a nice Dell display that has the same resolution as his 24" iMac, and will eventually serve as the replacement display for his wife's mini, who's display has developed a brightness-difference be
Lack of Imagination (Score:2)
It's not like Apple really cares about Macs anymore.
Agreed - Apple's latests offerings show that they are clearly lacking imagination and all this announcement does is make that official.
Re: (Score:2)
As others have pointed out, check out this page [macrumors.com].
And they didn't simply "not update the Mac mini" they actually went out of their way to downgrade the machine. A slow 1.4GHz on the low-end model? Soldered RAM so you're forced to pay Apple's RAM prices at the moment of purchase for future-proofing your machine?
I'm guessing Tim Cook really believes his "iPad is better than a laptop" nonsense. All the profits in the world means nothing if they can't make decent Macs anymore. Some of us need computers to work, n
Re: (Score:2)
There are a few things that mobile GPUs do to favour compute over off-chip data transfer because it saves power, but generally phone, tablet, and laptop GPUs are not that different other than in the number of pipelines that they support.
Well, aside from a massive difference in performance level and feature support. There's a reason Intel (despite actually making integrated desktop GPUs) doesn't try to compete with nVidia or AMD for the discrete market: modern desktop GPUs are very nearly as complicated as modern CPUs (in terms of transistor count, actually vastly more so, by a factor of 10-20 or so).
Re: (Score:1)
There are a few things that mobile GPUs do to favour compute over off-chip data transfer because it saves power, but generally phone, tablet, and laptop GPUs are not that different other than in the number of pipelines that they support.
Well, aside from a massive difference in performance level and feature support. There's a reason Intel (despite actually making integrated desktop GPUs) doesn't try to compete with nVidia or AMD for the discrete market: modern desktop GPUs are very nearly as complicated as modern CPUs (in terms of transistor count, actually vastly more so, by a factor of 10-20 or so).
Modern GPUs are nowhere near as complex as a modern CPU.
They have high transistor counts; but they are generally made up of fairly simple computational units. Just LOTS of them.
Re: (Score:1)
The one doesn't preclude the other. There are a few things that mobile GPUs do to favour compute over off-chip data transfer because it saves power, but generally phone, tablet, and laptop GPUs are not that different other than in the number of pipelines that they support. That said, the numbers aren't really there for the larger parts. The iPhone and iPad between them make a sufficiently large chunk of the high-end mobile market that it's worth developing a chip that's used solely by them. The Mac lines are a sufficiently small part of their overall markets that it's difficult to compete with the economies of scale of companies like AMD and nVidia.
You haven't looked at Intel, nVidia and AMD's prices lately, have you?
Apple can put a fair amount of R&D $$$ into walking-away from those guys, AND get the ability to move their capabilities at a pace that isn't controlled (hampered) by them, too.
Both of those things are VERY enticing to Apple, I assure you.
Re: (Score:3)
Re: (Score:1)
Its not the desktop PC issue. Apple has the surrounding hardware, OS, cpu, the developers, a way to pay developers for their software. The GPU is the last part that still has outside considerations. Control over the OS, developer tools, battery usage, resolution and the CPU tasks can allow for an interesting new internal GPU concept.
I think that Apple is getting REALLY tired of having their "roadmap" at the mercy of others, and with the new R&D facilities opening up, is going to go on a quite a push to bring all the key silicon designs "in house".
Then then only thing left is fabrication, in which Apple seems totally disinterested. But if they continue to have BEEELIONS burning a hole in their pocket (which it looks like they will), that will eventually come, too...
Re: (Score:3, Insightful)
The summary seems to suggest that but the title is vague. It would arguably be an even bigger bombshell if they were developing a GPU to compete with NVIDIA and ATI on the desktop market.
Let's not get ahead of ourselves here. Apple is not normally in the business of competing in the chip and components market. Apple designs its own motherboards but it does not market them to third parties and it would surprise me if they did any more with an in-house GPU design than use it in their own devices. If this design turns out to be superior to what you can get from NVIDIA and ATI, limiting its use to their own line of devices would help them sell those devices which fits their business model. If t
Re: (Score:3)
This is almost certainly aimed at improving improving the GPU in their iOS devices. Desktop (and laptop) GPUs are still an order of magnitude faster than GPUs in mobile devices (and consume an order of magnitude more power). I seriously doubt Apple would be able to leapfrog Nvidia and AMD in GPUs. (Except maybe power efficiency - problem being almost everyone else already beats them at power efficiency. That's why you rarel
Re: (Score:2)
If this design turns out to be superior to what you can get from NVIDIA and ATI
This is almost certainly aimed at improving improving the GPU in their iOS devices. Desktop (and laptop) GPUs are still an order of magnitude faster than GPUs in mobile devices (and consume an order of magnitude more power). I seriously doubt Apple would be able to leapfrog Nvidia and AMD in GPUs. (Except maybe power efficiency - problem being almost everyone else already beats them at power efficiency. That's why you rarely see Nvidia Terga SoCs in mobile devices outside of dedicated gaming handhelds like the Nvidia Shield and Nintendo Switch.).
True but you don't chop down a couple of giant redwoods like NVIDIA and ATI in a single swing, you do it one blow of your axe at a time. If Apple really was out to compete with NVIDIA and ATI, or more accurately stated was out to make itself self sufficient in terms of GPU chips for it's entire product line, I would expect them to start small and go on from there. It's what they did with the iPhone and iPod, they started with a couple of devices who into the bargain were widely lambasted by industry pundits
Re: (Score:2)
This isn't like the A6 SoC Apple designed - where everyone else was licensing and using the same ARM v7 design for their SoCs, and all Apple had to do was tweak it to make the A6 perform better than other ARM SoCs. There's no standard modern GPU hardware architecture for them to license - they'd have to start from scratch.
You do realize, of course, that Apple has an "Architecture"-class license from ARM, meaning they can, and DO, "roll their own" ARM-instruction-set-compatible CPUs. They don't just "tweak" or rearrange the deck-chairs, they actually have their own ARM designs, reflecting the fact that they have more ARM experience than almost anyone else on the planet.
Also, they've been neglecting their Mac line for years now. Many Macs aren't getting serious refreshes for 2-3 years, while competitors refresh every year.
Unlike most other laptop mfgs., Apple doesn't just throw together "this year's chipset", and call it a "New Design". They refresh stuff when it will actually r
Re: (Score:1)
The summary seems to suggest that but the title is vague. It would arguably be an even bigger bombshell if they were developing a GPU to compete with NVIDIA and ATI on the desktop market.
Let's not get ahead of ourselves here. Apple is not normally in the business of competing in the chip and components market. Apple designs its own motherboards but it does not market them to third parties and it would surprise me if they did any more with an in-house GPU design than use it in their own devices. If this design turns out to be superior to what you can get from NVIDIA and ATI, limiting its use to their own line of devices would help them sell those devices which fits their business model. If there is anything to hope for in this context it's mostly for Apple users who can hope that this will improve Apple devices as a gaming platform and that maybe one of the next couple of iterations of Apple TV will be a truly worth while gaming console (not holding my breath though).
Now, please give a cheer for the long line of local slashdot commenters eager to explain to us why Apple is the source of all evil and how this is a part of Apple's nefarious plan to achieve world domination.
I think you are spot-on that Apple has no intentions on selling any GPU, CPU or SoC designs or components outside of Apple.
They have been designing custom silicon since the Apple ][ days (some of it which would have been GREAT in the embedded world), and custom ARM designs since at least the Newton's time; and yet NEVER have they sold designs or components outside of their own company.
Re: (Score:1)
Mediocre: Like their laptop that has:
1. The fastest SSD on the market.
2. The most I/O bandwidth on the market.
3. A Thermal design that can go full-tilt all-day-long without thermal throttling of the CPU nor GPU, and which the case temperature never rises above 40 c (skin temp. is about 35 c)\
4. The ability to drive FOUR 4k or TWO 5k External Displays (no one else's laptop can do that, either).
Yeah. Mediocre.
Re: (Score:2)
1. Fastest SSD on market - not even close. I've got PCI-E SSDs a year old that are faster than anything in any Apple hardware, period.
2. The most I/O bandwidth on market - not in their gimped as fuck GPUs
3. Thermal design - yea, doesn't go anywhere. I've got a stress-test program that ignores all the safety stuff and does a real stress test, no matter the machine. Every Apple product burns up.
4. Uhh, my Sager notebook has dual GPUs. I can drive EIGHT 4K displays without issue, at the same cost as your shitt
Re: (Score:3, Informative)
1. Fastest SSD on market - not even close. I've got PCI-E SSDs a year old that are faster than anything in any Apple hardware, period.
2. The most I/O bandwidth on market - not in their gimped as fuck GPUs
3. Thermal design - yea, doesn't go anywhere. I've got a stress-test program that ignores all the safety stuff and does a real stress test, no matter the machine. Every Apple product burns up.
4. Uhh, my Sager notebook has dual GPUs. I can drive EIGHT 4K displays without issue, at the same cost as your shitty craptop.
Mediocre, beyond belief.
1. Fastest SSD. Not my benchmark [9to5mac.com]; but, BTW, where's yours?
2. Most I/O b/w. Four TB 3 ports say 80 Gbps of raw I/O. Sorry.. Dem's da facts.
3. Sorry, the new MBP DOESN'T even GET to the thermal limits. According to multiple reviews [notebookcheck.net], Both the CPU and GPU run flat-out 100% duty cycle 24/7. They really did fix it. Try again, Slashtard.
4. Dual GPUs. And at nearly THIRTEEN POUNDS, (nevermind the power bricks you have to lug around!) that Sager is more properly classified as a "luggable", than a laptop. You can't r
Re: (Score:2)
"1. Fastest SSD. Not my benchmark [9to5mac.com]; but, BTW, where's yours?"
http://i.imgur.com/wZ0cjjt.png [imgur.com] - you dare compare a laptop to anything I have and it will stomp the shit out of your CRAPPLE any day.
And there's still room for expansion in that configuration, too.
"80 Gbps of raw I/O"
Dude, I have THAT MANY LANES OF PCI-E 3.0 per motherboard (of which there are 4 in that config.)
"Sorry, the new MBP DOESN'T even GET to the thermal limits. According to multiple reviews [notebookcheck.net], Both the CPU a
Re: (Score:1)
"1. Fastest SSD. Not my benchmark [9to5mac.com]; but, BTW, where's yours?"
http://i.imgur.com/wZ0cjjt.png [imgur.com] - you dare compare a laptop to anything I have and it will stomp the shit out of your CRAPPLE any day.
And there's still room for expansion in that configuration, too.
Hardly a fair comparison; since that is NOT a laptop, and costs as much as a cheap house! A Cray can outperform any Dell, too; but what's the point? You're just grandstanding. NO Laptop, not even a Sager, has that much crap in it. It simply wouldn't fit. Try again. And if you claim that really IS in your Sager, then I want the model number.
"80 Gbps of raw I/O"
Dude, I have THAT MANY LANES OF PCI-E 3.0 per motherboard (of which there are 4 in that config.)
Again, what's your point? You're comparing a LAPTOP to some sort of monstrosity that dims the lights when you power it on! FFS!!!
"Sorry, the new MBP DOESN'T even GET to the thermal limits. According to multiple reviews [notebookcheck.net], Both the CPU and GPU run flat-out 100% duty cycle 24/7. They really did fix it. Try again, Slashtard."
As I look at three brand-fucking new ones, dead from overheating/deballing of the SoC, which I'm being paid to repair. Try again, oh ye who has no Apple repair certification.
Now I KNOW you're lying! What "SoC" would
Re: (Score:2)
"A Cray can outperform any Dell, too; but what's the point?"
Actually, if you bothered to look, no, it cannot. You are obviously living in old times, here. There's a reason Cray moved to x86 (and a good reason why they're losing right now.)
"Again, what's your point? You're comparing a LAPTOP to some sort of monstrosity that dims the lights when you power it on! FFS!!!"
You mactards think your shit is better than EVERYTHING, especially since you claim shit like fastest SSD on the market which is bullshit when
Re: Develop a MOBILE GPU, yes? (Score:2)
That may have been true for the first two weeks after launch. And then the market caught up. It's just a fancy winter laptop running a weirdo unixy OS, at the end of the day
Re: (Score:1)
That may have been true for the first two weeks after launch. And then the market caught up. It's just a fancy winter laptop running a weirdo unixy OS, at the end of the day
Sounds like sour grapes to me.
Re: (Score:2)
Re: (Score:1)
The summary seems to suggest that but the title is vague. It would arguably be an even bigger bombshell if they were developing a GPU to compete with NVIDIA and ATI on the desktop market.
That's Phase II of the Project...
Then it's the Axx CPU/SoC that can run x86...
Re: (Score:2)
The big bombshell would be that Apple had any interest at all in the desktop market.
Re: Develop a MOBILE GPU, yes? (Score:1)
I would be surprised, with this announcement, that the A11 doesn't include this new GPU. For them to drop the current GPU tech as they currently sell iPhones, they would have to totally stop making chips with that tech. 18 to 24 months would be iPhone "8" and iPhone "8S" for 2 generations as they have been doing. With a possible purchase of Toshiba's nand business, that would leave the modems, currently supplied by Qualcomm and Intel, and the display tech. Pretty much every other part is designed by Apple,
Re: (Score:2)
Given that we are talking about phones, why not develop a GPU that is an add-on core for the A11 or whatever CPUs they develop?
If I understand your question, you want to know why a GPU isn't an add-on for phones? The first reason would be efficiency and performance. By being on same SoC, communication between GPU and CPU is magnitudes faster than if they were not on the same chip.
As it is, they have to use very compact packaging technologies, such as SoC, PoP, et al
First of all you don't lose SoC or PoP whether or not a GPU is separated from the CPU. The mobile SoC still needs that technology for cache, memory, and other controllers.
Second, in order to use an add-on, you have to increase the complexity of manufacture
Re: (Score:2)
The Northbridge and PCIe for GPU are integrated in all desktop/laptops CPUs now (save for AM3+ FX 6000 and 8000 series that you still can buy)
Of course this doesn't change much at all. PCIe bandwith and latency and joule per bit on PCIe still are the same.
What would work better is an MCM, CPU and GPU are packaged close together and use custom or semi-custom interface. It does exist, that's AMD's upcoming Opteron that includes on the same package : two Ryzen dies, one Vega 2048 SP / 32 CU die, HBM2 memory fo
Re: (Score:2)
The point about the Northbridge/Southbridge was to point out to the poster that making the GPU an add-on module introduces multiple layers between the CPU and GPU that it does not currently have. This means that efficiency is being traded for modularity.
There's also Haswell or later at lowest wattage : on a single package, a CPU (with GPU and shit integrated) and a chipset are connected, closely and low powery. The "MacBook" (thing with only one USB port) and Surface Pro can use that. But it's a bit similar to a phone using a "do everything" SoC and a radio/modem chip on the side.
From what I gather that's not what the poster wants. He/she wants the ability to add-on GPUs which is not on the same SoC. This idea comes from teh modular phone concept where you can upgrade the camera, GPU, CPU, etc. like Google's Project Ara [youtube.com] which is a c
Re: (Score:2)
(thanks, that's interesting)
Even on chip buses are as fast as they need to be, or often too slow (typically, two packs of four CPU cores have a slowish networking between them and aren't to be used arbitrarily as an eight core CPU)
You can have a state of the art interconnect between CPU and GPU and memory etc. on a high end chip, sure.
CPU and GPU might be able to share memory addresses, function like AMD's heterogeneous computing promises to do (or does already, just not used very much). Not only AMD, the H
Re: (Score:2)
Apple designing a GPU? (Score:2, Funny)
I bet it will work with quadrangles, because triangles aren't "magical", and will work only with 10bit depth textures.
Re: (Score:3)
Rounded Corner Rectangles
And it would have the other magical incantation that makes things patent worthy . . .
On an iPhone!
Re: (Score:2)
It'll have special technology that detects when an application is trying to draw window
borders itself, and change them to the Apple look and feel.
Or maybe they will just dicontinue blue. Nobody wants blue anyway.
Re: (Score:2)
beat me to it. one big caveat is though... if you hold it wrong it renders everything in text only.
Re: (Score:2)
I know you're joking, but Apple's early 3D APIs RAVE and QuickDraw 3D were based on quads, and some early 3D hardware like Nvidia NV1 and Sega Model 1 rendered quads natively.
Quadrangles in action (Score:2)
how does that work? TRiangles are gaurenteed to be planar (3 points determine a line) quadrangles are not necessarily planar. Doesn't that screw up a lot of the interpolation and shading and such?
Re: (Score:1)
An "advantage" of the quad approach was that they weren't planar: they could be warped into rounded shapes. But the texture on the quad would tend to look stretched and pixelated, because the quads were more like 2D sprites that were being transformed and warped. You couldn't wrap a texture around a mesh like you can with triangle meshes. Each quad was its own texture.
Re: (Score:2)
Sega saturn does not exist,nor does the 3DO. Apple is the first company in the whole world to use quadrangles!
Has to be for mobile GPU (Score:2, Interesting)
So far Apple haven't given a crap about graphics performance. You don't have to be an anti-fanboi to see this, even Apple fanbois admit that the GPU in existing Apple kit, especially the so called 'pro' series, is lacking and the fanboi will say that this is because Apple users have better things to do with their time than play games.
Suddenly Apple cares enough to develop their own GPU? Are they hoping that game developers are going to start targeting the Apple user market which, for so long now, has been m
Re:Has to be for mobile GPU (Score:4, Informative)
Suddenly Apple cares enough to develop their own GPU?
Newsflash 1: Apple have been using their own A-series systems-on-a-chip (including CPU and GPU) in iPhone/iPad/Watch & AppleTV for a few years now. They license IP from various companies (ARM, Imagination and others) and have taken over a few chip designers to achieve this.
Newsflash 2: Apple owns one of the leading gaming platforms on the market: it's called the iPhone.
Apple has drunk deeply of the kool-aid that says that everybody is going to be using phones and tablets for all their computing needs in the next few years.
Macs, meanwhile, are mostly running on Intel integrated graphics or unspectacular AMD mobile graphics chips. Tim Cook recently stood up and re-iterated how important the Mac line is to Apple - and anybody who understands political talk will know that means exactly the opposite of what it says.
Re: (Score:3)
If you're going to replace the Mac with an iOS "Mac Mode" and drive a KVM you're going to need a very efficient GPU and a decent patent portfolio.
Re: (Score:2)
My guess is that the current provider was trying to milk Apple for licensing their GPUs and Apple looked at it and said "we probably can design something as good, let's cut them out".
We will build a garden ... (Score:2, Interesting)
We will make the wall taller and insurmountable
We will grow more stuff inside and import less and less.
By the time the inmates realize the walled garden is a prison, it would be too late. All other gardens would have been starved and withered and desolate.
Then, ... profit?
I remember another company trying to corner the desktop market for themselves.
Actually one can back all the way to Morgan trying to corner the silver market.
Well, free market and invisible hand all
Re: (Score:2)
They'll make the Windows users pay for it too.
Re: (Score:2)
Re: (Score:2)
Don't you think GM and Ford would simply love it, if SAE ceases to exist and they can make their engine incompatible with 10w-40 and make you buy proprietary oils from the dealer? Or tire rims become non
Re: (Score:2)
The standard interchangeable components and consumables in cars are achieved after a long period of struggle stretching over many decades. Now that most of the population is immunized and polio is something they read about in history books, people become lassie-faire with vaccination.
Your assertion relies on the component being consumable. I don't think a GPU can be considered a consumable.
Back to the 1980's (Score:2)
A tight new GPU design could see the kind of advancements some of the most creative game designers made with gpu support in the 1980's
Real freedom to be creative on one platform again. Not having to worry about the port, Windows, other devices.
A better in house GPU to keep developers happy. Been less tempted by easy porting and more productive on one OS.
The users then have to buy a hardware product range to play the must have
Re: (Score:2)
Mobile GPUs, not desktop GPUs (Score:2)
Thank you Apple for going into nowhere (Score:1)
Re: (Score:2)
It makes sense why they don't support Vulkan in light of that which is purely Apple decision.
From what I remember, Apple released Metal before Vulkan was announced as a spec. That was probably the main reason not to support it.
Re: (Score:2)
Those that cannot innovate... (Score:3)
sue.
Apple to replace Imagination's designs? (Score:2)
Imagination does not acknowledge Apples claims, it actual fact Imagination says the exact opposite.
"Apple has not presented any evidence to substantiate its assertion that it will no longer require Imagination’s technology, [imgtec.com] without violating Imagination’s patents, intellectual property and confidential information"
Apple were also one time in talks
Re: (Score:2)
There are other licensed GPU blocks (ARM's Mali comes to mind), along with mobile GPU's from NVIDIA that seem to work without Imagination's IP.
That doesn't mean Apple is building their own GPU from scratch, any more than they build the CPU from scratch. For both the CPU and GPU, they licensed from external companies (ARM & Imagination). There's likely nothing stopping them from licensing the GPU from ARM, NVIDIA, or any other of Imagination's competitors.
How is this news? (Score:3)
Patent age? (Score:2)
Imagination has a significant number of GPU patents (they’ve been at this for over 20 years), so developing a GPU that doesn’t infringe on those patents would be difficult to do, especially in the mobile space. Apple couldn’t implement Imagination’s Tile Based Deferred Rendering technique, for example, which has been the heart and soul of their GPU designs.
Since patents only last for 20 years, and the first Tile based PVR was released in 1996...... Why couldn
Re: (Score:2)
probably because the newer patent is on "Tile Based Deferred Rendering" on mobile :)
Re: (Score:2)
Um really? (Score:2)
Look guys- we're watching a dying company. Sure they have a lot of business at the moment. But their tech is limited and specialized. They killed their desktop business. They are losing ground in the tablet and phone market.
Investing in your own GPU is not the thing to do under those conditions. And only for mobile or just Apple products? Even with an assumption that Apple can produce something competitive it just doesn't make sense.
This smells a lot like Newton, John Sculley's pet project. Or CyberDog. Or
Re:Eliminate Moderation (Score:5, Funny)
Moderate the moderators! Yes, there is actually a way to do this. Someone could invent this thing. Call it meta-moderation. And patent it! With rounded corners! It would be the best! Trust me! It would go over bigly! I promise!
I can't tolerate intolerant people! I am totally intolerant of intolerant people!
There are no absolutes! Absolutely no absolutes! And that rule is absolute!
Re: (Score:2)
Actually the thing that should be banned are ACs, like the GP post.
Re: (Score:2)
Not a whole lot of logic to that one. See: The Federalist Papers
Re: (Score:3)
You'd think, then, if this was indeed the issue and they were unable to compete, they'd use some of the HUMONGOUS profit margin they make on the iPhones and iPads and buy the same (or similar) chips?
It's got nothing to do with "competing", so much as "owning".
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Hey, 820? Why compare with a more than a year old Soc? Compare it to the 835 then, moron and see a very diferent story. Anyway, se are talking about graphics fanboi. On a phone half the price of a iPhone 7.
You really are a fanboi arent you? The 835 has not been released yet. Therefore it is rather impossible for someone December 2016 to do a head to head comparison. And when someone compares apples to apples comparison of two processors which you can get today and that destroys your assertion, your first instinct it to get defensive.
Re: (Score:2)
Read the news and check you facts. The 835 is on the market. The already released S8 uses it in some countries, so you can buy phones using the SoC right now.
You can pre-order the S8 in some markets. That doesn't mean you can get one today. Certainly in the US, the release date is April 21, 2017. Again it's somewhat impossible for someone in December 2016 to do comparison of two phones where one isn't released until months later.
Also according to you, then you are comparing a year-old processor (A10) vs a new processor (Snapdragon 835). How fair is that comparison? Or do always intend your comparisons to be unfair and dishonest.
Anyway, back to the original discussion: iPhone7 graphics are lacking and there's nothing you can do to change it... you, like a good FANBOI tried to justify with some power efficiency bullshit... which is totally bogus since other SoCs manage it very well.
So many strawman arguments. First
Re: (Score:2)
Re: (Score:2)
Samsung uses Snapdragon processors which is 100% Qualcomm IP.
Which is why Samsung can't just say, "Golly Qualcomm, go away we're not going to pay you anymore. We're going to build our own!"
If it turns out compatible, they'll have a hard time claiming they didn't copy it when they even used to license it!
That is the danger is the first place when you have somebody develop tech to sell to you, instead of developing it yourself. They then own the implementations you're used to using for the features you thought of!
Ultimately, Apple probably figures they'll get a better
Re: (Score:2)
Which is why Samsung can't just say, "Golly Qualcomm, go away we're not going to pay you anymore. We're going to build our own!"
And how much experience has Samsung had with that? None. In their own Exynos chip, they have only designed the core for Exynos 8 and Exynos 9. Prior models used ARM standard cores.
The issue isn't just desire. Samsung doesn't have the experience or the personnel at this point to do it. They barely have the personnel to do it for CPUs much less any IP entanglements that might occur.
Re: (Score:2)
And how much experience has Samsung had with that? None. In their own Exynos chip, they have only designed the core for Exynos 8 and Exynos 9. Prior models used ARM standard cores.
So in your story, Samsung doesn't have experience with chip design because after using other people's designs for a long time, they designed their own, and then the next model they again designed their own. So, you're admitting that you know that they do in fact have experience at doing the thing you claim they are too inexperienced to do.
I'm not going to provide any other analysis of what you said, because you simply contradict yourself in a way that shows you don't even understand what you said. So of cou
Re: (Score:2)
So in your story, Samsung doesn't have experience with chip design because after using other people's designs for a long time, they designed their own, and then the next model they again designed their own. So, you're admitting that you know that they do in fact have experience at doing the thing you claim they are too inexperienced to do.
Um no. Samsung has only been doing their own architecture ARM CPU work only for the last two version of Exynos. Previously they only used standard ARM cores. My assertion is they have ZERO experience with GPU designs. None. They have yet to design a single GPU even in Exynos because they still use other GPUs from other manufacturers.
I'm not going to provide any other analysis of what you said, because you simply contradict yourself in a way that shows you don't even understand what you said. So of course it is unlikely that you even understood what I said.
I'm going to let you research the history of Samsung design and how a GPU is not a CPU.
Re: (Score:2)
First rule of karma whoring: never ask about the downvotes.