Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Businesses The Almighty Buck Apple Hardware

Apple To Develop Its Own GPU, UK Chip Designer Imagination Reveals In 'Bombshell' PR (anandtech.com) 148

From a report on AnandTech: In a bombshell of a press release issued this morning, Imagination has announced that Apple has informed their long-time GPU partner that they will be winding down their use of Imagination's IP. Specifically, Apple expects that they will no longer be using Imagination's IP in 15 to 24 months. Furthermore the GPU design that replaces Imagination's designs will be, according to Imagination, "a separate, independent graphics design." In other words, Apple is developing their own GPU, and when that is ready, they will be dropping Imagination's GPU designs entirely. This alone would be big news, however the story doesn't stop there. As Apple's long-time GPU partner and the provider for the basis of all of Apple's SoCs going back to the very first iPhone, Imagination is also making a case to investors (and the public) that while Apple may be dropping Imagination's GPU designs for a custom design, that Apple can't develop a new GPU in isolation -- that any GPU developed by the company would still infringe on some of Imagination's IP. As a result the company is continuing to sit down with Apple and discuss alternative licensing arrangements, with the intent of defending their IP rights.
This discussion has been archived. No new comments can be posted.

Apple To Develop Its Own GPU, UK Chip Designer Imagination Reveals In 'Bombshell' PR

Comments Filter:
  • by Anonymous Coward

    Because they couldn't get around the patents they had. They must have figured out another way to do things if they're just cutting them loose.
    Poor guys, the stock was down 63% this morning.

    • Comment removed based on user account deletion
      • Re: (Score:1, Offtopic)

        by DickBreath ( 207180 )
        Apple could get Imagination's stock to drop even further if they can get Trump to tweet something about failing Imagination. Terrible. Sad.
      • My thoughts exactly. Because Apple represents the lion share of the company profits, it makes sense for Apple to say they are building out their own GPU, watch the stock drop, then be in a position to better buy out position.

        I doubt that would work, even if it were true. As soon as word got out about talks the price would recover and the sale is a negotiation, not merely buying all the stock on the market. Imagination wold negotiate a fair price if Apple decided it needed to buy it for the IP. The IP might also be worth licensing to other Imagination customers as well, or as one more set of patents to potentially beat someone with if they decide to sue Apple. If Apple is deciding not to use their IP they may be going in a compl

    • by msauve ( 701917 )
      "Apple can't develop a new GPU in isolation -- that any GPU developed by the company would still infringe on some of Imagination's IP. "

      Why is that the case? I don't see AMD, Intel, or nVidia among their licensees [imgtec.com], and they make GPUs. Maybe they have a patent for "GPU, but on an Apple product."

      And it looks like Imagination's first GPU (by the name PowerVR) came out in 1996. So it seems that the foundational patents would be expired by now.
      • I suspect that most relevant patents are in the mobile space. PowerVR was the first GPU to use a tile-based model and mobile GPUs use later iterations of that technology - I'd be very surprised if they didn't have patents later than 1996 on something that they've been actively working on for so long. ARM isn't on that list, but I think Mali has enough similarities to the PowerVR designs that they've almost certainly got a cross-licensing deal with them. They may well do with nVidia as well - there were a
      • Because they were designing custom stuff for Apple, where Apple didn't own the customized stuff they were buying. That makes it hard for Apple to cut them loose and replace those parts with their own. They would have be building something significantly different for themselves than the thing they're replacing.

        When you buy a GPU from AMD or Intel, you're just buying a pre-made design. Of course you can't copy it, but you don't need to. You're just buying it. But now you ask to have a custom feature added, an

        • Yes, because Apple has historically been run by complete morons. Apple is chock full of 1) really smart people and 2) lawyers with IP experience. They know exactly what it means to make their own GPU and the risks of being sued by Imagination and/or pretty much every other GPU manufacturer [everybody will go for jumping on the "I want a cut of Apple's revenue cuz of my wonderous GPU IP" bandwagon]...

          • Golly, they know what they're doing, so that guantees success and forecloses analysis! Wowsers, Batman!

            Nobody is ever wrong, nothing is ever contested, and everybody always wins. Why? Because their lawyers had experience. Duh.

            roflcopter

            Also, Apple never lost a court case, right?

            Maybe instead we should just assume that everybody on slashdot knows that Apple spends a lot of money on lawyers, and sometimes they break the law and get in trouble. For example, price fixing in e-books. Other times they get away

      • Comment removed based on user account deletion
      • by mspohr ( 589790 )

        Patents are a minefield. They are written broadly to cover as much as possible. It's hard for a new company to enter the field without getting sued by those that own the patents. Not sure what Apple's strategy here is but I doubt they can avoid paying royalties to someone for GPU patents. Maybe they think they can bluff their way into the market.

      • You can use a lot of the PowerVR tiling stuff, certainly. And the technology to make a modern GPU is going to be in a pool of licensable IP.

        Still, PowerVR might conceivably have some useful power saving technologies that the others simply don't care about on account of not requiring absolute minimal power levels. This is entirely speculation of course.
    • by slew ( 2918 )

      Because they couldn't get around the patents they had. They must have figured out another way to do things if they're just cutting them loose.
      Poor guys, the stock was down 63% this morning.

      Probably gonna get worse for Imagination. About the only reason they were selling anything to SoC folks is that they could point and say, Apple uses our GPUs and that's why we are going to stay in business (used to be Apple and Intel). Now, not so much, and ARM/Mali is probably gonna come in and eat their lunch. Imagination isn't gonna be much better than Vivante after this.

      FWIW, Vivante isn't in much better shape than Imagination, their main customer is Freescale, which was bought by NXP which was recen

    • Poor guys, the stock was down 63% this morning.

      Things are likely to get a lot more grim in the near future.

      The article mentions that Apple's licensing payments account for 69% of Imagination's annual revenue (Imagination even referred to Apple as an "Essential Contract" in its filings). As is to be expected, that amount is larger than the entirety of their profits, meaning that the loss of Apple immediately plunges them into the red. It looks like they'll have 1.5-2 years to figure out how to reduce their R&D costs or increase the payments they rece

      • Surely some percentage of their expenditures are also related to fulfilling their obligations to Apple and their costs go down too ;) No reason at all to presume they'll be in the red, they might just be a lot smaller.

        Also, the R&D wouldn't still be getting spent right up to the day Apple stops buying the manufactured chips, that would be silly. The R&D costs would be scaling down right away, while the profit from existing Apple sales would continue for 18-24 months. We not only don't know they'll g

        • Saying "immediately in the red" was a poor choice of words on my part. What I meant to convey was that, as things are today and when taken by itself, the loss of Apple would be sufficient to put them into the red. You're quite right that that the loss isn't set to happen immediately and that they are likely to make adjustments in the meantime. Even so, what I was getting at is that I don't know that it will allow them to remain relevant.

          Surely some percentage of their expenditures are also related to fulfilling their obligations to Apple and their costs go down too

          As the article points out, their costs are almost entirely fixed R&

    • Comment removed based on user account deletion
  • The summary seems to suggest that but the title is vague. It would arguably be an even bigger bombshell if they were developing a GPU to compete with NVIDIA and ATI on the desktop market.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      There would be no point in telling their supplier of mobile GPU's "oh hey, we're about to drop you" if they were developing a desktop GPU.

    • The one doesn't preclude the other. There are a few things that mobile GPUs do to favour compute over off-chip data transfer because it saves power, but generally phone, tablet, and laptop GPUs are not that different other than in the number of pipelines that they support. That said, the numbers aren't really there for the larger parts. The iPhone and iPad between them make a sufficiently large chunk of the high-end mobile market that it's worth developing a chip that's used solely by them. The Mac line
      • by DontBeAMoran ( 4843879 ) on Monday April 03, 2017 @10:16AM (#54164003)

        It's not like Apple really cares about Macs anymore. The last Mac mini update in 2014 was even a downgrade from their 2012 models. The Mac mini slide from the Keynote even implied that SSD was standard, but it's not. Still using 5400 RPM HDDs in their overpriced 2017 computers. Shame on you, Apple.

        • by MSG ( 12810 )

          MacRumors' buyers guide rates everything but the MacBook Pro as "Don't buy" right now...

          https://buyersguide.macrumors.... [macrumors.com]

          • MacRumors' buyers guide rates everything but the MacBook Pro as "Don't buy" right now...

            https://buyersguide.macrumors.... [macrumors.com]

            Yeah, because everyone who's in the know about Apple realizes that a desktop upgrade is imminent. Even I recommended to someone not to upgrade their aging iMac (2007, still going strong, but the display is getting a bit dim), but rather buy an external monitor for it and wait for the next models. So, $250 and he has a nice Dell display that has the same resolution as his 24" iMac, and will eventually serve as the replacement display for his wife's mini, who's display has developed a brightness-difference be

        • It's not like Apple really cares about Macs anymore.

          Agreed - Apple's latests offerings show that they are clearly lacking imagination and all this announcement does is make that official.

      • There are a few things that mobile GPUs do to favour compute over off-chip data transfer because it saves power, but generally phone, tablet, and laptop GPUs are not that different other than in the number of pipelines that they support.

        Well, aside from a massive difference in performance level and feature support. There's a reason Intel (despite actually making integrated desktop GPUs) doesn't try to compete with nVidia or AMD for the discrete market: modern desktop GPUs are very nearly as complicated as modern CPUs (in terms of transistor count, actually vastly more so, by a factor of 10-20 or so).

        • There are a few things that mobile GPUs do to favour compute over off-chip data transfer because it saves power, but generally phone, tablet, and laptop GPUs are not that different other than in the number of pipelines that they support.

          Well, aside from a massive difference in performance level and feature support. There's a reason Intel (despite actually making integrated desktop GPUs) doesn't try to compete with nVidia or AMD for the discrete market: modern desktop GPUs are very nearly as complicated as modern CPUs (in terms of transistor count, actually vastly more so, by a factor of 10-20 or so).

          Modern GPUs are nowhere near as complex as a modern CPU.

          They have high transistor counts; but they are generally made up of fairly simple computational units. Just LOTS of them.

      • The one doesn't preclude the other. There are a few things that mobile GPUs do to favour compute over off-chip data transfer because it saves power, but generally phone, tablet, and laptop GPUs are not that different other than in the number of pipelines that they support. That said, the numbers aren't really there for the larger parts. The iPhone and iPad between them make a sufficiently large chunk of the high-end mobile market that it's worth developing a chip that's used solely by them. The Mac lines are a sufficiently small part of their overall markets that it's difficult to compete with the economies of scale of companies like AMD and nVidia.

        You haven't looked at Intel, nVidia and AMD's prices lately, have you?

        Apple can put a fair amount of R&D $$$ into walking-away from those guys, AND get the ability to move their capabilities at a pace that isn't controlled (hampered) by them, too.

        Both of those things are VERY enticing to Apple, I assure you.

    • by AHuxley ( 892839 )
      Its not the desktop PC issue. Apple has the surrounding hardware, OS, cpu, the developers, a way to pay developers for their software. The GPU is the last part that still has outside considerations. Control over the OS, developer tools, battery usage, resolution and the CPU tasks can allow for an interesting new internal GPU concept.
      • Its not the desktop PC issue. Apple has the surrounding hardware, OS, cpu, the developers, a way to pay developers for their software. The GPU is the last part that still has outside considerations. Control over the OS, developer tools, battery usage, resolution and the CPU tasks can allow for an interesting new internal GPU concept.

        I think that Apple is getting REALLY tired of having their "roadmap" at the mercy of others, and with the new R&D facilities opening up, is going to go on a quite a push to bring all the key silicon designs "in house".

        Then then only thing left is fabrication, in which Apple seems totally disinterested. But if they continue to have BEEELIONS burning a hole in their pocket (which it looks like they will), that will eventually come, too...

    • Re: (Score:3, Insightful)

      by Freischutz ( 4776131 )

      The summary seems to suggest that but the title is vague. It would arguably be an even bigger bombshell if they were developing a GPU to compete with NVIDIA and ATI on the desktop market.

      Let's not get ahead of ourselves here. Apple is not normally in the business of competing in the chip and components market. Apple designs its own motherboards but it does not market them to third parties and it would surprise me if they did any more with an in-house GPU design than use it in their own devices. If this design turns out to be superior to what you can get from NVIDIA and ATI, limiting its use to their own line of devices would help them sell those devices which fits their business model. If t

      • If this design turns out to be superior to what you can get from NVIDIA and ATI

        This is almost certainly aimed at improving improving the GPU in their iOS devices. Desktop (and laptop) GPUs are still an order of magnitude faster than GPUs in mobile devices (and consume an order of magnitude more power). I seriously doubt Apple would be able to leapfrog Nvidia and AMD in GPUs. (Except maybe power efficiency - problem being almost everyone else already beats them at power efficiency. That's why you rarel

        • If this design turns out to be superior to what you can get from NVIDIA and ATI

          This is almost certainly aimed at improving improving the GPU in their iOS devices. Desktop (and laptop) GPUs are still an order of magnitude faster than GPUs in mobile devices (and consume an order of magnitude more power). I seriously doubt Apple would be able to leapfrog Nvidia and AMD in GPUs. (Except maybe power efficiency - problem being almost everyone else already beats them at power efficiency. That's why you rarely see Nvidia Terga SoCs in mobile devices outside of dedicated gaming handhelds like the Nvidia Shield and Nintendo Switch.).

          True but you don't chop down a couple of giant redwoods like NVIDIA and ATI in a single swing, you do it one blow of your axe at a time. If Apple really was out to compete with NVIDIA and ATI, or more accurately stated was out to make itself self sufficient in terms of GPU chips for it's entire product line, I would expect them to start small and go on from there. It's what they did with the iPhone and iPod, they started with a couple of devices who into the bargain were widely lambasted by industry pundits

        • This isn't like the A6 SoC Apple designed - where everyone else was licensing and using the same ARM v7 design for their SoCs, and all Apple had to do was tweak it to make the A6 perform better than other ARM SoCs. There's no standard modern GPU hardware architecture for them to license - they'd have to start from scratch.

          You do realize, of course, that Apple has an "Architecture"-class license from ARM, meaning they can, and DO, "roll their own" ARM-instruction-set-compatible CPUs. They don't just "tweak" or rearrange the deck-chairs, they actually have their own ARM designs, reflecting the fact that they have more ARM experience than almost anyone else on the planet.

          Also, they've been neglecting their Mac line for years now. Many Macs aren't getting serious refreshes for 2-3 years, while competitors refresh every year.

          Unlike most other laptop mfgs., Apple doesn't just throw together "this year's chipset", and call it a "New Design". They refresh stuff when it will actually r

      • The summary seems to suggest that but the title is vague. It would arguably be an even bigger bombshell if they were developing a GPU to compete with NVIDIA and ATI on the desktop market.

        Let's not get ahead of ourselves here. Apple is not normally in the business of competing in the chip and components market. Apple designs its own motherboards but it does not market them to third parties and it would surprise me if they did any more with an in-house GPU design than use it in their own devices. If this design turns out to be superior to what you can get from NVIDIA and ATI, limiting its use to their own line of devices would help them sell those devices which fits their business model. If there is anything to hope for in this context it's mostly for Apple users who can hope that this will improve Apple devices as a gaming platform and that maybe one of the next couple of iterations of Apple TV will be a truly worth while gaming console (not holding my breath though).

        Now, please give a cheer for the long line of local slashdot commenters eager to explain to us why Apple is the source of all evil and how this is a part of Apple's nefarious plan to achieve world domination.

        I think you are spot-on that Apple has no intentions on selling any GPU, CPU or SoC designs or components outside of Apple.

        They have been designing custom silicon since the Apple ][ days (some of it which would have been GREAT in the embedded world), and custom ARM designs since at least the Newton's time; and yet NEVER have they sold designs or components outside of their own company.

    • It would be but I don't see it happening before mobile. Apple has spent most the last decade designing their mobile CPUs not their desktop ones. I'm sure Apple is always working on whether they could switch their laptops or desktops to their CPUs but the main priority would be mobile first.
    • The summary seems to suggest that but the title is vague. It would arguably be an even bigger bombshell if they were developing a GPU to compete with NVIDIA and ATI on the desktop market.

      That's Phase II of the Project...

      Then it's the Axx CPU/SoC that can run x86...

    • The big bombshell would be that Apple had any interest at all in the desktop market.

  • I bet it will work with quadrangles, because triangles aren't "magical", and will work only with 10bit depth textures.

    • The unique patent worthy novelty of an Apple GPU is that it could work with . . .

      Rounded Corner Rectangles

      And it would have the other magical incantation that makes things patent worthy . . .

      On an iPhone!
      • by skids ( 119237 )

        It'll have special technology that detects when an application is trying to draw window
        borders itself, and change them to the Apple look and feel.

        Or maybe they will just dicontinue blue. Nobody wants blue anyway.

      • by zlives ( 2009072 )

        beat me to it. one big caveat is though... if you hold it wrong it renders everything in text only.

    • by _merlin ( 160982 )

      I know you're joking, but Apple's early 3D APIs RAVE and QuickDraw 3D were based on quads, and some early 3D hardware like Nvidia NV1 and Sega Model 1 rendered quads natively.

      • how does that work? TRiangles are gaurenteed to be planar (3 points determine a line) quadrangles are not necessarily planar. Doesn't that screw up a lot of the interpolation and shading and such?

        • by Anonymous Coward

          An "advantage" of the quad approach was that they weren't planar: they could be warped into rounded shapes. But the texture on the quad would tend to look stretched and pixelated, because the quads were more like 2D sprites that were being transformed and warped. You couldn't wrap a texture around a mesh like you can with triangle meshes. Each quad was its own texture.

  • So far Apple haven't given a crap about graphics performance. You don't have to be an anti-fanboi to see this, even Apple fanbois admit that the GPU in existing Apple kit, especially the so called 'pro' series, is lacking and the fanboi will say that this is because Apple users have better things to do with their time than play games.

    Suddenly Apple cares enough to develop their own GPU? Are they hoping that game developers are going to start targeting the Apple user market which, for so long now, has been m

    • by itsdapead ( 734413 ) on Monday April 03, 2017 @10:51AM (#54164143)

      Suddenly Apple cares enough to develop their own GPU?

      Newsflash 1: Apple have been using their own A-series systems-on-a-chip (including CPU and GPU) in iPhone/iPad/Watch & AppleTV for a few years now. They license IP from various companies (ARM, Imagination and others) and have taken over a few chip designers to achieve this.

      Newsflash 2: Apple owns one of the leading gaming platforms on the market: it's called the iPhone.

      Apple has drunk deeply of the kool-aid that says that everybody is going to be using phones and tablets for all their computing needs in the next few years.

      Macs, meanwhile, are mostly running on Intel integrated graphics or unspectacular AMD mobile graphics chips. Tim Cook recently stood up and re-iterated how important the Mac line is to Apple - and anybody who understands political talk will know that means exactly the opposite of what it says.

    • If you're going to replace the Mac with an iOS "Mac Mode" and drive a KVM you're going to need a very efficient GPU and a decent patent portfolio.

    • by godrik ( 1287354 )

      My guess is that the current provider was trying to milk Apple for licensing their GPUs and Apple looked at it and said "we probably can design something as good, let's cut them out".

  • Then a wall around that garden.

    We will make the wall taller and insurmountable

    We will grow more stuff inside and import less and less.

    By the time the inmates realize the walled garden is a prison, it would be too late. All other gardens would have been starved and withered and desolate.

    Then, ... profit?

    I remember another company trying to corner the desktop market for themselves.

    Actually one can back all the way to Morgan trying to corner the silver market.

    Well, free market and invisible hand all

    • They'll make the Windows users pay for it too.

      • Just be sure these new GPUs run on clean coal. Help put the coal miners back to work. Make Apple Great Again!
  • When every brand had their own GPU ideas, CPU ideas and music chip support.
    A tight new GPU design could see the kind of advancements some of the most creative game designers made with gpu support in the 1980's
    Real freedom to be creative on one platform again. Not having to worry about the port, Windows, other devices.
    A better in house GPU to keep developers happy. Been less tempted by easy porting and more productive on one OS.
    The users then have to buy a hardware product range to play the must have
  • For you who this isn't completely obvious: This is entirely about mobile GPUs. This has nothing about Apple trying to dominate AMD or Nvidia in the desktop space.
  • It makes sense why they don't support Vulkan in light of that which is purely Apple decision. So whats gonna happen. They will introduce their own VR. Since VR is already on market and they are LATE to the party, somewhat. No one will want to invest into homogeneous enterprise. So they gonna try hard, and loose good chunk of fan base. Wellcome back to Windows our older more lazy users who ned to have shit done, welcome to Linux, those willing to learn.
    • It makes sense why they don't support Vulkan in light of that which is purely Apple decision.

      From what I remember, Apple released Metal before Vulkan was announced as a spec. That was probably the main reason not to support it.

  • by Gravis Zero ( 934156 ) on Monday April 03, 2017 @10:50AM (#54164131)

    sue.

  • 'Furthermore the GPU design that replaces Imagination's designs will be, according to Imagination, "a separate, independent graphics design."'

    Imagination does not acknowledge Apples claims, it actual fact Imagination says the exact opposite.

    "Apple has not presented any evidence to substantiate its assertion that it will no longer require Imagination’s technology, [imgtec.com] without violating Imagination’s patents, intellectual property and confidential information"

    Apple were also one time in talks
    • by sl3xd ( 111641 )

      There are other licensed GPU blocks (ARM's Mali comes to mind), along with mobile GPU's from NVIDIA that seem to work without Imagination's IP.

      That doesn't mean Apple is building their own GPU from scratch, any more than they build the CPU from scratch. For both the CPU and GPU, they licensed from external companies (ARM & Imagination). There's likely nothing stopping them from licensing the GPU from ARM, NVIDIA, or any other of Imagination's competitors.

  • by LynnwoodRooster ( 966895 ) on Monday April 03, 2017 @12:11PM (#54164615) Journal
    Apple's been advertising/looking for GPU verification engineers and IC process engineers on LinkedIn and other sites for months. If this was a "secret" it was one of the worst-kept secrets out there...
  • In TFA, It says:

    Imagination has a significant number of GPU patents (they’ve been at this for over 20 years), so developing a GPU that doesn’t infringe on those patents would be difficult to do, especially in the mobile space. Apple couldn’t implement Imagination’s Tile Based Deferred Rendering technique, for example, which has been the heart and soul of their GPU designs.

    Since patents only last for 20 years, and the first Tile based PVR was released in 1996...... Why couldn
  • by 2ms ( 232331 )
    Let's just hope they don't do with GPU what they did in CPU. I'm going to be pissed if Apple GPUs are as dominant as their CPUs are.
  • Look guys- we're watching a dying company. Sure they have a lot of business at the moment. But their tech is limited and specialized. They killed their desktop business. They are losing ground in the tablet and phone market.

    Investing in your own GPU is not the thing to do under those conditions. And only for mobile or just Apple products? Even with an assumption that Apple can produce something competitive it just doesn't make sense.

    This smells a lot like Newton, John Sculley's pet project. Or CyberDog. Or

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...