Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AMD Apple Games

Apple's New MetalFX Upscaling System Will Compete With AMD FSR, Nvidia DLSS (arstechnica.com) 44

At this year's WWDC, Apple announced a surprising new system coming to its Metal 3 gaming API that may sound familiar to PC gamers: MetalFX Upscaling. Ars Technica reports: The system will leverage Apple's custom silicon to reconstruct video game graphics using lower-resolution source images so that games can run more efficiently at lower resolutions while looking higher-res. This "temporal reconstruction" system sounds similar to existing offerings from AMD (FidelityFX Super Resolution 2.0) and Nvidia (Deep Learning Super-Sampling), along with an upcoming "XeSS" system from Intel. Based on how the system is described, it will more closely resemble AMD's system, since Apple has yet to announce a way for MetalFX Upscaling to leverage its custom-made "Neural Engine" system.

By announcing this functionality for some of the world's most popular processors, Apple is arguably letting more game developers build their games and engines with image reconstruction -- even if MetalFX Upscaling isn't open source, unlike AMD's FSR 2.0 system. Still, these image reconstruction systems typically have temporal anti-aliasing (TAA) in common. So long as game devs keep that kind of anti-aliasing in mind with their games and engines, they'll be more likely to take advantage and thus run more efficiently on a wide range of consoles, computers, and smartphones.
The report notes that Metal 3 also includes "a new 'resource-loading' API designed to streamline asset-loading processes in video games." The same Metal 3 API benefits will also come to iPadOS 16 later this year.
This discussion has been archived. No new comments can be posted.

Apple's New MetalFX Upscaling System Will Compete With AMD FSR, Nvidia DLSS

Comments Filter:
  • by Zuriel ( 1760072 ) on Monday June 06, 2022 @10:30PM (#62598890)
    Since MetalFX Upscaling won't exist on the same hardware, OS *or* graphics API as DLSS or FSR, how exactly is it competing? You use the one that's available on the platform you're developing for.
    • Re: (Score:3, Insightful)

      By existing on every existing M1/2 or recent A-series chip without adding a new video card? By being on a billion devices that already exist and cost less than a geforce card itself?
      • by Anonymous Coward

        By being on a billion devices that already exist and cost less than a geforce card itself?

        If only it performed even half as well as a geforce card!

      • By existing on every existing M1/2 or recent A-series chip without adding a new video card? By being on a billion devices that already exist and cost less than a geforce card itself?

        And not even that recent.

        IOS 16 (and thus, Metal 3), works all the way back to the iPhone 8, which only has an A11 Bionic SoC. That's a good 5 generations-back in the Ax Lineage.

    • Re:Competing? (Score:4, Insightful)

      by Luthair ( 847766 ) on Monday June 06, 2022 @10:58PM (#62598930)
      Or on hardware viable for PC gaming.
    • Metal 3 is actually supported on recent* AMD and Intel GPUs as well as Apple's own. There are (currently) no separate requirements listed for MetalFX.

      * "Available on Mac models with Apple silicon, AMD Radeon Pro Vega series, AMD Radeon Pro 5000/6000 series, Intel Iris Plus Graphics series, or Intel UHD Graphics 630."

  • by kriston ( 7886 ) on Monday June 06, 2022 @11:13PM (#62598952) Homepage Journal

    Maybe enable the M1 platform to support two external displays, already?

    My coworkers with M1 MacBooks constantly complain that their devices can't support two external displays.

    That is such a huge oversight on the part of Apple.

    My Windows laptop supports three external displays for 1/3 of the price of those M1 MacBooks. WTF?

    • by AmiMoJo ( 196126 )

      Isn't Apple's usual attitude that to get new features you must buy a new device?

    • Maybe enable the M1 platform to support two external displays, already?

      They did, finally, with the M1 Pro and above. We all seem to have forgotten the 'never buy generation 1' rule...

    • HAH. 3
      I can get 6 if I chain docking stations.
      • HAH. 3

        I can get 6 if I chain docking stations.

        What Docks, exactly are you using for that, and what are the limitations?

        • Dell D6000 docks and a Latitude 5420.
          Chained the docks together thru the USB-C ports on them.
          Didn't think it would work, but it did.
          • Dell D6000 docks and a Latitude 5420.

            Chained the docks together thru the USB-C ports on them.

            Didn't think it would work, but it did.

            Cool, thanks!

          • by kriston ( 7886 )

            The Dell D6000 is a DisplayLink adapter which is like a USB graphics card with slow performance. You can also plug it into your Arm-based MacBook to get multiple monitors.

            My post was about native graphics supporting more than one external monitor natively, which Arm MacBoos do not.

  • by rsilvergun ( 571051 ) on Monday June 06, 2022 @11:17PM (#62598958)
    These kind of features exist to do 4k gaming on GPUs that can't push 4k resolutions (pretty much anything under $800 bucks).

    That pretty much means modern AAA gaming, which is a dying breed. Sony and Ubisoft put out a handful of such games every generation. Square puts out on FF title. And Bethesda puts out a Doom game. Oh, and Microsoft puts out a Forza game. If we've been _really_ good children EA puts out a Need For Speed game and a Star Wars game.

    That's basically that. And i can't see any of that getting Mac ports. Just downscaled iOS ports.

    It's the same problem the Nintendo Wii had: Huge user base that only gamed casually. The majority of hardcore gamers who owned a Wii still had a 360 and/or a PS3 and gaming PC. Meaning sales were always weak on it because why would I buy a cut down Wii port when the Xbox 360 or PC version was right there?

    The Wii had actually inferior hardware. The Mac hardware might be able to hang (I'd really question that though, at least not w/o an AMD or nVidia GPU backing it up) but again, it doesn't matter how good your hardware is if devs don't use it. Ask Sega about the Saturn sometime...
    • in the PC gaming world that is true. But given Apples lack of GPU power this has potential at much lower resolutions for them, they aren't struggling to do 4k, they are struggling to do 1080p or 1440p.
    • It's a chicken and egg thing. Making 4K gaming accessible on midrange Macs will attract more developers and players to the platform. At present you need a higher end Mac to play stuff like Baldur's Gate 3. Apple is a very cashed up company so I imagine it will pay a few / subsidise developers to get the ball rolling on porting.
    • by AmiMoJo ( 196126 ) on Tuesday June 07, 2022 @05:43AM (#62599390) Homepage Journal

      The M1 and M2 GPUs are mobile designs. They use tile rendering, which has some big implications for performance. With tile rendering, the GPU has fast tile memory internally, but it only covers a small area of the screen e.g. 64x64 pixels. It renders those pixels and then pushes them out to RAM. Ideally it needs to do all the rendering before pushing, because pulling that data back in to do another rendering pass is quite expensive.

      Of course most games designed for consoles and for PCs aren't designed for tile rendering, they assume that the GPU has massive memory bandwidth and can do multiple passes over every pixel with little performance penalty. So Mac ports are a problem. Mobile games will come over okay though.

      This feature allows the M2 GPU to render at a lower resolution, which really helps with this bottleneck.

      The M1 GPU is slower than an RTX 3050, the lowest end model that Nvidia makes. It only benchmarks well in certain applications that Apple has added dedicated acceleration for, mostly video encoding/decoding.

      • All modern desktop GPUs use tile rendering. Nvidia Maxwell and higher and AMD GCN 5.0+
        • Right, Apple's rendering model isn't radically different. It's design is a lot different though, and I think it's a better overall tactic. A 64 core M1 Ultra is only a bit slower in benchmarks from actual games than a 3050 with 2500 cuda cores. Practically, it's quite a bit slower mostly due to platform support so what's this look like if game devs target M1/M2?

          And what's it look like with 128 or 256 GPU cores on M2 pro/ultra/max/whatever?

          MetalFX enables an interesting 'pipeline' for game builds though.

    • "AAA gaming" is not a "dying breed". Titles published in the "AAA" category set a revenue record in 2020, and all signs point at another record-breaker for 2021 (actual data is still very much paywalled).

    • That pretty much means modern AAA gaming, which is a dying breed. Sony and Ubisoft put out a handful of such games every generation. Square puts out on FF title. And Bethesda puts out a Doom game. Oh, and Microsoft puts out a Forza game. If we've been _really_ good children EA puts out a Need For Speed game and a Star Wars game.

      That's basically that.

      So you basically aren't paying attention to the industry all then? Let me clue you in:

      a) The number of games that can't reach 4K resolutions at playable framerates are constantly increasing, largely because of adoption of raytracing. You don't need to be a AAA developer to hit these limits. Hell any kid with a demo version of Unreal can produce a game that struggles to run well even at 1440p.

      b) Apple is making a move towards VR. VR *requires* both high resolution and high frame rates to prevent motion sickn

    • > The Wii had actually inferior hardware.

      I shipped a couple of Wii games. The Wii was literally an ~1.5x over-clocked GameCube. The GPU (Hollywood [wikipedia.org]) had the exact same hardware bugs and feature set as the GameCube's GPU (Flipper): no shaders, it had TEV (Texture Environment Unit) instead.

      Nintendo has never focused on pushing tech as much as Sony and Microsoft do.

  • FSR is GPU-agnostic and can potentially work on any hardware that supports shader model 5.0

    https://www.pcgamesn.com/amd/f... [pcgamesn.com]

    It already supports various NV GPUs. If Apple didn't have its head so far up its own posterior, they'd save themselves the trouble and just support FSR.

    • by Ormy ( 1430821 )
      I wasn't aware of FSR being agnostic, I had just assumed it was AMD-only. How does FSR compare to DLSS 2.x on NV 3xxx series hardware, specifically in terms of performance vs subjective image quality at the balanced & quality settings?
      • by Guspaz ( 556486 )

        DLSS has moderately higher performance and a small increase in image quality (mostly in fine detail and motion handling). Both of these differences can probably be attributed to DLSS's use of the GPU's tensor cores, which both allows it to offload more of the work from the shader cores (leaving more shader resources for actual game rendering) and probably allow it to make better decisions about how to integrate the temporal data. DLSS takes less time to process a frame than FSR 2.0, which has interesting im

  • Let me see,

    I could optimize my game for nVidia's DLSS + Raytracing that has the most features, but works only on nVidia cards in the Super Large Windows ecosystem...
    Or, I could optimize my game for AMD's FXX 2.0 which has less feature but works on ANY (AMD, nVidia, Intel, Innosilicon) graphics card in the Super Large Windows Ecosystem...
    Or, I could optimize my game for Intel's (upcoming) XeSS which has less features but works on ANY (AMD, nVidia, Intel, Innosilicon) graphics card in the Super Large Windows

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...