Apple's New MetalFX Upscaling System Will Compete With AMD FSR, Nvidia DLSS (arstechnica.com) 44
At this year's WWDC, Apple announced a surprising new system coming to its Metal 3 gaming API that may sound familiar to PC gamers: MetalFX Upscaling. Ars Technica reports: The system will leverage Apple's custom silicon to reconstruct video game graphics using lower-resolution source images so that games can run more efficiently at lower resolutions while looking higher-res. This "temporal reconstruction" system sounds similar to existing offerings from AMD (FidelityFX Super Resolution 2.0) and Nvidia (Deep Learning Super-Sampling), along with an upcoming "XeSS" system from Intel. Based on how the system is described, it will more closely resemble AMD's system, since Apple has yet to announce a way for MetalFX Upscaling to leverage its custom-made "Neural Engine" system.
By announcing this functionality for some of the world's most popular processors, Apple is arguably letting more game developers build their games and engines with image reconstruction -- even if MetalFX Upscaling isn't open source, unlike AMD's FSR 2.0 system. Still, these image reconstruction systems typically have temporal anti-aliasing (TAA) in common. So long as game devs keep that kind of anti-aliasing in mind with their games and engines, they'll be more likely to take advantage and thus run more efficiently on a wide range of consoles, computers, and smartphones. The report notes that Metal 3 also includes "a new 'resource-loading' API designed to streamline asset-loading processes in video games." The same Metal 3 API benefits will also come to iPadOS 16 later this year.
By announcing this functionality for some of the world's most popular processors, Apple is arguably letting more game developers build their games and engines with image reconstruction -- even if MetalFX Upscaling isn't open source, unlike AMD's FSR 2.0 system. Still, these image reconstruction systems typically have temporal anti-aliasing (TAA) in common. So long as game devs keep that kind of anti-aliasing in mind with their games and engines, they'll be more likely to take advantage and thus run more efficiently on a wide range of consoles, computers, and smartphones. The report notes that Metal 3 also includes "a new 'resource-loading' API designed to streamline asset-loading processes in video games." The same Metal 3 API benefits will also come to iPadOS 16 later this year.
Re: can't compete so fake it (Score:3, Insightful)
And why the fuck is slashdot flooded with articles about minor incremental improvements to apple shit, and Apple's new BNPL scheme to milk even more money out of stupid people (which make up the majority of their customers)? Especially considering all of the actual technical features they've added have already been done before by basically all of their competitors?
Re: (Score:2)
So, for example, they could cover the story about Intel's Arc 3 GPU getting benched marked (finally) and returning a score that pretty much equals the Nvidia 3070 mobile GPU. Big news like that deserves a space but . . . nope.
FYI: https://www.pcworld.com/articl... [pcworld.com]
Competing? (Score:3)
Re: (Score:3, Insightful)
Re: (Score:1)
By being on a billion devices that already exist and cost less than a geforce card itself?
If only it performed even half as well as a geforce card!
Re: (Score:2)
By existing on every existing M1/2 or recent A-series chip without adding a new video card? By being on a billion devices that already exist and cost less than a geforce card itself?
And not even that recent.
IOS 16 (and thus, Metal 3), works all the way back to the iPhone 8, which only has an A11 Bionic SoC. That's a good 5 generations-back in the Ax Lineage.
Re:Competing? (Score:4, Insightful)
Re: (Score:2)
Metal 3 is actually supported on recent* AMD and Intel GPUs as well as Apple's own. There are (currently) no separate requirements listed for MetalFX.
* "Available on Mac models with Apple silicon, AMD Radeon Pro Vega series, AMD Radeon Pro 5000/6000 series, Intel Iris Plus Graphics series, or Intel UHD Graphics 630."
Maybe enable the M1 platform to support 2 displays (Score:5, Informative)
Maybe enable the M1 platform to support two external displays, already?
My coworkers with M1 MacBooks constantly complain that their devices can't support two external displays.
That is such a huge oversight on the part of Apple.
My Windows laptop supports three external displays for 1/3 of the price of those M1 MacBooks. WTF?
Re: (Score:2)
This was my biggest WTF moment when my wife bought home a M1 Macbook for me as a present. could not believe if I kept it I would have to give up multi monitor, consequently I got her to return it, at least apple have easy return process.
Yes, the GPU capabilities of the base M1 are pretty sad. You can work around the limitation with a DisplayLink-compatible video adapter, albeit without HDCP support.
Re: (Score:2)
Yeah, and DisplayLink is expensive and sluggish. Videos and games get pixelated like you're watching a sketchy MPEG, to say nothing about the heavy CPU load.
Re: (Score:2)
I'd assume they were expecting the low end machine to have all the bells and whistles.
I personally prefer the higher end machines so I can go longer before needing to upgrade. The money I'd save by going with a cheaper machine just isn't a good trade-off for me here.
I'll cheap out on a lot of things, but not my computer.
Re: (Score:2)
They are using MacBook Pros.
Re: (Score:2)
I thought thy were using Macbook Airs?
Re: (Score:2)
No, MacBook Pros, but it doesn't matter. None of the Arm-based MacBooks support more than one extra display natively.
Re: (Score:2)
They are using MacBook Pros.
The Arm-based MacBook Pro does not support more than one external display.
Re: (Score:2)
Isn't Apple's usual attitude that to get new features you must buy a new device?
Re: (Score:1)
You're thinking of how Android OS updates work.
Re: (Score:1)
Maybe enable the M1 platform to support two external displays, already?
They did, finally, with the M1 Pro and above. We all seem to have forgotten the 'never buy generation 1' rule...
Re: (Score:2)
I can get 6 if I chain docking stations.
Re: (Score:2)
HAH. 3
I can get 6 if I chain docking stations.
What Docks, exactly are you using for that, and what are the limitations?
Re: (Score:2)
Chained the docks together thru the USB-C ports on them.
Didn't think it would work, but it did.
Re: (Score:2)
Dell D6000 docks and a Latitude 5420.
Chained the docks together thru the USB-C ports on them.
Didn't think it would work, but it did.
Cool, thanks!
Re: (Score:2)
The Dell D6000 is a DisplayLink adapter which is like a USB graphics card with slow performance. You can also plug it into your Arm-based MacBook to get multiple monitors.
My post was about native graphics supporting more than one external monitor natively, which Arm MacBoos do not.
I'm not sure I see the point (Score:4, Interesting)
That pretty much means modern AAA gaming, which is a dying breed. Sony and Ubisoft put out a handful of such games every generation. Square puts out on FF title. And Bethesda puts out a Doom game. Oh, and Microsoft puts out a Forza game. If we've been _really_ good children EA puts out a Need For Speed game and a Star Wars game.
That's basically that. And i can't see any of that getting Mac ports. Just downscaled iOS ports.
It's the same problem the Nintendo Wii had: Huge user base that only gamed casually. The majority of hardcore gamers who owned a Wii still had a 360 and/or a PS3 and gaming PC. Meaning sales were always weak on it because why would I buy a cut down Wii port when the Xbox 360 or PC version was right there?
The Wii had actually inferior hardware. The Mac hardware might be able to hang (I'd really question that though, at least not w/o an AMD or nVidia GPU backing it up) but again, it doesn't matter how good your hardware is if devs don't use it. Ask Sega about the Saturn sometime...
Re: (Score:2)
Chicken and egg (Score:2)
Re:I'm not sure I see the point (Score:5, Informative)
The M1 and M2 GPUs are mobile designs. They use tile rendering, which has some big implications for performance. With tile rendering, the GPU has fast tile memory internally, but it only covers a small area of the screen e.g. 64x64 pixels. It renders those pixels and then pushes them out to RAM. Ideally it needs to do all the rendering before pushing, because pulling that data back in to do another rendering pass is quite expensive.
Of course most games designed for consoles and for PCs aren't designed for tile rendering, they assume that the GPU has massive memory bandwidth and can do multiple passes over every pixel with little performance penalty. So Mac ports are a problem. Mobile games will come over okay though.
This feature allows the M2 GPU to render at a lower resolution, which really helps with this bottleneck.
The M1 GPU is slower than an RTX 3050, the lowest end model that Nvidia makes. It only benchmarks well in certain applications that Apple has added dedicated acceleration for, mostly video encoding/decoding.
Re: (Score:1)
Re: (Score:2)
Right, Apple's rendering model isn't radically different. It's design is a lot different though, and I think it's a better overall tactic. A 64 core M1 Ultra is only a bit slower in benchmarks from actual games than a 3050 with 2500 cuda cores. Practically, it's quite a bit slower mostly due to platform support so what's this look like if game devs target M1/M2?
And what's it look like with 128 or 256 GPU cores on M2 pro/ultra/max/whatever?
MetalFX enables an interesting 'pipeline' for game builds though.
Re: (Score:2)
"AAA gaming" is not a "dying breed". Titles published in the "AAA" category set a revenue record in 2020, and all signs point at another record-breaker for 2021 (actual data is still very much paywalled).
Re: (Score:2)
That pretty much means modern AAA gaming, which is a dying breed. Sony and Ubisoft put out a handful of such games every generation. Square puts out on FF title. And Bethesda puts out a Doom game. Oh, and Microsoft puts out a Forza game. If we've been _really_ good children EA puts out a Need For Speed game and a Star Wars game.
That's basically that.
So you basically aren't paying attention to the industry all then? Let me clue you in:
a) The number of games that can't reach 4K resolutions at playable framerates are constantly increasing, largely because of adoption of raytracing. You don't need to be a AAA developer to hit these limits. Hell any kid with a demo version of Unreal can produce a game that struggles to run well even at 1440p.
b) Apple is making a move towards VR. VR *requires* both high resolution and high frame rates to prevent motion sickn
Re: (Score:2)
> The Wii had actually inferior hardware.
I shipped a couple of Wii games. The Wii was literally an ~1.5x over-clocked GameCube. The GPU (Hollywood [wikipedia.org]) had the exact same hardware bugs and feature set as the GameCube's GPU (Flipper): no shaders, it had TEV (Texture Environment Unit) instead.
Nintendo has never focused on pushing tech as much as Sony and Microsoft do.
Unfortunate (Score:2)
FSR is GPU-agnostic and can potentially work on any hardware that supports shader model 5.0
https://www.pcgamesn.com/amd/f... [pcgamesn.com]
It already supports various NV GPUs. If Apple didn't have its head so far up its own posterior, they'd save themselves the trouble and just support FSR.
Re: (Score:2)
Re: (Score:3)
DLSS has moderately higher performance and a small increase in image quality (mostly in fine detail and motion handling). Both of these differences can probably be attributed to DLSS's use of the GPU's tensor cores, which both allows it to offload more of the work from the shader cores (leaving more shader resources for actual game rendering) and probably allow it to make better decisions about how to integrate the temporal data. DLSS takes less time to process a frame than FSR 2.0, which has interesting im
Re: (Score:2)
If I were a game developer... (Score:2)
Let me see,
I could optimize my game for nVidia's DLSS + Raytracing that has the most features, but works only on nVidia cards in the Super Large Windows ecosystem...
Or, I could optimize my game for AMD's FXX 2.0 which has less feature but works on ANY (AMD, nVidia, Intel, Innosilicon) graphics card in the Super Large Windows Ecosystem...
Or, I could optimize my game for Intel's (upcoming) XeSS which has less features but works on ANY (AMD, nVidia, Intel, Innosilicon) graphics card in the Super Large Windows