Nvidia's DLSS tech now in over 760 games and apps — native and override DLSS 4 support has broad reach

Blackwell RTX Pro
(Image credit: Nvidia)

Nvidia has just announced a new list of recently-released and upcoming games with native Nvidia DLSS support, including RuneScape: Dragonwilds. Counting the six new games discussed in Nvidia's newest blog post, DLSS and RTX support is now natively available at some level across 769 video games and applications, a number outpacing AMD and Intel's similar tech by incredible margins.

DLSS, or Deep Learning Super Sampling, is Nvidia's suite of AI video and rendering upscaling enhancements. The feature has come to encompass numerous features, from AI-powered "super resolution" upscaling and anti-aliasing, to ray reconstruction for improved denoising with ray tracing, along with frame generation (on RTX 40-series and later). The latest DLSS 4 release added MFG along with DLSS Transformers for upscaling — a more computationally intensive and higher quality upscaling algorithm compared to the old CNN mode.

DLSS has been a major part of Nvidia's RTX marketing since the first RTX 20-series cards launched in 2018. The newest DLSS 4 is a key facet of the current-gen Blackwell RTX 50-series GPUs, with Nvidia's performance boosts and claims for the new graphics cards based primarily on the generational improvements brought about by MFG, alongside DLSS upscaling enhancements.

Nvidia's latest DLSS announcement includes six upcoming games launching with native RTX and DLSS support. Steel Seed, from Italian studio Storm in a Teacup, launched with DLSS 4 MFG today. RuneScape: Dragonwilds, an open-world survival multiplayer title from Jagex, alongside anticipated re-release The Talos Principle: Reawakened and new title Tempest Rising all launched in the past week with DLSS 3 Frame Generation and Super Resolution — without MFG. New titles Clair Obscur: Expedition 33 and Commandos: Origins will also launch this week with DLSS 2 Super Resolution, without any of Nvidia's frame generation tech. (Technically it could be DLSS 3.7 with super resolution, but that's getting into the weeds...)

With Nvidia's DLSS tools now an industry-standard inclusion into even indie titles, the list of 769 currently supported games and apps is sure to balloon even larger as time marches on. Compare this to AMD and Intel's FSR and XeSS tools, sister suites to DLSS, which currently sit at 356 and 161 total adopted games according to their respective websites.

Nvidia holds the lead in quantity, and its manual override system also gives it the edge in putting its higher-grade tech into older games. FSR and XeSS require game developers to put in the work to support the more modern features of FSR 3.1 or XeSS 2 over previous gens, leaving some major games like Baldur's Gate 3 still on FSR 2.2 due to developer preference. AMD's FSR 4 can override FSR 3.1, but that's still only in a relatively limited number of games.

Nvidia has provided an override switch in its Nvidia App for select games, with Nvidia engineers tweaking games with some level of DLSS support to shoehorn in features like Multi-Frame Gen, providing support in games where the developers have not specifically added this feature. We still prefer native support, but that tends to take a lot longer to materialize.

DLSS 4's width of features and widespread adoption are a major part of Nvidia's market dominance in today's consumer graphics card sector. As Nvidia's share increases, future developers may grow increasingly tempted to support Nvidia's DLSS without including FSR or XeSS support, further widening the gap. Though AMD's latest GPU launch was a roaring success for the company, it will likely need much more than one limited release boom to catch up to Nvidia's pace in any corner of the gaming market.

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Dallin Grimm
Contributing Writer

Dallin Grimm is a contributing writer for Tom's Hardware. He has been building and breaking computers since 2017, serving as the resident youngster at Tom's. From APUs to RGB, Dallin has a handle on all the latest tech news. 

  • hotaru251
    I read it another way.
    "Nvidia's helped devs make over 700 poorly optimized games"

    dlss was made to help lower powered machines run games...which is great.

    Dev's however didn't do that and instead used it to save money and bruteforce games w/ higher req.
    Reply
  • Alvar "Miles" Udell
    As much as I dislike games leaning on AI like DLSS and people and publications saying they can do things like 4k120 max details with a midrange card with DLSS, it also helps cards age more gracefully since it is getting more widespread, which is important in the age of near $1000 mid range cards, and it is a boon to consoles and handheld with limited GPUs.

    As for AMD all they have to do is flood the market with cards cosing half as much as Nvidia and their market share will boom, as will FRSS adoption, like they did with the HD 4000 and 5000 series cards. Undercutting them by $100 doesn't work in the age of AI, and not having DLSS support is quite a bit more important in the long run than not having PhysX.





    Also, yes, DLSS is an algorithm, but in 2025 any algorithm can be called AI.
    Reply
  • edzieba
    hotaru251 said:
    I read it another way.
    "Nvidia's helped devs make over 700 poorly optimized games"

    dlss was made to help lower powered machines run games...which is great.

    Dev's however didn't do that and instead used it to save money and bruteforce games w/ higher req.
    From the "MSAA is just cutting corners from proper SSAA!", "mip-mapping isn't rendering all your texture texels at full resolution, CHEATING!" and "VRR is jsut a crutch for not meeting !" departments.

    Games have performed well and performed poorly long before DLSS existed, and will continue to do so.
    Reply
  • JarredWaltonGPU
    Alvar Miles Udell said:
    As for AMD all they have to do is flood the market with cards cosing half as much as Nvidia and their market share will boom, as will FRSS adoption, like they did with the HD 4000 and 5000 series cards. Undercutting them by $100 doesn't work in the age of AI, and not having DLSS support is quite a bit more important in the long run than not having PhysX.
    LOL. And all Intel has to do is flood the market with Battlemage GPUs sold at a loss and it can also gain massive market share!

    Realistically, the reason AMD's RDNA 4 graphics cards cost as much as they cost is because they need to generate a profit. It could reduce prices a bit, but "half as much" as Nvidia's cards would only be viable if we're talking about the RTX 5080 — a card that's faster than AMD's top solution by a decent amount. Best-case, I think AMD could probably sell 9070 XT for $499, and 9070 for maybe $399. But margins would be razor thin in that case and it wouldn't stop scalpers and AIBs from pushing the prices higher. Witness the current going rate of $800+ for RX 9070.
    Reply
  • Notton
    My guess is Intel ARC breaks even after all the R&D, and purchasing relatively large dies from TSMC.
    This is acceptable from a standpoint where ARC (Xe cores) is a key component of their mobile chips.
    Lunar Lake iGPU also manages to beat Ryzen Z1E by a hair.

    As for DLSS, when used properly it's not a crutch.
    However, it's quite evident that quadruple-A gaming companies, like noobisoft, have decided it's a crutch for their AAAA gaming titles, and it's an eyesore.
    We can thank games like... SW: Outlaws (noobisoft) for being an eyesore.

    Where as DLSS+FG+RG (Depending on the game) look just fine in Space Marine 2, Wukong, MH:Wilds, Indiana Jones: GC, Alan Wake 2, etc.
    Although some of those games still have LoD pop in issues despite not running out of VRAM, looking at you Indiana Jones, and MH:Wilds.

    And then there are titles that have a mixed bag like... Cyberpunk 2077.

    I can't be bothered to go through the entire list of games.
    Reply
  • hotaru251
    edzieba said:
    Games have performed well and performed poorly long before DLSS existed, and will continue to do so.
    yes, but it was never as harmful as now.
    Look at games like starfield, MHWilds, etc.

    they are crap quality for what they demand because you can bruteforce it via dlss.

    Yes, devs cheaped out on optimizing in past, but its never been as common as now days...because dlss makes it easier to do.

    Some devs do optimize games properly (doom eternal & bg3 being prime examples) but its way more common to have games need high req and they don't feel like soemthing that should need it because they cheaped out on optimizing.
    Reply
  • JarredWaltonGPU
    Notton said:
    My guess is Intel ARC breaks even after all the R&D, and purchasing relatively large dies from TSMC.
    This is acceptable from a standpoint where ARC (Xe cores) is a key component of their mobile chips.
    Lunar Lake iGPU also manages to beat Ryzen Z1E by a hair.
    I suspect if Intel is real, it has lost billions on its GPU efforts over the past five years. I don't know how many people worked on Alchemist and Battlemage, but it's got to be hundreds. Each of those probably earning in the 200K or more range. R&D for a dedicated GPU is massive.

    I think the hardware for a B580 is a little bit more than break even at $250. But the problem is you have several years of R&D that needs to be covered. There's no way Intel makes more than $50 off of a B580, and even that's being VERY generous. How many would Intel need to sell at $50 net profit to cover even $1 billion? Well, that's easy math: 20 million.

    Does anyone — anyone!? — actually think Intel has sold more than five million Arc GPUs? I certainly don't. The Steam Hardware Survey shows "Intel(R) Arc(TM) Graphics" at 0.22% of the total, and that's going to be Alchemist laptops most likely. Maybe if every one of the 9.7% of the "Other" category of GPUs is an Arc graphics card, it might be 20 million. But considering the "Other" category has been at around 8~10 percent for as long as I can remember, I'm pretty confident it's not millions of Arc GPUs hiding in there.
    Reply
  • CelicaGT
    JarredWaltonGPU said:
    I suspect if Intel is real, it has lost billions on its GPU efforts over the past five years. I don't know how many people worked on Alchemist and Battlemage, but it's got to be hundreds. Each of those probably earning in the 200K or more range. R&D for a dedicated GPU is massive.

    I think the hardware for a B580 is a little bit more than break even at $250. But the problem is you have several years of R&D that needs to be covered. There's no way Intel makes more than $50 off of a B580, and even that's being VERY generous. How many would Intel need to sell at $50 net profit to cover even $1 billion? Well, that's easy math: 20 million.

    Does anyone — anyone!? — actually think Intel has sold more than five million Arc GPUs? I certainly don't. The Steam Hardware Survey shows "Intel(R) Arc(TM) Graphics" at 0.22% of the total, and that's going to be Alchemist laptops most likely. Maybe if every one of the 9.7% of the "Other" category of GPUs is an Arc graphics card, it might be 20 million. But considering the "Other" category has been at around 8~10 percent for as long as I can remember, I'm pretty confident it's not millions of Arc GPUs hiding in there.
    Much of that R&D will be recovered via integrated graphics, net profit targets are typically around 20% company wide (cost centre, rather) so they could indeed be taking a net loss on Arc whilst making it up in other areas. I work for a very large international and we will absolutely take a hit to break into a market, as long as we can hit that magic 20%.
    Reply
  • JarredWaltonGPU
    CelicaGT said:
    Much of that R&D will be recovered via integrated graphics, net profit targets are typically around 20% company wide (cost centre, rather) so they could indeed be taking a net loss on Arc whilst making it up in other areas. I work for a very large international and we will absolutely take a hit to break into a market, as long as we can hit that magic 20%.
    That's the big question, though: How much did Intel expand its graphics division to create Ponte Vecchio, Alchemist, and Battlemage? How much of the additional cost was purely for dGPU as opposed to the iGPU variants? And the Ponte Vecchio sequel got axed. That was a huge cost I'm sure. Intel is definitely trying to catch up, and maybe it still can, but I don't believe for an instant that the dedicated GPUs have been a success.

    Even Battlemage B580, which is much better overall than Alchemist, is not doing that well. There aren't enough of them to go around, prices are higher than expected, and the result is that Intel isn't making as many as it needs to make. I'd really love to know how many BMG-G21 wafers Intel ordered from TSMC! I suspect even 10,000 is probably higher than the real number, but who knows?

    (10K with 272 sqmm per chip would be a maximum of around 212 chips per wafer. That would mean up to 2.1 million BMG-G21 chips if Intel did 10K wafers... but I suspect the real number might be more like a couple thousand. This is, however, just a seat of the pants guess. I'm skeptical that even Nvidia has done 10K wafers for any of the Blackwell RTX chips! Long-term it will do that many, but short-term it's probably 5K or less for GB202/GB203 it seems.)
    Reply
  • Alvar "Miles" Udell
    JarredWaltonGPU said:
    LOL. And all Intel has to do is flood the market with Battlemage GPUs sold at a loss and it can also gain massive market share!

    Realistically, the reason AMD's RDNA 4 graphics cards cost as much as they cost is because they need to generate a profit. It could reduce prices a bit, but "half as much" as Nvidia's cards would only be viable if we're talking about the RTX 5080 — a card that's faster than AMD's top solution by a decent amount. Best-case, I think AMD could probably sell 9070 XT for $499, and 9070 for maybe $399. But margins would be razor thin in that case and it wouldn't stop scalpers and AIBs from pushing the prices higher. Witness the current going rate of $800+ for RX 9070..

    Yes, actually, they're going to have to sell them at a loss in order to gain market share and get their technologies supported more broadly. On any given month they have around a 15% market share on the Steam hardware survey compared to Nvidias 75%, right now it's 17% vs 75%, and with that much difference there's zero reason for any game studio to not go just with Nvidia technologies. Also now is logically the best time for them to do it as the enterprise market is making gobs of profit and can easily cover any loss in gaming cards, not to mention the RTX 5000 series is fairly lackluster vs the 4000 series and they seem to constantly receive negative press so they are more vulnerable now than they have been since all those years ago when AMD pulled the HD 4870 out of nowhere. Once they have more parity in market share they can raise the prices back up, assuming performance is also in parity. Once the enterprise market starts drying up, and it will because AI will move to more efficient dedicated accelerators over Gpus at some point, it'll be harder to justify a hole in the balance sheet.

    It's basically the old tactic of a "loss leader", and AMD is in the strongest position they have been in a long time to be able to play it to its fullest, the biggest issue would be having to mandate retailers sell them at those prices and not take exorbitant markups for profits.
    Reply