LogFAQs > #946424229

LurkerFAQs, Active DB, DB1, DB2, DB3, DB4, DB5, DB6, Database 7 ( 07.18.2020-02.18.2021 ), DB8, DB9, DB10, DB11, DB12, Clear
Topic List
Page List: 1
TopicAnother generation, another AMD GPU disappointment
KamenRiderBlade
10/28/20 6:27:58 PM
#58:


https://www.reddit.com/r/nvidia/comments/agp6a5/how_does_a_game_get_dlss_support/

Let me first answer the technical part if how it gets the DLSS support.
First Nvidia and the developer train Nvidia's AI super-cluster with images and renders of the game, taken at both 1440p and 4K (you can ideally do it for any 2 resolutions but right now it is just this). Super-cluster AI comes up with a pipeline for how to upscale images from 1440p to 4K with minimal loss in quality, longer you train the AI, smaller the quality loss will be. Next, Nvidia puts that pipeline to their drivers and cards with dedicated tensor cores use that pipeline to upscale the game from 1440p to 4K. It is not something that requires particularly high developer involvement as Nvidia does most of the work.

More detail is given here;
https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/
DLSS is the first time a deep learning model has been placed directly into a 3D rendering pipeline. This is made possible through heavy use of the Turing TensorCores. But the developer need not worry about the difficulties of getting the DLSS model to run in only a few milliseconds. NVIDIA has had many teams that span NVIDIA Research, our Hardware & Architecture groups as well as many in Developer Technologies working on both image quality and performance and we will continue to improve both over time.
At this time, in order to use DLSS to its full potential, developers need to provide data to NVIDIA to continue to train the DLSS model. The process is fairly straightforward with NVIDIA handling the heavy lifting via its Saturn V supercomputing cluster.

In terms of who needs to contact who, I assume the developer needs to approach Nvidia. Nvidia might suggest it to some big AAA tittles of its long term partners (like EA and BFV and anthem). But for most developers, they would need to approach Nvidia. It is not clear if this will cost anything to the developer. I have read several different opinions ;
Developer would need to rent the compute time on Nvidia's super cluster.
Developer needs to pay a fixed "nvidia game works" fee, which will cover DLSS and other nvidia options developer wants.
Developer needs to put nvidia badge on their game and actively promote nvidia with the game.
Developer will not pay anything as the cost of DLSS is paid by the DLSS compatible card owners during purchase of the card.
Developers need to "Pay nVIDIA" just to get DLSS, it ain't even free to the developers.

<sarcasm>Adding a fee to DLSS, that's going to help feature adoption for sure! </sarcasm>

This is the first time in GPU history that I've heard Game Developers need to pay the GPU manufacturer to use a feature.

Talk about DLC / Nickle & Diming; I never would've thought they would have the stones to put a feature behind a paywall.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Topic List
Page List: 1