It makes games run worse and look horrible because devs now target dlss performance mode for 60fps on medium hardware. The game is rendered at 720p and blurred to death until you can’t notice the half rendered frame and pixelated effects that depend on it (and TAA).
It’s also a marketing ploy to lie to consumers, saying their cards get 3x the performance of the previous Gen, by generating fake frames in between real ones. Displaying garbage but because the benchmark say 100+fps, it must be better right? Even though there are more presentation errors, latency is through the roof and the result is insane.
DLSS 5 continues the trend of lies by simply putting a yassification filter on top, lies to the dev, and lies to the consumers.
That’s hella weird. 720p only upscales nicely on multiples, so 1440p or 2880p, but last I checked only about 15% of Steam users have QHD monitors. They should’ve gone 960*540 at the very least.
Of course, they shouldn’t half-ass their quality regardless. Frame gen is fine I guess, but not rendering at full res and using AI to distort the original is a huge issue.
Frame gen, in my experience, is worse than upscaling. At least upscaling has some uses (but about 10% of what they say in marketing materials). Frame gen just makes the game have worse latency for disgusting frames that barely look like the base game.
I disagree with your premise. It does make games run better which has helped devs get lazy and use it as a crutch. When devs don’t use it as a crutch but let it be a bonus it’s great.
Though I just replayed Just Cause 3 for kicks and think visually it holds up to just about any of today’s games and runs like a top with any fancy stuff.
I sure hope the games run better when enabled, it’s effectively running at lower resolution and upscaling. It’s just a fancy resolution select marketed as genuine improvements.
TAA as in temporal anti-aliasing? Is that not frame generation? It’s interpolating between frames to create a frame that wasn’t previously there. Just like how spatial anti-aliasing generates pixels that weren’t previously there.
I think maybe we have a different idea of what “generation” means. I’m guessing your idea of “generation” is when it surpasses some threshold of information added through the process.
It makes games run worse and look horrible because devs now target dlss performance mode for 60fps on medium hardware. The game is rendered at 720p and blurred to death until you can’t notice the half rendered frame and pixelated effects that depend on it (and TAA).
It’s also a marketing ploy to lie to consumers, saying their cards get 3x the performance of the previous Gen, by generating fake frames in between real ones. Displaying garbage but because the benchmark say 100+fps, it must be better right? Even though there are more presentation errors, latency is through the roof and the result is insane.
DLSS 5 continues the trend of lies by simply putting a yassification filter on top, lies to the dev, and lies to the consumers.
That’s like saying new graphics cards make a game run worse because developers make their games targeted at new hardware.
You’re conflating two issues. DLSS uses upscaling to improve framerate the the cost of image quality.
The decision of developers is a completely different issue.
That’s hella weird. 720p only upscales nicely on multiples, so 1440p or 2880p, but last I checked only about 15% of Steam users have QHD monitors. They should’ve gone 960*540 at the very least.
Of course, they shouldn’t half-ass their quality regardless. Frame gen is fine I guess, but not rendering at full res and using AI to distort the original is a huge issue.
Frame gen, in my experience, is worse than upscaling. At least upscaling has some uses (but about 10% of what they say in marketing materials). Frame gen just makes the game have worse latency for disgusting frames that barely look like the base game.
I disagree with your premise. It does make games run better which has helped devs get lazy and use it as a crutch. When devs don’t use it as a crutch but let it be a bonus it’s great.
Though I just replayed Just Cause 3 for kicks and think visually it holds up to just about any of today’s games and runs like a top with any fancy stuff.
I sure hope the games run better when enabled, it’s effectively running at lower resolution and upscaling. It’s just a fancy resolution select marketed as genuine improvements.
They don’t run better though. DLSS just makes fake frames based in guesses which looks like it is running better.
It is the same basic premise as LLM slop, it looks like something despite not actually being that thing.
Frame Gen makes fake frames. Standard dlss does no such thing which is what the question was. It uses temporal upscaling. Not the same thing.
Upscaling = artificially increasing the sampling rate through some sort of inter/extrapolation.
Temporal = it’s happening on the temporal axis.
Samples on the temporal axis are frames.
Therefore, temporal upscaling = artificially sampling more frames = frame gen?
No, that’s like saying TAA is frame gen.
Using temporal data to upscale is different than inserting frames that weren’t there to begin with.
TAA as in temporal anti-aliasing? Is that not frame generation? It’s interpolating between frames to create a frame that wasn’t previously there. Just like how spatial anti-aliasing generates pixels that weren’t previously there.
I think maybe we have a different idea of what “generation” means. I’m guessing your idea of “generation” is when it surpasses some threshold of information added through the process.
The early versions only did upscaling, but we are in the current day where DLSS does fake frame gen to increase fps on top of upscaling.
Only if it’s turned on. I have yet to see a game that forces it. You get to pick if it’s on or not.
You sound like you have never used a modern nVidia card. Not a slam, just saying it you had one you’d know it’s optional.
Keep moving those goalposts.