Upscaling is actually good (as an option)
-
Upskaling is a fabolous technology and the split that quality needs to do between hardware upgrades and software support. Overall the existence of the technology is definitly a positive one.
However people are worried about a development that we are already seeing where games are just not efficient with their resources and require way to much computing power. People are afraid studios will decrease the amount of work they put into optimising because they feel like Upskaling will solve all perfomance problems for them. But optimisation needs to happen on both parts. That's what people are afraid of.
Look at the release of Rise of the Ronin for PC, the game has a huge CPU bottleneck, poor performance around big cities, looks like the game render stuff that shouldn't be rendered. The last patch they released? "Graphics mode"...
-
I've seen a lot of people lately saying that upscaling (fsr, dlss, etc.) is a bad thing, including some calling it 'fake frames', which is probably due to them confusing it with frame generation.
What upscaling does is take an input (a frame rendered at 1080p, for example) and attempt to improve it by generating more information (bringing that 1080p frame to 1440p). this does make things a little fuzzy, but it also frees up resources to allow stuff like improved lighting to be rendered which makes games like cyberpunk able to be rendered at a decent framerate without a $5,000 gpu.
Frame generation is different. It takes an input as well (same 1080p frame, for example), but it doesn't improve the frame. It makes a new one based on that frame, sometimes several. These actually are 'fake frames', and this is what the people who called upscaling fake frames were really talking about.
I won't lie, upscaling is definitely a crutch and the goal should be to be able to render that cool stuff at native resolution. however, the tech that can render that stuff is too expensive to be worth buying unless you have money to throw away, which real people typically don't. it's up to you whether a little fuzziness in the graphics is worth it to you, but the fact is it'll give you the leeway to choose between higher framerate and prettier lighting. without it most people are stuck just setting their graphics to 'no', because they can't afford the kind of processing power making things look good at native resolution takes.
Part of why I am making this post is that I wanted to see what other people think of this take, and more importantly get feedback so I can improve the take later. I'm currently running a laptop with a 1650, and I've had it for years. I'm used to balancing frames and quality and making compromises, and upscaling tends to be one of them that's worth making.
Don't pay attention to idiots who don't know what they're talking about. Lemmy and Reddit are full of tech misinformation. It's not even worth replying to.
-
Yeah, taht is more or less where I come down. "AI" upscaling is spectacular. Frame gen is much more hit or miss
The main problem is that, as with most things, people are stupid. They don't understand that an outlet like Digital Foundry or even Gamers Nexus are going to be harsh on upscaling/frame gen because it actively makes it hard for them to give you guidance on what performance you can expect. So "This is horrible for benchmarking" becomes "This is horrible"
I'm confused. Digital foundry clearly spells out the performance you will get with both FG and without. They're not harsh on it at all?
-
I've seen a lot of people lately saying that upscaling (fsr, dlss, etc.) is a bad thing, including some calling it 'fake frames', which is probably due to them confusing it with frame generation.
What upscaling does is take an input (a frame rendered at 1080p, for example) and attempt to improve it by generating more information (bringing that 1080p frame to 1440p). this does make things a little fuzzy, but it also frees up resources to allow stuff like improved lighting to be rendered which makes games like cyberpunk able to be rendered at a decent framerate without a $5,000 gpu.
Frame generation is different. It takes an input as well (same 1080p frame, for example), but it doesn't improve the frame. It makes a new one based on that frame, sometimes several. These actually are 'fake frames', and this is what the people who called upscaling fake frames were really talking about.
I won't lie, upscaling is definitely a crutch and the goal should be to be able to render that cool stuff at native resolution. however, the tech that can render that stuff is too expensive to be worth buying unless you have money to throw away, which real people typically don't. it's up to you whether a little fuzziness in the graphics is worth it to you, but the fact is it'll give you the leeway to choose between higher framerate and prettier lighting. without it most people are stuck just setting their graphics to 'no', because they can't afford the kind of processing power making things look good at native resolution takes.
Part of why I am making this post is that I wanted to see what other people think of this take, and more importantly get feedback so I can improve the take later. I'm currently running a laptop with a 1650, and I've had it for years. I'm used to balancing frames and quality and making compromises, and upscaling tends to be one of them that's worth making.
I agree in principle, but it's a crutch that shouldn't substitute good code. It's like having a powerful car that runs sluggish, and then someone suggests that removing a couple seats could improve things.
-
I've seen a lot of people lately saying that upscaling (fsr, dlss, etc.) is a bad thing, including some calling it 'fake frames', which is probably due to them confusing it with frame generation.
What upscaling does is take an input (a frame rendered at 1080p, for example) and attempt to improve it by generating more information (bringing that 1080p frame to 1440p). this does make things a little fuzzy, but it also frees up resources to allow stuff like improved lighting to be rendered which makes games like cyberpunk able to be rendered at a decent framerate without a $5,000 gpu.
Frame generation is different. It takes an input as well (same 1080p frame, for example), but it doesn't improve the frame. It makes a new one based on that frame, sometimes several. These actually are 'fake frames', and this is what the people who called upscaling fake frames were really talking about.
I won't lie, upscaling is definitely a crutch and the goal should be to be able to render that cool stuff at native resolution. however, the tech that can render that stuff is too expensive to be worth buying unless you have money to throw away, which real people typically don't. it's up to you whether a little fuzziness in the graphics is worth it to you, but the fact is it'll give you the leeway to choose between higher framerate and prettier lighting. without it most people are stuck just setting their graphics to 'no', because they can't afford the kind of processing power making things look good at native resolution takes.
Part of why I am making this post is that I wanted to see what other people think of this take, and more importantly get feedback so I can improve the take later. I'm currently running a laptop with a 1650, and I've had it for years. I'm used to balancing frames and quality and making compromises, and upscaling tends to be one of them that's worth making.
Meh, I'd rather play at lower settings than upscale. DLSS just looks like muck and the rest of them are indistinguishable from normal resolution. I don't know any game my RTX 3090 can't crush GPU wise at 1440p at decent settings. I even turned off path tracing in 2077 because I didn't want to ever use DLSS.
But there is one situation where I support the brainrot and that's on portables. Steam Deck did the right thing by having a nice OLED 1280x800 16:10 screen instead of chasing resolution, it looks great, but if they do up it as most g*mers seem to want them to for mostly nonsensical reasons, FSR could work there, and something like DL DSR could help for games with small details if they go for an 800p screen again.
One thing I won't miss for sure is TAA, fuck TAA so much all my homies hate that shit. DLAA is at least better but never forget they took SMAA and MSAA away from you for this absurd world where consoles advertised themselves as 4k, 8k while they run games at barely 720p via checkerboarding and what have you.
-
I've seen a lot of people lately saying that upscaling (fsr, dlss, etc.) is a bad thing, including some calling it 'fake frames', which is probably due to them confusing it with frame generation.
What upscaling does is take an input (a frame rendered at 1080p, for example) and attempt to improve it by generating more information (bringing that 1080p frame to 1440p). this does make things a little fuzzy, but it also frees up resources to allow stuff like improved lighting to be rendered which makes games like cyberpunk able to be rendered at a decent framerate without a $5,000 gpu.
Frame generation is different. It takes an input as well (same 1080p frame, for example), but it doesn't improve the frame. It makes a new one based on that frame, sometimes several. These actually are 'fake frames', and this is what the people who called upscaling fake frames were really talking about.
I won't lie, upscaling is definitely a crutch and the goal should be to be able to render that cool stuff at native resolution. however, the tech that can render that stuff is too expensive to be worth buying unless you have money to throw away, which real people typically don't. it's up to you whether a little fuzziness in the graphics is worth it to you, but the fact is it'll give you the leeway to choose between higher framerate and prettier lighting. without it most people are stuck just setting their graphics to 'no', because they can't afford the kind of processing power making things look good at native resolution takes.
Part of why I am making this post is that I wanted to see what other people think of this take, and more importantly get feedback so I can improve the take later. I'm currently running a laptop with a 1650, and I've had it for years. I'm used to balancing frames and quality and making compromises, and upscaling tends to be one of them that's worth making.
I think upscaling is a good idea. Most of the time I'm running around while dodging bullets, arrows or fireballs, so I don't really have time to examine the details of the foliage around me at the pixel level. I also will not buy an overpowered space heater so that the grass in my hand is more realistic. I don't want a triple fan monster sounding like a turbojet near me.
-
I've seen a lot of people lately saying that upscaling (fsr, dlss, etc.) is a bad thing, including some calling it 'fake frames', which is probably due to them confusing it with frame generation.
What upscaling does is take an input (a frame rendered at 1080p, for example) and attempt to improve it by generating more information (bringing that 1080p frame to 1440p). this does make things a little fuzzy, but it also frees up resources to allow stuff like improved lighting to be rendered which makes games like cyberpunk able to be rendered at a decent framerate without a $5,000 gpu.
Frame generation is different. It takes an input as well (same 1080p frame, for example), but it doesn't improve the frame. It makes a new one based on that frame, sometimes several. These actually are 'fake frames', and this is what the people who called upscaling fake frames were really talking about.
I won't lie, upscaling is definitely a crutch and the goal should be to be able to render that cool stuff at native resolution. however, the tech that can render that stuff is too expensive to be worth buying unless you have money to throw away, which real people typically don't. it's up to you whether a little fuzziness in the graphics is worth it to you, but the fact is it'll give you the leeway to choose between higher framerate and prettier lighting. without it most people are stuck just setting their graphics to 'no', because they can't afford the kind of processing power making things look good at native resolution takes.
Part of why I am making this post is that I wanted to see what other people think of this take, and more importantly get feedback so I can improve the take later. I'm currently running a laptop with a 1650, and I've had it for years. I'm used to balancing frames and quality and making compromises, and upscaling tends to be one of them that's worth making.
Upscaling an old game on fixed hardware that can’t output at high resolutions is good.
Upscaling a new game as part of the graphics pipeline instead of optimizing it is terrible and shouldn’t be accepted by gamers that have to spend $1000+ on a GPU
-
I've seen a lot of people lately saying that upscaling (fsr, dlss, etc.) is a bad thing, including some calling it 'fake frames', which is probably due to them confusing it with frame generation.
What upscaling does is take an input (a frame rendered at 1080p, for example) and attempt to improve it by generating more information (bringing that 1080p frame to 1440p). this does make things a little fuzzy, but it also frees up resources to allow stuff like improved lighting to be rendered which makes games like cyberpunk able to be rendered at a decent framerate without a $5,000 gpu.
Frame generation is different. It takes an input as well (same 1080p frame, for example), but it doesn't improve the frame. It makes a new one based on that frame, sometimes several. These actually are 'fake frames', and this is what the people who called upscaling fake frames were really talking about.
I won't lie, upscaling is definitely a crutch and the goal should be to be able to render that cool stuff at native resolution. however, the tech that can render that stuff is too expensive to be worth buying unless you have money to throw away, which real people typically don't. it's up to you whether a little fuzziness in the graphics is worth it to you, but the fact is it'll give you the leeway to choose between higher framerate and prettier lighting. without it most people are stuck just setting their graphics to 'no', because they can't afford the kind of processing power making things look good at native resolution takes.
Part of why I am making this post is that I wanted to see what other people think of this take, and more importantly get feedback so I can improve the take later. I'm currently running a laptop with a 1650, and I've had it for years. I'm used to balancing frames and quality and making compromises, and upscaling tends to be one of them that's worth making.
I don't think I've ever seen someone say that upscaling OPTIONS are bad but I'm worried about games like monster hunter where upscaling and frame gen is used to make the game playable in most cases.
-
I agree in principle, but it's a crutch that shouldn't substitute good code. It's like having a powerful car that runs sluggish, and then someone suggests that removing a couple seats could improve things.
Totally agree though would say it's more along the lines of needing premium gas and newer performance air filters and tires when you're thinking it should be capable out of the box
-
I've seen a lot of people lately saying that upscaling (fsr, dlss, etc.) is a bad thing, including some calling it 'fake frames', which is probably due to them confusing it with frame generation.
What upscaling does is take an input (a frame rendered at 1080p, for example) and attempt to improve it by generating more information (bringing that 1080p frame to 1440p). this does make things a little fuzzy, but it also frees up resources to allow stuff like improved lighting to be rendered which makes games like cyberpunk able to be rendered at a decent framerate without a $5,000 gpu.
Frame generation is different. It takes an input as well (same 1080p frame, for example), but it doesn't improve the frame. It makes a new one based on that frame, sometimes several. These actually are 'fake frames', and this is what the people who called upscaling fake frames were really talking about.
I won't lie, upscaling is definitely a crutch and the goal should be to be able to render that cool stuff at native resolution. however, the tech that can render that stuff is too expensive to be worth buying unless you have money to throw away, which real people typically don't. it's up to you whether a little fuzziness in the graphics is worth it to you, but the fact is it'll give you the leeway to choose between higher framerate and prettier lighting. without it most people are stuck just setting their graphics to 'no', because they can't afford the kind of processing power making things look good at native resolution takes.
Part of why I am making this post is that I wanted to see what other people think of this take, and more importantly get feedback so I can improve the take later. I'm currently running a laptop with a 1650, and I've had it for years. I'm used to balancing frames and quality and making compromises, and upscaling tends to be one of them that's worth making.
I'm fine with the concept of upscaling tech. DLSS 4 with the transformer model looks excellent. And FSR 4 is looking pretty damn decent as well. The earlier attempts weren't as good. Ideally it would be acting more like DLAA, but 8.3 million pixels is a lot to render (4K). And if 8K is going to be a thing one day, it makes even more sense there.
I think too many people focus on the now and can't imagine what things will be in the future as they progress.
Now frame generation, that one I feel less optimistic about. Especially when I see people using it for 60fps or less. It should really only ever be used at 80fps or higher, where the lag is less of a problem. But one day inferred frames, where it only looks at the prior frames and does not wait for the next frame, might make it a better experience.
Lastly, it's NVidia and AMD's marketing departments fault for having them all conflated. DLFG & FFG is what the frame gen tools should have been called, rather than shoehorning them under their super sampling and super resolution branding.
-