Its like seeing those high-tech pop machines for the first time
-
This post did not contain any content.
Reminder: Temporal, proprietary upscalers are only made mandatory by devs, that actively refuse to make a properly functioning product.
-
Reminder: Temporal, proprietary upscalers are only made mandatory by devs, that actively refuse to make a properly functioning product.
wrote on last edited by [email protected]And/or consumers insisting on playing in 4K because "big number" even though fill rate is a huge issue with modern games and you can barely tell the difference on most setups. Which would not be so bad if they also didn't want ever increasing graphical fidelity and 120+ fps on top of that
-
If you pick a resolution that requires fractional scaling (eg 1080p on your 1440p monitor) it’ll look real dogshit because it’s trying to represent one game pixel with like one and a half real ones along either direction. A resolution that would use integer scaling (ie 720p for your monitor) will just use two pixels in either direction to show one game one (like four pixels all showing the same thing), so it’ll be more pixellated but much less blurry and gross. FSR is the better solution most of the time, but if you did want to go below native again that’d make it a little less gross.
So what should I downscale 4k to? 4k annoys the shit out of me on my laptop because it's pointless at that size display
-
I’ll take fake resolution for other framerates as long as it looks good enough! I play at 1440p though, because I don’t think mid-high level hardware is really there for 4k120+ yet.
My 7900XT works reasonably for 4k well in most games - though admittedly I have to turn graphics down to Medium in alot of cases to get 100-ish fps with upscaling and frame-gen on quality settings. Except Cyberpunk, it ran really well with high settings.
I'd guess in about 3 years it should be much better
-
I'll take dlss, frame gen, and dynamic resolution over dropping a static resolution any day.
Without a doubt, but the problem is, developers don't care anymore to make their games run at all without DLSS. DLSS should not be the baseline.
-
This post did not contain any content.
I use dlss in conjunction with dsl and it's the best way to avoid serrated edges. Much better than any antialias.
-
Reminder: Temporal, proprietary upscalers are only made mandatory by devs, that actively refuse to make a properly functioning product.
Reminder: Most devs actually care about the things they make. This is a management/timeline problem, not a developer one.
-
So what should I downscale 4k to? 4k annoys the shit out of me on my laptop because it's pointless at that size display
You should go 1080p
-
Reminder: Temporal, proprietary upscalers are only made mandatory by devs, that actively refuse to make a properly functioning product.
I'll take DLSS over any other AA solution any day.
We no longer use forward renderers, AA either looks like ass or comes with a massive performance cost, and it can't fix noise from foliage, alphas, smoke, etc. DLSS fixes all three issues at once.
-
I'll take DLSS over any other AA solution any day.
We no longer use forward renderers, AA either looks like ass or comes with a massive performance cost, and it can't fix noise from foliage, alphas, smoke, etc. DLSS fixes all three issues at once.
Well Half-Life Alyx uses forward rendering and has a brilliant MSAA implementation. It is optimised because it needs to be. You cannot have this thing chugging along with 30Hz at full HD. You need 4K or more running at 90Hz or more. So they invested a good amount of time into making sure it functions properly before releasing it.
Also, foliage really doesn't need to be fixed, if it is done properly. Example, 20 year old games like Halo 3 or the Crysis games.
I take issue with modern games because why the hell are they forgetting lessons of the past? Crysis and Halo 3 for example are 20 years old and they have better looking foliage than most modern games because they know what to do to avoid pop-in and noise. Yes, modern games have more foliage, because more VRAM, but older games have better looking foliage, due to the lack of wonky artifacts, in my opinion. And also, the proprietary TAA implementations, or TSR implementations, in my experience, add a ton of input latency, which makes the game feel worse. MSAA, because it uses geometry information to build AA, enhances image quality significantly and gives a better looking and more coherent picture than any other implementation of anti-aliasing, including proprietary TSR. Also, MSAA isn't my religion, I realise that there are some aspects where TAA and TSR can be useful, but problem is, in modern games it gets abused because devs can then say "we'll just do the absolute minimum, make sure the game executes on hardware at HD 30 Hz, and then we'll just let the magic TSR and frame generation handle the rest".
Well, the problem with MSAA is that it needs to have good geometry in the first place if quad overdraw is complete shit because no one bothered to make tessellation or proper LOD models and let just some automatic tool handle everything without any supervision, then yes, it will be horrible. If devs say, "it makes my geometry timing horrible", then we already know that their geometries are utter rubbish.
Also a brilliant example of why I'm bothered by that is Payday 3 because it looks like a late PS3 game and runs like complete trash and has a massive CPU bottleneck, no matter what you do, even if you doctor around with the engine settings themselves.
-
Reminder: Most devs actually care about the things they make. This is a management/timeline problem, not a developer one.
Well, I should have clarified by devs, I mean the entire companies, not the individuals. It's a collective problem, not an individual one.
-
And/or consumers insisting on playing in 4K because "big number" even though fill rate is a huge issue with modern games and you can barely tell the difference on most setups. Which would not be so bad if they also didn't want ever increasing graphical fidelity and 120+ fps on top of that
In my opinion, the fidelity is getting worse than what we had 10 or 20 years ago. Because now we have noise, pop-in, and the temporal smearing because of proprietary TAA and TSA. Example being Payday 3 and this new Justice League or Batman game where you play with the four characters, Which I couldn't bother to remember, Because everything about the game is way worse than the Arkham Knight game, which almost is 10 years old by now.
-
Well Half-Life Alyx uses forward rendering and has a brilliant MSAA implementation. It is optimised because it needs to be. You cannot have this thing chugging along with 30Hz at full HD. You need 4K or more running at 90Hz or more. So they invested a good amount of time into making sure it functions properly before releasing it.
Also, foliage really doesn't need to be fixed, if it is done properly. Example, 20 year old games like Halo 3 or the Crysis games.
I take issue with modern games because why the hell are they forgetting lessons of the past? Crysis and Halo 3 for example are 20 years old and they have better looking foliage than most modern games because they know what to do to avoid pop-in and noise. Yes, modern games have more foliage, because more VRAM, but older games have better looking foliage, due to the lack of wonky artifacts, in my opinion. And also, the proprietary TAA implementations, or TSR implementations, in my experience, add a ton of input latency, which makes the game feel worse. MSAA, because it uses geometry information to build AA, enhances image quality significantly and gives a better looking and more coherent picture than any other implementation of anti-aliasing, including proprietary TSR. Also, MSAA isn't my religion, I realise that there are some aspects where TAA and TSR can be useful, but problem is, in modern games it gets abused because devs can then say "we'll just do the absolute minimum, make sure the game executes on hardware at HD 30 Hz, and then we'll just let the magic TSR and frame generation handle the rest".
Well, the problem with MSAA is that it needs to have good geometry in the first place if quad overdraw is complete shit because no one bothered to make tessellation or proper LOD models and let just some automatic tool handle everything without any supervision, then yes, it will be horrible. If devs say, "it makes my geometry timing horrible", then we already know that their geometries are utter rubbish.
Also a brilliant example of why I'm bothered by that is Payday 3 because it looks like a late PS3 game and runs like complete trash and has a massive CPU bottleneck, no matter what you do, even if you doctor around with the engine settings themselves.
There's a reason you had to fish for an exception to find a modern game with a forward rendering engine.
-
There's a reason you had to fish for an exception to find a modern game with a forward rendering engine.
Okay then, but it still works. It is still hard to claim that Half-Life Alyx runs bad or looks bad. I can only judge from my perspective as a customer. Why do we use these weird, wonky, hacky solutions for deferred rendering if the other one can look just as good, run as good, but doesn't need any of these workarounds?
-
Okay then, but it still works. It is still hard to claim that Half-Life Alyx runs bad or looks bad. I can only judge from my perspective as a customer. Why do we use these weird, wonky, hacky solutions for deferred rendering if the other one can look just as good, run as good, but doesn't need any of these workarounds?
I didn't claim it doesn't work. I claimed there's a reason out of hundreds of releases, you have a singular example of a forward renderer.
Which means TAA will keep being a problem, so my remark that DLSS is miles ahead applies to pretty much all games, even if once in a blue moon you find an exception.
-
So what should I downscale 4k to? 4k annoys the shit out of me on my laptop because it's pointless at that size display
wrote on last edited by [email protected]4K would go to 1080p for best results (for 3840x2160 screens rather than true 4K, but I’m assuming that’s what you’ve got), and should be much more playable on laptop hardware that way.
Edit: oops didn’t see Beryl already answered this lol -
Reminder: Temporal, proprietary upscalers are only made mandatory by devs, that actively refuse to make a properly functioning product.
wrote on last edited by [email protected]Honestly I couldn't care less, because DLSS/FSR looks better than native with AA at this point. It's so good, that I even turn it on in games that I don't need to.
Quality comparable to supersampling, and I get a FPS boost too? Sign me the fuck up. It's like magic.
-
This post did not contain any content.
I couldn't even imagine what seeing PC games for the first time in 2025 feels like, after not seeing them since 2011.
Do you think they were blown away? Or maybe disappointed that we still don't have photorealistic graphics yet? I wish I could speak with this person so I could pick their brain.
-
I couldn't even imagine what seeing PC games for the first time in 2025 feels like, after not seeing them since 2011.
Do you think they were blown away? Or maybe disappointed that we still don't have photorealistic graphics yet? I wish I could speak with this person so I could pick their brain.
Dude we're still playing classic wow and runescape, that guy hasn't missed anything
-
This post did not contain any content.
A buddy of mine was locked up from 03 - 17. He was asking me, questions like " do you have Playstation 3, what kind of phone do you have?" ...
He said " man I know I missed a lot but people are so rude now. I was talking to my cousin and instead of talking to me he was looking at his phone. That is disrespectful." I said yeah man the world changed a lot. Felt terrible for him trying to integrate back into this bull shit.
He went away for the craziest shift in society I could imagine.