Its like seeing those high-tech pop machines for the first time
-
Let's not forget Nvidia created DLSS and Raytracing and directly helped devs integrate them into their games to create demand for their newer cards.
Yeah, they laid out the bait and got them hook, line and sinker.
-
Man. I went back and played some native raster graphics games with no AA.
It was like I took the drunk glasses off. Everything made sense again. The headache went away. I could see. Object permanence in the engine was insane... Because it all just was.
In the late 00s and early 10s we had bad console ports. But before then, things were actually amazing. And after, when TB putting up a stink about options finally got traction, games were reaching a screaming peak and things were finally figuring it out. I really do believe that right now, we're just in that awkward early-phase of a technology (like the latest 90s with the earliest 3D being really awkward) where people are trying new things and, regardless of rhetoric or stubbornness, will eventually have to face the cold, nuanced truth, no matter what:
TAA is dung and should be flung into the sun.
I hear you, but what do you mean by a transitional phase? Transitioning to what? I'm curious.
-
This guy games.
Also, if your game can't look decent without any kind of DLSS or AA, you need to stop and fix that before relying on AA. Personally, I can't stand the blurriness of any kind of AA, including DLSS, and almost always turn it off.
Games are not still images and our brains are super good at motion interpolation between discrete pixels. To me, it always looks sharper and clearer and truer to life (I have very good vision irl, so blur is unwelcome, and TAA is just... Why would you want that outside of being an effect like being drunk or stunned?).
Fuck TAA. 100%, forever.
wrote on last edited by [email protected]Amen. But in all honesty, TAA has its place for correcting some artifacts, with clouds for example, where blur really doesn't matter. See the minecraft comment above, that's interesting.
Edit: typo.
-
Not sure why most games cant/dont do this, but i've seen Minecraft shaders use temporal upscaling exclusively on the clouds, reflections, and shadows. while using fxaa for the rest of the image.
Because you need to dig into the rendering engine to do that, and if you didn't build it yourself you might not be able to do that easily
-
i just wish it wasn't the general direction the industry had decided to push things.
it's become the expected norm. it's the performance metric games are optimized to hit now, and it's far from prefect.
i was just playing red dead 2 yesterday with dlss and i was legitimately struggling to do some things due to the artifacting. like there are some small missions and challenges that require you to find and shoot specific tiny birds with a bow, but dlss struggles with small things flying across a dynamic background. the birds would literally fade in and out of existence.
same thing with trying to snipe distant heads. the little red fatal zone indicator would ghost like hell and fade in and out.
like, it may be better than needing to drop your resolution, but it still kind of sucks sometimes.
95% of those issues would disappear if there was a rendering hint layer for the games to use to mark which details needs to be rendered in higher quality, so the game engine would ensure that important details doesn't disappear.
-
You're being downvote because you are correct. The culture in 2000s America was trashy at best. The CIA psyop was in full effect (project mockingbird) and everyone was dancing to the rhythm of the patriotic drum, and being asleep at the wheel.
The cartoons of the mid to late 2000s were pretty good though, it must be said
-
Fair, but I'm mostly interested in how they feel about modern AAA games, with their path tracing and HDR support and whatnot.
Tbh, I haven't done time, but that's still me.
I upgraded from an old laptop to a 4070. I tried HDR and I don't see a difference at all. I turned off all the lights, closed the blinds and turned the (hdr compatible, I checked) screen to max brightness. I don't see a difference with HDR turned on or off.
Next I tried path tracing. I could see a difference, but honestly, not much at all. Not nearly enough to warrant reduced FPS and certainly not enough to turn down other graphics settings to keep the FPS.
To me, both are just buzzwords to get people to fork over more money.
-
Honestly, the jump from 2011 to 2025 doesn't seem nearly as steep as say 2000-2011. Sure games look better today but 2011 games still hold up. In 2000, 3d graphics were still new and most titles are considered unplayable now in terms of graphics and controls
And 3D was the "AI" of those times. They had to bring it to EVERYTHING. Micro Machines (a top view toy car racer)? We'll make it 3D, buy a card. Frogger? Yupp, Frogger 3D. They even tried to force 3D on poor Worms in 2003. I still prefer Worms World Party/Armageddon.
-
And 3D was the "AI" of those times. They had to bring it to EVERYTHING. Micro Machines (a top view toy car racer)? We'll make it 3D, buy a card. Frogger? Yupp, Frogger 3D. They even tried to force 3D on poor Worms in 2003. I still prefer Worms World Party/Armageddon.
Don't forget all the 3d movies too.
-
I miss 2003. So many bangers from that year. Ignition by R. Kelly. Picture by Kid Rock/Sheryl Crow. P. Diddy's party anthems Shake Ya Tailfeather and Bump, Bump, Bump. You could tune into The Apprentice to learn about business and enjoy Donald Trump's timeless one-liners, or The West Wing to learn about the American presidency, maybe a little Chappelle's Show for some laughs. Apparently it was also the first year we could all go hop on 4chan and Google Adsense for the first time. Anyway, it kinda makes you wonder what all those folks are up to now. I hope they're well.
I was a roller skating rink DJ when Shake Ya Tailfeather came out. It had the place so hyped up I had security tell me to cut the song off before it finished. People jumping up amd dancing on tables and shit. It was wild. That song was definitely a banger.
-
Tbh, I haven't done time, but that's still me.
I upgraded from an old laptop to a 4070. I tried HDR and I don't see a difference at all. I turned off all the lights, closed the blinds and turned the (hdr compatible, I checked) screen to max brightness. I don't see a difference with HDR turned on or off.
Next I tried path tracing. I could see a difference, but honestly, not much at all. Not nearly enough to warrant reduced FPS and certainly not enough to turn down other graphics settings to keep the FPS.
To me, both are just buzzwords to get people to fork over more money.
wrote on last edited by [email protected]Seems to me that you got an early or cheaper HDR display, then. To me the difference is night and day.
FWIW, HDR does its best work if you have a display that can do true blacks. If you don't have an OLED, mini LED, or full array, you're going to have a hard time noticing the difference, especially if you don't know what you're looking for. HDR works best in either extremely dark or bright scenes, so having a display with a near infinite contrast ratio is important.
Here's a hint for any display: Look at some HDR clouds while you toggle HDR on and off. You'll definitely notice the difference there. Also check the teals. It's less obvious but SDR displays can't do a proper teal.
-
Seems to me that you got an early or cheaper HDR display, then. To me the difference is night and day.
FWIW, HDR does its best work if you have a display that can do true blacks. If you don't have an OLED, mini LED, or full array, you're going to have a hard time noticing the difference, especially if you don't know what you're looking for. HDR works best in either extremely dark or bright scenes, so having a display with a near infinite contrast ratio is important.
Here's a hint for any display: Look at some HDR clouds while you toggle HDR on and off. You'll definitely notice the difference there. Also check the teals. It's less obvious but SDR displays can't do a proper teal.
I tried it on a few OLED smartphones too, couldn't see a difference.
I tried it with some HDR demo videos, so I expected that these would show off the difference especially well, but I couldn't see the difference at all.
I'll try it again with clouds and teals, but I don't have a huge affinity for distinguishing minute colour differences in general (I'm not colour blind or anything, but it's hard for me to differentiate between very similar colours), so that might play into it.
-
"I remember 14 years ago when my GPU used to draw almost 400 watts. Crazy right? Anyways, how is GPU power consumption these days?"
"I budgeted about $500 for my GPU, that should be able to get me a high end card right?"
(That's like $750 today, adjusted for inflation, btw)
-
I tried it on a few OLED smartphones too, couldn't see a difference.
I tried it with some HDR demo videos, so I expected that these would show off the difference especially well, but I couldn't see the difference at all.
I'll try it again with clouds and teals, but I don't have a huge affinity for distinguishing minute colour differences in general (I'm not colour blind or anything, but it's hard for me to differentiate between very similar colours), so that might play into it.
wrote on last edited by [email protected]HDR is more for showing the "depth" of an image, not as much the color gamut (how many colors it can show).
HDR will help more with things like if you're inside a building and looking out in a daylight scene. Youll be able to see more of both inside and outside the building. Of course it won't make your monitor better, but assuming you have more than a basic display you should be able to see a difference.
-
Because you need to dig into the rendering engine to do that, and if you didn't build it yourself you might not be able to do that easily
Which would be easier if you were a dev making your own game than if you were making a mod for an existing one no?
-
And/or consumers insisting on playing in 4K because "big number" even though fill rate is a huge issue with modern games and you can barely tell the difference on most setups. Which would not be so bad if they also didn't want ever increasing graphical fidelity and 120+ fps on top of that
wrote on last edited by [email protected]4k is absolutely an upgrade over 1440p. I have two of them (an LCD and an OLED) and I absolutely love them in every game I play. I will admit that I'm in the super minority and because of my work history I've spent a lot of time looking at a lot of displays so I'm more sensitive to various artifacts than the normal person. And in games I always prefer looks over resolution, it needs to drop down to like 40fps or lower for me to start changing settings.
Basically, it was worth it for me but probably won't be for you. OLED is a significantly actual upgrade. You should get an OLED it'll change your life.