How did we get here?
I’ll be completely honest, that’s probably the coldest take someone can make about recent tech that I’ve seen, and it’s being presented as a hot take.
Virtually everyone prefers native, almost aggressively so. That being said, I think there’s important nuance that’s missing in most talks about upscaling. In my testing, my experience of blurring and smearing with upscaling/frame gen seems to be hugely dependent on pixel density. If you get a really dense screen, then upscaling, in my experience at least, becomes virtually undetectable even at 1080p.
probably the coldest take someone can make about recent tech that I’ve seen, and it’s being presented as a hot take
That’s exactly what the “we have you surrounded” meme template conveys (at least according to my understanding): a popular opinion, but ironically presented as a fringe opinion.
So no, this isn’t really intended this as a “hot take”, there seems to be a decent amount of people who dislike TAA for example. I’m pointing out a trend in the industry, that devs are using temporal or upscaling tools to make the game run/look better, and GPU vendors support those tools to squeeze out the most fps from their cards. At this point TAA is the standard AA method and is integral to how some games are rendered, and upscaling is advertised as basically free* performance. Unfortunately, by its nature, all this temporal tech doesn’t work too well at low framerates and resolutions, a scenario where it would be very useful.
I would agree that most artifacts and the softening effects of upscaling will be less visible on higher density screens, or when you’re sitting further away from a screen. Unless your TAA/upscaling implementation is absolutely botched, in which case it will always looks garbage, but that’s not really the fault of a specific technology.
Interesting - I was not aware that that was the intended use of that meme. Maybe I’m getting old?
I don’t play a lot of AAA games, but ngl I’m quite happy gaming at 1080p on my 27" monitors.
I get way better framerates, and it still looks plenty good with maxed out graphics, as long as I’m not sticking my face right up against the screen.
Have you actually tried 4k though? Yes, framerates are lower, but boy does it look better. For me, and it’s that’s just my take on it, 1080p ends at 24" monitors.
If a person is running TAA, I dont really see the point.
Same. Honestly my eyes aren’t good enough to notice a difference between 1080p and 1440p (or 4k) at the scale of my pc monitor, but I damn sure notice a difference between 60 fps and 200 fps…
Glad it works for you. Since I upgraded to a 1440p monitor (I still have the same GPU) I went from comfortable high-ultra settings to mid-high settings + FSR Quality in more demanding titles. From the games I played in both 1080p and 1440p, I’d say that less GPU-intensive titles definitely look better in high-res, but I found the overall experience quite whelming.
Simply playing on a higher res monitor won’t necessarily give you better visuals if you don’t have the GPU power to match settings, however at that point it’s not “higher res = better visuals” but “more powerful PC = better visuals” which, duh, of course it will look better.
Well there’s your problem, you wanted better resolution but didn’t match it with a GPU upgrade.
Gotta have both or you’ll suffer a bit of loss.
That’s basically the feeling I get. I’ve been gaming on PC since the days of CRT monitors that could run many different resolutions. The tradeoff was always quality vs resolution vs framerate, but nowadays LCD/LED monitors have a fixed native resolution, so that’s one factor to take out of the equation. Nobody wants to play games at a non-native resolution.
I still play in 1080. I believe monitors shouldn’t cost as much as my build. (Yes I’m exaggerating for comedic effect).
1080p is still great. What giant ass screens are people running where it isnt?
1080p is great yes but it could be better. Those graphics must be sharp enough to fucking cut me
27 and 34 inchers that are like 15 inches from our faces
Friendly tip: For singleplayer games, you can always disable the game’s built in AA solution and use reshade for AA instead. If you have extra GPU power you can also use reshade to add all sorts of other graphical effects if you’re willing to fiddle around with things to get it looking good.
If you have an NVidia card, sometimes PCGamingWiki has instructions for tweaks you can do in Profile Inspector to adjust how the driver applies AA to a game too.
I’m still running a 1060. You’d be surprised what you can play if you’re willing to put up with shit graphics
And for anything else, I run it on my PS5.
Remember when you could force MSAA through Nvidia Control Panel on almost any game without issue? Pepperidge Farm Remembers.
Solution is to not play modern AAA garbage. For the $40-50 pricetag, I could get a handful of great indie games off my wishlist. Games that won’t bat an eye at an aging GPU.
They would make a lot more money if they made games run on older hardware. Most people can’t play cities skylines 2. Most people can’t play kerbal space program 2. We don’t want photorealistic graphics. Just give us fallout 3 era graphics because thats good enough. Fuck.
I agree in theory but Fallout 3 is a horrible example, the art direction was the epitome of the era’s fascination with bland brown and sickening green landscapes.
The answer here is to ask if photorealism matters to the game or not, if another type of art direction suits better then do that. Hell, look at Boltgun or even just games that used contrast and bold colors to their advantage like Halo 3 or Mass Effect 2.
The completely unfounded death of MSAA in modern games is devastating. It was (and still is!!!) so much better than every alternative.
Turns out the death of MSAA was actually quite founded! Here’s a great Digital Foundry video on anti-aliasing piped link | youtube
tl;dw MSAA only affects geometry, so while it worked fine for older titles, it can’t handle textures, normal maps, shading etc.
Still looks better than all alternatives by far though.
edit: second best, just remembered DLAA is a thing whenever its actually implemented.
Games like Deep Rock Galactic have every reason to use MSAA but don’t anyway because game engines decided it was unnecessary, and small devs like that can’t be arsed to maintain their own implementation.
Also textures and normal maps don’t need anti-aliasing because they will already have it baked in. Shaders are a similar situation where any aliasing will be situational and should be handled by the shader itself. (If it even makes sense to do so)
Anyone want to explain to me wtf anti-aliasing even is
Most AAA games have been so shitty lately that calling something “AAA” these days is almost like saying a bad word.
Gods, the amount of disappointments I’m glad I didn’t waste money on. The biggest spending I’ve done on gaming lately is buying myself a Steam Deck. Now I’m enjoying my backlog of indies I got from Humble Monthly.
Sounds like you care more about eye candy than gameplay. To each their own.
I’ve replayed rdr2 recently and TAA absolutely DESTROYS the beautiful visuals of the game.
That said stuff like DLSS is a godsend and looks 90% there depending on the situation. Its simply another tradeoff you can make.
My old nvidia can punch above its weight because of it.
This is especially apparent and a major downgrade on war thunder because you need to be able to see player tanks and aircraft in the distance which ends up being like 3 pixels on your screen which DLSS will make invisible lol.
There are definitely some games that benefit from this, but the inconsistency is still enough that its not really useful for FPS and Multiplayer games where even the slightest change on your screen can dramatically affect your gameplay. A sharp 60fps is much preferable to an AI generated 120 fps which may remove detail and accuracy.
DLAA with frame generation seems pretty good to me in cyberpunk.
Watch as half life 3 comes out and you won’t be able to play it without dlss or fsr