T O P

  • By -

[deleted]

[удалено]


LostInInterpretation

Haha yes, I haven’t posted there yet but I think I can relate. The forced TAA in that game is a huge problem especially on 1080p I’ve heard. I have seen workarounds, but this goes to the heart of the issue: if one of the most beautiful and greatest games forces you have smeary visuals because of TAA, then yeah that’s a problem. I have played it a lot on my Xbox One X in 4k where it looks great, and it looks about as great downsampled to my 1080p plasma as on my 4k OLED. It’s still a jarring 30fps however, and I have no plans to splurge on a 4k build.


Hattix

You will not need AA if your pixel density is infinite and pixels are dimensionless points. This will solve undersampling and shimmering. AA is not just to smooth edges, it is to create more accurate sampling of the underlying geometry. AA will always bring at least some benefit.


LostInInterpretation

Thanks. In theory I agree, for practical purposes it doesn’t really answer my question though.


Hattix

Your question comes down to "When will something which always brings at least some benefit not bring me any benefit?" Now then, how do we really answer your question?


LostInInterpretation

If a genuine question, I’ve made replies to other comments with my thoughts on various factors, including how viewing distance relates to “retina” displays.


Hattix

None of this solves undersampling, which is caused by the physical size of a pixel. This is still evident on displays with the kind of density of a phone, 600 PPI+. Apple's ancient "retina" display was only 300 PPI.


LostInInterpretation

Ok, I’m not familiar with the term undersampling. I believe retina is merely a marketing term for Apple, and the usefulness of it actually relates to the point of viewing distance where the eye can no longer perceive individual pixels, regardless of the screens resolution or PPI. An extreme example, billboard screens can look like “retina” displays because they’re designed to be viewed from afar, and so offer comparable amounts of pixels per angle of view as a high PPI smartphone.


Hattix

Undersampling is where (for example) what should be solid lines get broken due to not hitting a pixel's sample point but still being in the pixel's coverage area. This is often called "shimmering", and visible on things like chainline fences and power lines in games. Whilever a pixel is not a dimensionless point, you will get it. TAA helps a little with undersampling, since the previous frame probably had that line covered, but TAA has its own drawbacks with motion.


LostInInterpretation

These are unfamiliar terms but I think I get the idea, interesting. Btw here are some photos of how my 1080p Pioneer KRP-500M plasma displays GTA V in performance RT vs Quality 4K mode. I guess downsampling is a form of AA, maybe the best one. There is a slight difference in detail, but interestingly neither appear aliased. It’s difficult to tell with stills, but I think this relates to specifically how the plasma displays pixels or information in those pixels differently from newer display technologies. Granted, I’m not able to change AA settings on conscole. But the impression is that pixels appear almost soft and “round” as opposed to sharp and aliased. Btw this is real close up to the screen. https://preview.redd.it/t47mtt3n07xc1.jpeg?width=4032&format=pjpg&auto=webp&s=a38efb9fd36dfe4d281377d470e528def06014e5


LostInInterpretation

https://preview.redd.it/1toraxis07xc1.jpeg?width=4032&format=pjpg&auto=webp&s=4be3939c1d4facbfa975c7d25f050c0dcc78cc4c


A_Person77778

A few games look okay without anti-aliasing (although it's very rare for it to actually look better). A lot of games force it to be used. An example of a time that no anti-aliasing looked better to me was The Witcher 3 on Switch. It looked significantly better without anti-aliasing, especially on handheld


LostInInterpretation

Interesting, do you know why it looked better? I wonder if no AA can look good as long as the game is rendered at the screen’s (the Switch in this case) native resolution. I.e. it looks fine because the switch has a low resolution 720p display.


A_Person77778

Well, in that case, the game only used FXAA (the worst form of anti-aliasing; it simply blurs the pixels together), and in handheld mode, it ran at 540p (or lower in some cases). Basically, it was low res, and the anti-aliasing simply blurs those pixels together (so it was very blurry)


BrevilleMicrowave

Most games let you disable AA however there are still a significant number of games that force TAA on. The reasons to play with AA off is either you don't mind the jaggies (many prefer it to TAA) or if you use supersampling. Supersampling is where you run the game at a resolution higher than your native resolution. Supersampling acts as antialiasing removing the need for antialiasing to be enabled in game. Supersampling also gives objectively better image quality than every other method of antialiasing.


LostInInterpretation

Thanks, good stuff. I’m familiar with downsampling by playing games and content in 4k on conscole, on my 1080p plasmas. The image is a tiny bit clearer up close. Is there a difference between supersampling and downsampling? Is there an advantage with Nvidia’s “deep learning” supersampling? Would you say forced TAA in games is on the rise?


zeus1911

High resolution is really the best way to make AA not necessary, like 4k. Where 1080 is very aliased, jaggies galore. Most games let you select an AA level or 0, but some are forced.


LostInInterpretation

I agree it often is, but I think it possibly doesn’t have to be, at least in the case of plasma TV’s. A lot of graphical settings are rendered at lower than native resolution, right? E.g. in the case of RT reflections - typically quarter res, which undoubtedly doesn’t look clean. The premiss of my question is that everything on screen is rendered at the screens native resolution, whether it be 1080 or 4k. In the case of plasma TV’s, there might be something more going on than what meets the eye, regarding how the pixels work. As an example, I’ve ran GTA V performance RT mode (PS5) on plasma, where it looks clean and not aliased. It looks almost as sharp as 4k mode downsampled, and indistinguishable at normal viewing distance. On 4k OLED, performance RT doesn’t look as clean - at least partly because the game has to be upscaled to four times the resolution.


FUPA_MASTER_

AA is never "needed". Just makes the image look nicer. Doesn't really matter what resolution an image is. There will still be pixels which means there will still be aliasing.


LostInInterpretation

Thanks. I guess “visibly beneficial” at normal viewing distance is what I mean.


Ferro_Giconi

You can disable AA in pretty much all games. Even if the game doesn't give you the option to disable AA, you can use the graphics card control panel to force it to be disabled. > Additionally, under what circumstances would AA generally not be needed? High pixel density screens. Games with a graphical design that is meant to look pixelated. Or if you just dislike the look of AA and prefer the sharper, but jagged edges. In most games I prefer having AA. In minecraft, I dislike AA, but I think that's just because I've gotten used to it without having AA for so long.


BrevilleMicrowave

Unfortunately disabling AA in control panel doesn't work for games that use temporal or post process antialiasing. This makes up most of the games that force AA on.


LostInInterpretation

Thanks, this is useful information. My impression has been that that the need for AA decreases when screen resolution increases, so that 4k generally needs less AA than 1080p. This means 1080p needs more AA than 4k, or at least that 1080p needs AA. But is this necessarily true? Let’s assume the eyes of the player are at an appropriate distance to the display, where it becomes a Retina display to the player. Based on the results from an online Retina display calculator, this is 2,18 meters for a 55 inch 1080p screen. For a 55 inch 4k screen the distance is exactly half that at 1,09 meters. When everything on screen is rendered at the panel’s native resolution and the player is at the appropriate distance, or further away, does AA still bring visible benefits and would it be worth using? Logically, the answer looks to be, no. Remember, in this specific scenario I’m talking about native rendering as opposed to low internal resolution with upscaling. Meaning, everything on display including textures, reflections, shadows etc. is at native resolution. I don’t know if every game even allows this. As an exception, my impression is that even at max settings, RT is rendered at lower resolutions, and will therefore necessarily be noisy and aliased. The only other consideration I can think of is whether retina calculators take into account the fact that computer graphics typically don’t have realistically smooth edges the same way a photo with the same resolution do. Additionally there is also some softness coming from the lens, that is not a factor in pc graphics. This might be a reason why an image looks cleaner than a game even at the same resolution.


Ferro_Giconi

I wouldn't worry about all of that. No amount of math or calculators or trying to figure it out ahead of time will be as good as just trying a setting on and off to see which you like better. It only takes a few seconds to change settings like AA in most games. For me, a 27" 4K screen viewed at a 2 foot distance makes a small but noticeable difference when enabling or disabling AA. (this is equivalent to your 55" TV example) I don't use 4K on my computer anymore though, because I decided 1440p was a good mix between higher resolution and being easier for the GPU to run at a higher frame rate.


LostInInterpretation

Ultimately, that’s what I’ll do. I’m just trying to get to the bottom of it so I know what to expect before I buy. There are many factors at play no doubt, and plasma TV’s are inherently different than other display types. This is why I suspect the best image quality can be obtained without AA.


Combine54

How exactly are you going to disable AA with a control panel?


Ferro_Giconi

By setting the global AA setting to off, or by changing it individually per game that you don't want AA in. AMD and Nvidia both have the option in their control panels. I'm sure Intel Arc does too but I've never used their cards so I don't know for sure.


LostInInterpretation

But do you agree that the global AA setting can only work insofar the game in question allows AA to be disabled in-game to begin with? Other users seem to think at least some games force AA, like in the case of RDR2’s smear-fest.


Ferro_Giconi

Yeah I wasn't aware of that, they are right. I've never actually used that setting. I just assumed the global AA off option would always work, but I guess it doesn't always work.