Thank god you're not forcing TAA like the current modern game trends. Always hate that as soon as the camera moves, the image becomes a blurry, smeared mess or has ridiculously jarring ghosting artifacts.
Certainly! I've noticed a lot of the gaming community complaining about this topic and I would like to take their general feedback and ensure players' have the option to use what their preference is! I also don't really like the effect of TAA and would rather have no AA over TAA.
@RiftDivision I'm in the same boat, being that I prefer an aliased image vs. one with TAA. Its unfortunate both DLSS and FSR are based on TAA and share many of the same drawbacks.
Come on bro u would not rather use no AA over TAA. blurring is better than scratching. Also allow Taa but yea dont force it. Id use it if the game were unoptimized and i couldnt run msaa. Msaa is never worth it over a render scaler + taa imo.
The new Satisfactory version uses TAA. I'm not sure who thought that adding _temporal_ antialiasing to a game with _fast-moving_ conveyor belts was a good idea, but it basically causes those moving objects to dissolve into a featureless blur.
@@Lyazhka I am pretty sure I was using FXAA before the update to Unreal Engine 5, and now FXAA just doesn't seem to be an option for me but maybe I need to disable DLSS or something to get that option back. It's kinda confusing.
@@TerrisLeonis well yeah, DLSS acts like anti aliasing thus they lock the option for AA, you already have AA enabled by having DLSS, you can also use TSR instead of TAA, which can render at native resolution giving better results than TAA (if given the option to do so)
Came here to say exactly this. No 1 minute intro to the subject, no bs about their childhood stories, no cuts to them changing locations and sitting something unconventional or some random stuff, no filler editing, just straight to the point.
Good to see people talk about this, I've also decided to use MSAA purely. The loss in visual clarity from TAA is unacceptable in my opinion. Developers should not be using TAA or upscaling features as excuses not to optimize their games. That wasn't the case with older games, and we shouldn't be doing it today.
@@MaxIronsThirdmy god your right everyone arm the 4090s. To battle! Tame the beast we must conquer this monster no matter how many pixels it may possess even if it is 32million strong. FOR THE ANTIALIASING, FOR THE RETURN OF NON BLURRY GAMES, FOR GAMERS!
Gotta love Arma :D Btw I've noticed that many modern games tend to look worse even with TAA off, almost like the were made to look good with TAA but look weird if it's turned off. Modern Warfare 2019 for example had some "grainy" textures that did that.
Arma is a fantastic game! I've noticed that too, I think that there must be some more complex graphical enhancements being made that require TAA on to correctly work - probably why a lot of games force TAA on aswell!
From my understanding, many games use deferred rendering, where they will intentionally undersample effects rendered on screen to save compute power. An extreme example is the hair completely breaking in Cyberpunk 2077 when TAA is off. They rely on TAA stacking the undersampled pixels into one complete image.
the reason a lot of games use TAA is because taa can be use for many more things then just aa, due to how it works it allows other aspects of a games rendering pipeline to be changed as well, TAA basicly works by storing previous frames and combining them together into the current frame, if you understand this what you can do is basicly only render parts of certain affects over the coarse of several frames to save performance, this is pretty often done with shadows, somtimes ao, other effects like SSR and another really common method is fake transparency, especially with hair and grass and stuff, which is why disabling TAA makes some stuff look really bad, because for example using an actual transparent shader to render hair cards is really exspensive but using an alpha cutout method dosnt look as good ebcause it dosnt look as soft, if you run the hair card alpha texture through random noise every frame so that the gradiations in the texture are rendered over the coarse of several frames and then combined together it creates the effects of transparency which can make the object look a lot softer at the edges. This is just one of many reasons why TAA is so prevelant, it allows for effects like this without falling back on actual transparency rendering which is much more exspensive, and applying this to other effects like shadows can help save performance a lot, especially when games have to constantly push graphical boundaries you constantly have to look for more ways to save performance so you can squeeze in other effects. people act like TAA is the worst thing ever but people dont realize that its a necessary evil. and saying, "just optomize better" is a really stupid and ignorant response that shows a complete lack of understanding of game development, especially when the hardware dosnt really change but games are urged to get better and better looking. its not always just about optomization as well, using actual transparency comes with a host of other issues as well like zfighting due to how transparent pixels are sorted, there are some ways around this with modifying an engine or using a custom engine, for example I believe the new alan wake game and its custom engine use multiple seperate depth buffers for different transparent objects to mitigate the zfighting issues that transparent objects run into (aka particles using transparency use one depth buffer but other objects use a different depth buffer, so if you have a ton of smoke particles for example near glass on a building you wont run into the zfighting issues where somtimes the glass renders ontop of the smoke) but its not perfect and using multiple zbuffers isnt exactly cheap either.
I've known about these techniques for awhile - but your video did a great job of explaining each one in detail while keeping things brief. Your editing also compliments your narration well! Great job!
One of the understated benefits of TAA is that its temporal nature gets used in a lot of other effects as a denoiser - because it averages the pixel over time, we can use it to create dithered transparency (which is what you might use in place of alpha to coverage), we can use it to denoise SSR, SSS, etc. The artifacts really suck but a lot of devs use it because it makes so many other effects easier to optimize.
That's a double-edged sword. Say you undersampled a ton of effects and also decided to leverage TAA as a denoiser; The people that turn it off or force it off will now be left with a broken image. The most egregious examples of this being RDR 2 and Cyberpunk 2077. The solution to this would be to provide higher sampling quality.
Cyberpunk 2077 is a great example for this. TAA is horrible there, I hate it. It seems to have gotten somewhat better in recent updates, but it still sucks. You can turn it off but then you see the problem you mentioned, noise, everything in Cyberpunk is noisy, the asphalt and it's reflections are the most notable ones since it has random pixels that are just sharp points of light, like literally just a white pixel which is denoised by TAA normally. It's horrible and the ghosting is just ugly
And crucially, it DOESN'T actually provide much in the form of anti aliasing because edge aliasing is THE LEAST of your concerns with all those shaders causing aliasing of themselves.
@@djwhitepeople Not sure how you got those implications from a minor correction for a footnote in the video, but okay. edit: they deleted their comment but it insisted I was implicating developers were lazy for not implementing MSAA and then called MSAA obsolete because it doesn't solve for specular aliasing. Which, by the way, we've had other solutions for for a while, notably for VR titles like HL:A, where temporal solutions just aren't good enough.
It's worth noting that MSAA wasn't ditched simply because of the performance hit, it was ditched because it doesn't really work with modern games. The issue is 2 fold. MSAA only effects the edges of Geometry, meaning that in surface aliasing such as with specular highlights is completely untreated. Secondly it's a Spatial Techniques meaning it can only consider each frame individually, this method misses what's called Temporal aliasing AKA shimmer. Temporal aliasing is when high contrast details appear and disappear on a frame by frame basis. A common example is tree branches in the distance, a lot of the detail in the thin branches will be smaller than a single pixel. This means that as the camera moves and the tree animates the branch will oscillate from taking up enough of a single pixel to be visible to being small enough to be invisible, this creates a highly distracting shimmering effect around any areas of dense detail (turn off TAA in any modern game to see what I mean). These 2 issues were not a big deal in the 90's & 00's, lighting models were to simple to contain a lot of fine specular highlights and geometry to large & sparse for shimmer to be prevalent but in the 10's that all changed. MSAA just couldn't cope with modern renderers leading to image that still looked plenty aliased despite costing half your performance. So simply put even if MSAA was free it still would've been replaced by TAA, it's the only technique other than SSAA that can handle more complex visuals (that's because TAA technically is SSAA). Switching over to TAA and spending the time & effort to alleviate it's short comings was really the only option, plus the development of the technology would lead to things such as upscaling. So we can moan about TAA till we're blue in the face, its not going anywhere. But don't worry too much it has gotten significantly better over the years and personally I feel a lot of the problem boils down to monitor resolution, at 1080p the blur is obvious at 4k its almost invisible. Eventually everyone will be using 4k so this problem will be solved one way or another. For more info on the drawback of both MSAA & TAA along with how they work check out Digital Foundry's recent video on the subject, there visual examples & benchmarks demonstrate clearly why MSAA was doomed.
Specular anti aliasing exists, and your tree branch example would be resolved by MSAA because it samples subpixels and adds that detail in, with TAA the frame averaging ends up erasing tiny details if they don't cover every sample
'at 1080p the blur is obvious at 4k its almost invisible' You do realize that TAA is quite literally stealing image information from you, right? The fact that you need small 2K monitors (
@@TheJohn_Highway Deferred renderers went out of fashion years ago, pretty every game nowadays is some sort of hybrid deferred mixed with forward renderer. Unreal is deferred everything with only a forward pass for transparencies, although ue5 has OIT so idk if thats a forward pass anymore. But even in games that are pure forward+ renderers, like Doom 2016 and Eternal and Detroit Become Human, they still use TAA and force it on because they want the 'unified rendering' it provides (as in denoise all their undersampled stuff) and because it means they don't have to put in the work to author art that doesn't shimmer. Source 2 is the shining example of what games COULD be looking like now.
@@TheJohn_Highway A 1080p image with MSAA doesn't look as good as a 4k image with TAA, I game at 4k and it's a huge upgrade even with TAA. My point was that at 4k a proper TAA implementation looks almost as sharp as a native 4k one (not 1080p) and any reduction in clarity is so slight to be worth it for proper AA coverage. The way the algorithm works means it makes less errors at higher resolutions. And you also realise deferred renderers & other modern techniques aren't used for the lols. They're used because without them modern graphical fidelity would be impossible. Your proposed solution would make games look even worse than they currently do, going backwards to PS3 era rendering for a bit of extra sharpness. Making games look worse because you think they look bad is a crazy solution. As I explained in my other comment, if TAA blur bothers you so much you can simply combine Supersampling with DLSS/FSR to get a perfect AA without any blur. Yes it has a performance hit of course but no worse than the one MSAA comes with and it delivers better results than MSAA could at it's best. This is a solution people can use right now and unlike yours actually improves the visuals, and yet none of the TAA haters actually do it because of the performance downsides, showing that even if MSAA came back 99% of gamers aren't actually willing to sacrifice their framerate for a bit more sharpness. Oh and TAA isn't "stealing image information", it's not turning a high res image into a low res one that's not how it works. It combines multiple frames together with a slight offset then down samples the result. So the real issue is adding information in the wrong places, that's why it combines well with sharpening the details all there it just needs more contrast around certain edges.
@@TheJohn_Highway It's about time you realize that even disregarding the whole Deferred vs Forward rendering - MSAA is useless unless your game has a lighting model straight out of 90's/early 00's. By it's very nature it only works on geometry edges.
Thing is - Valve did use MSAA in HL Alyx too (or was it FXAA?), key thing is - they found out through research that aliasing happening bacause of materials, especially normals, they determined that adding noise greatly reduces amount of aliasing, and slight msaa on top serves as cherry on top to get rid if remaining aliasing. Because of that even, if you disable AA in HLA - image still will look like AA is on. I might be wrong about the way how they done it exactly since my dyslexia is kinda not letting me read stuff properly, and i cannot add link here since YT deletes my comment, but you can find it yourself - Alex_Vlachos_Advanced_VR_Rendering_GDC2015
@@Frisbie147 -- msaa doesn't perform better in forward rendering what? it isn't compatible with deferred rendering at all because deferred rendering fundamentally can't work with transparency, it's still a big performance hit in forward rendering, don't just make bullshit up
@@SkedgyEdgymsaa can be done with deferred rendering, control uses deferred and has msaa, afaik rdr2 is deferred too, and msaa is extremely heavy in those games,
The reason they add noise is to remove colour banding, which is extremely apparent in VR and results in more shimmer, not because of anti aliasing. They apply a screen space dither algorithm that blends colour bands so the colour depth appears greater than what the screen actually provides. This can be seen in CS2 in the water shader and when you spawn in and have the glow around your character, it softens out the harsh gradient lines at the cost of looking a bit fuzzy, and with the water shader it allows them to do volumetric scattering at a lower sample count. They resolved surface aliasing by applying a specular anti aliasing technique where the roughness of a material is augmented based on the derivative of the object's surface normal, so the change in value between two pixels is proportional to the increase in material roughness, which essentially clamps the roughness value so you dont get specular shimmer. This technique was further improved afterwards by various people, I have a video on my channel showcasing it in unreal, and its built in to godot and unity.
Something else to note is most games made by big studios are made in Unreal Engine. Unreal Engine has motion blur enabled by default. I have to remember to manually turn it off in every new Unreal game I start. Wouldn't surprise me if most of those studios working in Unreal Engine leave motion blur on, which would contribute to blurry graphics.
A bad TAA implementation does look like a 2000's era accumulation buffer based motion blur. Which in my experience seems to be most things made in unreal.
Every 3d game I own have motion blur and even worse "depth of field" tuned on as well as lens flare... All effects that only makes the game look worse and blurry which make me eyes hurt .... First thing I do is turn all those blurring effects of.
Answer: Consoles More specifically, The Great Consolization of 2008, which is the event marking the end of AAA PC games, and the first year in which all AAA games ** were conceived, designed, and coded for console hardware, and the expectations and sensibilities of the console market. ** With the possible exceptions of Left 4 Dead (2008), Left 4 Dead 2 (2009), and Portal 2 (2011).
@@ConcavePgons That's true, but for my part I don't think it actually improves anything. At really low framerates it might make it more difficult to notice the abrupt change between frames, but it also makes it more difficult to notice everything, lol.
i hate fake pixels. i hate fake frames. i hate TAA. i hate DLSS. i hate nvidia for gaslighting the entire gaming industry into thinking it's ok for games to look like they were dipped in vaseline. give me my jagged edges back
Yeah I hate AA too. I also hate DLSS and V-Sync, I even hate upscaling, I cant stand it when a game studio decides to do 4k upscaled over native 4k. Console games are starting to look worse because of all these "features".
Wow an informative video that doesn't go back to the very first example in history of the subject at hand, and then spend 1h gradually moving toward the answer.
Devs seem to forget that hardware DOES improve over time and that once unobtainable MSAA 8/16x may be possible 4 years down the line, giving someone a whole new reason to play the game over again! I for one, LOVE seeing the MSAA option in any modern video game because I know im going to be getting the purest image possible with no smudging or blurriness so good on ya for choosing that AA option for your game. It may be demanding now, but just think, in 10 years, even an APU will likely be able to run all of these modern games with maxed settings including MSAA cranked! edit: Deferred renderers are the reason why we have TSAA and not MSAA in the first place; while faster, it doesnt mean they should completely neglect the old API that still does 90% of the newer API visually and in most cases the old API performs better (at least on my 1080ti/era appropriate api) Ultimately, it does come down to API; and most modern devs forcing DX12 but hell, any DX api game will work on a DX* card so long as it is DX capable so why not have the older revisions; especially DX11 which is still MSAA capable.
With exception that resolution tends to go up and with it the cost of MSAA increases drastically. I'm playing in 4k and dropping the MSAA is the first thing I do to improve performance. To be honest I think that hatred from PC gamers towards temporal AA solutions stems from TVs and consoles manufacturers moving to 4k, while most PC gamers due to shitty GPU market are stuck with FullHD for more than 10 years. TAA, with exception of motion blur, looks great on high PPI screen while costing next to nothing in performance, but it's quality suck more the lower resolution you got. With MSAA situation is different: it scales badly with resolution while only having affect on jagged edges (textures like distant ropes or wires still flicker).
You seem to forget the pinnacle of graphics from 10 years ago is the low settings of today. If you want graphical fidelity to remain high you'd simply not have enough computational resources to also do MSAA... Hardware improves, so does the amount of shading a game applies for a higher quality output.
Problem is more that a deferred renderer requires the developer to manually write MSAA variants of nearly every single shader, and use special MSAA compatible texture formats on top. It's a lot of extra work, particularly compared to just storing a few buffers and slapping a post-processing shader on the screen.
@@krazownik3139 and to that id say; 1080p at 16xmsaa has been pretty hard to pixel peep for a long time; and thats coming from someone who sits 2ft away from a 24" 1080p - so really, any resolution past 1080p with 16xmsaa has far been overkill for quite a while - I do understand that the base resolution of everyones monitor will eventually go up, but that just means you can lower the MSAA and get the same performance/visuals back as you would with 1080p 16xmsaa - texture resolution is a different story for sure, but, at least with MSAA, as I said, youre getting the cleanest textures and not a temporal, blurry mess like in modern APIs - 1080p, while not the cleanest textures, are definitely on the most bearable side of textures being 'low' (with non temporal AA) and I've dealt with those textures for many years but the real holy grail is SSAA.
It's not going to be feasible because the rate of graphics advancements and all these shaders will require exponentially more horsepower to do MSAA on them. It's the same fundamental issue with driving native resolution towards 4K and beyond, which is why these new temporal upscalers like XeSS and DLSS are developed. 4K screens are becoming the norm yet the law of diminishing return means it's a terrible waste to render things at native 4K let alone MSAAing the 4K render. You only need enough information to reconstruct a perceivably good 4K image.
Whenever i start a new game i go straight to the settings and turn off anything related to anti aliasing and depth of field effects. Feels like its making me go blind just staring at a game with those effects. If its a particularly evil game it will also have chromatic abberation and that fish eye lens effect i turn that off too. Making effects for games for the purpose of realism is cool and great, but in some cases it just doubles the effect that our eyes already have and hurts to look at.
Antialiasing is usually among the first settings I tweak. And by tweak, I mean I turn it off. I would much rather be able to tell if there's a camouflaged enemy hiding in bushes in the distance than have the image look less jagged but get shot in the face.
my favorite AA method is just playing at a high resolution, it makes everything sharp and the jagged edges are so small that they're barely noticeable. I'd much rather lose framerate to higher resolution than to simulating myopia. developers can put as many AA methods as they want into their game, I'm going to turn them all off
Finally, someone who knows how things must work! Also, a common misconception #1, you actually can use MSAA with deferred. And also, a common misconception #2, MSAA is resource-hungry - no, with rendertarget compression it isn't. And even #3, sometimes still noticeable edges - custom MSAA resolver solves the issue. So, the only final downside of MSAA is shader-triggered aliasing, but it is an uncommon issue usually solved by either altering your shaders or, if nothing helps, relying on SSAA for those particular parts of the image.
Those quirks are very difficult to counteract properly to apply MSAA though. If time crunch is the issue, the cheaper brute-force method like high res SMAA is far more practical.
@@nathanlamaire Which ones? ;) Resource-(un)hungryness? It is there just out of the box. Custom resolver? You can just build one on top of your colorgrading pipeline (and since DX12 it is essential anyway) - just average the linear-light final outputs while colorgrading every sample, no big deal as they share the same color mostly, so just cause a same-texel read if using a LUT somewhere in it. Tried, looks cute ;) Deferred? Go either forward(+) which is decent btw, or do an in-shader texel type detection (e.g. same- or differently-sampled). SSAA? Just add a sample semantic into the PS in question to trigger the SSAA there. So no, no reason to watch the blurry SMAA where the neat and sharp MSAA should be (everywhere, that is)!
I absolutely hate TAA. It's subtle enough where you don't really notice it from a first glance but it always feels like something is off and it drives me crazy it's like playing with bad eyesight.
I prefer AA turned off entirely. It's sharper without it, and I don't mind the pixelation, which is barely visible on a modern hi-res screen anyway. Plus you get a performance boost that way.
It does work, it's just a little more complex to implement (in OpenGL), and might be a lot more expensive for lots of lights (because lights have to be evaluated per-sample then, too). Source: my game engine "Rem's Engine" does that.
They're actually different. Forward rendering (i.e. the traditional method) can provide FSAA (this is Full Scene Anti-Aliasing) directly in the hardware/driver level, which requires no developer effort. MSAA, which is done for deferred renderers, requires you to both use specialised texture formats AND write the shaders yourself to do the proper amount of samples manually. And you need to write those shader variants for every single shader that matters (yes, all of them!), with a variant for each sample level (i.e. Off 2x 4x 8x 16x is 5 variants) to boot. It's not impossible to do, but it requires significant extra developer time and testing/validation on top. Off also isn't as simple as just doing a 1x sample rate.
@@billy65bob "specialised texture formats" Yeah, you use glTexImage2DMultisample() instead of glTexImage2D() in OpenGL, and fetchPixel(uv*size) instead of texture(uv) in GLSL, what a difference XD. No, they don't need different variants for each MSAA, that's what variables are for. uniform int numSamples; for(int i=0;i
@@AntonioNoackIt's a big enough difference to be a pain in the ass :) I'll admit I'm probably overthinking it in terms of optimisation. I kind of ascribe to the "Generate the sampler statements in a preprocessor" school of thought.
@@billy65bob *"They're actually different... MSAA, which is done for deferred renderers..."* Perhaps I am misunderstanding, but _appears_ that you are saying that MSAA is new technique for Deferred Rendering, which is false of course. In Forward Renderers MSAA is flawless: It hits every edge and has only a modest performance impact. *"...requires you to both use specialised texture formats AND write the shaders yourself... And you need to write those shader variants for every single shader that matters (yes, all of them!)..."* Further, I don't understand what you're talking about regarding MSAA requiring specialized texture formats and custom shaders. The entire point of MSAA is that it treats only geometry edges --- it has no effect on surfaces/textures/shaders.
Antialiasing never really needed to be considered in the days of CRT monitors. You'd have to play at like 640x480 or less to notice any pixel edges in 3d games because of how CRTs technically don't have pixels. Something like Half Life 2 on a CRT still looks leagues prettier than a modern game abusing TAA if you ask me. One day though, OLED tech will be refined enough and 4K resolutions accessible enough that we'd be back to not needing antialiasing tech of any kind and reaping the benefits of superior black levels/color depth and basically no pixel response latency.
You know what is the biggest salt on the wound so to say? The fact that 90% of games that have forced TAA now offer "sharpening image", which creates even worse image. So you fix one problem, then create a thing that causes 3 problems and you add a fix for one problem that adds on top of that 2 problem 2 more problems.
If you don't know game settings things always go the way devs are biased. First thing I do on any game: 1. Turn off (or to lowest) Motion Blur 2. Adjust Anti Aliasing to be actually good (if TAA is good, its fine) 3. turn off Depth of Field 4 .turn off stupid effects like Vignette and Film Grain Done, better graphics and LESS GPU usage.
MSAA is also useful for doing volumetric lighting and alpha effects, since you can use multiple samples per-pixel to alpha blend the volumetric diffusion without using another pixel shader for it, source 2 uses this, that's why it doesn't support TAA, since without MSAA some of its effects won't even work.
Thanks! So far I have used a custom imposter system for the trees and a custom GPU instancing system for the grass and other details. I will be making a video on this topic in the near future so stick around for an in-depth break down!
SSAA is the best, but it eats too damn much GPU processing resources. I am currently re-playing Half-Life 2 and I use VSR from 7680x4320 to native 4K *plus* 8x MSAA on top (6900XT allows that with such a dated game while maintaing 90+FPS). The resulting quality is outstanding, although I'd rather have an even higher SSAA quality, along the lines of 3x (11520x6480) or even 4x (16K, for really old games but modded to support unlimited resolutions, or modern indies), but the maximum VSR (or DSR for that matter) allows is 2x. MSAA is simply inferior because too damn many games just don't support alpha-to-coverage technique, and it looks weird, to say the least, to have perfectly antialiased geometry right by the fully aliased grass, leaves, bushes or fence grates. To the point where I sometimes disable MSAA to have a more consistent, even if aliased, image.
The blurrier the better. Foggy? Increases immersion. Motion Blur? Increases immersion. Idc if you need a 3090Ti to run it. Everything blurry isn't a something to fix, it is a experience enhancing feature. Actually ... people should pay for the blur. Thinking of an obligatory monthly subscription. Also making the game very dark will increase tension and immersion, and reduced needed effort for quality. "Set screen brightness" selector? Fuck it, monthly subscription if they really want it.
I'd rather have no AA than TAA or some flickery upscaler. I've modde games before to do just that, take out the TAA. I don't care if it breaks screen-space reflections, since those are already so broken that, you guessed it, I'd rather have no reflections.
My opinion: aniti aliasing is not needed for resolutions higher than 1080p. One of the first things I do in any game is turn AA off. I play 1440p and everything looks great.
Every time I boot a new game for the first time, AA is the first thing I disable. I cannot stand it at all. You may consider it to be a minor thing but it infuriates me to no end.
ArmA 2 & 3 are one of the few games to use AtoC, and even have differing options on the type of foliage to apply it to (only trees, only grass, both, certain trees, etc)
Am I the only person who likes TAA? I tried turning it off in Starfield and the result looked so terrible even with every other antialiasing method that I turned it back on.
@@trucid2 TAA isn't generally bad. It mostly gets bad rep due to poor/older implementation. These days we have a lot of ways of minimising blur through stuff like motion vectors which tells TAA in which direction things will move in the next frame so that it can counter the ghosting and thus blurriness. I've seen a fair few games which look way worse without TAA even ones like RDR2 which is a ghosting fest. Turn off TAA in RDR2 and all the shadows are now noise.
@@trucid2 I think it depends on the game, and on personal experience. For me and in those games I play, anything which causes the image to be blurry or introduces input lag, makes me nauseous, so I turn all of those off. Specifically TAA seems to cause both an input lag and also makes the image seem as if your eyes are unfocused, which is why I dislike it.
This coupled with LODs and many other "optimization" techniques, I strongly believe blurryness could give a more authentic experience to a targeted type of a game.
Low resolutions were not an issue even 20 years ago. The modern games simply have shit ART STYLE, and the devs take tons of shortcuts to lazily optimize their bloated game titles.
There's also the part where textures are being designed for higher resolution monitors, 1080p is being superseded by 1440p. Downsampling is making things look too smudged and the TAA spam isn't helping since its built for monitors above 1080p.
I just like it without any antialiasing. I don't see cristal clear in real life, in fact, I can't even make out objects from each other pass 50 meters. Antialiasing feels just like that, at least without antialiasing i feel like i can see better, even if it looks more gamey
@@theRPGmaster I like it because it gets rid of pixelation on lower resolutions. It does look horrible in motion, i guess i just got used to it and don't mind it cus it's in every game as the default option.
@@MrEditorsSideKick I think you are lacking proper terminology to describe what you are witnessing. But essentially, TAA is a poorman's blurry-AF way of faking what SSAA and MSAA did already years ago. These days it's become a "necessary evil", because for some unholy reason, devs started using noisy, dithered transparency instead of soft and smooth translucency in places like character hair and even shading... resulting these outright Sega Saturn -tier, pixelated dither patterns EVERYWHERE.
@@theRPGmaster only at low framerate/resolution, I though TAA looked bad before but now I got 1440p 144hz and TAA always looks better than no AA. there is no blur when moving camera or ever.
Before I knew how those Anti Aliasing techniques worked I usually just used FXAA or TAA. FXAA is extremly performant and has basically no drawbacks but only reduces the visibility of sharp edges by 50%. TAA is less performant but still efficient, because of it making pixelated edges disappear completly. The drawbacks though are blurriness, artifacts and general unnatural look while moving. MSAA is the only technique I didnt know anything about and I always avoided it because it looked worse than FXAA and was less performant than TAA, in every case I could use it (there were many).
@@SinaelDOveromand msaa cost to much fps what the fuck are you talking? Msaa use on grass, tree, buldings.... Is fucking fps killer. Playing game on 1400p or 4k txaa is the best.
@@SinaelDOverom MSAA used to be the go-to default in the 00s, but these days it's borderline unusable because of three reasons: 1. By default, Deferred Rendering games do not support it, and implementing it is kinda pain in the ass. It also becomes even more taxing. 2. Resolutions and polycounts have gone up exponentially since the early 2000s. Meaning MSAA has more work to do, resulting often DOUBLED rendering times. 3. There's tons of non-polygonal jaggies and other artifacts that MSAA cannot touch. The worst contenders are the Pixel shader shimmering, and the butt-fugly DITHERED transparency effects that devs have started using post-2014 to speed up rendering things like hair, glass and even water. Even FXAA cannot fully handle that crap, leaving TAA (and bruteforce downsampling) the only options for glitch-free visuals. There is ways to improve image quality if TAA is used, but waaay too many ignorant devs just stamp the stock effect on their games and call it a day. The Like A Dragon games for example do pretty okay job with TAA, and even apply a Sharpening filter on it to combat the blurriness.
@@SinaelDOverom Calm the hell down, I just told you what my experience with MSAA was, if you play different games thats your thing, not a general rule. Or are you offended by that?
It's nice for once to have an informational video that isn't filled with 20 minutes of unnecessary cinematic shots, dramatic beats and random scenes of the youtuber contemplating their idea. Thanks for the intuitive video!
The reason TAA is used, is for a middle ground between performance & visual looks. SMAA is far more computationally taxing and this shows in modern games with the feature. Three examples; either go to Forza V, GTA V or War Thunder and turn on the SMAA from 2x to 8x, your FPS should drop significantly (especially War Thunder & GTA V) for MARGINAL improvement in edges. Now that you have SMAA turned on you will probably notice flickering, especially on lower SMAA settings which is not present in some other AA options. Its kind of tied to more important settings like LODs. SMAA is only good in certain use cases*
One reason is because most games use FXAA which can only blend pixels in two directions and doesn’t not trust on reconstruct sub-pixel data. MLAA does reconstruct that data and can blend in any direction.
yeah the 2010-2012 period has a LOT of games I really enjoy. and a few years after that all the games just seem to get so fuzzy. it's something that's been bugging me for a while, glad to have an answer. great vid !
2:10 Actually about that. The traditional MSAA technique is FSAA, or Full Scene Anti-Aliasing - It only works with forward renderers, but it was hardware/driver based and affected everything rendered. MSAA is fully available in deferred renderers, but unfortunately it requires you to use not only special MSAA compatible texture formats, but also to manually create specialised shaders that do the requisite number of samples in every single part of the renderer that matters.
I wish AA only applied to certain objects, like powerlines/chainlink fences- things like that. A lot of games didn't used to have Depth of Field either, which is always way over-exaggerated. It's partially a trend, partially tech. I used to develop games (not anymore), but the dev experience comes in handy for modding games to my liking- this is one thing that I often tweak if possible. Really nice explanation of AA!
I only use anti aliasing when I take screenshots for better looks, but in gameplay i just leave it off completely for max performance, even in non competitive games.
Soo many in comments say they hate Anti Aliasing... I can not live without it, I hate it when the game world looks like it is built by bricks of Lego. I either run TXAA combined with 2x or 4x AA or 8X MSAA / SSAA in most games if my GPU can handle it. If it struggle to keep up the FPS I often run no Anti Aliasing and run higher resolution then my 55" 4k TV, that give same effect as SSAA but at less GPU usage.
I hate TAA because it's blurry, I hate most AA because it eats the framerate. I can't stand random frame drops and AA is usually responsible for that. I usually turn AA off, the jaggies don't bother me.
I don't know if it's just nostalgia but I really mis early 2000 fps games without AA, like CS1.6. Everything was incredibly clear and sharp despite fairly low resolution. With newer games, higher graphic settings you use, the blurrier and undefined it gets.
I remember playing HL2 with a Geforce 7300LE and it was very aliased, but then I upgraded to a 8600GT and could finally turn on MSAA and the game looked beautiful as there was too many finer details in that game.
I'm a student pilot so I use flight sims all the time. My biggest gripe with X-Plane 12 and MSFS is that there's a constant but subtle blurriness on literally EVERYTHING, including the cockpit screens, which makes it hard to fly because I now need to zoom in on each individual display, rather than just setting and my camera at the right POV where I can see everything I need to like I could in XP11.
Great video explaining what I have recently been thinking about but haven't been able to properly understand. I play RDR2 in 4K and without any AA the shimmering is absurd. The AA methods that fix the shimmering either come at a huge performance penalty (MSAA/2x) or ends up blurry for distant foliage and during movement (TAA/medium). FXAA is also available but doesn't make much difference in my experience. I've also been playing around with different driver-side AA methods in AMD Adrenalin, but so far I have not had all that much success. At the moment it seems like I am stuck with TAA/medium and the blurriness and lack of distant detail that comes with it, because the shimmering drives me insane and MSAA drops my FPS too much.
if you made a video covering TAA, SMAA, FXAA, other weird AA methods seen in random games, that would be super cool and probably get a lot of views. youtube recommendation brought me here
I played a demo recently that only had TAA and it was bundled in some generic "Quality" setting so to turn it off you had to turn off a bunch of other things. I've never uninstalled and unwishlisted a game so fast.
Very informative video about something that I'll turn off as soon as I reach to the options menu. Now I have more understanding for a reasoning. I like it.
This is why games need more graphics choices in their menus. Don’t just optimize the game for the standard of PCs, add options so you can optimize it for the customer’s pc as well. Ark was pretty good with that, down to how much vegetation even spawned, the type of antialiasing, if there was even anti-aliasing. It cannot be that hard to add graphics options, like seriously.
MSAA is nigh impossible to do with deferred shading because you just render a single fullscreen triangle every pass. So TAA is the best option available if you want to have complex PBR and a few hundred lights on your screen. Also it depends on the implementation, a good TAA algorithm would try to stick to subpixel jitter and use velocity map to avoid excessive blurriness. A velocity map is basically a UV displacement that tells a graphics API how much any object moved relative to the previous frame so you would sample stuff from the exact position it was one frame ago at reducing blurriness. And yeah sadly AtoC is out of the question.
Great explanation. I didn't know about the transparency portion. It's unfortunate that for a lot of games MSAA isn't an option anymore because of "deferred rendering". When TAA is the only option, I usually end up using some kind of upscaling option because that does make the image less blurry sometimes. But it differs from game to game. TAA can be even more blurry based on what the developer configured. In Unreal Engine games you can at least change these parameters and make the overall experience less blurry. But that does require fiddling and not all settings work for everyone, which brings me to: TAA is a pain in the ass to make a universal setup for. Every resolution, every game it's art style, etc. Combine that with the fact that TAA is overall worse and you get a nightmare. TAA is also a lot less blurry if there are more pixels to work with. So higher resolution monitors benefit more from TAA than a standard 1920x1080 monitor would. It looks less blurry and less aliased.
Thank you for not using TAA. Ever since it was the only option in Skyrim Special Edition's forst release, i have absolutely hated it. I would literally rather play with no AA than TAA.
That's how I got "Brothers : A Tale of Two Sons Remake" refunded. I was mad to see that they messed up so much with the original artistic direction on a poorly handled technology..
i recently noticed this specifically with helldivers 2, the game looks like a blurry mess at long distance with anti aliasing turned on. the game looks stunning if you turn off antialiasing and enable 1440p/4k supersampling.
honestly at 3:00 I prefer OFF, I have always preferred a jagged block over a blurry blob, especially when compared to the tree in my back yard. The one with everything off looks truer to life in a side-by-side comparison of similar portions. Just to add some quantification, I have 30/20 vision, that's better than normal.
Thank god you're not forcing TAA like the current modern game trends. Always hate that as soon as the camera moves, the image becomes a blurry, smeared mess or has ridiculously jarring ghosting artifacts.
Certainly! I've noticed a lot of the gaming community complaining about this topic and I would like to take their general feedback and ensure players' have the option to use what their preference is! I also don't really like the effect of TAA and would rather have no AA over TAA.
@RiftDivision I'm in the same boat, being that I prefer an aliased image vs. one with TAA. Its unfortunate both DLSS and FSR are based on TAA and share many of the same drawbacks.
i don't notice the ghosting much because i use a 60 hz monitor, but i do definitely notice the blurriness
@@UsernameAwesomeSauce So thats why I couldn't stand DLSS
Come on bro u would not rather use no AA over TAA. blurring is better than scratching. Also allow Taa but yea dont force it. Id use it if the game were unoptimized and i couldnt run msaa. Msaa is never worth it over a render scaler + taa imo.
No time wasted, not even a minute. Keep creating great videos like this!
The chad MSAA mogging blurry Temporal-cels.
Stop talking
Yareli mating press
Your Chad MSAA fails miserably in every modern games.
RUclips deleted my “yareli mate press” comment , literally 1984
@@DragonOfTheMortalKombat how does it fail if its not even implemented most of the time?
The new Satisfactory version uses TAA. I'm not sure who thought that adding _temporal_ antialiasing to a game with _fast-moving_ conveyor belts was a good idea, but it basically causes those moving objects to dissolve into a featureless blur.
Haven't played update 8 at all, but wonder if it's not a consequence of updating to Unreal Engine 5
but it... always used taa??? it now uses tsr by default with like 80% scaling so yeah i get it, i for example play it with fxaa and it's fine
@@Lyazhka I am pretty sure I was using FXAA before the update to Unreal Engine 5, and now FXAA just doesn't seem to be an option for me but maybe I need to disable DLSS or something to get that option back. It's kinda confusing.
@@TerrisLeonis well yeah, DLSS acts like anti aliasing thus they lock the option for AA, you already have AA enabled by having DLSS, you can also use TSR instead of TAA, which can render at native resolution giving better results than TAA (if given the option to do so)
That seems unsatisfactory.
Others would make this a 20 minute video. Its great you keep it short and straight to the point
Came here to say exactly this. No 1 minute intro to the subject, no bs about their childhood stories, no cuts to them changing locations and sitting something unconventional or some random stuff, no filler editing, just straight to the point.
It doesn't really describe TAA very well. The cost of shorter videos is losing information
sometimes i prefer a long in depth video. im not a brain rotted zoomer that cant watch a video thats more than 5 mins long.
What is wrong with this? This was not a satisfying video. I enjoy sitting down and enjoying a longer video.
Tik-tok attention span
I watched half this video at 480p by accident and was like "Damn he's really leaning into this blurriness thing, this is almost unwatchable" lmao
same and im on full speed internet... never goes under 60mbps during the day
@@ex4mple69THANK YOU FOR SHARING
@@ex4mple69 THANK YOU FOR SHARING THIS MASTERPIECE
@@ex4mple69 THANK YOU SO MUCH I'M PRAYING TO GOD RIGHT NOW AFTER U SHARED THIS
I _had_ to watch in 480p because there's no non-60fps 720p option and my internet is ass right now, lol
Good to see people talk about this, I've also decided to use MSAA purely. The loss in visual clarity from TAA is unacceptable in my opinion. Developers should not be using TAA or upscaling features as excuses not to optimize their games. That wasn't the case with older games, and we shouldn't be doing it today.
I mean, often enough there's just *nothing* to optimize
It's your hardware that's a bottleneck
@@blinded6502 hard disagree on this, modern games are often rushed and the first thing they skip when pushing for a release date is optimization...
why dont you just go with fxaa or smaa?
@@DiamondFireball Maybe SMAA combined with modest multisampling. But FXAA and SMAA are not very effective.
@@mow_catUsing TAA instead of MSAA *is* an optimization (of performance).
Who needs MSAA when you can throw performance out the window and use the best antialiasing method: SSAA
Just be sure to render your UI without it to retain proper crispness.
SSAA or no AA is the only way I play :P
4k dsr
yes, render the game in 8K on a 2K screen
@@MaxIronsThirdmy god your right everyone arm the 4090s. To battle! Tame the beast we must conquer this monster no matter how many pixels it may possess even if it is 32million strong. FOR THE ANTIALIASING, FOR THE RETURN OF NON BLURRY GAMES, FOR GAMERS!
Gotta love Arma :D Btw I've noticed that many modern games tend to look worse even with TAA off, almost like the were made to look good with TAA but look weird if it's turned off. Modern Warfare 2019 for example had some "grainy" textures that did that.
Arma is a fantastic game! I've noticed that too, I think that there must be some more complex graphical enhancements being made that require TAA on to correctly work - probably why a lot of games force TAA on aswell!
From my understanding, many games use deferred rendering, where they will intentionally undersample effects rendered on screen to save compute power. An extreme example is the hair completely breaking in Cyberpunk 2077 when TAA is off. They rely on TAA stacking the undersampled pixels into one complete image.
I noticed this with Deadside. Trees look like dogshit without antialias.
Teardown doesn't support standard transparency so uses TAA to fake it!
Same for it's smoke effects and etc, that's why it looks grainy
the reason a lot of games use TAA is because taa can be use for many more things then just aa, due to how it works it allows other aspects of a games rendering pipeline to be changed as well, TAA basicly works by storing previous frames and combining them together into the current frame, if you understand this what you can do is basicly only render parts of certain affects over the coarse of several frames to save performance, this is pretty often done with shadows, somtimes ao, other effects like SSR and another really common method is fake transparency, especially with hair and grass and stuff, which is why disabling TAA makes some stuff look really bad, because for example using an actual transparent shader to render hair cards is really exspensive but using an alpha cutout method dosnt look as good ebcause it dosnt look as soft, if you run the hair card alpha texture through random noise every frame so that the gradiations in the texture are rendered over the coarse of several frames and then combined together it creates the effects of transparency which can make the object look a lot softer at the edges. This is just one of many reasons why TAA is so prevelant, it allows for effects like this without falling back on actual transparency rendering which is much more exspensive, and applying this to other effects like shadows can help save performance a lot, especially when games have to constantly push graphical boundaries you constantly have to look for more ways to save performance so you can squeeze in other effects. people act like TAA is the worst thing ever but people dont realize that its a necessary evil. and saying, "just optomize better" is a really stupid and ignorant response that shows a complete lack of understanding of game development, especially when the hardware dosnt really change but games are urged to get better and better looking. its not always just about optomization as well, using actual transparency comes with a host of other issues as well like zfighting due to how transparent pixels are sorted, there are some ways around this with modifying an engine or using a custom engine, for example I believe the new alan wake game and its custom engine use multiple seperate depth buffers for different transparent objects to mitigate the zfighting issues that transparent objects run into (aka particles using transparency use one depth buffer but other objects use a different depth buffer, so if you have a ton of smoke particles for example near glass on a building you wont run into the zfighting issues where somtimes the glass renders ontop of the smoke) but its not perfect and using multiple zbuffers isnt exactly cheap either.
Never knew how Anti Aliasing worked on a deeper level until now, great video!
There is a truly excellent video on anti-aliasing by 2kliksphillip, I recommend you watch it.
I've known about these techniques for awhile - but your video did a great job of explaining each one in detail while keeping things brief. Your editing also compliments your narration well! Great job!
One of the understated benefits of TAA is that its temporal nature gets used in a lot of other effects as a denoiser - because it averages the pixel over time, we can use it to create dithered transparency (which is what you might use in place of alpha to coverage), we can use it to denoise SSR, SSS, etc. The artifacts really suck but a lot of devs use it because it makes so many other effects easier to optimize.
That's a double-edged sword.
Say you undersampled a ton of effects and also decided to leverage TAA as a denoiser; The people that turn it off or force it off will now be left with a broken image. The most egregious examples of this being RDR 2 and Cyberpunk 2077. The solution to this would be to provide higher sampling quality.
Cyberpunk 2077 is a great example for this. TAA is horrible there, I hate it. It seems to have gotten somewhat better in recent updates, but it still sucks. You can turn it off but then you see the problem you mentioned, noise, everything in Cyberpunk is noisy, the asphalt and it's reflections are the most notable ones since it has random pixels that are just sharp points of light, like literally just a white pixel which is denoised by TAA normally. It's horrible and the ghosting is just ugly
@@spartanxander-0476 those would still be there with msaa, msaa doesnt help at all with in surface aliasing, only edge aliasing
Pls use human words
@@catastic9394no. Learn or don't
MSAA CAN be used for deferred rendering, but it becomes much more expensive and complicated to implement correctly.
And crucially, it DOESN'T actually provide much in the form of anti aliasing because edge aliasing is THE LEAST of your concerns with all those shaders causing aliasing of themselves.
@@djwhitepeople Not sure how you got those implications from a minor correction for a footnote in the video, but okay.
edit: they deleted their comment but it insisted I was implicating developers were lazy for not implementing MSAA and then called MSAA obsolete because it doesn't solve for specular aliasing. Which, by the way, we've had other solutions for for a while, notably for VR titles like HL:A, where temporal solutions just aren't good enough.
Deferred rendering shouldn't be used for anything other than a tech demo.
It's worth noting that MSAA wasn't ditched simply because of the performance hit, it was ditched because it doesn't really work with modern games.
The issue is 2 fold. MSAA only effects the edges of Geometry, meaning that in surface aliasing such as with specular highlights is completely untreated.
Secondly it's a Spatial Techniques meaning it can only consider each frame individually, this method misses what's called Temporal aliasing AKA shimmer.
Temporal aliasing is when high contrast details appear and disappear on a frame by frame basis. A common example is tree branches in the distance, a lot of the detail in the thin branches will be smaller than a single pixel. This means that as the camera moves and the tree animates the branch will oscillate from taking up enough of a single pixel to be visible to being small enough to be invisible, this creates a highly distracting shimmering effect around any areas of dense detail (turn off TAA in any modern game to see what I mean).
These 2 issues were not a big deal in the 90's & 00's, lighting models were to simple to contain a lot of fine specular highlights and geometry to large & sparse for shimmer to be prevalent but in the 10's that all changed.
MSAA just couldn't cope with modern renderers leading to image that still looked plenty aliased despite costing half your performance.
So simply put even if MSAA was free it still would've been replaced by TAA, it's the only technique other than SSAA that can handle more complex visuals (that's because TAA technically is SSAA).
Switching over to TAA and spending the time & effort to alleviate it's short comings was really the only option, plus the development of the technology would lead to things such as upscaling.
So we can moan about TAA till we're blue in the face, its not going anywhere. But don't worry too much it has gotten significantly better over the years and personally I feel a lot of the problem boils down to monitor resolution, at 1080p the blur is obvious at 4k its almost invisible. Eventually everyone will be using 4k so this problem will be solved one way or another.
For more info on the drawback of both MSAA & TAA along with how they work check out Digital Foundry's recent video on the subject, there visual examples & benchmarks demonstrate clearly why MSAA was doomed.
Specular anti aliasing exists, and your tree branch example would be resolved by MSAA because it samples subpixels and adds that detail in, with TAA the frame averaging ends up erasing tiny details if they don't cover every sample
'at 1080p the blur is obvious at 4k its almost invisible'
You do realize that TAA is quite literally stealing image information from you, right? The fact that you need small 2K monitors (
@@TheJohn_Highway Deferred renderers went out of fashion years ago, pretty every game nowadays is some sort of hybrid deferred mixed with forward renderer. Unreal is deferred everything with only a forward pass for transparencies, although ue5 has OIT so idk if thats a forward pass anymore. But even in games that are pure forward+ renderers, like Doom 2016 and Eternal and Detroit Become Human, they still use TAA and force it on because they want the 'unified rendering' it provides (as in denoise all their undersampled stuff) and because it means they don't have to put in the work to author art that doesn't shimmer. Source 2 is the shining example of what games COULD be looking like now.
@@TheJohn_Highway A 1080p image with MSAA doesn't look as good as a 4k image with TAA, I game at 4k and it's a huge upgrade even with TAA.
My point was that at 4k a proper TAA implementation looks almost as sharp as a native 4k one (not 1080p) and any reduction in clarity is so slight to be worth it for proper AA coverage.
The way the algorithm works means it makes less errors at higher resolutions.
And you also realise deferred renderers & other modern techniques aren't used for the lols.
They're used because without them modern graphical fidelity would be impossible.
Your proposed solution would make games look even worse than they currently do, going backwards to PS3 era rendering for a bit of extra sharpness.
Making games look worse because you think they look bad is a crazy solution.
As I explained in my other comment, if TAA blur bothers you so much you can simply combine Supersampling with DLSS/FSR to get a perfect AA without any blur.
Yes it has a performance hit of course but no worse than the one MSAA comes with and it delivers better results than MSAA could at it's best.
This is a solution people can use right now and unlike yours actually improves the visuals, and yet none of the TAA haters actually do it because of the performance downsides, showing that even if MSAA came back 99% of gamers aren't actually willing to sacrifice their framerate for a bit more sharpness.
Oh and TAA isn't "stealing image information", it's not turning a high res image into a low res one that's not how it works. It combines multiple frames together with a slight offset then down samples the result. So the real issue is adding information in the wrong places, that's why it combines well with sharpening the details all there it just needs more contrast around certain edges.
@@TheJohn_Highway It's about time you realize that even disregarding the whole Deferred vs Forward rendering - MSAA is useless unless your game has a lighting model straight out of 90's/early 00's. By it's very nature it only works on geometry edges.
Thing is - Valve did use MSAA in HL Alyx too (or was it FXAA?), key thing is - they found out through research that aliasing happening bacause of materials, especially normals, they determined that adding noise greatly reduces amount of aliasing, and slight msaa on top serves as cherry on top to get rid if remaining aliasing. Because of that even, if you disable AA in HLA - image still will look like AA is on.
I might be wrong about the way how they done it exactly since my dyslexia is kinda not letting me read stuff properly, and i cannot add link here since YT deletes my comment, but you can find it yourself - Alex_Vlachos_Advanced_VR_Rendering_GDC2015
thats because vr games are mostly forward rendered, so the cost of msaa is lower
@@Frisbie147 -- msaa doesn't perform better in forward rendering what? it isn't compatible with deferred rendering at all because deferred rendering fundamentally can't work with transparency, it's still a big performance hit in forward rendering, don't just make bullshit up
@@SkedgyEdgymsaa can be done with deferred rendering, control uses deferred and has msaa, afaik rdr2 is deferred too, and msaa is extremely heavy in those games,
The reason they add noise is to remove colour banding, which is extremely apparent in VR and results in more shimmer, not because of anti aliasing. They apply a screen space dither algorithm that blends colour bands so the colour depth appears greater than what the screen actually provides. This can be seen in CS2 in the water shader and when you spawn in and have the glow around your character, it softens out the harsh gradient lines at the cost of looking a bit fuzzy, and with the water shader it allows them to do volumetric scattering at a lower sample count.
They resolved surface aliasing by applying a specular anti aliasing technique where the roughness of a material is augmented based on the derivative of the object's surface normal, so the change in value between two pixels is proportional to the increase in material roughness, which essentially clamps the roughness value so you dont get specular shimmer. This technique was further improved afterwards by various people, I have a video on my channel showcasing it in unreal, and its built in to godot and unity.
@@epoch151 thx for clearing it out, but damn, i can't read this wall of text without forcing and torturing myself into reading
Something else to note is most games made by big studios are made in Unreal Engine. Unreal Engine has motion blur enabled by default. I have to remember to manually turn it off in every new Unreal game I start.
Wouldn't surprise me if most of those studios working in Unreal Engine leave motion blur on, which would contribute to blurry graphics.
I call it Blurry Engine 4 and 5. I don't ever remember Unreal Engine 3 games looking blurry.
I like motion blur, but you should never pair it with TAA.
A bad TAA implementation does look like a 2000's era accumulation buffer based motion blur.
Which in my experience seems to be most things made in unreal.
Every 3d game I own have motion blur and even worse "depth of field" tuned on as well as lens flare... All effects that only makes the game look worse and blurry which make me eyes hurt .... First thing I do is turn all those blurring effects of.
@@FordHoard UE3 has even worse motion blur with lower samples
Answer: Consoles
More specifically, The Great Consolization of 2008, which is the event marking the end of AAA PC games, and the first year in which all AAA games ** were conceived, designed, and coded for console hardware, and the expectations and sensibilities of the console market.
** With the possible exceptions of Left 4 Dead (2008), Left 4 Dead 2 (2009), and Portal 2 (2011).
Especially motion blur, since (when used correctly) makes motion in lower framerate console games look a bit smoother.
@@ConcavePgons That's true, but for my part I don't think it actually improves anything. At really low framerates it might make it more difficult to notice the abrupt change between frames, but it also makes it more difficult to notice everything, lol.
My favorite part about AA and all other "video options" is the option to turn them off.
Great video, very concise and straight to the point.
i hate fake pixels. i hate fake frames. i hate TAA. i hate DLSS. i hate nvidia for gaslighting the entire gaming industry into thinking it's ok for games to look like they were dipped in vaseline. give me my jagged edges back
Yeah I hate AA too. I also hate DLSS and V-Sync, I even hate upscaling, I cant stand it when a game studio decides to do 4k upscaled over native 4k. Console games are starting to look worse because of all these "features".
I couldn't agree more! I hate these lazy BS tricks.
I don't know who told them we wanted those features @@thedarkdragon89
We are two my men!!!
@@nitroxylictv maybe devs chose upsacaled res beacuse the hardware cant handle the native res? Just saying
Wow an informative video that doesn't go back to the very first example in history of the subject at hand, and then spend 1h gradually moving toward the answer.
Devs seem to forget that hardware DOES improve over time and that once unobtainable MSAA 8/16x may be possible 4 years down the line, giving someone a whole new reason to play the game over again!
I for one, LOVE seeing the MSAA option in any modern video game because I know im going to be getting the purest image possible with no smudging or blurriness so good on ya for choosing that AA option for your game. It may be demanding now, but just think, in 10 years, even an APU will likely be able to run all of these modern games with maxed settings including MSAA cranked!
edit: Deferred renderers are the reason why we have TSAA and not MSAA in the first place; while faster, it doesnt mean they should completely neglect the old API that still does 90% of the newer API visually and in most cases the old API performs better (at least on my 1080ti/era appropriate api)
Ultimately, it does come down to API; and most modern devs forcing DX12 but hell, any DX api game will work on a DX* card so long as it is DX capable so why not have the older revisions; especially DX11 which is still MSAA capable.
With exception that resolution tends to go up and with it the cost of MSAA increases drastically. I'm playing in 4k and dropping the MSAA is the first thing I do to improve performance.
To be honest I think that hatred from PC gamers towards temporal AA solutions stems from TVs and consoles manufacturers moving to 4k, while most PC gamers due to shitty GPU market are stuck with FullHD for more than 10 years. TAA, with exception of motion blur, looks great on high PPI screen while costing next to nothing in performance, but it's quality suck more the lower resolution you got. With MSAA situation is different: it scales badly with resolution while only having affect on jagged edges (textures like distant ropes or wires still flicker).
You seem to forget the pinnacle of graphics from 10 years ago is the low settings of today. If you want graphical fidelity to remain high you'd simply not have enough computational resources to also do MSAA...
Hardware improves, so does the amount of shading a game applies for a higher quality output.
Problem is more that a deferred renderer requires the developer to manually write MSAA variants of nearly every single shader, and use special MSAA compatible texture formats on top.
It's a lot of extra work, particularly compared to just storing a few buffers and slapping a post-processing shader on the screen.
@@krazownik3139 and to that id say; 1080p at 16xmsaa has been pretty hard to pixel peep for a long time; and thats coming from someone who sits 2ft away from a 24" 1080p - so really, any resolution past 1080p with 16xmsaa has far been overkill for quite a while - I do understand that the base resolution of everyones monitor will eventually go up, but that just means you can lower the MSAA and get the same performance/visuals back as you would with 1080p 16xmsaa - texture resolution is a different story for sure, but, at least with MSAA, as I said, youre getting the cleanest textures and not a temporal, blurry mess like in modern APIs - 1080p, while not the cleanest textures, are definitely on the most bearable side of textures being 'low' (with non temporal AA) and I've dealt with those textures for many years but the real holy grail is SSAA.
It's not going to be feasible because the rate of graphics advancements and all these shaders will require exponentially more horsepower to do MSAA on them. It's the same fundamental issue with driving native resolution towards 4K and beyond, which is why these new temporal upscalers like XeSS and DLSS are developed. 4K screens are becoming the norm yet the law of diminishing return means it's a terrible waste to render things at native 4K let alone MSAAing the 4K render. You only need enough information to reconstruct a perceivably good 4K image.
Whenever i start a new game i go straight to the settings and turn off anything related to anti aliasing and depth of field effects. Feels like its making me go blind just staring at a game with those effects. If its a particularly evil game it will also have chromatic abberation and that fish eye lens effect i turn that off too. Making effects for games for the purpose of realism is cool and great, but in some cases it just doubles the effect that our eyes already have and hurts to look at.
Antialiasing is usually among the first settings I tweak. And by tweak, I mean I turn it off. I would much rather be able to tell if there's a camouflaged enemy hiding in bushes in the distance than have the image look less jagged but get shot in the face.
Sadly many games have it force enabled.
wow I was feeling alone in this opinion. Even at 3:00 I prefer the tree with MSAA off.
Rift, please keep the concise format. More of this is needed on this platform!
my favorite AA method is just playing at a high resolution, it makes everything sharp and the jagged edges are so small that they're barely noticeable. I'd much rather lose framerate to higher resolution than to simulating myopia. developers can put as many AA methods as they want into their game, I'm going to turn them all off
Finally, someone who knows how things must work! Also, a common misconception #1, you actually can use MSAA with deferred. And also, a common misconception #2, MSAA is resource-hungry - no, with rendertarget compression it isn't. And even #3, sometimes still noticeable edges - custom MSAA resolver solves the issue. So, the only final downside of MSAA is shader-triggered aliasing, but it is an uncommon issue usually solved by either altering your shaders or, if nothing helps, relying on SSAA for those particular parts of the image.
Those quirks are very difficult to counteract properly to apply MSAA though. If time crunch is the issue, the cheaper brute-force method like high res SMAA is far more practical.
@@nathanlamaire Which ones? ;) Resource-(un)hungryness? It is there just out of the box. Custom resolver? You can just build one on top of your colorgrading pipeline (and since DX12 it is essential anyway) - just average the linear-light final outputs while colorgrading every sample, no big deal as they share the same color mostly, so just cause a same-texel read if using a LUT somewhere in it. Tried, looks cute ;) Deferred? Go either forward(+) which is decent btw, or do an in-shader texel type detection (e.g. same- or differently-sampled). SSAA? Just add a sample semantic into the PS in question to trigger the SSAA there. So no, no reason to watch the blurry SMAA where the neat and sharp MSAA should be (everywhere, that is)!
Very clear video! I subscribed. Can't wait to see where this project goes
I absolutely hate TAA. It's subtle enough where you don't really notice it from a first glance but it always feels like something is off and it drives me crazy it's like playing with bad eyesight.
I prefer AA turned off entirely. It's sharper without it, and I don't mind the pixelation, which is barely visible on a modern hi-res screen anyway. Plus you get a performance boost that way.
Same, completely the same. I don't hate AA, but I hate TAA a lot.
That explains it! I've noticed moving text has been incredibly hard to read in games recently, it just becomes a smudge
Problem is MSAA doesn't work with deferred rendering which seems to have become quite popular in the industry
It does work, it's just a little more complex to implement (in OpenGL), and might be a lot more expensive for lots of lights (because lights have to be evaluated per-sample then, too).
Source: my game engine "Rem's Engine" does that.
They're actually different.
Forward rendering (i.e. the traditional method) can provide FSAA (this is Full Scene Anti-Aliasing) directly in the hardware/driver level, which requires no developer effort.
MSAA, which is done for deferred renderers, requires you to both use specialised texture formats AND write the shaders yourself to do the proper amount of samples manually.
And you need to write those shader variants for every single shader that matters (yes, all of them!), with a variant for each sample level (i.e. Off 2x 4x 8x 16x is 5 variants) to boot.
It's not impossible to do, but it requires significant extra developer time and testing/validation on top.
Off also isn't as simple as just doing a 1x sample rate.
@@billy65bob "specialised texture formats"
Yeah, you use glTexImage2DMultisample() instead of glTexImage2D() in OpenGL,
and fetchPixel(uv*size) instead of texture(uv) in GLSL, what a difference XD.
No, they don't need different variants for each MSAA, that's what variables are for.
uniform int numSamples;
for(int i=0;i
@@AntonioNoackIt's a big enough difference to be a pain in the ass :)
I'll admit I'm probably overthinking it in terms of optimisation.
I kind of ascribe to the "Generate the sampler statements in a preprocessor" school of thought.
@@billy65bob *"They're actually different... MSAA, which is done for deferred renderers..."*
Perhaps I am misunderstanding, but _appears_ that you are saying that MSAA is new technique for Deferred Rendering, which is false of course. In Forward Renderers MSAA is flawless: It hits every edge and has only a modest performance impact.
*"...requires you to both use specialised texture formats AND write the shaders yourself... And you need to write those shader variants for every single shader that matters (yes, all of them!)..."*
Further, I don't understand what you're talking about regarding MSAA requiring specialized texture formats and custom shaders. The entire point of MSAA is that it treats only geometry edges --- it has no effect on surfaces/textures/shaders.
Antialiasing never really needed to be considered in the days of CRT monitors. You'd have to play at like 640x480 or less to notice any pixel edges in 3d games because of how CRTs technically don't have pixels. Something like Half Life 2 on a CRT still looks leagues prettier than a modern game abusing TAA if you ask me.
One day though, OLED tech will be refined enough and 4K resolutions accessible enough that we'd be back to not needing antialiasing tech of any kind and reaping the benefits of superior black levels/color depth and basically no pixel response latency.
You know what is the biggest salt on the wound so to say? The fact that 90% of games that have forced TAA now offer "sharpening image", which creates even worse image. So you fix one problem, then create a thing that causes 3 problems and you add a fix for one problem that adds on top of that 2 problem 2 more problems.
If you don't know game settings things always go the way devs are biased. First thing I do on any game:
1. Turn off (or to lowest) Motion Blur
2. Adjust Anti Aliasing to be actually good (if TAA is good, its fine)
3. turn off Depth of Field
4 .turn off stupid effects like Vignette and Film Grain
Done, better graphics and LESS GPU usage.
Thanks even with me building pc's and knowing alot about stuff like this I still didn't know about how anti-aliasing works.
*a lot, "alot" isn't a word
MSAA is also useful for doing volumetric lighting and alpha effects, since you can use multiple samples per-pixel to alpha blend the volumetric diffusion without using another pixel shader for it, source 2 uses this, that's why it doesn't support TAA, since without MSAA some of its effects won't even work.
Tbf the effects in HLA are amazing.
I always go without antialising. I rather have edges
I usually play without anti-aliasing, or with very low anti-aliasing. I prefer the slight sharp edges to the blurryness
Cool video! What techniques did you use to optimize the vegetation in your open world?
Thanks! So far I have used a custom imposter system for the trees and a custom GPU instancing system for the grass and other details. I will be making a video on this topic in the near future so stick around for an in-depth break down!
I like your delivery, intuitive and straight to the point :)
TAA looks worse when the FPS or resolution is lower, which many is part of the problem.
your game and your videos have a lot of quality, i was here before the channel gets bigger, keep it up
SSAA is the best, but it eats too damn much GPU processing resources.
I am currently re-playing Half-Life 2 and I use VSR from 7680x4320 to native 4K *plus* 8x MSAA on top (6900XT allows that with such a dated game while maintaing 90+FPS). The resulting quality is outstanding, although I'd rather have an even higher SSAA quality, along the lines of 3x (11520x6480) or even 4x (16K, for really old games but modded to support unlimited resolutions, or modern indies), but the maximum VSR (or DSR for that matter) allows is 2x.
MSAA is simply inferior because too damn many games just don't support alpha-to-coverage technique, and it looks weird, to say the least, to have perfectly antialiased geometry right by the fully aliased grass, leaves, bushes or fence grates. To the point where I sometimes disable MSAA to have a more consistent, even if aliased, image.
Played the HL2 VR mod?
The blurrier the better.
Foggy? Increases immersion.
Motion Blur? Increases immersion.
Idc if you need a 3090Ti to run it. Everything blurry isn't a something to fix, it is a experience enhancing feature. Actually ... people should pay for the blur. Thinking of an obligatory monthly subscription.
Also making the game very dark will increase tension and immersion, and reduced needed effort for quality. "Set screen brightness" selector? Fuck it, monthly subscription if they really want it.
many games include one of the best antialiasing methods, typically labeled something like "off" or "none"
Correct. Who needs it with 4k resolution?
I'd rather have no AA than TAA or some flickery upscaler. I've modde games before to do just that, take out the TAA. I don't care if it breaks screen-space reflections, since those are already so broken that, you guessed it, I'd rather have no reflections.
I like jagged lines. Give my more jagged lines and less blurry lines.
👍 give me aliasing or give me death
My opinion: aniti aliasing is not needed for resolutions higher than 1080p. One of the first things I do in any game is turn AA off. I play 1440p and everything looks great.
Stalker my beloved
this is one of the most clever ways I've seen someone secretly advertise their game
Every time I boot a new game for the first time, AA is the first thing I disable. I cannot stand it at all. You may consider it to be a minor thing but it infuriates me to no end.
I really really do not understand why they do this.
I cant stand taa.
Thank God someone finally explained this clearly, with examples and in like 3 min
Taa and fxaa. If you disable ot the game looks soooo bad without it
If game uses deferred rendering, I use only FXAA.
Direct rendering games I use MSAA.
FXAA is peak and no one can tell me otherwise.
i hate TAA and how its forced
ArmA 2 & 3 are one of the few games to use AtoC, and even have differing options on the type of foliage to apply it to (only trees, only grass, both, certain trees, etc)
Ah yes, TAA, Motion Blur, Bloom and Vsync - the four horsemen of nausea-inducing apocalypse.
Am I the only person who likes TAA? I tried turning it off in Starfield and the result looked so terrible even with every other antialiasing method that I turned it back on.
@@trucid2 TAA isn't generally bad. It mostly gets bad rep due to poor/older implementation. These days we have a lot of ways of minimising blur through stuff like motion vectors which tells TAA in which direction things will move in the next frame so that it can counter the ghosting and thus blurriness. I've seen a fair few games which look way worse without TAA even ones like RDR2 which is a ghosting fest. Turn off TAA in RDR2 and all the shadows are now noise.
@@trucid2 I think it depends on the game, and on personal experience. For me and in those games I play, anything which causes the image to be blurry or introduces input lag, makes me nauseous, so I turn all of those off.
Specifically TAA seems to cause both an input lag and also makes the image seem as if your eyes are unfocused, which is why I dislike it.
motion blur + vsync in racing games isn't that bad (when done right)
what about chromatic aberration and vignette? are those lesser daemons?
This coupled with LODs and many other "optimization" techniques, I strongly believe blurryness could give a more authentic experience to a targeted type of a game.
Because we are rendering at lower resolutions than ever before. Freaking 4090 cannot run modern games at native 4k.
Low resolutions were not an issue even 20 years ago.
The modern games simply have shit ART STYLE, and the devs take tons of shortcuts to lazily optimize their bloated game titles.
@@GugureSuxwell, they were not, because we had analog displays and the rest of the graphics was blocky too.
What game can’t you run on a 4090 at native res?
The first thing I do in every game is turn off AA. My eyesight is bad enough to replace those algorithms anyways.
There's also the part where textures are being designed for higher resolution monitors, 1080p is being superseded by 1440p.
Downsampling is making things look too smudged and the TAA spam isn't helping since its built for monitors above 1080p.
yep my screens are 4k and i dont have this issue. taa looks great on them
I just like it without any antialiasing. I don't see cristal clear in real life, in fact, I can't even make out objects from each other pass 50 meters. Antialiasing feels just like that, at least without antialiasing i feel like i can see better, even if it looks more gamey
I hate antialiasing. I hate it when a game doesn't let you turn it off.
THANK YOU for this!! I thought I was going insane or that TAA had been broken for me on my games...
Am I the only one who likes the look of TAA? The worst thing about it is that it is forced in a lot of AAA games.
Apparently some like it, but I really don't get why. Details in motion get smeared out of existence.
@@theRPGmaster I like it because it gets rid of pixelation on lower resolutions. It does look horrible in motion, i guess i just got used to it and don't mind it cus it's in every game as the default option.
@@MrEditorsSideKick I think you are lacking proper terminology to describe what you are witnessing.
But essentially, TAA is a poorman's blurry-AF way of faking what SSAA and MSAA did already years ago.
These days it's become a "necessary evil", because for some unholy reason, devs started using noisy, dithered transparency instead of soft and smooth translucency in places like character hair and even shading... resulting these outright Sega Saturn -tier, pixelated dither patterns EVERYWHERE.
@@theRPGmaster only at low framerate/resolution, I though TAA looked bad before but now I got 1440p 144hz and TAA always looks better than no AA. there is no blur when moving camera or ever.
when i played the 2019 modern warfare it took quite a while to get the graphics not be blurry. i had to put screen resolution as high as possible.
try reshade sharp filter or just use nvidia's sharp filter
you can just disable taa in mw19, select either no aa or smaa1x
TAA also creates a ghosting effect sometimes with moving things, every time I see it, it makes me want to explode
It’s also important to mention that some games just lower the render resolution to optimize their games quickly
Before I knew how those Anti Aliasing techniques worked I usually just used FXAA or TAA. FXAA is extremly performant and has basically no drawbacks but only reduces the visibility of sharp edges by 50%. TAA is less performant but still efficient, because of it making pixelated edges disappear completly. The drawbacks though are blurriness, artifacts and general unnatural look while moving. MSAA is the only technique I didnt know anything about and I always avoided it because it looked worse than FXAA and was less performant than TAA, in every case I could use it (there were many).
MSAA is much better looking than TAA or FXAA WTF are you talking about. MSAA uses higher sampling at the visible edges.
@@SinaelDOveromand msaa cost to much fps what the fuck are you talking? Msaa use on grass, tree, buldings.... Is fucking fps killer. Playing game on 1400p or 4k txaa is the best.
@@SinaelDOverom MSAA used to be the go-to default in the 00s, but these days it's borderline unusable because of three reasons:
1. By default, Deferred Rendering games do not support it, and implementing it is kinda pain in the ass. It also becomes even more taxing.
2. Resolutions and polycounts have gone up exponentially since the early 2000s. Meaning MSAA has more work to do, resulting often DOUBLED rendering times.
3. There's tons of non-polygonal jaggies and other artifacts that MSAA cannot touch. The worst contenders are the Pixel shader shimmering, and the butt-fugly DITHERED transparency effects that devs have started using post-2014 to speed up rendering things like hair, glass and even water. Even FXAA cannot fully handle that crap, leaving TAA (and bruteforce downsampling) the only options for glitch-free visuals. There is ways to improve image quality if TAA is used, but waaay too many ignorant devs just stamp the stock effect on their games and call it a day. The Like A Dragon games for example do pretty okay job with TAA, and even apply a Sharpening filter on it to combat the blurriness.
@@SinaelDOverom Calm the hell down, I just told you what my experience with MSAA was, if you play different games thats your thing, not a general rule. Or are you offended by that?
It's nice for once to have an informational video that isn't filled with 20 minutes of unnecessary cinematic shots, dramatic beats and random scenes of the youtuber contemplating their idea. Thanks for the intuitive video!
The reason TAA is used, is for a middle ground between performance & visual looks. SMAA is far more computationally taxing and this shows in modern games with the feature. Three examples; either go to Forza V, GTA V or War Thunder and turn on the SMAA from 2x to 8x, your FPS should drop significantly (especially War Thunder & GTA V) for MARGINAL improvement in edges. Now that you have SMAA turned on you will probably notice flickering, especially on lower SMAA settings which is not present in some other AA options.
Its kind of tied to more important settings like LODs.
SMAA is only good in certain use cases*
One reason is because most games use FXAA which can only blend pixels in two directions and doesn’t not trust on reconstruct sub-pixel data.
MLAA does reconstruct that data and can blend in any direction.
yeah the 2010-2012 period has a LOT of games I really enjoy. and a few years after that all the games just seem to get so fuzzy. it's something that's been bugging me for a while, glad to have an answer. great vid !
2:10 Actually about that.
The traditional MSAA technique is FSAA, or Full Scene Anti-Aliasing - It only works with forward renderers, but it was hardware/driver based and affected everything rendered.
MSAA is fully available in deferred renderers, but unfortunately it requires you to use not only special MSAA compatible texture formats, but also to manually create specialised shaders that do the requisite number of samples in every single part of the renderer that matters.
I wish AA only applied to certain objects, like powerlines/chainlink fences- things like that. A lot of games didn't used to have Depth of Field either, which is always way over-exaggerated. It's partially a trend, partially tech. I used to develop games (not anymore), but the dev experience comes in handy for modding games to my liking- this is one thing that I often tweak if possible.
Really nice explanation of AA!
To me, the no AA example at the beginning looks way better and easy to look at than the with AA sampling.
I only use anti aliasing when I take screenshots for better looks, but in gameplay i just leave it off completely for max performance, even in non competitive games.
I still miss the "dry" crisp and sometimes shiny look of og modern warfare 2 and mirrors edge.
Mirror's Edge is still one of the best looking games ever. Even looks better than the sequel.
Soo many in comments say they hate Anti Aliasing... I can not live without it, I hate it when the game world looks like it is built by bricks of Lego. I either run TXAA combined with 2x or 4x AA or 8X MSAA / SSAA in most games if my GPU can handle it. If it struggle to keep up the FPS I often run no Anti Aliasing and run higher resolution then my 55" 4k TV, that give same effect as SSAA but at less GPU usage.
it really depends, any game with a lot of foliage or well any new game with TAA dependent effects is a pixelated shimmery mess without aa
I hate TAA because it's blurry, I hate most AA because it eats the framerate. I can't stand random frame drops and AA is usually responsible for that. I usually turn AA off, the jaggies don't bother me.
I don't know if it's just nostalgia but I really mis early 2000 fps games without AA, like CS1.6. Everything was incredibly clear and sharp despite fairly low resolution. With newer games, higher graphic settings you use, the blurrier and undefined it gets.
I remember playing HL2 with a Geforce 7300LE and it was very aliased, but then I upgraded to a 8600GT and could finally turn on MSAA and the game looked beautiful as there was too many finer details in that game.
I'm a student pilot so I use flight sims all the time. My biggest gripe with X-Plane 12 and MSFS is that there's a constant but subtle blurriness on literally EVERYTHING, including the cockpit screens, which makes it hard to fly because I now need to zoom in on each individual display, rather than just setting and my camera at the right POV where I can see everything I need to like I could in XP11.
I always wondered why Tarkov looked so blurry. Thanks for the explanation.
Here I thought it was my astigmatism acting up
Damn this felt longer than 3 minutes and 26 seconds. You are doing a great job! Much love from India :)
Great video explaining what I have recently been thinking about but haven't been able to properly understand. I play RDR2 in 4K and without any AA the shimmering is absurd. The AA methods that fix the shimmering either come at a huge performance penalty (MSAA/2x) or ends up blurry for distant foliage and during movement (TAA/medium). FXAA is also available but doesn't make much difference in my experience. I've also been playing around with different driver-side AA methods in AMD Adrenalin, but so far I have not had all that much success. At the moment it seems like I am stuck with TAA/medium and the blurriness and lack of distant detail that comes with it, because the shimmering drives me insane and MSAA drops my FPS too much.
if you made a video covering TAA, SMAA, FXAA, other weird AA methods seen in random games, that would be super cool and probably get a lot of views. youtube recommendation brought me here
There's been tons of videos on those topics the past 15 years.
@@GugureSux so outdate for now
I always turn off motion-sharpness effects and everything else as far as possible, because it looks awefull and anoyes me in general.
I played a demo recently that only had TAA and it was bundled in some generic "Quality" setting so to turn it off you had to turn off a bunch of other things. I've never uninstalled and unwishlisted a game so fast.
Very informative video about something that I'll turn off as soon as I reach to the options menu. Now I have more understanding for a reasoning. I like it.
Perfect video i like that you didn't waste any of our time and started talking directly on the subject, thank you.
alpha to coverage works with taa. you just treat multiple pixels over time like their one multisample pixel
This is why games need more graphics choices in their menus. Don’t just optimize the game for the standard of PCs, add options so you can optimize it for the customer’s pc as well. Ark was pretty good with that, down to how much vegetation even spawned, the type of antialiasing, if there was even anti-aliasing. It cannot be that hard to add graphics options, like seriously.
Can’t remember the last time I willingly turned on anti aliasing on in a game
MSAA is nigh impossible to do with deferred shading because you just render a single fullscreen triangle every pass. So TAA is the best option available if you want to have complex PBR and a few hundred lights on your screen. Also it depends on the implementation, a good TAA algorithm would try to stick to subpixel jitter and use velocity map to avoid excessive blurriness. A velocity map is basically a UV displacement that tells a graphics API how much any object moved relative to the previous frame so you would sample stuff from the exact position it was one frame ago at reducing blurriness.
And yeah sadly AtoC is out of the question.
Great explanation. I didn't know about the transparency portion. It's unfortunate that for a lot of games MSAA isn't an option anymore because of "deferred rendering". When TAA is the only option, I usually end up using some kind of upscaling option because that does make the image less blurry sometimes. But it differs from game to game. TAA can be even more blurry based on what the developer configured. In Unreal Engine games you can at least change these parameters and make the overall experience less blurry. But that does require fiddling and not all settings work for everyone, which brings me to:
TAA is a pain in the ass to make a universal setup for. Every resolution, every game it's art style, etc. Combine that with the fact that TAA is overall worse and you get a nightmare. TAA is also a lot less blurry if there are more pixels to work with. So higher resolution monitors benefit more from TAA than a standard 1920x1080 monitor would. It looks less blurry and less aliased.
Thank you for not using TAA.
Ever since it was the only option in Skyrim Special Edition's forst release, i have absolutely hated it.
I would literally rather play with no AA than TAA.
Go to settings disable depth of field.
That's how I got "Brothers : A Tale of Two Sons Remake" refunded. I was mad to see that they messed up so much with the original artistic direction on a poorly handled technology..
i recently noticed this specifically with helldivers 2, the game looks like a blurry mess at long distance with anti aliasing turned on. the game looks stunning if you turn off antialiasing and enable 1440p/4k supersampling.
honestly at 3:00 I prefer OFF, I have always preferred a jagged block over a blurry blob, especially when compared to the tree in my back yard. The one with everything off looks truer to life in a side-by-side comparison of similar portions. Just to add some quantification, I have 30/20 vision, that's better than normal.