can we all just appreciate Acerola's genius idea in using a double video, a technique that emerged with the advent of tiktok and quick videos on social platforms as a method of hooking the audience using super stimulation techniques so that people don't skip videos that might seem uninteresting on their own (like an advertisement)? very well thought out and implemented, with the use of a race that dominates the internet. genuinely brilliant and honest, so good in fact that I didn't skip an ad for the first time in a long time. thank you Acerola, thank you.
Hey Acerola! Great content so far! Really look forward to your take on how glass works in games, this had been the most daunting topic for me as a videogame artist 😟 Would be a blast if you showed how to deal with it and achieve best results. I personally haven't achieved even a somewhat adequate refraction behavior in years I've been into Unreal 🙃
I want to make a mobile game where the galaxy is full of basically dandelions that float around instead of planets, and each sphere can be traversed like mario galaxy Could you maybe do a video on how to do gravity that connects to bodies in space instead of indiscriminately falling down? Also, I'm curious if I could just declare a point in space, and project these shell texture grass effects in all directions from there? Also, by extension if I can declare one point in space can I also declare a distance from that point representing the circumference of the body in space and creating the ground? Not gonna lie, it would be pretty ideal if I could just render each body is space as a basic point in space, and move them around from there Also, I just realised the directional tilting could lend itself well to an effect of being like the tail of a comet giving it it's distinct flaring light look Also, how do you think the smoke effects you previously spoke of would apply to stuff like nebulas and space clouds? Does it scale well? I want it to have a very "cosmic ocean" aesthetic, and making big areas of space different colored clouds would be cool
one technique I've seen to hide the gaps between shells, is to distribute the shells such that the gaps are bigger at the bottom and smaller near the top. as the denser lower layers do a good job of masking the gaps but the upper sparse layers need denser layers to hide the gaps
I wonder if there's some aspect of gaussian splatting that could be applied. As far as I understand, it's using blurry splats that take camera position into place to replace the points in a point cloud. Perhaps blurry shells that also take into account camera position?
@@ludfde unfortunately no. That operation has to be done in the vertex shader pass, and each layer only has four vertices, so you could only apply it to the very corners of the grass, not to each individual blade. If you wanted something like that, you'd have to find a way to bake it into the pixel rendering shader, which would get extremely complicated very quickly
Viva Piñata’s grass may be just a visual trick, but you can’t imagine how much I wish I could just lay down in it. It always looked so soft! You can tell me that I need to go outside and touch grass, but the only thing I really want is to touch THAT grass!
1:40 "a game with a protagonist with a beautiful head of hair but gets [transported?] into a universe where everyone is bald and is trying to kill them with having hair" and that was how Acerola made his fortune
i am incredibly bored so i'm gonna be a massive nerd for a second. isekai is a popular genre of anime, where a character is magically transported into another universe. the method of transportation is usually the character getting hit by a truck and a god taking pity on their untimely death. the popularity and weirdness of the genre gets it a more emotional reaction than just "transportation"
Nintendo particularly has really gotten really good at shell texturing across a ton of their recent games, particularly with the rendering of fur on characters like Donkey Kong. Fun fact: Super Mario Galaxy 2 (for Nintendo Wii) was originally going to make prolific use of shell texturing for its grass; which was showcased in various early development/promotional screenshots. -However come release, nearly all shell texturing was removed from the game. Why this was done I don't believe was ever specified.. it may have well been due to performance or technical complications, ..though personally I believe the more likely answer is that the team saw the effect as too uncanny applicated in this specific instance, and jarringly dissimilar from the presentation of grass in the first game, so it was scrapped (outside of a few one-off planetoids.)
Paper Mario TTYD used a similar method (not _procedural_ texturing, just layered textured shells) in quite a lot of areas. The flower beds in Boggly Woods, and the carpeting in the Glitz Pit Champ's Room come to mind...
@@romajimamulo You and Mizox are absolutely correct, just loaded up Honeyhive Galaxy in the Noclip website, and bam, that's absolutely shell-texturing right there. It was just unnoticeable in the games because the Wii is 480p. ...Really, the Galaxy games in general are a masterclass in making things look _good_ with limited hardware resources.
Oh, so that's why Sif is like that. Might be just me, but the uncanny look of the fur actually makes Sif look almost ethereal. I bet you could use this to intentionally trigger an uncanny valley effect when designing some otherworldly creatures. Neat stuff!
This is the actual good way to utilize phycologists in game dev, instead of using them to figure out how to further exploit people prone to gambling addiction
In the original dark souls the gaps were more subtle,the lighting was hiding it better but in the remaster version he shows the lighting makes some parts darker wich makes the difference way more visible.
Can't believe no one is mentioning this bit at 19:35 saying "...what's basically a layered cake of meshes" while "Layer Cake" is playing... Masterfully done 👏
I'd like to also say about the Half Lambert solution - one of the reason it works so well is because hair is, as you say, transmissive. Real hair has light pass & refract through it, so any part of a contiguous set of hair that is in total darkness will look really weird since it's nearly impossible in most situations with a light source. Half Lambert fakes the indirect lighting you would get from a path or ray-tracing lighting model even in a forward rendering environment, so it feels more natural.
Planet Zoo has an amazing fur system too for LOTS of animals. I'm not totally sure but it also looks like shell texturing, but it looks good even on grazing angles!
Another Acerola banger? You can't be fur real! I've worked with fur throughout my life in 3D, seeing such clever, concise techniques here gives me an incredible basis / tool kit for this. Thanks again, king. 👑
24:43 THIS IS THE REASON why old Dark Souls bosses with fur brought my laptop always to lagging when i moved the camera close in ! Now i finally understand. It never made sense to me that when the camera got stuck on the groin of the taurus demon that my pc almost crashed.
Our game, Steel Heel Jam, has a non rotating, almost top down camera, and while watching your live streams I realized this would be a cool addition to the environment art for some grass. I plan to easily add some basic vertex animation with a noise texture for wind on the grass.
I would advise you to look into raymarching grass ShaderToy instead (it look up "grass field with blades"). This technique is outdated, slow, and a pain in the ass to set up comparatively.
This has become one of my favorite RUclips channels. I don’t have any interest in creating games or other things that require real-time rendering techniques, but I have a deep love for video games (although I don’t really play anymore). 30 minutes of extremely well articulated, concise, funny, and entertaining explanations to technologies and concepts I knew nothing about draws me in. This is what the internet was made for: experts of their craft sharing knowledge publicly. Even if I’m not going to internalize this information so that I can create with it, it’s beautiful to learn about the foundation of how things work and it brings me happiness. Keep sharing mr. Rola! You’re inspiring!
17:23 another way to do this is to lerp between last frame’s value and the whatever new value by some small amount. technically it’s dependent on the frame rate so maybe do that in the physics update and then a bit of smoothing in the per frame update, or calculate it every time. the benefit of this tho is that it’s an exponential approach, which feels very natural. it’s a first order lowpass filter, and eventually i’d like to make a video going thru it in depth because it’s a really good way to smooth things and i’ve been using it a lot in my game to make things smoother, and i think more people should know about it for game use.
This technique reminds me of sprite stacking. A good example is NIUM. It is used in 2D games to simulate a 3D object without actual 3D graphics. Instead of layers being randomly generated, they contain "voxel" layers to represent a model. This would mean that the higher the object is, the more resource intensive it becomes.
Acerola, have you ever heard about particle rendered hair? I heard about the graphics of pikmin 3 a while ago and one of the bosses uses this technique to render fluffy fur from partickless that are locked to dark unseable areas of the texture. It's very interesting stuff
He's talked about it in previous videos, he _was_ an engineer at nVidia, and left (quit or was laid off, I don't remember) not too long ago and decided to focus on youtube videos.
@@Acerola_t Well hot diggity! That explains how you are so well versed in all of this Always enjoy your videos, and I appreciate that you don't shy away from the intense "formulas and calculations" side of "how do I make this look good" lol Thanks again for the video ✨
Nah Nvidia wouldn't use all this el cheapo flawed stuff. I mean they're pushing realistic lighting with ray tracing, and this technique directly contradicts that, as shown in the video.
I've seen something similar to shell texturing used in 2D as a method of creating nearly-3D voxelized objects in low-resolution games. By generating vertical slices of a given object and drawing them from lowest to highest with a gradually decreasing Y value between layers, sometimes as small as a single pixel, the slices stack on top of each other and appear to build depth. This creates an effect which is something like front-on isometric perspective, because if you rotate the slices around a common origin so that every part of the image remains in the same relative location it creates an illusion of a three-dimensional object spinning but always maintaining the same 45-ish degree angle relative to the camera.
Regarding RNG in GPUs, what do you think about the paper "Parallel Random Numbers: As Easy as 1, 2, 3" by John K. Salmon et al., which seems to be an approach close to the hashing method you describe, but their implementation is pretty damn good statistically.
@@Acerola_t happy you've seen this comment, I know about it because in deep learning the Jax framework uses it to generate random numbers. The core idea comes from counter mode ciphers, where a seed + counter allow you to get a block of random bytes. The seed remains fixed across the generation and you only need to increase the counter over all the blocks!
Not much of a gamedev person myself, but I have always wondered how shell textures work, and this video explained it super well! Great video, I would love to see how nvidia hairworks and AMD tressfx works!
I've always been fascinated with how good Fox's fur looked in Star Fox Adventures, considering it was running on the Gamecube. Such a simple trick, but it works wonders, and I think the part that impresses me most is not just how impressive the fur and its "physics" looked, but how well it smoothed out Fox's model and made him look higher fidelity than anything you'd see in other games of the era simply because the fur hid the hard edges of the polygons he was made of.
I figured out that Star Fox Adventures was using this technique back in my university days when I borrowed a friend's Gamecube for a weekend. Also, don't be surprised if your Furry Challenge gets a lot of entries involving characters from that franchise.
Honestly it's amazing that this isn't more well known among that franchise's fans, or used on fan-made models. I seem to remember reading that the actual models from that game only recently finally got ripped and released to the public-and even then I've only seen them rendered without the shells or even the subtle sheen on Fox's snout-so Rare must have done a really good job obfuscating their file structure. Because it sure wasn't for lack of demand.
AFAIK, there is one way to "fix" the gap between shells by slightly turning the surface towards the camera in the vertex shader. This allows a camera to be parallel to the shell planes, but the math is so complex i never got it to work right. A different technique that i also know of is parallax hair/grass, but that requires an unholy amount of preprocessing so that you can render curves in depth.
Shadow of the Colossus was what made me interested in shell texturing long before I even knew there was a name for it. Makes sense that they'd want to draw your eye to it, since fur patches are key to climbing the colossi and all that. It blew me away in 2005 and it still holds up today.
Another amazing video! I feel like this is one of those things that I will see EVERYWHERE now that I know to look for it. Very excited to participate in this challenge. Love that you're encouraging people to do homework and learn the material :3
If anyone is interested in how to do human hair, most high fidelity games do "hair cards", It's similar to this, but more handcrafted. You can search that term to find tutorials It's a transparent texture with many rectangular hair parts (the "cards"), each with different densities. Then you model hair strips with long rows of quads, and assign their uv's to one of the cards. The more you stack one above the other, the more real it looks, the same principle as shown in this video. You just use a normal transparent material to display it. It's pretty hard to model I think, even with specialized tools. From what I know you should sculpt it first, use that for generating the strips, and then do manual touch ups High budget modern games do other more realistic approaches probably, share if you know
The Far Cry series also uses it (as any modern game needing more than one furry creature). It's very noticeable in Primal, since the animals are half a meter away from the camera when you pet them. I'm pretty sure CP2077 uses it too, noticeable, again, while petting cats.
Funny timing. Just last week CIG, the makers of Star Citizen and Squadron 42 had a panel at the annual event CitizenCon where they showed off how they're rendering hair.
Could you do shell rendering with billboarding? That might solve the illusion breaking at an angle. (though if big game studios haven't done it I assume this just isn't useful)
Exactly my thought as well. It's such an obvious solution that there must be a serious downside. Upvoting in the hopes that people who know stuff see this and respond.
Yeah, This is the first thought of every person who knows Billboarding, problem being that the Billboarding is done to a Quad or Triangle during Vertex Shader phase. but the shell here is made up of rasterized pixels, and Pixel itself is just a dot facing screen already. So doing Shell Texturing of a Quad made up of two triangles, has nothing to further Billboarding.
It’s a bit after my bedtime but I had to watch this now ;) Could the rendering discussed at the end be helped by logarithmic search? So checking the mid layer first and then closing in recursively?
Unfortunately i don't have much time in the next weeks, but here is what I'd attempt to improve rendering: Get rid of shells entirely and only jave a single outer shell. Then, in the fragment shader, use the incoming "ray" direction (i.e. the angle to the camera plane) and the position (UV) to know through which shell pixels the ray will travel. Then go through these, stopping when hitting the first grass and perhaps using statistics if too many shell pixels would need to be traversed, thus setting an upper bound on the shaders per pixel runtime. This would basically be like converting each shell pixel into a box between two shell layers. I imagine this being similar to a parallax shader with multiple layers. It does still have some artefacts unless the statistical guess is good. It would however eliminate the overdraw issue and may reduce the amount of artefacts when looking at it from the side. The per grass blade lighting code could then aslo be more complex, although that has other difficulties since each "ray" diesnt know about neighbor grass blades, since it is guaranteed to happen only once per fragment (each fragment will sample the random function multiple times however, but that shouldnt be as big of an issue, since it doesnt even take fommunication between Threads or bandeidth to read from a texture. Anyways, Im curious what we'll see in the follow up video.
Edit: dynamically changing it by applying displacement would be more difficult and would have more performance overhead however, since that will require some per she'll pixel math Evaluation (or to be more precise: integration into the code for finding the next she'll pixel) as just displacing the shells would not be possible.
Waiting the whole video to see if you'd mention Shadow of the Colossus, and it being mentioned at 25:49... RIP in pepperonis, sweet prince. Definitely one of the best looking uses of shell texturing for large creatures, imo, and done with very little processing power at its disposal.
Hey, I've been thinking about this some more lately, and I've come up with a simple way this could be further optimized: by having the number of shells vary depending on angle from the camera. The more the "strand" points toward the camera, the fewer it needs to be convincing. Though if you're planning to simulate AO, you'll still need a few at minimum to pull off that effect-and on that note, the tapering effect will also be slightly more work because you'll need to go based on distance from the solid surface rather than shell count. It also wouldn't work for most grass applications since a low camera would suddenly cause almost ALL the grass on screen to need more shells, but for something like Sif or the _Genshin Impact_ moss where it wraps around a round object, it might be ideal.
This dude is an actual genuis for putting his cat on the side of his sponsorship, its like subway surfers but i dont feel like a derranged 2 year old for watching it. Thank you. Your cat is beautiful. God bless
I don't think Genshin Impact's moss should be "fixed". IMO it lends itself quite well to the art style and I feel that would be lost if it were made entirely solid.
Somehow your videos have opened my eyes to how good Rare are at making graphics in their games look stylised by using incredibly simple techniques in unique ways 😁
I came across this video right before the final for my Intro to Modern Graphics Programming class, where I had to research a graphics programming technique and give a mini lecture on it. I only covered about half of what this video does, but it was a pretty fun experience, and while I’m more of a game designer than a programmer, I’m starting another graphics programming class today and I’m excited to learn more! Thanks for making these videos and keeping me curious about graphics programming!
I don't even do anything with this knowledge other than put it in a space inside my head. Sheer curiosity towards how things are done for real-time rendering is what makes me enjoy videos like these.
I always intuitively felt that, as soon as I saw a game, the quality of hair modelling spoke to me about the overall graphic design quality of that game. Learning about how complicated rendering hair is made me more aware of this.
2:28 generally there would just be a ton of instancing going on for grass and other repeating objects. Instancing is taking and object and duplicating it in such a way that the program acts like there is no extra faces on screen to compute. For example, a cube has 6 sides, and when you duplicate normally you would then have 12 faces to compute, but with instancing you still only have 6. so you can do that with millions of grass blades and have really nice optimization. I'm further in now and really fascinated by how these ideas came to be. Can't wait to see how physics works. Dude I had no idea Genshin did it too. I'm probably going to unfortunately have to figure this out myself.
I don't play Genshin, I've barely seen any footage of it. But comparing the super high resolution sharp edged character models to the soft shell textures is really off putting. They don't look and feel like they belong in the same game.
The early z test issue is actually much worse than you claimed, discarding completely disables early z tests on a lot of gpus because reading from the depth texture and writing to it are a single, atomic instruction. So it doesn't actually matter if you draw your shells from front to back, all pixel shaders will be executed every time. I can't link in youtube comments without sending my post into the shadow realm, so I'll just leave this sublink to a unity forum thread discussing in threads/general-questions-regarding-early-z.1065779/
That's what I thought but I wasn't quite sure since I saw differing statements on it so I sorta just went inbetween the logic for the explanation. Either way, low shell counts are ideal.
Your explanations are so thorough and have just enough balance of the technical and non-technical to be interesting and educational even to those with no expertise. You’re an excellent teacher.
Didn't expect my favorite game of all time to get so much love in a video about simulations! I gotta say, viva piñata on PC is the bomb (even though I can't save my run) and with it's simplicity comes a bunch of benefits! Thank you for this amazing video, i'll surely come back to it!
You should probably use acceleration instead of velocity to calculate the displacement or ideally a blend of both after applying a low-pass IIR filter.
Why can't we have a shader that turns each shell 90 degrees, basically turning it into a fin? To avoid having all the fin-shells aligned, you could then rotate them around the normal axis by an amount determined by the index of the facet of the mesh.
This is the first of your videos that i could mostly understand virtually every step that we took! im actually really excited for this because you're helping me get a hang of graphics programming while i have zero programming experience, and also shows that your videos are getting better!
Interesting I have been doing the shell texturing method for years without knowing it existed, I called it "MRI parallax-mapping" because it mimics the slices you get from an MRI machine. Use it for shag rug, tank tracks, bicycle chains, chainsaws, you name it. Thanks for making this video glad to be in the same pool finally.
I just LOVE the humor in your videos. There is so much there hidden in plain sight. And it isn't even distracting (when you aren't currently writing a comment about it).
Regarding the physics section, what I'd personally do is have various control points attached to the base mesh that have physics simulated on them, and attach the shells to those control points. You can control the density of controls, how the material moves, etc etc. Kinda wanna try that myself, now.
Its nice to see an in depth examination oh this effect. The first time I've seen it was in a magazine that covered the release of shadow of the colosus on PS2. But I also noticed this effect used in Starfox Adventure on the GameCube. Another game from Rare. Pretty much all their games used this effect. That being said, I don't think triangles arw that big of a deal on modern hardware, you could probably get away with blades of grass using 5 vertexes and using various mesh instamcing techniques. This would give tou multiple benefits like proper shadow casting, culling and no need for expensive pixel shading or pixel transparency testing. You probably wouldn't be able to get the same density or long distance views but that calls for additional visual tricks too. Note, tou kept saying this was and still is commonly used but apart from a couple of AAA console games and Breath of the wild clones, I have mostly seen crhe classic X grass. Does anybody know of other games using this technique (outside of small indie projects )
Get an exclusive @Surfshark deal! Enter promo code ACEROLA for an extra 3 months free at surfshark.deals/acerola #ad
Thanks for watching!!
can we all just appreciate Acerola's genius idea in using a double video, a technique that emerged with the advent of tiktok and quick videos on social platforms as a method of hooking the audience using super stimulation techniques so that people don't skip videos that might seem uninteresting on their own (like an advertisement)? very well thought out and implemented, with the use of a race that dominates the internet. genuinely brilliant and honest, so good in fact that I didn't skip an ad for the first time in a long time. thank you Acerola, thank you.
Hey Acerola!
Great content so far! Really look forward to your take on how glass works in games, this had been the most daunting topic for me as a videogame artist 😟
Would be a blast if you showed how to deal with it and achieve best results.
I personally haven't achieved even a somewhat adequate refraction behavior in years I've been into Unreal 🙃
I want to make a mobile game where the galaxy is full of basically dandelions that float around instead of planets, and each sphere can be traversed like mario galaxy
Could you maybe do a video on how to do gravity that connects to bodies in space instead of indiscriminately falling down?
Also, I'm curious if I could just declare a point in space, and project these shell texture grass effects in all directions from there? Also, by extension if I can declare one point in space can I also declare a distance from that point representing the circumference of the body in space and creating the ground?
Not gonna lie, it would be pretty ideal if I could just render each body is space as a basic point in space, and move them around from there
Also, I just realised the directional tilting could lend itself well to an effect of being like the tail of a comet giving it it's distinct flaring light look
Also, how do you think the smoke effects you previously spoke of would apply to stuff like nebulas and space clouds? Does it scale well? I want it to have a very "cosmic ocean" aesthetic, and making big areas of space different colored clouds would be cool
Just wanted to say this is a super useful video, really appreciate that the code is also available. Especially for the fake physics.
Shaft studio?
God I just realized he's wearing more layers as time passes.
Amazing.
Dude, spoilers
@@Yixdystill opens comments before watching video 😅
And some pretty good bands among them too!
I watched the entire video without realising even once
I literally watched this video like 5 times and only now realised when i read this comment lmao he's god of comedy
one technique I've seen to hide the gaps between shells, is to distribute the shells such that the gaps are bigger at the bottom and smaller near the top. as the denser lower layers do a good job of masking the gaps but the upper sparse layers need denser layers to hide the gaps
This functionality is in my shader code I didn't expressly mention it tho
Couldn't you slightly offset the normal of the quads using noise?
@@ludfde Instead of being randomly offset, wouldn't we always want the normals to slightly face the camera?
I wonder if there's some aspect of gaussian splatting that could be applied.
As far as I understand, it's using blurry splats that take camera position into place to replace the points in a point cloud.
Perhaps blurry shells that also take into account camera position?
@@ludfde unfortunately no. That operation has to be done in the vertex shader pass, and each layer only has four vertices, so you could only apply it to the very corners of the grass, not to each individual blade. If you wanted something like that, you'd have to find a way to bake it into the pixel rendering shader, which would get extremely complicated very quickly
These videos are genuinely one of two things I look forward to on RUclips, the other being Sebastian Lague videos. Great stuff.
same
Sebastian Lague my beloved
and Stuff Made Here.
Code Parade is one I can recommend.
He is another great example of this style of technical, but beautifully explained coding projects.
Yess
I'm just going to take a picture of my dog, put it on a simple square mesh and call it a day.
I imagine Acerola in a party corner like "they dont know how to render realtime hair"
Viva Piñata’s grass may be just a visual trick, but you can’t imagine how much I wish I could just lay down in it. It always looked so soft! You can tell me that I need to go outside and touch grass, but the only thing I really want is to touch THAT grass!
same
1:40 "a game with a protagonist with a beautiful head of hair but gets [transported?] into a universe where everyone is bald and is trying to kill them with having hair" and that was how Acerola made his fortune
The word you're looking for is "isekai'd".
But "transported" is a fair synonym.
literally just the world of Bobobo-bo Bo-bobo
As a guy with long hair entering my thirties, this is an accurate description of my life imo
Brave video game, all the enemies are helmeted warriors. The end boss is a bear that glitches the game out but you use the glitches to defeat them.
i am incredibly bored so i'm gonna be a massive nerd for a second. isekai is a popular genre of anime, where a character is magically transported into another universe. the method of transportation is usually the character getting hit by a truck and a god taking pity on their untimely death. the popularity and weirdness of the genre gets it a more emotional reaction than just "transportation"
Nintendo particularly has really gotten really good at shell texturing across a ton of their recent games, particularly with the rendering of fur on characters like Donkey Kong. Fun fact: Super Mario Galaxy 2 (for Nintendo Wii) was originally going to make prolific use of shell texturing for its grass; which was showcased in various early development/promotional screenshots. -However come release, nearly all shell texturing was removed from the game. Why this was done I don't believe was ever specified.. it may have well been due to performance or technical complications, ..though personally I believe the more likely answer is that the team saw the effect as too uncanny applicated in this specific instance, and jarringly dissimilar from the presentation of grass in the first game, so it was scrapped (outside of a few one-off planetoids.)
I think they learned a lot from making nintendogs lol, fur on low power devices is hard
Paper Mario TTYD used a similar method (not _procedural_ texturing, just layered textured shells) in quite a lot of areas. The flower beds in Boggly Woods, and the carpeting in the Glitz Pit Champ's Room come to mind...
Adding on a prior comment, both galaxy games use shells for the bee fur.
@@romajimamulo oh yeah truue
@@romajimamulo You and Mizox are absolutely correct, just loaded up Honeyhive Galaxy in the Noclip website, and bam, that's absolutely shell-texturing right there. It was just unnoticeable in the games because the Wii is 480p. ...Really, the Galaxy games in general are a masterclass in making things look _good_ with limited hardware resources.
A game with a protagonist with amazing hair where everyone is bald and tries to kill the protagonist... You mean... Bald'ers gate?
Oh, so that's why Sif is like that. Might be just me, but the uncanny look of the fur actually makes Sif look almost ethereal. I bet you could use this to intentionally trigger an uncanny valley effect when designing some otherworldly creatures. Neat stuff!
This is the actual good way to utilize phycologists in game dev, instead of using them to figure out how to further exploit people prone to gambling addiction
In the original dark souls the gaps were more subtle,the lighting was hiding it better but in the remaster version he shows the lighting makes some parts darker wich makes the difference way more visible.
As well, unless you were on PC running DSFix, the game was locked to 720p with heavy FXAA blurring. I imagine it helped a little.
Can't believe no one is mentioning this bit at 19:35 saying "...what's basically a layered cake of meshes" while "Layer Cake" is playing...
Masterfully done 👏
Wow
I'd like to also say about the Half Lambert solution - one of the reason it works so well is because hair is, as you say, transmissive. Real hair has light pass & refract through it, so any part of a contiguous set of hair that is in total darkness will look really weird since it's nearly impossible in most situations with a light source. Half Lambert fakes the indirect lighting you would get from a path or ray-tracing lighting model even in a forward rendering environment, so it feels more natural.
I actually don't understand that much of what you say but it's just fun to see you magically combine values into complex lighting models (:
Same
One day
I just love hearing nerds talk about topics that interest them 😂
All complex things are just a collection of simple things
I think one can do it by just using black values as transparent pixels
Planet Zoo has an amazing fur system too for LOTS of animals. I'm not totally sure but it also looks like shell texturing, but it looks good even on grazing angles!
Another Acerola banger? You can't be fur real!
I've worked with fur throughout my life in 3D, seeing such clever, concise techniques here gives me an incredible basis / tool kit for this. Thanks again, king. 👑
24:43 THIS IS THE REASON why old Dark Souls bosses with fur brought my laptop always to lagging when i moved the camera close in ! Now i finally understand.
It never made sense to me that when the camera got stuck on the groin of the taurus demon that my pc almost crashed.
Our game, Steel Heel Jam, has a non rotating, almost top down camera, and while watching your live streams I realized this would be a cool addition to the environment art for some grass. I plan to easily add some basic vertex animation with a noise texture for wind on the grass.
I would advise you to look into raymarching grass ShaderToy instead (it look up "grass field with blades"). This technique is outdated, slow, and a pain in the ass to set up comparatively.
Brave of you to assume that I've only watched four videos about rendering grass.
I know next to nothing about graphics programming but these videos are always so entertaining
> what is basically a layer cake of meshes
> Layer Cake - Persona 5
well done
This has become one of my favorite RUclips channels. I don’t have any interest in creating games or other things that require real-time rendering techniques, but I have a deep love for video games (although I don’t really play anymore). 30 minutes of extremely well articulated, concise, funny, and entertaining explanations to technologies and concepts I knew nothing about draws me in. This is what the internet was made for: experts of their craft sharing knowledge publicly. Even if I’m not going to internalize this information so that I can create with it, it’s beautiful to learn about the foundation of how things work and it brings me happiness. Keep sharing mr. Rola! You’re inspiring!
You verbalized why I love the Internet! Free expert knowledge everywhere!
17:23 another way to do this is to lerp between last frame’s value and the whatever new value by some small amount. technically it’s dependent on the frame rate so maybe do that in the physics update and then a bit of smoothing in the per frame update, or calculate it every time. the benefit of this tho is that it’s an exponential approach, which feels very natural. it’s a first order lowpass filter, and eventually i’d like to make a video going thru it in depth because it’s a really good way to smooth things and i’ve been using it a lot in my game to make things smoother, and i think more people should know about it for game use.
20:10
>If they didn't fix it, you don't have to either!
Perfectly said! Tech art is all about compromise lol
This is all too complicated for me. I’m just going to simulate everything perfectly and tell consumers to get better rigs -Crisis devs, probably
Ive ran into plenty of rendering issues with hair, it can cause some really *hairy * opacity problems. Ba dum tis
This technique reminds me of sprite stacking. A good example is NIUM.
It is used in 2D games to simulate a 3D object without actual 3D graphics. Instead of layers being randomly generated, they contain "voxel" layers to represent a model.
This would mean that the higher the object is, the more resource intensive it becomes.
Acerola, have you ever heard about particle rendered hair? I heard about the graphics of pikmin 3 a while ago and one of the bosses uses this technique to render fluffy fur from partickless that are locked to dark unseable areas of the texture. It's very interesting stuff
yeah, it's a more expensive solution to the geometry problem, definitely really cool that it even works on the switch
I like to imagine Acerola is a senior graphics technician at Nvidia or something and just every so often releases youtube videos on his off time haha
He's talked about it in previous videos, he _was_ an engineer at nVidia, and left (quit or was laid off, I don't remember) not too long ago and decided to focus on youtube videos.
I was a graphics programmer for Intel and PlayStation before quitting to do this full time
@@Acerola_t Well hot diggity! That explains how you are so well versed in all of this
Always enjoy your videos, and I appreciate that you don't shy away from the intense "formulas and calculations" side of "how do I make this look good" lol
Thanks again for the video ✨
Nah Nvidia wouldn't use all this el cheapo flawed stuff. I mean they're pushing realistic lighting with ray tracing, and this technique directly contradicts that, as shown in the video.
@temotskipuri3151 realtime raytracing is el cheapo flawed stuff dude. Its nowhere near as high fidelity as your average offline renderer.
These videos give off huge high school project presentation vibes with all the clipart (in the best way possible) and I absolutely love it
I've seen something similar to shell texturing used in 2D as a method of creating nearly-3D voxelized objects in low-resolution games. By generating vertical slices of a given object and drawing them from lowest to highest with a gradually decreasing Y value between layers, sometimes as small as a single pixel, the slices stack on top of each other and appear to build depth. This creates an effect which is something like front-on isometric perspective, because if you rotate the slices around a common origin so that every part of the image remains in the same relative location it creates an illusion of a three-dimensional object spinning but always maintaining the same 45-ish degree angle relative to the camera.
Dolly mixtures, yum. :P
Regarding RNG in GPUs, what do you think about the paper "Parallel Random Numbers: As Easy as 1, 2, 3" by John K. Salmon et al., which seems to be an approach close to the hashing method you describe, but their implementation is pretty damn good statistically.
I'll check it out!
@@Acerola_t happy you've seen this comment, I know about it because in deep learning the Jax framework uses it to generate random numbers. The core idea comes from counter mode ciphers, where a seed + counter allow you to get a block of random bytes. The seed remains fixed across the generation and you only need to increase the counter over all the blocks!
Not much of a gamedev person myself, but I have always wondered how shell textures work, and this video explained it super well! Great video, I would love to see how nvidia hairworks and AMD tressfx works!
This guy teaches me more than I ever learned paying for a $100,000 degree
I noticed when you said "layer cake" and then it started playing _Layer Cake._
'Twas not lost on me.
This video and the Acerola Furry Challenge is great, I love promoting learning!
I like the detail of having the background song being "Layer Cake" while talking about layer cakes at 19:33
Saying average person, and then showing Jerma is a war crime
Was just coming down to say the same thing
I've always been fascinated with how good Fox's fur looked in Star Fox Adventures, considering it was running on the Gamecube. Such a simple trick, but it works wonders, and I think the part that impresses me most is not just how impressive the fur and its "physics" looked, but how well it smoothed out Fox's model and made him look higher fidelity than anything you'd see in other games of the era simply because the fur hid the hard edges of the polygons he was made of.
I figured out that Star Fox Adventures was using this technique back in my university days when I borrowed a friend's Gamecube for a weekend.
Also, don't be surprised if your Furry Challenge gets a lot of entries involving characters from that franchise.
Honestly it's amazing that this isn't more well known among that franchise's fans, or used on fan-made models. I seem to remember reading that the actual models from that game only recently finally got ripped and released to the public-and even then I've only seen them rendered without the shells or even the subtle sheen on Fox's snout-so Rare must have done a really good job obfuscating their file structure. Because it sure wasn't for lack of demand.
AFAIK, there is one way to "fix" the gap between shells by slightly turning the surface towards the camera in the vertex shader. This allows a camera to be parallel to the shell planes, but the math is so complex i never got it to work right.
A different technique that i also know of is parallax hair/grass, but that requires an unholy amount of preprocessing so that you can render curves in depth.
we all appreciate how long it took you to record this, you even grew clothes over time
Putting a video of your pet during the length of the sponsorship might be the most brilliant move i've ever seen on youtube
WTF, I never realised that's how traditional fur rendering worked!! That's so smart, and also explains the strange stepped look it always has.
Shadow of the Colossus was what made me interested in shell texturing long before I even knew there was a name for it. Makes sense that they'd want to draw your eye to it, since fur patches are key to climbing the colossi and all that. It blew me away in 2005 and it still holds up today.
Another amazing video! I feel like this is one of those things that I will see EVERYWHERE now that I know to look for it.
Very excited to participate in this challenge. Love that you're encouraging people to do homework and learn the material :3
If anyone is interested in how to do human hair, most high fidelity games do "hair cards", It's similar to this, but more handcrafted. You can search that term to find tutorials
It's a transparent texture with many rectangular hair parts (the "cards"), each with different densities. Then you model hair strips with long rows of quads, and assign their uv's to one of the cards. The more you stack one above the other, the more real it looks, the same principle as shown in this video. You just use a normal transparent material to display it.
It's pretty hard to model I think, even with specialized tools. From what I know you should sculpt it first, use that for generating the strips, and then do manual touch ups
High budget modern games do other more realistic approaches probably, share if you know
18:54 OH MY GOD! I kept wondering how they made this short soft grass in Genshin, which I really like. Thanks for the video, now I know it
The Far Cry series also uses it (as any modern game needing more than one furry creature). It's very noticeable in Primal, since the animals are half a meter away from the camera when you pet them.
I'm pretty sure CP2077 uses it too, noticeable, again, while petting cats.
I've always liked gaps in shell textures, I think they look neat
Tbh, I almost forgot, that the video was about hair, when he talked about the grass field
I was invested in it
Funny timing. Just last week CIG, the makers of Star Citizen and Squadron 42 had a panel at the annual event CitizenCon where they showed off how they're rendering hair.
Are links allowed? ruclips.net/video/kLTZfAcaJpc/видео.html
it took me until 22:18 to realize you were layering your shirts throughout the video. Mad props.
Wouldn't it be possible to billboard the quads towards the camera? That way, there's never a gap between the layers (might ruin the narrowing though)
Hair physics in GTA-6 trailer was insane. Let's see how that turns out to be in future when it's released...........
Could you do shell rendering with billboarding? That might solve the illusion breaking at an angle. (though if big game studios haven't done it I assume this just isn't useful)
Exactly my thought as well. It's such an obvious solution that there must be a serious downside. Upvoting in the hopes that people who know stuff see this and respond.
Yeah, This is the first thought of every person who knows Billboarding,
problem being that the Billboarding is done to a Quad or Triangle during Vertex Shader phase.
but the shell here is made up of rasterized pixels, and Pixel itself is just a dot facing screen already.
So doing Shell Texturing of a Quad made up of two triangles, has nothing to further Billboarding.
The first time I ever noticed this was the Werehog in Sonic Unleashed. His fur looks soooo ridiculous.
It’s a bit after my bedtime but I had to watch this now ;)
Could the rendering discussed at the end be helped by logarithmic search? So checking the mid layer first and then closing in recursively?
That sounds like an interesting optimization, but you'd have to mess around with the ordering of the draw calls manually.
never done graphics programming, never coded, i don't even own a computer
but these videos are top tier
Unfortunately i don't have much time in the next weeks, but here is what I'd attempt to improve rendering:
Get rid of shells entirely and only jave a single outer shell. Then, in the fragment shader, use the incoming "ray" direction (i.e. the angle to the camera plane) and the position (UV) to know through which shell pixels the ray will travel. Then go through these, stopping when hitting the first grass and perhaps using statistics if too many shell pixels would need to be traversed, thus setting an upper bound on the shaders per pixel runtime.
This would basically be like converting each shell pixel into a box between two shell layers.
I imagine this being similar to a parallax shader with multiple layers. It does still have some artefacts unless the statistical guess is good. It would however eliminate the overdraw issue and may reduce the amount of artefacts when looking at it from the side. The per grass blade lighting code could then aslo be more complex, although that has other difficulties since each "ray" diesnt know about neighbor grass blades, since it is guaranteed to happen only once per fragment (each fragment will sample the random function multiple times however, but that shouldnt be as big of an issue, since it doesnt even take fommunication between Threads or bandeidth to read from a texture.
Anyways, Im curious what we'll see in the follow up video.
Edit: dynamically changing it by applying displacement would be more difficult and would have more performance overhead however, since that will require some per she'll pixel math Evaluation (or to be more precise: integration into the code for finding the next she'll pixel) as just displacing the shells would not be possible.
Correction:
Cats hair gets measured in volume not surface area... because it's freaking everywhere...
Waiting the whole video to see if you'd mention Shadow of the Colossus, and it being mentioned at 25:49... RIP in pepperonis, sweet prince.
Definitely one of the best looking uses of shell texturing for large creatures, imo, and done with very little processing power at its disposal.
Hey, I've been thinking about this some more lately, and I've come up with a simple way this could be further optimized: by having the number of shells vary depending on angle from the camera. The more the "strand" points toward the camera, the fewer it needs to be convincing. Though if you're planning to simulate AO, you'll still need a few at minimum to pull off that effect-and on that note, the tapering effect will also be slightly more work because you'll need to go based on distance from the solid surface rather than shell count. It also wouldn't work for most grass applications since a low camera would suddenly cause almost ALL the grass on screen to need more shells, but for something like Sif or the _Genshin Impact_ moss where it wraps around a round object, it might be ideal.
Getting every achievement in Viva Piñata is no joke. I remember using spreadsheets for that game I found online
I had the strategy guide lol
I never realized this, one great game I found out that uses it is fluffy fall and you can see it quite clearly
I love how you present these so casual and yet so informative
This dude is an actual genuis for putting his cat on the side of his sponsorship, its like subway surfers but i dont feel like a derranged 2 year old for watching it. Thank you. Your cat is beautiful. God bless
I freaking love Viva Piñata and its art style, I appreciate the VP appreciation. GAMING.
I don't think Genshin Impact's moss should be "fixed". IMO it lends itself quite well to the art style and I feel that would be lost if it were made entirely solid.
i really would love to see how you would tackle hair strand rendering, like in Ratchet and Clank Rift Apart
streets are saying rift apart uses shell texturing
I have been trying to understand fur in Unity for weeks, and now you, kind sir, have provided me with all the answers. Thank you very much
Somehow your videos have opened my eyes to how good Rare are at making graphics in their games look stylised by using incredibly simple techniques in unique ways 😁
I came across this video right before the final for my Intro to Modern Graphics Programming class, where I had to research a graphics programming technique and give a mini lecture on it. I only covered about half of what this video does, but it was a pretty fun experience, and while I’m more of a game designer than a programmer, I’m starting another graphics programming class today and I’m excited to learn more! Thanks for making these videos and keeping me curious about graphics programming!
I don't even do anything with this knowledge other than put it in a space inside my head. Sheer curiosity towards how things are done for real-time rendering is what makes me enjoy videos like these.
You're probably the only RUclipsr whose ads I don't skip. That cat cam is pure genius!
I still feel viva pinata looks good. Choosing a style can really help avoid things aging poorly when graphics level up
Someone make the shells wavey, and collide with eachother. Bet itd make the side profile better.
I always intuitively felt that, as soon as I saw a game, the quality of hair modelling spoke to me about the overall graphic design quality of that game. Learning about how complicated rendering hair is made me more aware of this.
2:28 generally there would just be a ton of instancing going on for grass and other repeating objects.
Instancing is taking and object and duplicating it in such a way that the program acts like there is no extra faces on screen to compute. For example, a cube has 6 sides, and when you duplicate normally you would then have 12 faces to compute, but with instancing you still only have 6. so you can do that with millions of grass blades and have really nice optimization.
I'm further in now and really fascinated by how these ideas came to be. Can't wait to see how physics works.
Dude I had no idea Genshin did it too. I'm probably going to unfortunately have to figure this out myself.
". . .where everyone is bald and is trying to kill them for having hair" [ 1:40 ]
this analogy just won you an immediate sub 😂
It is unfathomable that you still only have a few over 100k subs. Your videos are the peak of RUclips
He just gained a subscriber a few minutes ago. :) I was actually surprised I wasn't already subscribed.
I don't play Genshin, I've barely seen any footage of it. But comparing the super high resolution sharp edged character models to the soft shell textures is really off putting. They don't look and feel like they belong in the same game.
The early z test issue is actually much worse than you claimed, discarding completely disables early z tests on a lot of gpus because reading from the depth texture and writing to it are a single, atomic instruction. So it doesn't actually matter if you draw your shells from front to back, all pixel shaders will be executed every time.
I can't link in youtube comments without sending my post into the shadow realm, so I'll just leave this sublink to a unity forum thread discussing in
threads/general-questions-regarding-early-z.1065779/
That's what I thought but I wasn't quite sure since I saw differing statements on it so I sorta just went inbetween the logic for the explanation. Either way, low shell counts are ideal.
Your explanations are so thorough and have just enough balance of the technical and non-technical to be interesting and educational even to those with no expertise. You’re an excellent teacher.
Putting the cute cat videos next to the sponsor ads so you can't skip them if you want to see a cute cat is pretty smart!
Didn't expect my favorite game of all time to get so much love in a video about simulations! I gotta say, viva piñata on PC is the bomb (even though I can't save my run) and with it's simplicity comes a bunch of benefits! Thank you for this amazing video, i'll surely come back to it!
You should probably use acceleration instead of velocity to calculate the displacement or ideally a blend of both after applying a low-pass IIR filter.
Yeah, assuming you're in a fluid medium (e.g. air), velocity would definitely play a role, too, but I had the same thought about using acceleration.
the cat video on the ad segment is pure genius ngl I sat through all that
Why can't we have a shader that turns each shell 90 degrees, basically turning it into a fin? To avoid having all the fin-shells aligned, you could then rotate them around the normal axis by an amount determined by the index of the facet of the mesh.
This is the first of your videos that i could mostly understand virtually every step that we took! im actually really excited for this because you're helping me get a hang of graphics programming while i have zero programming experience, and also shows that your videos are getting better!
Interesting I have been doing the shell texturing method for years without knowing it existed, I called it "MRI parallax-mapping" because it mimics the slices you get from an MRI machine. Use it for shag rug, tank tracks, bicycle chains, chainsaws, you name it. Thanks for making this video glad to be in the same pool finally.
I just LOVE the humor in your videos. There is so much there hidden in plain sight. And it isn't even distracting (when you aren't currently writing a comment about it).
Regarding the physics section, what I'd personally do is have various control points attached to the base mesh that have physics simulated on them, and attach the shells to those control points. You can control the density of controls, how the material moves, etc etc. Kinda wanna try that myself, now.
I- did you just trick me into watching a whole sponsorship segment by putting a cute cat video next to it?
10:30 This can be done using POM too. Yes there are differences, but POM can do this too, it just uses a different process.
Rage 2 had shell textures for its buffalos wandering around the overworld. It was really well executed and looked great.
19:34
Acerola: "layer cake"
OST playing: Person 5 - Layer Cake
👏
the opening is a *Everything Everything reference?*
and I thought the VA-11 Hall A music was based
Its nice to see an in depth examination oh this effect.
The first time I've seen it was in a magazine that covered the release of shadow of the colosus on PS2. But I also noticed this effect used in Starfox Adventure on the GameCube. Another game from Rare. Pretty much all their games used this effect.
That being said, I don't think triangles arw that big of a deal on modern hardware, you could probably get away with blades of grass using 5 vertexes and using various mesh instamcing techniques. This would give tou multiple benefits like proper shadow casting, culling and no need for expensive pixel shading or pixel transparency testing.
You probably wouldn't be able to get the same density or long distance views but that calls for additional visual tricks too.
Note, tou kept saying this was and still is commonly used but apart from a couple of AAA console games and Breath of the wild clones, I have mostly seen crhe classic X grass. Does anybody know of other games using this technique (outside of small indie projects )
Thanks for the tip on overdraw, I can now abuse this in Garry's Mod to pull 100% GPU usage on my friend's 4090 (lol)