I do not understand how it doesn't quite matter where you put the islands to UV2 other than just.. somewhere on the section it needs to be in for the lookup. Care to elaborate more on that? In the same breath as I think of that, if a texture occupies 0-1 space on it's UV0, I doubly don't understand how that's going to map to the correct textures.. *confused*
Hey hey! So as for the first question, the beginning section of our math will quantize the 0-1 space in to equal sections by using Floor, which rounds the value down to the nearest integer. We do this after multiplying it by the number of sections we want. So let's say we want 10 UV sections, for simplicity's sake. If our shrunken UV island has values like 0.24, 0.25 and 0.26, then we multiply it by 10 (now 2.4, 2.5 and 2.6) and then we Floor it. (Now all the values are 2) Then if we subtract 2, those values become 0 while all the other values around them are non-zero. We can Absolute them (turn the negatives in to positives) and the 0 values stay at 0. When we OneMinus the result, we get our beautiful bitmask! :) ---------- For the second part, imagine we have 2 groups of vertices. ABC and DEF. Both of these are mapped to a texture in UV0 as per normal and apper to be overlapping. ABC's UV2 values are in the "3" section of our grid, while DEF's UV2 values are in the "5" section of our grid. Texture 3 gets put through the "atlas function" to be masked on to triangles that are in the "3" section of our UV2 grid, and appears black on any other triangles. Texture 5 gets put through the "atlas function" to be masked on to triangles that are in the "5" section of our UV2 grid, and appears black on any other triangles. We can then sum the results for our desired outcome :)
A fun little analogy that we talked about during the premiere: Let's replace "draw call" with "pizza call". You want to order 10 pizzas! Would you rather call and order 1 pizza, wait for it to arrive, call up again and order another pizza and repeat 10 times? Or would you rather order 10 pizzas and have them delivered together? They might take more time to cook and process individually this way but you cut out all the delivery time. This is basically the problem that the CPU runs in to when telling the GPU what to draw. How that helps explain what's going on behind the scenes! :)
If your UVs don't overlap, you can use a mask texture instead. A lot of things don't have that many materials, so red, green and blue gives you three meterial switches. You can combine them and/or use 128 (half intensity) to get more masks. Another method if you don't want to use another UV map is to use the same technique as UDIM. You move the original UVs by 1, 2, 3, etc. horizontally. Yes, they will be outside the 0-1 range, but that's fine. You can even move them vertically. In the material graph, you use frac on the UV's to get the original UV's back. The integer part is the UDIM tile and you can group texture lookup and whatnot by UDIM tile. The other stuff in the video still needs to be done like merging the materials into one.
A texture requires another texture sample/lookup, which is a lot more expensive than using some math to separate the sections. I guess the upside is that you can mask between materials on a per-pixel level. You could also use gradient-packing to get 12 masks for an RGBA texture (0, 0.5 and 1 per channel) As for using a UDIM-style method, it's definitely a banger! I do something like that for my characters which need non-overlapping UV's for their damage painting, and it doubles up as a mask.
Huge thanks for everything your doing. These tutorials are just the right amount of info to not be spoon feeding. You have really pushed me to rub my brain cells together, and I'm learning so much. Proud to say I got this functioning!
very interesting stuff. obviously the general concept of being able to use shaders in these kinds of networks suited to your project... is excellent. I guess at that stage youve essentially built a shader 'tool' for your requirements (ie, performance and behaviour). it's also useful to see the kind of scales shaders can grow to. Plus... free outsider art
I honestly learned a lot from this video and your "Unlimited Customization" dev log... My understanding of how to apply multiple materials in performant ways has significantly expanded! Keep up the good work, mate!
This is great! I make all of my own assets, from skeletal meshes and animations to static meshes and what not. But I really wanted my game to have a stylized look with a color pallet that I've created... Do initially I had just a master material to hold my base color and such, with material instances from it for each color in my pallet... But that meant that each of my models had a material slot for each different color from the pallet they used. The draw calls were getting out of hand the larger my world became. So using this technique to simply organize my pallet and have it run on one material slot is going to save my entire project. Thank you!!!
I only took a glance of this video, but I must comment before I dive into this further. From at a glance, gives me an excellent idea in resolving my multi-purpose road texture. In the beginning, I was I trying to develop a way to differentiate between the number of lanes, passing lanes, no passing lanes, and so fourth. I thought I could separate the R-G-B-A channels for masks. However, I needed more than 4 channels, and I could not figure out how to implement that. I wish I could afford supporting your channel, but, if you do not mind, I will certainly spread the word about you and this video. Please let me know if that is okay. Excellent job Charlie!!!!!!!!😁
Haha that's the exact same problem I ran in to - Vertex Colour is sacred and should be saved for more important things. They're also a bit hard to manage if you want to re-order them etc. UV maps are much easier to edit! Thanks for the kind words :)
Update, I've been adapting lots of assets to this system and it works PERFECTLY in VR in Forward Shading, is such a game changing feature! It saved us like half the drawcalls in most cases, in some even more than half the drawcalls :D
Thanks for sacrificing 3 hrs of your life to film this, because the amount of time you gifted me from being able to add this into my project is likely going to save months. Your a hero Charlie-boy, thanks as always for you well presented knowledge dump :) I was wondering though, when you started your project did the gameplay abilities system exist and do you use it? Would love to hear your thoughts on it, been tryna wrap my head round the whole thing the last two weeks in spare time, was wondering if you had any insights you might be sharing in the future on it at all.
Man I love this spooky material magic! It took me a bit to get it working with my materials, but man, once it does, it's so neato! I was a little confused by the texture coordinates - so if you're like me and using this with plugging in normal/occ/textures as well as colors - MAKE SURE YOU LEAVE THE TEXCOORD NODE AT 2 (if that's where your atlas is) otherwise you end up with a stripey mess. >.> Whew. Super cool stuff as always. :)
@@PrismaticaDev Next I get to figure out if I can get some random color variation options (or get it to work with the custom data colors). I've got it partially but it's only applying to the first texture so I've got some exploring to do! :D
This is genius! I am using a similar method, nearly a half-step away from doing just this and I can't believe I never made this connection. Still learning shader stuff though, so my method is a little bit more bone-headed if you will haha. I also have a separate UV channel, but I fill faces/islands in with R, G, and B to create an RGB mask texture, then in Unreal that RGB mask takes in that separate UV channel as its texcoord, and then I use the RGB channels to lerp out each albedo (which is receiving the traditional UV channel). So unfortunately I only get 3 maximum textures/colors per material instance this way and I need another texture sampler for the RGB mask. I'm seriously considering shifting to this method! Thanks!
Maybe it would be good to collect some of the limitations of this technique in the comments. We tried using this on our foliage, but we use early-z on masked materials and unfortunately this is a bad combination with this setup. The foliage with default Unreal materials (8 different materials per tree) had almost if not more than twice the fps than the same foliage with this supermaterial applied. We made this observation both on PC and on Switch, but it really made the biggest difference when we tested on Switch. With the supermaterial the SceneDepthZ pass made up almost half of the whole rendering pass. It would be great to get to read some more experiences where this technique actually excelled over the default Unreal method, but also where it absolutely failed like for us. Thank you anyways for the inspiring video!
Great video! Can you tell me why you pass the atlas function output through Vertex Interpolator on your master material at 29:42, but you didn't need to do that with the example material you were building at 27:50? Follow-up: How expensive would it be to convert the default value of 0 to an input parameter that sets the default value for all other inputs? Follow-up-follow-up: Any reason why you put the switch statements before the multiplication and not after? As far as I can tell, putting it after the multiplication would simplify to ignoring the atlas operations for that material. Am I missing something?
Absolutely - I have a few master materials. Characters, Foliage, Props, Landscape and Trees all have logic that is very different to one another so they all have their own specific material setup.
Thanks for the tutorial! I'm a bit confused and wasn't able to follow too well going from master material to material function, but is there a way to use this to go around the 16 texture sampler cap (especially if more texture types are added like roughness, normal, etc)? That list of Material Color at 5:09 is particularly scary. Were these blueprints meant to eventually replace those v4 parameters by textures? Or were they meant to allow us to have some color variety (organized by type/UV clusters) blended into the diffuse map while keeping the same microdetail normals, roughness, metallic, etc? For instance, if I have a bunch of leaves, I use this to have different hues of green, but the leaves would have the same normals/cutout masks, etc? Thanks in advance!
This is my question as well. By default you can only use 16 texture samples in a single material before hitting the limit on the sampler register index. So that would basically invalidate this entire process if you wanted to use full PBR materials for each of the slots. I've read that you can change the Material Expression's Sampler Source to "Shared: Wrap" or "Shared: Clamp" to get around that limit, but I'm unsure if that would be the correct way to address the issue. @PrismaticaDev It would be great to have some clarification on this.
This is awesome stuff! But you want to be careful about the quantity of switches. In Unreal 4, each side of a switch gets compiled into your shader cache as a different variant. So if you have many many many switches, the worse case being switches under switches, Unreal will bloat out a lot of variants. Depending on your platform limitations and your cook configuration it might be better to avoid the switch and use Multiply & Add to create the masking effect. Probably less relevant for Indies and NextGen consoles. But if you do something on mobile, memory limitations are a big deal.
Yep, very good point there! Every material instance that has a different switch configuration gets recompiled as a completely different shader. I think finding the balance between having them be broad enough to encompass multiple objects, but also with as low complexity as possible is the key. For example, my “floors” instance had a couple of mostly redundant layers that only appear in 2 assets out of 20 etc
if the assign material doesn't work you should be able to shift select an object it did work on so that it becomes the active object and then hit ctrl + l and link materials.
So cool, amazing job! One question though, im new to the channel and i havn't heard on which platforms are you planning to realase the game? Cause this level of optimalization looks like a bit too much for a regular pc? Or am I just unexperienced :D?
Hey there! Just planning on PC currently, although using this method is a big time-saver as well when it comes to designing levels :) And the extra juice is always good to have for some cool effect later on
Kind of necroing the comment section here but I got an idea while watching your video... First of all: great work on those tutorials you make! I recently found your channel and I greatly enjoy your videos! An idea flew through my head for blending between the different materials... another approach would be to create material attributes for each submaterial and then to BlendMaterialAttributes them based on the material mask. I have seen you are using a whole bunch of multiply and add for all sorts of different textures (diffuse, normals, etc.), it may be easier to combine them into material attributes and just blend them in one after another (maybe doing some simplest black material to switch out when not using those, for optimization purposes).That way you can have each submaterial group up seperately in the material graph and just switching the results based on the mask.
Yep yep! You could absolutely do that - the only reason I opt to do otherwise is since I only tend to use 1 texture per "layer" and the rest is weird colour logic. I'm not sure how MaterialAttributes blends handle empty entries (might be doing a lot of unnecessary operations even if you aren't using specular/roughness) but yeah, for big photo-real materials you sure could :)
@@PrismaticaDev Thank you for your fast reply :D I have come to stress-test my approach finally last night... BlendMaterialAttributes seems to do take into account if the values are actually different from one another - like if you are blending between materials wich all have their specular channel set to 0, the instruction count goes down a bit... nonetheless the instruction count is ridiculously high for the blend. I have tried it with an If-node, that seems to work and gets the instruction count down a bit further. You can't Switch MaterialAttributes unfortunately though, it will implicitely cast the whole attributes to a single float vaule and blows it up into your face when you try to use it as input for the MaterialAttributes node. I have an other problem with this... using one texture for each material subslot seems to use a texture sampler each... When I get to the count of 16 material subslots, the material complains about having too many texture samplers. I looked into your video and you seem to have just 4 texture samplers active at once, despite having a texture for each subslot active... how did you do that?
@@Develavid nice finds! Also, for the texture samplers thing - click the texture sample node and check the dropdown on the left hand side and set it to one of the "Shared" modes. It will combine them and allow for 128+ textures per material. It's a must-have for landscapes hahah. I don't know why it isn't the default option since it has 0 downsides, I think it's just for backward compatibility with REALLY old hardware/rendering
@@PrismaticaDev I have given this approach a bit more testing... and it seems that when using If-nodes it doesn't matter if the If applies before or after putting together the material. An If seems to just use one shader instruction either way and it's the same instruction count to do it on the texture as to do it on materials putted together with a SetMaterialAttributes node (the MakeMaterialAttributes node doesn't seem to work in ifs... I have no [swearword] clue why that is) So it might be even cheaper on an instruction count basis to create the materials seperately and If-ing them at last - maybe even doing some SetMaterialAttributes afterwards to make global things like rain, wind and such things. When you make the StaticSwitchParameter the way that you even don't calculate the if when the material is not used, it saves some juice aswell. I may be doing a pastebin in the near future if I can get myself to it so I can share the stuff - if you are interested, that is. EDIT: Nevermind, it seems it doesn't matter if you are doing two Ifs which hook to one material or two materials that hook to one If... The instruction count is the same. Apparently it is implicitely calculating the if for every material input and even optimizing itself against things that are hooked up the same way in different materials... so it may even not matter how you line it up.
Damn... this is cool. Still working on my first little boring racing game and i don´t need this but if i will stick with games in the future i will come back for this. :D
Hey there! I haven't actually looked in to HLODs yet since Prismatica is a top-down game and we don't run in to many LOD issues yet haha. Definitely something I want to cover in the future
Thanks for this, yet I've a few questions. |I'm new to unreal. I've been doing tutorials on making a game, and am now getting ready to start my first demo. I've been downloading assets from the marketplace and had noticed that some (probably all) have many materials and knew from other videos that that meant multiple draw calls. So, tonight I come upon your video. In the beginning of your video it seemed you were going to delete all the materials and make it just one or two. If I understand it, you are creating a stack of paper (IE one per material - the atlas) in one master material, is that right? Though if there are more than 16, are you just chucking the extra? Or did you choose 16 because the asset pack used 16? Once done with this new atlas / master material, that is now just one draw call? Also, is the master material for just this asset pack, or can you use it across other asset packs? Though likely have to go through each pack and do the export to blender thing and back again. And would any of this benefit from C++?
great tutorial! some really cool ideas here that i've had a look at myself in some experiments, great to see someone else using similar methods. however, have you explored the material layering system in unreal? it is almost what you have created here, i'd definitely check it out if you have not seen it. Judging by your performance, its not too important. but the user experience might be nicer :)
Hey hey! I have looked at it indeed, and it should be implementable using this same UV-island method as the masking/blending method. The only reason I haven't used it myself is because my materials are quite simple in terms of texturing compared to other games and it seemed a bit overkill, and it's also hard to know how well it scales/its performance since it's a bit more under-the-hood
Using material layers, how would you map the relevant faces on to their respective material layer? Say i've got an asset that's a wooden chair with metal legs, how would I tell Unreal that the legs need to use the metal layer and the rest needs to use the wooden one? As I understand it the layer blending happens using a texture mask, but would that not come down to the same system as this video, still needing the seperate UV channel to put the UVs on the relevant part of the mask (say that we use 0(black) for metal and 1(white) for wood, putting those UV's on either black or white)?
Thank you for the video! Very interesting! I have a question. Would this method still be convenient even if we use multiple Albedo Textures instead of pure colors for the materials?
It's definitely more expensive than a regular shader, however compared to the cost of even just 1 extra draw call, it's miniscule. Check out my Devlog number 11 about my character material to see a comparison of 4 material slots compared to 1 (it's about a 300% gain)
Hey Charlie, would using the UE material layers system be a bit simpler than this? It was used extensively for props in the UE5 matrix demo. Have you experimented with it? Would be great to get a vid on the advantages/disadvanages.
There are definitely upsides like the modularity, and downsides like the potential for extra performance impact, but at the end of the day both systems require a way to separate the sections of the material which is what the focus here is about (using UV's instead of an ID texture or vertex colours)
When making a ruined building model, do you use 1 normal map and the 16 channel for just color selection or is it some funky thing that I'm missing? How do you align the wood grain or the wall brick on the model?
For these assets, they use multiple different textures - some of them are simple tiling textures (like the brick wall pattern) and some of them are baked atlases/high-low bakes. All of the textures are aligned using UV index 0, and then swapping between the textures/colours is done in UV index 1
Great video as always man! A few questions tho: Isnt it the same as the layered material system? How does it differ and is this one more effecient? What if I want to add an AO+Curvature (Packed) Texture - Slot, so I can add dirt and edge damages on top of all that soup? Where would I have to add the slot for the packed texture? What would be enough is to have one slot each material instance. This slot is unique for THIS mesh only. Wall1 uses Mat_Wall1 and has WearTexture_Wall1 Wall2 uses Mat_Wall2 which is a copy of Mat_Wall1 but has WearTexture_Wall2. Obligatory: Obviously I cant have everytime 1 drawcall, with unique texture anyways not. And also I would need another UV channel, ofcourse, and bake down and pack all the stuff into one texture, unwrapping properly too just for AO and Curvature (which isnt that hard). So main issues here are, extra uv channel and adding the slot for the packed texture "mask" with ao+curv which uses extra 2 materials for dirt and edge damage. Because as I can see you arent using any ao dirt or edge damages so its pretty straight forward, but what if you would want to? ;D Ty Charlie
Any way to create a grid and not strips? Have you considered a way to possibly do this with a flipbook to remap UVs? I think it can be done. I'd like to create a 10x10 grid.
Hey there! You can do strips in the X and Y axis and the multiply them together to create a grid. The only trouble with a flip book is that it is taking a pre-existing texture atlas and blowing each section up to full size, where as what we’re doing is completely modular
The shader calculation isn't that much unti you start layering 10's and 10's of materials, and even then it is still up to 10x faster than using separate material slots
Followed along and everything works great! The only bummer is copying and renaming 16 parameters each time I want to add a new one. Is there a way to tell Unreal to increment the parameter name number instead of keeping it the same? For instance, texture 1, texture 2 etc.
It does increase the complexity slightly, but the benefit vastly outweighs the cost. Draw calls are the main bottleneck for rendering times, so this method can be up to 10x faster (not an exaggeration)
Hi Charlie, thanks for the awesome video! Do you think this would be possible with the use of vertex colours and z up too in a PBR workflow. Thinking of trying it on a project I'm currently doing :)
Just gotta know where to cut corners! And also there are a lot of Static Switches which are defaulted to "off" so they don't get counted in the shown instruction count in the material graph.
@@PrismaticaDev do you offer a masterial material that works with this function. The example mat you created is a ltttle hard to follow and Im currently using Easy Mapper which would tough to integrate without help.
what if our assets use multiple *textures*? as i see it, all the pieces of these assets use just one texture (stone). this is just so we can overlay some color over one individual texture? it is still a hell of an improvement for sure. but i am trying to understand the limitations here. great work and thanks for sharing :) edit: oh it doesn't work like this. it does work with multiple textures! it is not just for overlaying colors then? so, if i have an huge asset with a lot of individual textures, this will work too? what about sampler limit?
@@PrismaticaDev So the inputs in the function are vector 4 but it's throwing error when I use the setup from pastebin saying, can't connect Vector 4 to vector 3 or something like that, so I changed all the inputs to Vector 3 and it worked, Is that fine?
This is great but only if you are going for same color driven stylized style, as this doesn't really explain how to deal with 16 samples max in a single material. Would be great to get a vid on this subject "Workaround for 16/16 samples". I tried to use UE5 Material Layering system but that has same limitation i can only have about 3 layers of materials and then i run out of 16 samples.
wait, say i want to use this thing in my character system and i want to have hair, armor, pants, boots and so on, do i have to put them all in 1 blender file in order to organize their uv atlas map thing?
You could use a secondary UV channel, you could also use Vertex colour which you can assign on import which might be easier. But either way you'll need to have a system figured out beforehand :) Actually now that I think of it, you can edit UV's in Unreal 5 now so that might be worth looking in to!
soo i have successfuly implemented an atlas for my character material & merged the mesh, it has dropped the draw call from 60ish (i kinda separated the mesh by quite a lot of parts) to about 5, so thank you so much for this video! it's still a bit weird tho how there's still 5, i thought having 1 mesh and 1 material means like 2 draw calls or is there something else i don't know?
@@masgondi1 awesome to hear!! I'm sure your GPU will thank you haha. The extra draws might be from the lighting pass or from cloth sim if you're using it
Is it normal that every mesh that have the instance of this master material will load every texture that is in the master material? so i reduce the draw call to one but the size map of every actor is huge cause every texture is a hard reference (disabling the static switch parameters doesn't seem to do nothing about the hard references) Did I lost something? Thanks in advance
is there a way to do this with UDIM slots I imagine it would be way easier if there was a material node that can point to each udim slot then you would just put each UV piece in it's own slot 1001,1002, 1003 etc
You can handle it that way, however this method is a little more flexible as you can have texture UVs that are outside of UV bounds. Although I'm not sure how UDIMs are handled in Unreal, so I might be wrong
Thanks for the reply, still trying to understand the UDIM in unreal I don't see how to make it into a material atlas.. I have two questions Does This videos atlas method support tiling down for high pixel density? if you choose the same slot would it blend the texture on top of eachother? @@PrismaticaDev
@@Retrocaus Yep, you can scale/offset the textures to any size within the material without breaking anything, since we're only fiddling with the primary UV channel, not the "key" UV channel
so if lets say my material on section 1 was a 4k tiled cloth material, on the second section I could have a 4k hand drawn texture in a material and just offset and scale it to overlap the uv in the position I want right. if so that just makes udims pretty much pointless lol makes you wonder why this is not an engine feature.. I see they are adding uv tools to the engine so maybe we will get this built in.@@PrismaticaDev
I don't think Nanite bypasses drawcalls completely - from my understanding, it can "batch" together instances of the same material across different meshes since it essentially merges all meshes in to one GIGA-mesh. But it would be interesting to do some performance comparisons. This method greatly reduces draw time at the cost of shader complexity, so with Nanite it might be more performant (if using the same material across multiple meshes) to use smaller, simpler materials. Performance aside, this method is still extremely useful for ID-mapping and sectioning different parts of an object even when using Nanite
@@PrismaticaDev I did some really basic tests using a mesh with 3 materials vs 1 mesh with 1 material (fancy uv setup). Even using the same more complex 'UV mapped material' for both nanite and non-nanite version. It seems the nanite one is more performant at first glance of stats. I know most people talk about nanite using 'high poly stuff' and LOD aspect of it but my game is low poly, so the performance of material pipeline in nanite isn't talked about as much. I'll continue to investigate but I think it might be more performant, or at least have no performance loss of having independent materials now. If I find anything interesting, I'll post to prismatica discord.
Correct - the textures will be very "low resolution" because it's taking the colour of texture at that vertex point and using it to colour, so it's dependent on the density of the mesh. On the topic of the instruction count, the Vertex instructions are orders of magnitude cheaper. Imagine running those instructions for only 1,000 vertices vs potentially 100,000 pixels.
I understand why i need it, and primely how to integrate it into MM, but. . . Is it possible to, to individually combine with (MF Map Adjustment) for every map in total, to have fully parametric material?
2 Days later maybe it's working, i simplified it from 16 to 4 for reason's but it has nice parametric and group structure now it's testing time I need to remap tree in blender. . . PS: Thank you for your creativity.
Can't for the life of me get that MF to work. In UE5 I get `[SM5]: Cannot cast from smaller type float3 to larger type float4.` It seems to think the output of the MF is a Float3, and I assume the base color of my MM is expecting an Float4. Is there a step I'm missing?
what is the "1 - x" node? This is way cool! You've created an infinate texture atlas. I'm an artist, not a coder, but I was able to follow the concept of this and incorporate it in my Cell Shader that uses directonal light. But I got stuck on the "1 - x" node... never mind. I found it on your blueprintue link.... for anyone else that ran into this, the node is called "OneMinus"
As said before, you should really look into texture arrays - this would make this a lot less complex. Anyway - beside this, great information. I also use this technique for a city I am creating and it's really a lifesaver. I do use vertex colors, because it's easier to remap these values if necessary and I only need 1 channel - so it sufficient. Also I can blend material more easily :D
I've had a look in to them a few times - definitely a very useful tool, although I'll need to look in to how they're allocated in memory. For example, if only a few of the textures are being used, the entire array is still loaded in memory (probably not the biggest deal) but if a texture needed to be changed as a one-off, would creating another texture array allocate space for ALL of the duplicate textures? From what I've read the texture arrays are more performant because they have a fixed address or something that makes it easier to find and sample, my only concern is the flexibility of it. And also apparently mipping doesn't work with them (not a big issue for me, but could be for others)
@@PrismaticaDev Yes, the texture array is always in memory (as long it's used) and in my opinion really nice for objects that are static - but these are often a lot. For example the city I am working on. Yep, these buildings are static so there won't be any changes over time. Also these building are always present, so the texture were in memory nontheless. So an array is really suitable and for me - also pretty flexible. Before doing anything, it needs to have a plan what is needed, but when this step is done, it's easy to maintain (drag & drop array). In Blender I do have a script, that is converting the materials to vertex colors (every material does have an ID and this is applied to the vertex color) and you get also a nice visual representation. About mip mapping - in my experience this is a rumour that is wrong and maybe a thing when the arrays were new in Unreal. Yes, on default you don't have mips, but when you go down the options and search for the regular mip map settings, you can turn them on. Tested it, worked. EDIT: Oh, and one of the most important benefits of using texture arrays. My whole city is consuming just two texture sampler slots (four, if you want to blend).
@@aukehuys2297 I can't find the docs or information on texture arrays. Is this an array in the material or a data asset that can be set up and accessed by a material?
having everything compacted into 1 Shader, isn't making that shader really heavy? to load and to generate? I mean, every frame, the GPU has to calculate this Heavy Shader. if we do a Shader Complexity, is it still green or become red or white? :O Thanks!
RGB or any texture keying requires you to have a unique texture for every asset which needs to have it's own individual unique Material Instance (so you can set the Texture param) In this method, the data is built directly in to the UV data of the meshes themselves. You could also use Vertex Colour, however it's quite difficult to do things like "select all verts with colour between 0.1 and 0.2" if you want to make adjustments like you can with the UV keying method
I do not understand how it doesn't quite matter where you put the islands to UV2 other than just.. somewhere on the section it needs to be in for the lookup. Care to elaborate more on that?
In the same breath as I think of that, if a texture occupies 0-1 space on it's UV0, I doubly don't understand how that's going to map to the correct textures.. *confused*
Hey hey!
So as for the first question, the beginning section of our math will quantize the 0-1 space in to equal sections by using Floor, which rounds the value down to the nearest integer. We do this after multiplying it by the number of sections we want.
So let's say we want 10 UV sections, for simplicity's sake. If our shrunken UV island has values like 0.24, 0.25 and 0.26, then we multiply it by 10 (now 2.4, 2.5 and 2.6) and then we Floor it. (Now all the values are 2)
Then if we subtract 2, those values become 0 while all the other values around them are non-zero. We can Absolute them (turn the negatives in to positives) and the 0 values stay at 0. When we OneMinus the result, we get our beautiful bitmask! :)
----------
For the second part, imagine we have 2 groups of vertices. ABC and DEF. Both of these are mapped to a texture in UV0 as per normal and apper to be overlapping.
ABC's UV2 values are in the "3" section of our grid, while DEF's UV2 values are in the "5" section of our grid.
Texture 3 gets put through the "atlas function" to be masked on to triangles that are in the "3" section of our UV2 grid, and appears black on any other triangles.
Texture 5 gets put through the "atlas function" to be masked on to triangles that are in the "5" section of our UV2 grid, and appears black on any other triangles.
We can then sum the results for our desired outcome :)
To call a sampler only once, people came up with a great invention - texture arrays. =))
@@williamgodwinACAB Texture array doesnt have mip maps
A fun little analogy that we talked about during the premiere:
Let's replace "draw call" with "pizza call". You want to order 10 pizzas!
Would you rather call and order 1 pizza, wait for it to arrive, call up again and order another pizza and repeat 10 times?
Or would you rather order 10 pizzas and have them delivered together? They might take more time to cook and process individually this way but you cut out all the delivery time.
This is basically the problem that the CPU runs in to when telling the GPU what to draw.
How that helps explain what's going on behind the scenes! :)
made me hungry
Great analogy! I love it!
Does this work for wings too?
I love seeing smart techniques like these being used
You are an extremely talented solo Game Dev. You learn very quickly and acquire practical skills
If your UVs don't overlap, you can use a mask texture instead. A lot of things don't have that many materials, so red, green and blue gives you three meterial switches. You can combine them and/or use 128 (half intensity) to get more masks.
Another method if you don't want to use another UV map is to use the same technique as UDIM. You move the original UVs by 1, 2, 3, etc. horizontally. Yes, they will be outside the 0-1 range, but that's fine. You can even move them vertically. In the material graph, you use frac on the UV's to get the original UV's back. The integer part is the UDIM tile and you can group texture lookup and whatnot by UDIM tile. The other stuff in the video still needs to be done like merging the materials into one.
A texture requires another texture sample/lookup, which is a lot more expensive than using some math to separate the sections. I guess the upside is that you can mask between materials on a per-pixel level. You could also use gradient-packing to get 12 masks for an RGBA texture (0, 0.5 and 1 per channel)
As for using a UDIM-style method, it's definitely a banger! I do something like that for my characters which need non-overlapping UV's for their damage painting, and it doubles up as a mask.
Charlie has unlocked the first stages of Unreal Ultra Instinct, he's too powerful.
Your hair is glorious, man. Also thanks for all the amazing tutorials!
Huge thanks for everything your doing. These tutorials are just the right amount of info to not be spoon feeding. You have really pushed me to rub my brain cells together, and I'm learning so much. Proud to say I got this functioning!
very interesting stuff. obviously the general concept of being able to use shaders in these kinds of networks suited to your project... is excellent. I guess at that stage youve essentially built a shader 'tool' for your requirements (ie, performance and behaviour). it's also useful to see the kind of scales shaders can grow to. Plus... free outsider art
I honestly learned a lot from this video and your "Unlimited Customization" dev log... My understanding of how to apply multiple materials in performant ways has significantly expanded! Keep up the good work, mate!
If you don't mind please can you share the link to the dev blog that you mentioned.. cheers...
@@0rdyin It was dev blog #11 and was about character customization: ruclips.net/video/A-P0llMckSw/видео.htmlsi=am-0shksVqkSW-Ai
This is great! I make all of my own assets, from skeletal meshes and animations to static meshes and what not. But I really wanted my game to have a stylized look with a color pallet that I've created... Do initially I had just a master material to hold my base color and such, with material instances from it for each color in my pallet... But that meant that each of my models had a material slot for each different color from the pallet they used. The draw calls were getting out of hand the larger my world became. So using this technique to simply organize my pallet and have it run on one material slot is going to save my entire project. Thank you!!!
Thank you for making this overview Charlie. This helped bridge the gaps that I was missing from your character overview. Cheers!
This is DOOOOPE!!! Charlie you're Awesome!
yes i have been too lazy to look into draw call reduction perfect timing.
Brilliant! Thx for taking the time to make it. I had to watch it twice to really absorb it all 😅
Dude that is so cool
It is such a hacky way to add variation
It useful for a lot of things
I only took a glance of this video, but I must comment before I dive into this further. From at a glance, gives me an excellent idea in resolving my multi-purpose road texture. In the beginning, I was I trying to develop a way to differentiate between the number of lanes, passing lanes, no passing lanes, and so fourth. I thought I could separate the R-G-B-A channels for masks. However, I needed more than 4 channels, and I could not figure out how to implement that. I wish I could afford supporting your channel, but, if you do not mind, I will certainly spread the word about you and this video. Please let me know if that is okay. Excellent job Charlie!!!!!!!!😁
Haha that's the exact same problem I ran in to - Vertex Colour is sacred and should be saved for more important things. They're also a bit hard to manage if you want to re-order them etc. UV maps are much easier to edit! Thanks for the kind words :)
That looks really cool. Thank you for sharing your experience with us.
This is game-changing :o, thanks a lot! I will be testing this workflow soon :D
Update, I've been adapting lots of assets to this system and it works PERFECTLY in VR in Forward Shading, is such a game changing feature! It saved us like half the drawcalls in most cases, in some even more than half the drawcalls :D
Thanks for sacrificing 3 hrs of your life to film this, because the amount of time you gifted me from being able to add this into my project is likely going to save months. Your a hero Charlie-boy, thanks as always for you well presented knowledge dump :)
I was wondering though, when you started your project did the gameplay abilities system exist and do you use it? Would love to hear your thoughts on it, been tryna wrap my head round the whole thing the last two weeks in spare time, was wondering if you had any insights you might be sharing in the future on it at all.
Damn. Always miss the live streams as working. Thanks again chief.
Downloaded that video and will definitely learn from that for my own Models. That's awesome!!❤
Sheesh charlie hitting the new year with some bangers
Man I love this spooky material magic! It took me a bit to get it working with my materials, but man, once it does, it's so neato! I was a little confused by the texture coordinates - so if you're like me and using this with plugging in normal/occ/textures as well as colors - MAKE SURE YOU LEAVE THE TEXCOORD NODE AT 2 (if that's where your atlas is) otherwise you end up with a stripey mess. >.>
Whew. Super cool stuff as always. :)
Haha glad to hear you're digging it!!
@@PrismaticaDev Next I get to figure out if I can get some random color variation options (or get it to work with the custom data colors). I've got it partially but it's only applying to the first texture so I've got some exploring to do! :D
This is like an atlas material on steroids :D
Thanks for sharing your knowledge
Holy hell, what an interesting approach
It's blown my mind.
i just learned so much watching this thank you so much!
This is genius! I am using a similar method, nearly a half-step away from doing just this and I can't believe I never made this connection. Still learning shader stuff though, so my method is a little bit more bone-headed if you will haha.
I also have a separate UV channel, but I fill faces/islands in with R, G, and B to create an RGB mask texture, then in Unreal that RGB mask takes in that separate UV channel as its texcoord, and then I use the RGB channels to lerp out each albedo (which is receiving the traditional UV channel). So unfortunately I only get 3 maximum textures/colors per material instance this way and I need another texture sampler for the RGB mask.
I'm seriously considering shifting to this method! Thanks!
Thanks for this. I saw the other video last week and this clarifies it.
Thanks mate! Very helpful explanation
Maybe it would be good to collect some of the limitations of this technique in the comments.
We tried using this on our foliage, but we use early-z on masked materials and unfortunately this is a bad combination with this setup. The foliage with default Unreal materials (8 different materials per tree) had almost if not more than twice the fps than the same foliage with this supermaterial applied. We made this observation both on PC and on Switch, but it really made the biggest difference when we tested on Switch. With the supermaterial the SceneDepthZ pass made up almost half of the whole rendering pass.
It would be great to get to read some more experiences where this technique actually excelled over the default Unreal method, but also where it absolutely failed like for us.
Thank you anyways for the inspiring video!
That's a very good find! I can see why it would affect the Early Z-pass negatively, since it would be testing against all textures at once
@@PrismaticaDev That's what we thought too, yes
@@riversandwine out of curiosity, are all of your texture samplers set to "shared: wrap or just left default?
@@PrismaticaDev "shared: wrap" due to the amount of texture samplers and because it doesn't seem to have any downsides.
Fantastic idea! This is going to help me a ton in one of my projects. :)
amazing tutorial again!
engagement, via comments.
Thanks though, found you on reddit and i'm learning so much.
This is Still the Best way to do it, In my opinion
Great video! Can you tell me why you pass the atlas function output through Vertex Interpolator on your master material at 29:42, but you didn't need to do that with the example material you were building at 27:50?
Follow-up: How expensive would it be to convert the default value of 0 to an input parameter that sets the default value for all other inputs?
Follow-up-follow-up: Any reason why you put the switch statements before the multiplication and not after? As far as I can tell, putting it after the multiplication would simplify to ignoring the atlas operations for that material. Am I missing something?
0:00 A disney princess lmao 🤣🤣🤣
Great video! Do you advice using a master material for foliage too?
Absolutely - I have a few master materials. Characters, Foliage, Props, Landscape and Trees all have logic that is very different to one another so they all have their own specific material setup.
@@PrismaticaDev Thank you very much, you are the best!
nice functionality
Thanks for the tutorial! I'm a bit confused and wasn't able to follow too well going from master material to material function, but is there a way to use this to go around the 16 texture sampler cap (especially if more texture types are added like roughness, normal, etc)? That list of Material Color at 5:09 is particularly scary. Were these blueprints meant to eventually replace those v4 parameters by textures? Or were they meant to allow us to have some color variety (organized by type/UV clusters) blended into the diffuse map while keeping the same microdetail normals, roughness, metallic, etc? For instance, if I have a bunch of leaves, I use this to have different hues of green, but the leaves would have the same normals/cutout masks, etc? Thanks in advance!
This is my question as well. By default you can only use 16 texture samples in a single material before hitting the limit on the sampler register index. So that would basically invalidate this entire process if you wanted to use full PBR materials for each of the slots. I've read that you can change the Material Expression's Sampler Source to "Shared: Wrap" or "Shared: Clamp" to get around that limit, but I'm unsure if that would be the correct way to address the issue. @PrismaticaDev It would be great to have some clarification on this.
Draw Calls = Demonic !
This is awesome stuff!
But you want to be careful about the quantity of switches.
In Unreal 4, each side of a switch gets compiled into your shader cache as a different variant.
So if you have many many many switches, the worse case being switches under switches, Unreal will bloat out a lot of variants.
Depending on your platform limitations and your cook configuration it might be better to avoid the switch and use Multiply & Add to create the masking effect.
Probably less relevant for Indies and NextGen consoles. But if you do something on mobile, memory limitations are a big deal.
Yep, very good point there! Every material instance that has a different switch configuration gets recompiled as a completely different shader. I think finding the balance between having them be broad enough to encompass multiple objects, but also with as low complexity as possible is the key. For example, my “floors” instance had a couple of mostly redundant layers that only appear in 2 assets out of 20 etc
26:49 Can someone explain why he's using TexCoord[2] instead of default? Is there any special setup on channel 2 that I missed?
Hey hey - TexCoord index 2 is being used for our "material mask" channel.
if the assign material doesn't work you should be able to shift select an object it did work on so that it becomes the active object and then hit ctrl + l and link materials.
So cool, amazing job! One question though, im new to the channel and i havn't heard on which platforms are you planning to realase the game? Cause this level of optimalization looks like a bit too much for a regular pc? Or am I just unexperienced :D?
Hey there! Just planning on PC currently, although using this method is a big time-saver as well when it comes to designing levels :) And the extra juice is always good to have for some cool effect later on
@@PrismaticaDev Right! That seems logical, thanks! :)
Kind of necroing the comment section here but I got an idea while watching your video... First of all: great work on those tutorials you make! I recently found your channel and I greatly enjoy your videos!
An idea flew through my head for blending between the different materials... another approach would be to create material attributes for each submaterial and then to BlendMaterialAttributes them based on the material mask. I have seen you are using a whole bunch of multiply and add for all sorts of different textures (diffuse, normals, etc.), it may be easier to combine them into material attributes and just blend them in one after another (maybe doing some simplest black material to switch out when not using those, for optimization purposes).That way you can have each submaterial group up seperately in the material graph and just switching the results based on the mask.
Yep yep! You could absolutely do that - the only reason I opt to do otherwise is since I only tend to use 1 texture per "layer" and the rest is weird colour logic. I'm not sure how MaterialAttributes blends handle empty entries (might be doing a lot of unnecessary operations even if you aren't using specular/roughness) but yeah, for big photo-real materials you sure could :)
@@PrismaticaDev Thank you for your fast reply :D
I have come to stress-test my approach finally last night... BlendMaterialAttributes seems to do take into account if the values are actually different from one another - like if you are blending between materials wich all have their specular channel set to 0, the instruction count goes down a bit... nonetheless the instruction count is ridiculously high for the blend. I have tried it with an If-node, that seems to work and gets the instruction count down a bit further. You can't Switch MaterialAttributes unfortunately though, it will implicitely cast the whole attributes to a single float vaule and blows it up into your face when you try to use it as input for the MaterialAttributes node.
I have an other problem with this... using one texture for each material subslot seems to use a texture sampler each... When I get to the count of 16 material subslots, the material complains about having too many texture samplers. I looked into your video and you seem to have just 4 texture samplers active at once, despite having a texture for each subslot active... how did you do that?
@@Develavid nice finds! Also, for the texture samplers thing - click the texture sample node and check the dropdown on the left hand side and set it to one of the "Shared" modes. It will combine them and allow for 128+ textures per material. It's a must-have for landscapes hahah. I don't know why it isn't the default option since it has 0 downsides, I think it's just for backward compatibility with REALLY old hardware/rendering
@@PrismaticaDev oh MANY many thanks for this tip! You are fantastic!
@@PrismaticaDev I have given this approach a bit more testing... and it seems that when using If-nodes it doesn't matter if the If applies before or after putting together the material. An If seems to just use one shader instruction either way and it's the same instruction count to do it on the texture as to do it on materials putted together with a SetMaterialAttributes node (the MakeMaterialAttributes node doesn't seem to work in ifs... I have no [swearword] clue why that is)
So it might be even cheaper on an instruction count basis to create the materials seperately and If-ing them at last - maybe even doing some SetMaterialAttributes afterwards to make global things like rain, wind and such things. When you make the StaticSwitchParameter the way that you even don't calculate the if when the material is not used, it saves some juice aswell. I may be doing a pastebin in the near future if I can get myself to it so I can share the stuff - if you are interested, that is.
EDIT: Nevermind, it seems it doesn't matter if you are doing two Ifs which hook to one material or two materials that hook to one If... The instruction count is the same. Apparently it is implicitely calculating the if for every material input and even optimizing itself against things that are hooked up the same way in different materials... so it may even not matter how you line it up.
Aw yeah, knowledge
Aw yeah
Damn... this is cool. Still working on my first little boring racing game and i don´t need this but if i will stick with games in the future i will come back for this. :D
Thank you for the insight
another excellent video
Great video. Have you used HLODs? Do HLODs do something similar and generate their own atlas material?
Hey there! I haven't actually looked in to HLODs yet since Prismatica is a top-down game and we don't run in to many LOD issues yet haha. Definitely something I want to cover in the future
Great video, thanks
Hey Charlie, great tut!
Not sure if you know by now, but unreal has a checkbox to combine all meshes on .fbx import.
Hey hey! Yeah I did realise that (a little too late :P )
Nice! Does it work for realistic where you need to blend stuffs like moss or dirts or it only works for stylized?
It works for all styles. You can blend the moss/dirts using Vertex Colour/Paint which I also have a video on :)
@@PrismaticaDev Cool! Thank you for the video
39:45 lmao you got my like for the content. You got my subscribe right in this moment lmao
Thanks for this, yet I've a few questions.
|I'm new to unreal. I've been doing tutorials on making a game, and am now getting ready to start my first demo. I've been downloading assets from the marketplace and had noticed that some (probably all) have many materials and knew from other videos that that meant multiple draw calls.
So, tonight I come upon your video. In the beginning of your video it seemed you were going to delete all the materials and make it just one or two. If I understand it, you are creating a stack of paper (IE one per material - the atlas) in one master material, is that right? Though if there are more than 16, are you just chucking the extra? Or did you choose 16 because the asset pack used 16? Once done with this new atlas / master material, that is now just one draw call?
Also, is the master material for just this asset pack, or can you use it across other asset packs? Though likely have to go through each pack and do the export to blender thing and back again. And would any of this benefit from C++?
great tutorial! some really cool ideas here that i've had a look at myself in some experiments, great to see someone else using similar methods. however, have you explored the material layering system in unreal? it is almost what you have created here, i'd definitely check it out if you have not seen it. Judging by your performance, its not too important. but the user experience might be nicer :)
Hey hey! I have looked at it indeed, and it should be implementable using this same UV-island method as the masking/blending method. The only reason I haven't used it myself is because my materials are quite simple in terms of texturing compared to other games and it seemed a bit overkill, and it's also hard to know how well it scales/its performance since it's a bit more under-the-hood
Using material layers, how would you map the relevant faces on to their respective material layer? Say i've got an asset that's a wooden chair with metal legs, how would I tell Unreal that the legs need to use the metal layer and the rest needs to use the wooden one? As I understand it the layer blending happens using a texture mask, but would that not come down to the same system as this video, still needing the seperate UV channel to put the UVs on the relevant part of the mask (say that we use 0(black) for metal and 1(white) for wood, putting those UV's on either black or white)?
Thank you for the video! Very interesting! I have a question. Would this method still be convenient even if we use multiple Albedo Textures instead of pure colors for the materials?
It should still work perfectly, and all the Albedo/Normal/RoughnessMetallicAO will be in sync
This material looks quite complex. Is it performance heavy, especially when used with albedo textures?
Other than that, great video!
Cheers!
It's definitely more expensive than a regular shader, however compared to the cost of even just 1 extra draw call, it's miniscule. Check out my Devlog number 11 about my character material to see a comparison of 4 material slots compared to 1 (it's about a 300% gain)
@@PrismaticaDev I see, I will definitely check that out!
Hey Charlie, would using the UE material layers system be a bit simpler than this? It was used extensively for props in the UE5 matrix demo. Have you experimented with it? Would be great to get a vid on the advantages/disadvanages.
There are definitely upsides like the modularity, and downsides like the potential for extra performance impact, but at the end of the day both systems require a way to separate the sections of the material which is what the focus here is about (using UV's instead of an ID texture or vertex colours)
When making a ruined building model, do you use 1 normal map and the 16 channel for just color selection or is it some funky thing that I'm missing?
How do you align the wood grain or the wall brick on the model?
For these assets, they use multiple different textures - some of them are simple tiling textures (like the brick wall pattern) and some of them are baked atlases/high-low bakes. All of the textures are aligned using UV index 0, and then swapping between the textures/colours is done in UV index 1
would it be possible to make some textures world position but still taking the UV channel for layering as you do in your system?
Sure would be! :) You'd just need to add a switch for each "layer" that changes it's main UV channel's coordinates
@@PrismaticaDev awesome thanks for the info!
Great video as always man!
A few questions tho: Isnt it the same as the layered material system?
How does it differ and is this one more effecient?
What if I want to add an AO+Curvature (Packed) Texture - Slot, so I can add dirt and edge damages on top of all that soup? Where would I have to add the slot for the packed texture?
What would be enough is to have one slot each material instance. This slot is unique for THIS mesh only.
Wall1 uses Mat_Wall1 and has WearTexture_Wall1
Wall2 uses Mat_Wall2 which is a copy of Mat_Wall1 but has WearTexture_Wall2.
Obligatory: Obviously I cant have everytime 1 drawcall, with unique texture anyways not. And also I would need another UV channel, ofcourse, and bake down and pack all the stuff into one texture, unwrapping properly too just for AO and Curvature (which isnt that hard).
So main issues here are, extra uv channel and adding the slot for the packed texture "mask" with ao+curv which uses extra 2 materials for dirt and edge damage.
Because as I can see you arent using any ao dirt or edge damages so its pretty straight forward, but what if you would want to? ;D
Ty Charlie
4:55 aight imma head out
KEKW
Great video.
Any way to create a grid and not strips? Have you considered a way to possibly do this with a flipbook to remap UVs? I think it can be done. I'd like to create a 10x10 grid.
Hey there! You can do strips in the X and Y axis and the multiply them together to create a grid. The only trouble with a flip book is that it is taking a pre-existing texture atlas and blowing each section up to full size, where as what we’re doing is completely modular
@@PrismaticaDev thanks! I just did the vertical strips. Too much trouble to worry about a grid
How would you handle material physics using this technique, specifically in terms of playing different sounds depending on which texture is hit?
There is a paper on Rendering Wounds in Left 4 Dead 2 by Alex Vlachos and it would be interesting to see your take on it.
Sounds like an awesome read! I’ll have a look in to it :)
I wonder, isn't this kind of mad material logic very CPU costly to run in a game?
Or is that just to prep your assets that all go through baker later?
The shader calculation isn't that much unti you start layering 10's and 10's of materials, and even then it is still up to 10x faster than using separate material slots
Followed along and everything works great! The only bummer is copying and renaming 16 parameters each time I want to add a new one. Is there a way to tell Unreal to increment the parameter name number instead of keeping it the same? For instance, texture 1, texture 2 etc.
You could technically write a For loop in HLSL custom nodes and declare the variables there perhaps? I’m not too sure
Thank you!
Awesome!
But, doesn't that greatly increase the complexity of the material which in turn reduces performance ?
It does increase the complexity slightly, but the benefit vastly outweighs the cost. Draw calls are the main bottleneck for rendering times, so this method can be up to 10x faster (not an exaggeration)
Hi Charlie, thanks for the awesome video! Do you think this would be possible with the use of vertex colours and z up too in a PBR workflow. Thinking of trying it on a project I'm currently doing :)
Can someone tell me how to create the "Num of Segments" node in material editor? It's like a local variable but I don't know what the node is called.
Hello - press "1" and left click. It's a static scalar value
5:06 this huge blueprint has less base instructions then my tiny little master material.. 👀🤤🤯
Just gotta know where to cut corners! And also there are a lot of Static Switches which are defaulted to "off" so they don't get counted in the shown instruction count in the material graph.
@@PrismaticaDev do you offer a masterial material that works with this function. The example mat you created is a ltttle hard to follow and Im currently using Easy Mapper which would tough to integrate without help.
Any worry about the materials that get created/compiled when using static switches in an uber shader?
what if our assets use multiple *textures*? as i see it, all the pieces of these assets use just one texture (stone). this is just so we can overlay some color over one individual texture? it is still a hell of an improvement for sure. but i am trying to understand the limitations here. great work and thanks for sharing :)
edit: oh it doesn't work like this. it does work with multiple textures! it is not just for overlaying colors then? so, if i have an huge asset with a lot of individual textures, this will work too? what about sampler limit?
Sorry for asking stupid question but why does Num of Segments node look blue?
Blue nodes are usually Material Functions - you can check out my Material Functions video for more info :)
@@PrismaticaDev So the inputs in the function are vector 4 but it's throwing error when I use the setup from pastebin saying, can't connect Vector 4 to vector 3 or something like that, so I changed all the inputs to Vector 3 and it worked, Is that fine?
This is great but only if you are going for same color driven stylized style, as this doesn't really explain how to deal with 16 samples max in a single material. Would be great to get a vid on this subject "Workaround for 16/16 samples".
I tried to use UE5 Material Layering system but that has same limitation i can only have about 3 layers of materials and then i run out of 16 samples.
If you set the Sampler type to "Shared: wrap" you can have 100+ samplers in the shader :)
@@PrismaticaDev 👀oh
wait, say i want to use this thing in my character system and i want to have hair, armor, pants, boots and so on, do i have to put them all in 1 blender file in order to organize their uv atlas map thing?
You could use a secondary UV channel, you could also use Vertex colour which you can assign on import which might be easier. But either way you'll need to have a system figured out beforehand :)
Actually now that I think of it, you can edit UV's in Unreal 5 now so that might be worth looking in to!
@@PrismaticaDev thanks kanye, very cool
soo i have successfuly implemented an atlas for my character material & merged the mesh, it has dropped the draw call from 60ish (i kinda separated the mesh by quite a lot of parts) to about 5, so thank you so much for this video! it's still a bit weird tho how there's still 5, i thought having 1 mesh and 1 material means like 2 draw calls or is there something else i don't know?
@@masgondi1 awesome to hear!! I'm sure your GPU will thank you haha. The extra draws might be from the lighting pass or from cloth sim if you're using it
why do not use separate uv layers for materials or udims?
Is it normal that every mesh that have the instance of this master material will load every texture that is in the master material? so i reduce the draw call to one but the size map of every actor is huge cause every texture is a hard reference (disabling the static switch parameters doesn't seem to do nothing about the hard references) Did I lost something? Thanks in advance
is there a way to do this with UDIM slots I imagine it would be way easier if there was a material node that can point to each udim slot then you would just put each UV piece in it's own slot 1001,1002, 1003 etc
You can handle it that way, however this method is a little more flexible as you can have texture UVs that are outside of UV bounds. Although I'm not sure how UDIMs are handled in Unreal, so I might be wrong
Thanks for the reply, still trying to understand the UDIM in unreal I don't see how to make it into a material atlas.. I have two questions Does This videos atlas method support tiling down for high pixel density? if you choose the same slot would it blend the texture on top of eachother? @@PrismaticaDev
if it supports tiling then I think this videos method is better because virtual textures in unreal are more expensive
@@Retrocaus Yep, you can scale/offset the textures to any size within the material without breaking anything, since we're only fiddling with the primary UV channel, not the "key" UV channel
so if lets say my material on section 1 was a 4k tiled cloth material, on the second section I could have a 4k hand drawn texture in a material and just offset and scale it to overlap the uv in the position I want right. if so that just makes udims pretty much pointless lol makes you wonder why this is not an engine feature.. I see they are adding uv tools to the engine so maybe we will get this built in.@@PrismaticaDev
Can you put the link for the blender add-ons in the description?
Niiiice!
You are!
oof, Runescape nostalgia. I need to sit down for a minute.
With Nanite now bypassing draw calls completely. I wonder if its still useful to do material atlassing like this?
I don't think Nanite bypasses drawcalls completely - from my understanding, it can "batch" together instances of the same material across different meshes since it essentially merges all meshes in to one GIGA-mesh. But it would be interesting to do some performance comparisons. This method greatly reduces draw time at the cost of shader complexity, so with Nanite it might be more performant (if using the same material across multiple meshes) to use smaller, simpler materials.
Performance aside, this method is still extremely useful for ID-mapping and sectioning different parts of an object even when using Nanite
@@PrismaticaDev I did some really basic tests using a mesh with 3 materials vs 1 mesh with 1 material (fancy uv setup). Even using the same more complex 'UV mapped material' for both nanite and non-nanite version. It seems the nanite one is more performant at first glance of stats.
I know most people talk about nanite using 'high poly stuff' and LOD aspect of it but my game is low poly, so the performance of material pipeline in nanite isn't talked about as much. I'll continue to investigate but I think it might be more performant, or at least have no performance loss of having independent materials now. If I find anything interesting, I'll post to prismatica discord.
@@namrog84 Would love to hear how it scales at the scene gets more complex and more different materials/meshes are added. Cheers!
Tell me how you work with node Vertex Interpolator? My textures breaks and the shader only gets heavier.
Correct - the textures will be very "low resolution" because it's taking the colour of texture at that vertex point and using it to colour, so it's dependent on the density of the mesh. On the topic of the instruction count, the Vertex instructions are orders of magnitude cheaper. Imagine running those instructions for only 1,000 vertices vs potentially 100,000 pixels.
Hey! you should look into texture arrays. would simplify this a lot.
Can you elaborate on this more please? :)
this is now broken in UE 5 any possibility to update?
I understand why i need it, and primely how to integrate it into MM, but. . . Is it possible to, to individually combine with (MF Map Adjustment) for every map in total, to have fully parametric material?
2 Days later maybe it's working, i simplified it from 16 to 4 for reason's but it has nice parametric and group structure now it's testing time I need to remap tree in blender. . .
PS: Thank you for your creativity.
Nice
Can't for the life of me get that MF to work. In UE5 I get `[SM5]: Cannot cast from smaller type float3 to larger type float4.` It seems to think the output of the MF is a Float3, and I assume the base color of my MM is expecting an Float4. Is there a step I'm missing?
Did you ever figure this out?
Take the float 3 and append vector
what is the "1 - x" node? This is way cool! You've created an infinate texture atlas. I'm an artist, not a coder, but I was able to follow the concept of this and incorporate it in my Cell Shader that uses directonal light. But I got stuck on the "1 - x" node...
never mind. I found it on your blueprintue link.... for anyone else that ran into this, the node is called "OneMinus"
As said before, you should really look into texture arrays - this would make this a lot less complex.
Anyway - beside this, great information. I also use this technique for a city I am creating and it's really a lifesaver. I do use vertex colors, because it's easier to remap these values if necessary and I only need 1 channel - so it sufficient. Also I can blend material more easily :D
I've had a look in to them a few times - definitely a very useful tool, although I'll need to look in to how they're allocated in memory. For example, if only a few of the textures are being used, the entire array is still loaded in memory (probably not the biggest deal) but if a texture needed to be changed as a one-off, would creating another texture array allocate space for ALL of the duplicate textures? From what I've read the texture arrays are more performant because they have a fixed address or something that makes it easier to find and sample, my only concern is the flexibility of it. And also apparently mipping doesn't work with them (not a big issue for me, but could be for others)
@@PrismaticaDev Yes, the texture array is always in memory (as long it's used) and in my opinion really nice for objects that are static - but these are often a lot.
For example the city I am working on. Yep, these buildings are static so there won't be any changes over time. Also these building are always present, so the texture were in memory nontheless. So an array is really suitable and for me - also pretty flexible.
Before doing anything, it needs to have a plan what is needed, but when this step is done, it's easy to maintain (drag & drop array).
In Blender I do have a script, that is converting the materials to vertex colors (every material does have an ID and this is applied to the vertex color) and you get also a nice visual representation.
About mip mapping - in my experience this is a rumour that is wrong and maybe a thing when the arrays were new in Unreal. Yes, on default you don't have mips, but when you go down the options and search for the regular mip map settings, you can turn them on. Tested it, worked.
EDIT: Oh, and one of the most important benefits of using texture arrays. My whole city is consuming just two texture sampler slots (four, if you want to blend).
@@PrismaticaDev Does this mean with your method, if a material slot is not being used those textures won't be loaded into memory?
@@aukehuys2297 I can't find the docs or information on texture arrays. Is this an array in the material or a data asset that can be set up and accessed by a material?
having everything compacted into 1 Shader, isn't making that shader really heavy? to load and to generate? I mean, every frame, the GPU has to calculate this Heavy Shader. if we do a Shader Complexity, is it still green or become red or white? :O Thanks!
It’s heavier, but the cost of sending multiple draw calls is much much more costly :)
@@PrismaticaDev oh ok thanks :D didnt know it was much more costly to have multiple draw calls. Thanks dude, still learning! :P
Why not just use an RGB map and 1 main master material & instance?
RGB or any texture keying requires you to have a unique texture for every asset which needs to have it's own individual unique Material Instance (so you can set the Texture param)
In this method, the data is built directly in to the UV data of the meshes themselves. You could also use Vertex Colour, however it's quite difficult to do things like "select all verts with colour between 0.1 and 0.2" if you want to make adjustments like you can with the UV keying method
@@PrismaticaDev Ah that makes sense. Thanks for the in-depth response 🫂