Super cool, still somewhat complex but really really cool. I have rendered out textures to be used in realtime shaders like complex noise this way for years but never anything in this extend.
Yes! And high res to low res works if you have the same UV map (ie, high res was made from the low res.) Otherwise you need to transfer UVs from low to high, which can get tricky.
Awesome video, just binged a lot of your videos and I am very impressed. Question , in your video for clean toon shading with object normals, how do you bake the normals to a uv map? Tried following the series customizing normals but gave up, geometry nodes are not my cup of tea, would appreciate the advice.
If they are based on a Normal saved to vertex data with geonodes, yes. The Normal from the Geometry Node in the shader will be based on the meshe's current state in world space so that doesn't work. If its an attribute it will. If you want to bake with a Normal Map too then you'll need to make UV Tangent in Geonodes as well (see my normal mapping video.)
Can we use this to bake from another object (eg. high poly) with different/overlapping/no UVs down to the unwrapped low poly object? If so I would love a video or even example file of how that could work!
This method relies on the meshes sharing a UV space so that when you render a texture on one, it can be used on another. You can always transfer the UV from your low poly to high poly mesh via Nearest Surface or Raycasting. Or another method would be to take the low poly object, subdivide it until its as dense as the high poly (non-smooth subdiv, not subsurf), transfers the attribute you want to bake with Nearest Surface, then bake to itself.
@@aVersionOfReality Just fyi I wrote and maintain a baking extension for blender. So I'm always interested in ways to get around the limitations of cycles baking. OSL can help a bit, but those shaders execute so slowly. I'm currently writing an extension to bake animation data and I'm looking to use EEVEE, because while cycles is quick enough for a couple of textures, its way too slow when you want a few 1000 frames!
You can bake anything you can render in front of the UV layout. The catch being that you need UVs that will work for. Although I suppose you could also use multiple layers of decals to do it, hmm. Will think about that.
In the next video is it possible to show the node setup of your geonodes corrective smooth implementaion paired with mesh decal? Preserving the mesh shape had given me a headaches when i tried making my own decal system and i'm too dumb to figure out how to get the mesh tangent information in geo nodes ;(
Probably not in the next video, but I am going to get to Corrective Smooth eventually. Its just very complicated, and I'm trying to build up through less complex stuff but that introduces similar setups. So it'll probably be mesh projection, then Surface Deform, then Corrective Smooth.
Always Standard, as everything else applies a color correction. And when saving the Normals, you want to use .png with non-color or .exr (which will be Linear.) So not SRGB. And generally avoid Save as Render. You should be able to see the settings on my file output node.
Would it be possible to use an already existing image texture with this? I saw that you put the color on after setting everything up and I don't if I can put an already painted texture on. I wasn't planning on having to bake the shader originally but plans changed.
Does it work "backwards"? I wanna bake a procedural texture I created. I know in Cycles, it's simply a matter of selecting what map to bake from a dropdown.
Do you think its possible with GN to take a second body mesh and flatten to uv of the first body mesh Say i have two body meshes with different topology and different Uvs and texture maps and i have shrink wrapped the second to fit the first perfectly I want to take the texture of the second and use it on the first. If through GN I can edit the second body mesh to flatten out to the UVs of fist. So something where it takes the uv seams of the first and slices them onto the second body mesh(makes new edges) and then flattens the second mesh to the same shape of the first when its flattened. So now we can bake the second mesh texture to the firsts UV and use this baked texture on the first mesh I hope i explained that well enough Also wondering if we could make a sort of auto body mesh fit system. If both body meshes vertex weights are named the same could get GN to move the geometry using each vertex group as a selection so so each section of the body mesh moves and rotates to match he other body mesh then shrink wrap the result. Do you have discord, wonder what your thoughts are as you seem very dedicated to figuring out complex things like this :D I need something like this as i have several assets and i like the mesh of one but like the textures of the other so a nice and procedural way of converting assets would be epic
You can transfer the UV map of one mesh to another, and then unwrap to it. This can get complicated when Seams don't line up well, or when there's transfer distortion from not having a clean fit of one mesh to the other. But those things can be worked around. And yes you can make a fit system like that. If you make Surface Deform in GN, you can then also mask it in various ways. You can find me in the BlenderNPR.org discord.
Hey man! :) Did you see the new RayPortal BSDF in 4.2? With this node you can bake textures in Cycles. Theoretically even high to lowpoly bakes. Unfortunately no one has figured out the object to tangent space normal conversion yet. I tried the MikkT space within the compositor. But had no luck. This might be a challenge for a bright guy like you.
@@waltage Ah yes. That matters a lot when it matters, but often does not. And of course this same method can be used with Cycles too. Then Eevee is just a preview.
Yes. But there's plenty of other issues that often come up. I'm not saying Cycles can't be used or always has problems though. And this method has downsides too.
I was interested in the UV space stuff before but this is actually a literal game changer for me
You're such a genius with these!
He cooked once more, good one!
Super cool, still somewhat complex but really really cool. I have rendered out textures to be used in realtime shaders like complex noise this way for years but never anything in this extend.
That´s actually really smart way to Bake in eevee, thank you!
this looks like it could be immensely useful!
This is a game changer!🎉
If you add a few other nodes (edge angle, face area, mesh island) you can bake other useful maps.
Yes! And high res to low res works if you have the same UV map (ie, high res was made from the low res.) Otherwise you need to transfer UVs from low to high, which can get tricky.
absolutely incredible. thanks dawg
😎
Awesome video, just binged a lot of your videos and I am very impressed. Question , in your video for clean toon shading with object normals, how do you bake the normals to a uv map? Tried following the series customizing normals but gave up, geometry nodes are not my cup of tea, would appreciate the advice.
great tutorial. Is it possible to bake shader normals directly to the mesh normals using this method in EEVEE?
If they are based on a Normal saved to vertex data with geonodes, yes. The Normal from the Geometry Node in the shader will be based on the meshe's current state in world space so that doesn't work. If its an attribute it will. If you want to bake with a Normal Map too then you'll need to make UV Tangent in Geonodes as well (see my normal mapping video.)
Can we use this to bake from another object (eg. high poly) with different/overlapping/no UVs down to the unwrapped low poly object? If so I would love a video or even example file of how that could work!
This method relies on the meshes sharing a UV space so that when you render a texture on one, it can be used on another. You can always transfer the UV from your low poly to high poly mesh via Nearest Surface or Raycasting. Or another method would be to take the low poly object, subdivide it until its as dense as the high poly (non-smooth subdiv, not subsurf), transfers the attribute you want to bake with Nearest Surface, then bake to itself.
@@aVersionOfReality Just fyi I wrote and maintain a baking extension for blender. So I'm always interested in ways to get around the limitations of cycles baking. OSL can help a bit, but those shaders execute so slowly. I'm currently writing an extension to bake animation data and I'm looking to use EEVEE, because while cycles is quick enough for a couple of textures, its way too slow when you want a few 1000 frames!
@ Ah, yeah makes sense
Wait so you are telling me we can use grease pencil to bake textures? This is awesome!
Interested in seeing some low poly stuff.
You can bake anything you can render in front of the UV layout. The catch being that you need UVs that will work for. Although I suppose you could also use multiple layers of decals to do it, hmm. Will think about that.
thank you i learned a lot from this
In the next video is it possible to show the node setup of your geonodes corrective smooth implementaion paired with mesh decal? Preserving the mesh shape had given me a headaches when i tried making my own decal system and i'm too dumb to figure out how to get the mesh tangent information in geo nodes ;(
Probably not in the next video, but I am going to get to Corrective Smooth eventually. Its just very complicated, and I'm trying to build up through less complex stuff but that introduces similar setups. So it'll probably be mesh projection, then Surface Deform, then Corrective Smooth.
For baking the normals with this method, what color setting do I use in the color management tab? Standard, Agx, Filmic?
Always Standard, as everything else applies a color correction. And when saving the Normals, you want to use .png with non-color or .exr (which will be Linear.) So not SRGB. And generally avoid Save as Render. You should be able to see the settings on my file output node.
Would it be possible to use an already existing image texture with this? I saw that you put the color on after setting everything up and I don't if I can put an already painted texture on. I wasn't planning on having to bake the shader originally but plans changed.
Yes that should work fine as long as the textures matches the UV map you unwrap to.
@@aVersionOfReality Aight, Thanks.
Does it work "backwards"? I wanna bake a procedural texture I created. I know in Cycles, it's simply a matter of selecting what map to bake from a dropdown.
Only if the texture is UV mapped. Then it should be the same on both meshes.
thanks for helping me
Do you think its possible with GN to take a second body mesh and flatten to uv of the first body mesh
Say i have two body meshes with different topology and different Uvs and texture maps
and i have shrink wrapped the second to fit the first perfectly
I want to take the texture of the second and use it on the first. If through GN I can edit the second body mesh to flatten out to the UVs of fist. So something where it takes the uv seams of the first and slices them onto the second body mesh(makes new edges) and then flattens the second mesh to the same shape of the first when its flattened. So now we can bake the second mesh texture to the firsts UV and use this baked texture on the first mesh
I hope i explained that well enough
Also wondering if we could make a sort of auto body mesh fit system.
If both body meshes vertex weights are named the same could get GN to move the geometry using each vertex group as a selection so so each section of the body mesh moves and rotates to match he other body mesh then shrink wrap the result.
Do you have discord, wonder what your thoughts are as you seem very dedicated to figuring out complex things like this :D
I need something like this as i have several assets and i like the mesh of one but like the textures of the other so a nice and procedural way of converting assets would be epic
You can transfer the UV map of one mesh to another, and then unwrap to it. This can get complicated when Seams don't line up well, or when there's transfer distortion from not having a clean fit of one mesh to the other. But those things can be worked around.
And yes you can make a fit system like that. If you make Surface Deform in GN, you can then also mask it in various ways.
You can find me in the BlenderNPR.org discord.
Hey man! :) Did you see the new RayPortal BSDF in 4.2? With this node you can bake textures in Cycles. Theoretically even high to lowpoly bakes.
Unfortunately no one has figured out the object to tangent space normal conversion yet. I tried the MikkT space within the compositor. But had no luck. This might be a challenge for a bright guy like you.
Yes the ray portal node is great! I'm not familiar with the object to tangent space issue. Do you have a link?
@@aVersionOfReality mmh, I think youtube doesnt like my link, did you receive the message?
@@YT775 Hmm no. You can send to my email or DM on twitter. Info is on the youtube About page.
Good on ya
Brilliant. but how about influence of sampling of EeVeE?
What do you mean exactly?
@@aVersionOfReality Eevee rendering not so precize
@@waltage Ah yes. That matters a lot when it matters, but often does not. And of course this same method can be used with Cycles too. Then Eevee is just a preview.
you saved my life
we can turned samples to zero for baking... you don't need samples for most case scenario.
Yes. But there's plenty of other issues that often come up. I'm not saying Cycles can't be used or always has problems though. And this method has downsides too.
Oh thank god there's a solution for those seem lines, I was racking my brain over that problem for at least a week,