Awesome tutorial. Also want to put this here in case someone has these issues for any reason. 1) If you import your model and its scale is set to 0.01 (or anything other than 1), and you happened to apply scale - Simply set the Strength value on your displacement modifier to 0.01 to get the correct result 2) If your height map was a PNG or something, by default its Color space would be set to sRGB. Make sure you set it to Raw/Non-Color
I subscribed because I love the way that there are two of you guys. When there is only one person, it kinda feels like your watching a video. But when there are two of you guys, It feels more like a conversation that i'm in.
I'm a instant fan of you guys. The blender 2.8 was a shock to my system but damned if you guys havnt helped helped me immensely. Im a hobbyist that started with blender In 2014. It shocked me then too. Graduated full sail in 2006 for CA. Maya was it back then. Blows me away how much work flow has changed. People getting in it today dont have a clue how hard it used to be. Love you guys.
Whoa!! The Timing... I just completed my character sculpting in Zbrush on this evening and was wondering how can I get all the details from Zbrush to blender after retopologizing it in blender.. Thanks Guys! You saved my time. Looking forward for more tutorials. : )
@@FlippedNormals Thanks for your awesome support guys! 🥰 It's my first complete character project and I don't know how much weird it gonna be. I am trying to retopologize everything and going to texture it with substance painter for the very first time. But feels pretty much confident; thanks for the 'Introduction to Substance Painter' course on Flipped normal marketplace. There is many digital resource website but most of them don't assure the quality of each and every product; all waste is dumped there and its so hard to find good course from there. Whereas FlippedNormals have only quality resources which makes it worth to save some money to purchase some of the courses. And also thank you sooooooo much for making these much high quality content and make it available to the community for free through youtube! We are looking forward to see more awesome content on this channel. Thank you!
This has been incredibly useful for me to understand. An hour ago I had no idea about anything involving retopology! This, plus a couple of your other videos, have been a godsend. Amazing work!
create a texture node + image , go into the texture paint,(making sure that the texture editor has your newly created image) then set your brush texture to the alphas you are referring to. Your texture node can then be linked to whatever channel you need to affect (i.e. bump, disp, roughness, etc etc)
@@aiamfree Hey man! It appears youtube never notified me of your reply. So sorry for the late response and thank you a ton! Gonna try that tommorow :D (I kinda gave up on it after a while)
Will you guys make a tutorial below? sculpting hi-res in blender 3.5 > retopo > bake displacement map from hi-res sculpt > apply displacement map to the retopo mesh > render with cycles)
It’s better if you use material vector displacement, combined with micro-displacement. This makes it possible to use much higher and more accurate detail
@@FlippedNormals if I were you I would be waiting for 2.81 because there are some very interesting updates coming to the Poly Build tool! t.co/S462pP3cuJ Pablo Dobarro is doing an amazing job with the sculpting workflow ♥️
Hello @Flipped Normals. I have a problem with baking displacment for blender. I made geometry using dynamesh, then i used subdivision for more details. After that i copied the high poly mesh and Zremesh it for low poly version and i used Subdivision again for projecting details from High poly to low poly. Next to i tried to bake displacment but there is a problem. When I exported .exr displacment for blender it wasn't work at all. Or it was a black image or with only a little information (and in wrong direction). I tried to export tiff file as displacment it works better but still not good. It generate information on mesh but there was a mesh artefacts because of that. I tried this map in substance painter and it worked well. Idk why there is problem only in blender. Could You something tell me about that? Thanks. Regards.
Amazing stuff, by far the best approach to explaining things without confusion! Just wondering if you choose blender over cinema for any reason? I’ve been using cinema for a few years now but are seeing more and more people opting for blender and I’m noticing tools that don’t seem to even be available in cinema, Thanks again!
Is there a video on the whole UV border issue? Like Zbrush exports displacement maps with purely frozen uv borders, while the subdivision modifier can only do linear or smooth borders. Are the seams on this just hidden very well or how did you get around that?
nice tutorial. my question is: does it matter which subd i will export from zbrush? you took subd 1 but in blender it shows every detail til subd6. so it seems like it doesnt effect it this much, right?
so i created horn with vdm brush that i created, and i applying it on the mesh with UV and subdivision, and i generate displacement like this, and using in in blender, the horn is in the wrong direction, it's like straight spike direction, it should be like rhino direction horn, what did i do wrong? please help/
@Flipped Normals So you showed how to get one subtool into blender. Great. But how many of us have models with just one subtool? How do you get an entire creation made of multiple subtools into blender? Is there a way to do it all at once? or do you have to import every single subtool individually? What am I missing here?
man, nothings working, i tried this with a GOZ/GOB multi subtool model i have. but i cant seem to get the details projected with this method and also the displacement mode method. What do you think is the problem? im using blender 3.4
@@Lostpx Cause Displacement map in cycles material works only at render time, not in the viewport and so it doesn't work for Eevee which is important for istance if you want to paint on your character.
I'm trying this out with a bust model that is ~1.2m tall in Blender; exporting a 32 bit map(confirmed by opening in photoshop) gives correct displacement for details but also shows some stair-stepping effect. Examining the image, the range is very narrow around 50% grey, I get a stair stepping effect with enough subdivisions and if I use levels to view the range (tiny spike around 50% grey values) in photoshop it also has the stairsteps.
Think I found a solution, should be accurate, theoretically. In ZBrush, set the [Multi Map Exporter] export scale to 10, It will complain that the 32-bit scale should be set to 1, ignore it. This makes the histogram range for the values 10x wider. In Blender, set the displacement range to 0.1 to compensate This avoids crushing the dynamic range of the exr if your model is small (even 32bit isn't enough for the tiny range otherwise). Apparently even 1 meter is too small for 1:1 scale in Blender.
Hey guys, thank you so much for this tutorial. I'm a beginner and hoping to get my file size down a lot with this technique when I do skin textures in ZBrush. I am facing some trouble trying to export the map though. I'm following your tutorial and click export all maps. But no file appears. I've tried TIFF and EXR but nothing appears. Any help?
I can't get my displacement to work after following this video... When exporting to tiff the texture looks okay, but with exr i can't see anything but a grey or black image... Also tiff file is like 200 mb and exr one is 5 mb or so. Tiff file has a weird midlevel, so i can't use it. Any help? imgur.com/a/WSfz7ei
Your content is simply amazing! Keep it up! Quick question: can I do this directly from blender? EDIT: I suppose the shrinkwrap modifier is what I'm looking for...I'll test it out!
OK! Thanks for HDR and Disp map explanation. Only thing now is how correctly setup node in blender when i have metalness pbr workflow. My question here. Have you saw any other way to create displacement map ? I work with photogrammetry so optional step (hand retopo) will give more or less good low poly model but reproject create always some kind errors that take hours to fix. I fight now with Topogun to see its possible there, but maybe another software ?
Thanks. I'm having an issue with the texture map lining up with the displacement. I know you do not cover the Color, texture map in the tut. I was wondering if you can help. Im exporting all from Multi map exporter. its off slightly. here is a link to the comparison.
yuo can set the scale in blender, for example my zbrush models tend to be a hundred time bigger in zbrush that blender, so importing back in I clamp the scale to 1 and set the displacement scale to 0.01 in blender. works fine. I also have blender set to unit scale of 1.0 and cm , so my head is real size ie around 25 cm from chin to top of head setting the scale to ,01 in zbrush does not seem to work in my experience, I just leave it at 1
There are so many things in ZBrush that make me wonder why? ZBrush has been out for a while and yet it seems like they haven't bothered fixing bugs I'm finding that have been around since the release of those features. Projection Master for example....I've found 3 bugs that people posted questions about like 7 years ago.
hi flip in new version of zbrush i can't do this bcz first need go to tools and make uv unwrap and create displacement map and export it like in video of you
Question, is it possible to make the displacement map directly in Blender, or do you need ZBrush and its add-on to pull this off? Anyway, nice tutorial! :)
you could use a shrinkwrap modifier and a multiresolution modifier (to get your high poly detail on your low poly mesh) and then go to the render tab in the properties panel, find your bake settings -make sure you're using the Cycles engine -make sure multiresolution is checked(in the bake settings) -make sure the preview in your multiresolution modifier is at 0 make an image node in the shader editor and give it a new texture. make sure you have the image node and the mesh selected before you click bake
Thanks for the instruction very much! The video caused another question for me though. What's the point of generating displacement maps if we are going to add subdivision levels again? Sorry if the question is too basic.
If you create a sculpture in a program like zbrush and you want to use it in a movie or render it in high quality you would have to export it into blender. But blender can't import models that are millions of polygons in size. So you have to import a low res version of the sculpture and subdivide it inside of blender until it's as high res as the zbrush sculpture. But if you just subdivide the low res model it's going to become a high res smoothed out version of the low res model. So you have to displace the surface using a displacement map. Then you will end up with the sculpture you made in zbrush in blender, you can then texture the low res model or animate it. To achieve very high res and realistic looking models.
@@DanIel-fl1vc but i am still getting probs because i am only getting good details at sub 6 but my blender can't able to handle it what to do?😔pores on my sculpt dont get good view on sub 5
noob question here: I get that by creating a low-res mesh, the eventual goal is to 'fake' the appearance of the ultra-high-res sculpt by using a normal map to get all the tiny details baked in. And that will perform better in a game. Does it 'perform' better in terms of doing a faster cycles render? Will the low-res mesh + normal map, render faster than high-res mesh + no map? Also, what is the main purpose of doing the high-res version as a displacement to the lo-res mesh? I guess so you can model other stuff around your character and keep him low-res and responsive in the viewport, but by you can replace him with the high-res version at any time by changing the subdivisions? Oh, also, is this true displacement, changing the geometry of the mesh? I remember that before you could have real displacement or a bump map.
The advantage of displacement is that all of that detail is being stored separately from the mesh so that you can disable it if you want, you can easily animate the model because it's pretty low resolution, and if you do your displacement with adaptive subdivision (which isn't supported in EEVEE yet), then it will remove detail as the camera gets further away which obviously improves performance.
Most likely you will be using the Multiresolution modifier for sculpting. Use cycles render engine. Go to bake settings and click the "Bake from multiresolution" checkbox. Switch the bake type from Normals to Displacement. From here use a 2.80 bake tutorial. Note: For the time being, there isn't a basic displacement bake yet. Nor, baking textures in eevee.
Considering how rarely baking is updated in blender and that it's missing a lot of useful baking functionality i'd suggest baking somewhere else. Marmoset Toolbag is excellent and Substance Painter is decent enough for example.
How would you use this as part of a full workflow with a program like substance painter? For example, say I have a high res character bust that I sculpted in ZBrush, and a low res retopologized model I made in blender. The high resolution mesh has low to mid frequency details sculpted in, small forms/muscles, wrinkles, etc. From there I would want to bake a displacement map and bring it into Substance Painter to both create the skin materials and add the finest wrinkles, pores, bumps, etc (overkill to sculpt those in ZBrush imo). From painter I then need to export texture maps and either a very detailed displacement map with all the small frequency detail, OR, a displacement map with low-mid frequency detail + a normal map with the high frequency detail. This then needs to be added to the model in the main 3D app (here blender) and display the combined details properly. There is very little info out there on how to go through the full workflow properly, do you know a way to get good results reliably?
Not sure if this is 100% correct, but here's what I did that (mostly) seems to work - Export your map and base model from ZBrush, likely after reprojection to remeshed topology - In blender, plug the .exr into a displacement node (found under the vector category), and the output of that into the displacement input of the material output node. Make sure your midlevel and scale are correct - In Painter, setup your low and high poly meshes and bake your normals. - I put random normal stamps all over the mesh here to add details other than the baked normals. - Export your maps and plug them into the shader network (through a normal map node found in the vectors category). Both the displacement node and Principled BSDF have a normal input and I'm not 100% sure which is correct, however the input on the displacement node seems to give weak normals so I think it's the other one but I'm not sure ( I checked the Blender docs and the description for that input is just blank *facepalm*). It makes the surface jiggle a little if you toggle between just the displacement and the displacement + normals, so I'm not sure if this is the right way to do it, but the extra high frequency details from the normal map get overlayed on top of the displaced geometry without over-exaggerating the low-mid frequency details present in both maps. If anyone knows more about this please share.
So does anyone els, Blender Crash on render with Displacements. I have a character face. low ( medium) polly. a disp ,ape. followed all the steps but Blender 3.1 Crashed every time.
Thanks for this, I've been doing it the normal way and using the 2.79 nightly build that has a displacement node in cycles. I was wondering how they carried that over to 2.8 Also is there a chance you can give a Udim tutorial when the new update comes out?
Are you guys beginning to transition professionaly to Blender? I know you've only been using it extensively a month or so, but what's your initial verdict? Is it indeed time to look past Maya and move here instead?
No we're not. I really don't see that happening anytime soon. You have to think about, why? From a production point of view, you would have to change your entire pipeline. Re-write all your plugins, retrain all staff to use blender. And ultimately you have a software that's still lacking basic features. Don't get me wrong, blender has so many cool things to offer, and the openness of it makes it really accessible.but ultimately, no.
If you're referring to the sculpt overall, that looks like the one they made a couple videos back in Blender. They may have finished it in ZBrush, or maybe retopologised it, I'm not sure.
@@charltonrodda Just sculpts in general. I saw a sculpt that they did in Blender recently which looked to be of ZBrush quality, certainly better than the model used here.
Never EVER use smooth Uv's out of zbrush.(Unless blender smooths the uv's the same way zbrush does at render-time?). standard production practice is not to use smooth uvs in the multimap exporter. This will result in ugly seams around the UV borders. This is because the software used for rendering does not subdivide the model the same way as zbrush, the alogorithm is different.
It actually depends on the pipeline. Some studios do use it from time to time. When you're working for yourself, you're probably better off never using it like you say.
I'm really impressed with how quickly you guys have gotten up to speed in blender. Valuable contributions to the community. Thanks!
Thanks a lot Adam! We've spent hundreds of hours in Blender over the last month which have been very focused, so we're glad it's working
Awesome tutorial.
Also want to put this here in case someone has these issues for any reason.
1) If you import your model and its scale is set to 0.01 (or anything other than 1), and you happened to apply scale - Simply set the Strength value on your displacement modifier to 0.01 to get the correct result
2) If your height map was a PNG or something, by default its Color space would be set to sRGB. Make sure you set it to Raw/Non-Color
Thanks guys, you don't have idea how mucho your tutorials have been helping me . Thanks a lot!!
I subscribed because I love the way that there are two of you guys. When there is only one person, it kinda feels like your watching a video. But when there are two of you guys, It feels more like a conversation that i'm in.
I'm a instant fan of you guys. The blender 2.8 was a shock to my system but damned if you guys havnt helped helped me immensely. Im a hobbyist that started with blender In 2014. It shocked me then too. Graduated full sail in 2006 for CA. Maya was it back then. Blows me away how much work flow has changed. People getting in it today dont have a clue how hard it used to be. Love you guys.
Whoa!! The Timing... I just completed my character sculpting in Zbrush on this evening and was wondering how can I get all the details from Zbrush to blender after retopologizing it in blender.. Thanks Guys! You saved my time. Looking forward for more tutorials. : )
Perfect timing :D We'd love to see your character
@@FlippedNormals Thanks for your awesome support guys! 🥰 It's my first complete character project and I don't know how much weird it gonna be. I am trying to retopologize everything and going to texture it with substance painter for the very first time. But feels pretty much confident; thanks for the 'Introduction to Substance Painter' course on Flipped normal marketplace. There is many digital resource website but most of them don't assure the quality of each and every product; all waste is dumped there and its so hard to find good course from there. Whereas FlippedNormals have only quality resources which makes it worth to save some money to purchase some of the courses.
And also thank you sooooooo much for making these much high quality content and make it available to the community for free through youtube! We are looking forward to see more awesome content on this channel. Thank you!
This has been incredibly useful for me to understand. An hour ago I had no idea about anything involving retopology! This, plus a couple of your other videos, have been a godsend. Amazing work!
Amazing work in this tutorials, one of the main reasons I was relactant moving to blender was the integration with programs like Zbrush.
Thanks for the tutorial, I learned a lot from it, love how clear it is!
You're very welcome!
I agree as well, please make a tutorial to bake maps from sculpt which has high res
Thanks, lads! Very useful!
great video, and it works! i remember this process being so difficult for me before. thanks guys!
Whoaa zbrush has some pretty fancy animations for going into UV mode and back
They take like 3 seconds, not worth it at all.
real UV mapping is coming in later updates, this is just a UV preview u can use UV Master to skip all of this and does a decent work on organic stuff.
Awesome video! Would be great with a quick tip video on using your skin alphas in blender (I am really struggling to make them work properly)
create a texture node + image , go into the texture paint,(making sure that the texture editor has your newly created image) then set your brush texture to the alphas you are referring to. Your texture node can then be linked to whatever channel you need to affect (i.e. bump, disp, roughness, etc etc)
use a mix node if you have more than one layer of texture i.e. freckle layers, discoloration, so on so on
@@aiamfree Hey man! It appears youtube never notified me of your reply. So sorry for the late response and thank you a ton! Gonna try that tommorow :D (I kinda gave up on it after a while)
Will you guys make a tutorial below?
sculpting hi-res in blender 3.5 > retopo > bake displacement map from hi-res sculpt > apply displacement map to the retopo mesh > render with cycles)
It’s better if you use material vector displacement, combined with micro-displacement. This makes it possible to use much higher and more accurate detail
They mentioned they're not doing it in the shader because vector displacement isn't supported by EEVEE
Matt Curtis yet they end up using Cycles anyway.
Can you write, explain a bit about it.I see where to export from ZB. can you write the settings and node set up?. Im totally interested in this.
hope we get to see a retopo video for this guy
You bet!
@@FlippedNormals if I were you I would be waiting for 2.81 because there are some very interesting updates coming to the Poly Build tool!
t.co/S462pP3cuJ
Pablo Dobarro is doing an amazing job with the sculpting workflow ♥️
Client: "I need a 3D model of a head stung 1000 times by bees, can you model this for me?"
Artist: "sec" 8:03
Thanks that was really practical!
the question that has been on my mind is:
could we make the displacement map inside Blender instead of ZBrush?
Hello @Flipped Normals. I have a problem with baking displacment for blender. I made geometry using dynamesh, then i used subdivision for more details. After that i copied the high poly mesh and Zremesh it for low poly version and i used Subdivision again for projecting details from High poly to low poly. Next to i tried to bake displacment but there is a problem. When I exported .exr displacment for blender it wasn't work at all. Or it was a black image or with only a little information (and in wrong direction). I tried to export tiff file as displacment it works better but still not good. It generate information on mesh but there was a mesh artefacts because of that. I tried this map in substance painter and it worked well. Idk why there is problem only in blender. Could You something tell me about that? Thanks. Regards.
This is great ! But a video about how to have the model uv unvrapped in zbrush would be cool !
Do you guys knows how fucking much I love you ?
Thank you!
Amazing stuff, by far the best approach to explaining things without confusion!
Just wondering if you choose blender over cinema for any reason? I’ve been using cinema for a few years now but are seeing more and more people opting for blender and I’m noticing tools that don’t seem to even be available in cinema,
Thanks again!
Is there a video on the whole UV border issue?
Like Zbrush exports displacement maps with purely frozen uv borders, while the subdivision modifier can only do linear or smooth borders.
Are the seams on this just hidden very well or how did you get around that?
nice tutorial. my question is: does it matter which subd i will export from zbrush? you took subd 1 but in blender it shows every detail til subd6. so it seems like it doesnt effect it this much, right?
Amazing tutorial, I wonder if you have the GoZ Blender wouldn't this speedup the import/export workflow? Cheers!
so i created horn with vdm brush that i created, and i applying it on the mesh with UV and subdivision, and i generate displacement like this, and using in in blender, the horn is in the wrong direction, it's like straight spike direction, it should be like rhino direction horn, what did i do wrong? please help/
@Flipped Normals So you showed how to get one subtool into blender. Great. But how many of us have models with just one subtool? How do you get an entire creation made of multiple subtools into blender? Is there a way to do it all at once? or do you have to import every single subtool individually? What am I missing here?
man, nothings working, i tried this with a GOZ/GOB multi subtool model i have. but i cant seem to get the details projected with this method and also the displacement mode method. What do you think is the problem? im using blender 3.4
quick question, why displacement and not vector displacement? I usually have better results in complex detail with VD
also, why disp modifier and not plug in in the displacement map in the input on the cycle material?
@@Lostpx I wonder as well.
@@Lostpx Answer at 10:04
@@Lostpx Cause Displacement map in cycles material works only at render time, not in the viewport and so it doesn't work for Eevee which is important for istance if you want to paint on your character.
I'm trying this out with a bust model that is ~1.2m tall in Blender; exporting a 32 bit map(confirmed by opening in photoshop) gives correct displacement for details but also shows some stair-stepping effect. Examining the image, the range is very narrow around 50% grey, I get a stair stepping effect with enough subdivisions and if I use levels to view the range (tiny spike around 50% grey values) in photoshop it also has the stairsteps.
Think I found a solution, should be accurate, theoretically. In ZBrush, set the [Multi Map Exporter] export scale to 10, It will complain that the 32-bit scale should be set to 1, ignore it. This makes the histogram range for the values 10x wider. In Blender, set the displacement range to 0.1 to compensate
This avoids crushing the dynamic range of the exr if your model is small (even 32bit isn't enough for the tiny range otherwise). Apparently even 1 meter is too small for 1:1 scale in Blender.
Hey guys, thank you so much for this tutorial. I'm a beginner and hoping to get my file size down a lot with this technique when I do skin textures in ZBrush. I am facing some trouble trying to export the map though. I'm following your tutorial and click export all maps. But no file appears. I've tried TIFF and EXR but nothing appears. Any help?
U need to have a UV unwrap to make the process work :)
I can't get my displacement to work after following this video... When exporting to tiff the texture looks okay, but with exr i can't see anything but a grey or black image... Also tiff file is like 200 mb and exr one is 5 mb or so. Tiff file has a weird midlevel, so i can't use it. Any help? imgur.com/a/WSfz7ei
Your content is simply amazing! Keep it up!
Quick question: can I do this directly from blender?
EDIT: I suppose the shrinkwrap modifier is what I'm looking for...I'll test it out!
Hi.
you mean you projected the hight res onto low res using shrinkwrap modifier instead of using displacement map??
OK! Thanks for HDR and Disp map explanation. Only thing now is how correctly setup node in blender when i have metalness pbr workflow.
My question here. Have you saw any other way to create displacement map ? I work with photogrammetry so optional step (hand retopo) will give more or less good low poly model but reproject create always some kind errors that take hours to fix. I fight now with Topogun to see its possible there, but maybe another software ?
8:01 first day at the studio
8:25 last day at the studio
this workflow will work for unreal 5 ? like is it the way they import characters ? with displacement ?
My map exports as pure gray. I followed the directions to the letter. Why is it doing this?
I think adaptive subdivision has much more advantages than the displacement modifier.
Thanks. I'm having an issue with the texture map lining up with the displacement. I know you do not cover the Color, texture map in the tut. I was wondering if you can help. Im exporting all from Multi map exporter. its off slightly. here is a link to the comparison.
This kinda works but I get a few janky artefacts on my model. Any ideas why?
yuo can set the scale in blender, for example my zbrush models tend to be a hundred time bigger in zbrush that blender, so importing back in I clamp the scale to 1 and set the displacement scale to 0.01 in blender. works fine.
I also have blender set to unit scale of 1.0 and cm , so my head is real size ie around 25 cm from chin to top of head
setting the scale to ,01 in zbrush does not seem to work in my experience, I just leave it at 1
There are so many things in ZBrush that make me wonder why? ZBrush has been out for a while and yet it seems like they haven't bothered fixing bugs I'm finding that have been around since the release of those features. Projection Master for example....I've found 3 bugs that people posted questions about like 7 years ago.
hi flip in new version of zbrush i can't do this bcz first need go to tools and make uv unwrap and create displacement map and export it like in video of you
Please help ! Would you then retopo the model and bake normal maps from the mesh with the displacement modifier?
Hellooo
When I try to save it into a folder it doesnt show anything in the folder.. I have no idea whats the problem.
I love you guys!!!!😂😉
amazing
Yess !
can we bake displacement maps inside blender? i.e., from high poly to low poly
Question, is it possible to make the displacement map directly in Blender, or do you need ZBrush and its add-on to pull this off? Anyway, nice tutorial! :)
Unfortunately, displacement baking isn't in 2.8 yet. In fact, the whole baking system is kind of broken right now.
you could use a shrinkwrap modifier and a multiresolution modifier (to get your high poly detail on your low poly mesh) and then go to the render tab in the properties panel, find your bake settings
-make sure you're using the Cycles engine
-make sure multiresolution is checked(in the bake settings)
-make sure the preview in your multiresolution modifier is at 0
make an image node in the shader editor and give it a new texture.
make sure you have the image node and the mesh selected before you click bake
How would I do multiple UDims for a character?
In production would u get any other maps from the Zbrush model like AO or something? Thank you!
Thanks for the instruction very much! The video caused another question for me though. What's the point of generating displacement maps if we are going to add subdivision levels again? Sorry if the question is too basic.
If you create a sculpture in a program like zbrush and you want to use it in a movie or render it in high quality you would have to export it into blender. But blender can't import models that are millions of polygons in size. So you have to import a low res version of the sculpture and subdivide it inside of blender until it's as high res as the zbrush sculpture.
But if you just subdivide the low res model it's going to become a high res smoothed out version of the low res model. So you have to displace the surface using a displacement map. Then you will end up with the sculpture you made in zbrush in blender, you can then texture the low res model or animate it. To achieve very high res and realistic looking models.
@@DanIel-fl1vc thanks for the time you took to help me out, that was very helpful 👌
@@DanIel-fl1vc but i am still getting probs because i am only getting good details at sub 6 but my blender can't able to handle it what to do?😔pores on my sculpt dont get good view on sub 5
So when I create the maps and save it in the folder its not turning up for some reason
noob question here: I get that by creating a low-res mesh, the eventual goal is to 'fake' the appearance of the ultra-high-res sculpt by using a normal map to get all the tiny details baked in. And that will perform better in a game.
Does it 'perform' better in terms of doing a faster cycles render? Will the low-res mesh + normal map, render faster than high-res mesh + no map?
Also, what is the main purpose of doing the high-res version as a displacement to the lo-res mesh? I guess so you can model other stuff around your character and keep him low-res and responsive in the viewport, but by you can replace him with the high-res version at any time by changing the subdivisions?
Oh, also, is this true displacement, changing the geometry of the mesh? I remember that before you could have real displacement or a bump map.
The advantage of displacement is that all of that detail is being stored separately from the mesh so that you can disable it if you want, you can easily animate the model because it's pretty low resolution, and if you do your displacement with adaptive subdivision (which isn't supported in EEVEE yet), then it will remove detail as the camera gets further away which obviously improves performance.
please show us how to bake that displacement map withing blender
Blender Guru has a video on this. "How to bake perfect normals in Blender" I believe it's called. Baking displacements is done the exact same way.
Most likely you will be using the Multiresolution modifier for sculpting. Use cycles render engine. Go to bake settings and click the "Bake from multiresolution" checkbox. Switch the bake type from Normals to Displacement.
From here use a 2.80 bake tutorial.
Note: For the time being, there isn't a basic displacement bake yet. Nor, baking textures in eevee.
Considering how rarely baking is updated in blender and that it's missing a lot of useful baking functionality i'd suggest baking somewhere else. Marmoset Toolbag is excellent and Substance Painter is decent enough for example.
How would you use this as part of a full workflow with a program like substance painter?
For example, say I have a high res character bust that I sculpted in ZBrush, and a low res retopologized model I made in blender. The high resolution mesh has low to mid frequency details sculpted in, small forms/muscles, wrinkles, etc. From there I would want to bake a displacement map and bring it into Substance Painter to both create the skin materials and add the finest wrinkles, pores, bumps, etc (overkill to sculpt those in ZBrush imo).
From painter I then need to export texture maps and either a very detailed displacement map with all the small frequency detail, OR, a displacement map with low-mid frequency detail + a normal map with the high frequency detail. This then needs to be added to the model in the main 3D app (here blender) and display the combined details properly.
There is very little info out there on how to go through the full workflow properly, do you know a way to get good results reliably?
Not sure if this is 100% correct, but here's what I did that (mostly) seems to work
- Export your map and base model from ZBrush, likely after reprojection to remeshed topology
- In blender, plug the .exr into a displacement node (found under the vector category), and the output of that into the displacement input of the material output node. Make sure your midlevel and scale are correct
- In Painter, setup your low and high poly meshes and bake your normals.
- I put random normal stamps all over the mesh here to add details other than the baked normals.
- Export your maps and plug them into the shader network (through a normal map node found in the vectors category). Both the displacement node and Principled BSDF have a normal input and I'm not 100% sure which is correct, however the input on the displacement node seems to give weak normals so I think it's the other one but I'm not sure ( I checked the Blender docs and the description for that input is just blank *facepalm*).
It makes the surface jiggle a little if you toggle between just the displacement and the displacement + normals, so I'm not sure if this is the right way to do it, but the extra high frequency details from the normal map get overlayed on top of the displaced geometry without over-exaggerating the low-mid frequency details present in both maps.
If anyone knows more about this please share.
Nice. Zbrush has udim support?
Yup!
So does anyone els, Blender Crash on render with Displacements. I have a character face. low ( medium) polly. a disp ,ape. followed all the steps but Blender 3.1 Crashed every time.
How do you get UVs on a high poly mesh in zbrush?
Thanks for this, I've been doing it the normal way and using the 2.79 nightly build that has a displacement node in cycles. I was wondering how they carried that over to 2.8
Also is there a chance you can give a Udim tutorial when the new update comes out?
It already supports udim in version 2.81
@@Solomon_Amoasi972 Great, now how do you create and use them in blender is there documentation on that?
How to export the fibermesh from zbrush to blender particle system
is it possible to do this all within Blender? like with a model you have sculpted and done the retopology for?
You can model and retopologize in blender but i don't think it will be able to do the maps and projection. P.S just pirate zbrush.
Are you guys beginning to transition professionaly to Blender? I know you've only been using it extensively a month or so, but what's your initial verdict? Is it indeed time to look past Maya and move here instead?
No we're not. I really don't see that happening anytime soon. You have to think about, why?
From a production point of view, you would have to change your entire pipeline. Re-write all your plugins, retrain all staff to use blender. And ultimately you have a software that's still lacking basic features.
Don't get me wrong, blender has so many cool things to offer, and the openness of it makes it really accessible.but ultimately, no.
Do you know how to or if its even possible to bake displacement maps in blender
i freezed the subdivision levels.how can i do that in freezed subdivision?is it possible?
Surprised you guys didn't use or recommend GOZ/GOB for blender.
How you transfert your model blender low res and High res ?
Adaptive is really helpful for things that are spiky and curvy. Just fyi
Btw blender 2.81 supports UDIM
It's not there yet, and I heard in today's Blender Today livestream that it seems to have been pushed into 2.82.
how do i get the uv?
Can Blender make a turntable movie from this? If so, how? Thanks :)
Here you go: ruclips.net/video/2r0KsLYr3wA/видео.html
How far Blender is from Maya now? To be real concurrency.
very far but it's coming full force, maybe more few years or so to catch up that if Autodesk stays still and do nothing.
Once Blender gets full integrated popular renderers it will sky rocket the way to the top...
Vray redshift arnold
@@henrique-3d Redshift is comming bro vray they stop the support for 2.8 and arnold they said no way
@@thiogarces I would be 100% satisfied with only redshift then!
Could you have done the sculpting in Blender, and if so, would it have rendered better/faster?
If you're referring to the sculpt overall, that looks like the one they made a couple videos back in Blender. They may have finished it in ZBrush, or maybe retopologised it, I'm not sure.
@@charltonrodda Just sculpts in general. I saw a sculpt that they did in Blender recently which looked to be of ZBrush quality, certainly better than the model used here.
This sculpt was started in Blender and finished in Zbrush. Zbrush handles higher polycounts way better than traditional 3D apps.
Guys if you use micro displacements is way better to do the displacement maps
It doesnt work with Eevee though
@@FlippedNormals It looked you were using cycles in there
UDIMs dont exist in blender yet? That's gonna be a big nope from me
They're coming to 2.81. I believe November will be the release date for 2.81, but I'm not entirely sure yet.
Thanks for the tutorial guys, this looks way simpler than importing into maya. I'll be waiting for that UDIM support.
@@kassiag7579 I'll be downloading that as soon as it comes out
Can't do seamless UDIMs lol
Cinema 4d users can do the same instead of blender... 🙂
Change channel name to
FlippedBlender
Never EVER use smooth Uv's out of zbrush.(Unless blender smooths the uv's the same way zbrush does at render-time?). standard production practice is not to use smooth uvs in the multimap exporter. This will result in ugly seams around the UV borders. This is because the software used for rendering does not subdivide the model the same way as zbrush, the alogorithm is different.
It actually depends on the pipeline. Some studios do use it from time to time. When you're working for yourself, you're probably better off never using it like you say.
I can confirm that this is an issue, I had all sorts of artifacts when displacing the model then I turned off smoothing and it displaces correctly.