Real Human to Metahuman Photogrammetry Workflow

Поделиться
HTML-код
  • Опубликовано: 26 июл 2024
  • This video provides and overview of a single-camera photogrammetry process for creating a custom Metahuman in the likeness of a real person.
    Some tips for photogrammetry photography are shared including use of strobes and cross polarization to cancel specular reflection on real-world skin. Photogrammetry process is shown using Reality Capture.
    All software tools used are free or low cost....
    XnView: www.xnview.com/en/
    Reality Capture: www.capturingreality.com/
    Unreal Engine: www.unrealengine.com/
    Metahuman Mesh To Metahuman Plugin: www.unrealengine.com/en-US/bl...
    Epic's Official 6min Tutorial for the Plugin: • Using Mesh to MetaHuma...
    MetahumanCreator: www.unrealengine.com/en-US/me...

Комментарии • 115

  • @FeedingWolves
    @FeedingWolves 10 месяцев назад +1

    This video was beyond helpful! As usual you make incredible tutorials!!! Thank you so much!!!

  • @Zenosart
    @Zenosart 2 года назад +5

    Amazing Tutorial :) So simple and straight to the point :)

  • @malickfayediagne2888
    @malickfayediagne2888 Год назад

    you sir deserve an statute, you are the best on youtube. i really appreciate the amount of knowledge you share on your vids, keep going

  • @fulltimespy
    @fulltimespy 2 года назад +2

    This was alot easier than I expected, can't wait to play around with this :)

  • @frankb4793
    @frankb4793 2 года назад +2

    Thanks, You explained this very well, easy to understand.

  • @jarabito001
    @jarabito001 2 года назад +3

    Thanks for sharing this knowledge! really really appreciated :)

  • @jakecolefilms
    @jakecolefilms Год назад +1

    Absolutely incredible, I wish I had you as a professor!

  • @MrToroburn
    @MrToroburn Год назад +1

    Great video! Have got to say about the cropping - the exception seems to be if you are capturing an object on a turntable cropping to the object was the only way I could get reality capture to lock on to the object for the image alignment.

  • @robertprescott9577
    @robertprescott9577 2 года назад

    This was great. Thank you

  • @karlmahlmann
    @karlmahlmann Год назад

    Really informative and fun to watch. thanks.

  • @6nupurbanait687
    @6nupurbanait687 Год назад +1

    So smooth! EverytNice tutorialng he says goes into my brain and sticks in.

  • @SuperlativeCG
    @SuperlativeCG 2 года назад

    It's quite fascinating.

  • @erichiller21
    @erichiller21 2 года назад +5

    Great job on this tutorial! Nice to see the steps done being to end in a concise manner

  • @panictryouts
    @panictryouts 2 года назад +3

    pretty amazing. for the blacks on tshirts and shoes, pick something less black, because computer black is like a black hole but real world black is a bit more gray.

  • @dilzan9822
    @dilzan9822 Год назад

    Nice tutorial Michael,

  • @Soluchi-InfiniteCoCreatorGod
    @Soluchi-InfiniteCoCreatorGod 2 года назад

    Awesome. 👍

  • @camilosandoval4905
    @camilosandoval4905 11 месяцев назад

    thanks very much, its very useful

  • @Praudas
    @Praudas 4 месяца назад +1

    thank you!

  • @benjamindupuis528
    @benjamindupuis528 Год назад

    very good tutorial . very clean . i am happy you didn't start digging deep into any specific feature - sotNice tutorialng that happens to too many

  • @meowme7644
    @meowme7644 Год назад

    Thx Prof.

  • @opshredderytp
    @opshredderytp 2 года назад

    I'm gonna need that.

  • @danielakirsten768
    @danielakirsten768 2 года назад

    overwhelming, but tNice tutorials was absolutely brilliant! Thank you!

  • @rayd1oo
    @rayd1oo 2 года назад

    Nice video

  • @Greenrobotvp
    @Greenrobotvp Год назад

    Thanks for making this, can’t wait to try it. What do I do with my hair in the photogrammetry process, try get it flat as possible?

  • @nathaliagomez3431
    @nathaliagomez3431 2 года назад

    Crazyyyyy

  • @optimus6858
    @optimus6858 Год назад

    never thought i see my self as an avatar !
    thanks

  • @Vixo.in.skatelul
    @Vixo.in.skatelul Год назад

    tNice tutorialngs simple, straight to the point and well organized.

  • @j2only007
    @j2only007 Год назад

    Head to body proportion is off , and is very tough thing to achieve. But still pretty quick way to do. There must be normal map, spec map, displacement and roughness. Unfortunately I don't seen yet meta human. Might be there but I didn't get the access of it. Reality capture might have those map.

  • @linhmac0222
    @linhmac0222 2 года назад +1

    Thank you so much for this great tutorial! Very straight forward and easy to follow. I got an issue when I scanned in reality capture, my model's surface was pretty bumpy. Is there by any chance you have any tips or tricks to solve this? Much appreciated!!

    • @PixelProf
      @PixelProf  2 года назад +2

      Sure!... specular shine is often a contributor to this. Try cross-polarizing your lighting and camera: ruclips.net/video/rBwPGYBn0Tc/видео.html
      Alternative could be anti-shine makeup.

    • @linhmac0222
      @linhmac0222 2 года назад

      @@PixelProf Awesome!! thank you so much for your suggestion! I've just watched your "DIY Cross-polarization" tutorial. And I'm going to try with that to see how it turns out :) Thanks again for your helpful videos😆

  • @devmishra18
    @devmishra18 2 года назад +1

    Great video! Do you know if the metahuman plugin is creating custom skin weights for the custom face mesh or is it trying to find the closest metahumans in the metahuman library and blending them to approximately match the 3D scan?

    • @PixelProf
      @PixelProf  2 года назад +2

      Thanks for the kind words.
      I don't have any "insider" information on this, but "reading between the lines" of what is shared by Epic, it looks like the system first gets as close as possible to the target using just MH zygotes from their library (complete with blendshapes), then applies a final blend shape to close the remaining gaps.
      You can blend between the MH result and the corrective blend in MHC in the "Custom Mesh" panel.

    • @devmishra18
      @devmishra18 2 года назад

      @@PixelProf very cool! Thank you!

  • @k1ngdragonoid468
    @k1ngdragonoid468 Год назад

    Your laundry is done sir 5:06

  • @aysenkocakabak7703
    @aysenkocakabak7703 4 месяца назад

    This is the most accurate one so far i have seen but my problem is with the studio setup. Right now I cant reach to a studio so is it still allright stooting in a daylight room. Other question related to second phase do you , can i use facebuilder blender addon. Just because it seems easier for me. But if you day reality capture is really the best one I will try it. I also tried polycam but i did not find it accurate enough.
    Thank you again. ❤

  • @ArgoBeats
    @ArgoBeats 2 года назад

    Beautiful! Could we export the facial animations from MH to Unreal to Maya?

    • @PixelProf
      @PixelProf  2 года назад

      You can export the custom MH geo and rig to Maya, animate in Maya and export to Unreal, but I don't think you can animate in Unreal and export to Maya. (MHs are not really setup for rendering from Maya anyway).

  • @markmarrin4754
    @markmarrin4754 Год назад +1

    When I import into unreal there is so much bump, it looks 10 times better in Reality capture, smooth,not sure that I am doing wrong, I did the specular part, and remove bump mapping even but its the actual exported model with is super jaggady.

  • @EricTareDenedoteentalk
    @EricTareDenedoteentalk 2 года назад

    Nice tutorial. Thank you. Have you tried these latest metahuman with iPhone live link face? I faced some problems. It works great in 4.47 and previously in 5, however since the latest release there are problems. Please let me know if you face the same problem. I Need Solution

    • @PixelProf
      @PixelProf  2 года назад +1

      Thanks for the kind words. I primarily use Faceware Studio rather than iPhone for face performance capture, so I don't have any specific information on the iPhone app.

  • @perspectivex
    @perspectivex Год назад

    Good tutorial in that it really shows the whole process from start to finish including photogrammetry. Is it possible to export metahums? e.g. after you create it then make it have some expression, could you export that as a mesh for 3d printing, so basically give people other expressions than that with which they were captured? Again, for printing, not for in-computer use. And if you add hair, is that just a kind of shader or would there be some semblance of hair exported in the mesh (assuming a mesh can be exported)?

    • @PixelProf
      @PixelProf  Год назад

      You can export an MH head from Unreal, but it will always be the blank, “neutral” pose. If you would like to export with an expression, you would need to use Maya.
      If you have Maya, you can use the full version of Quixel Bridge (downloaded free from Quixel.com, not the “built-in” version in Unreal) to export the MH for use in Maya. In Maya, you can use the fully rigged MH to pose the body and/or face as you like, and use the duplicate command to take a snapshot copy of the mesh in its posed state. That can then be exported to any of a number of export formats that are supported by 3D printing tools.

    • @perspectivex
      @perspectivex Год назад

      @@PixelProf thanks a lot for the clarification. That's too bad. No Maya, just Blender. I guess the only way I could do it might be to pose the expression in MH then take ~70 screenshots from different angles and just redo the photogrammetry. That should work I think, in theory.

  • @videohelper2299
    @videohelper2299 2 года назад

    Hi, thanks for this very helpful and precisely concise tutorial. I'd like to ask one question. I'm not sure what I'm missing but when I export the mesh from realitycapture to Unreal Engine 5, it only imports the mesh, and not the textures? Would you know which step was I missing? Thanks in advance for your kind attention.

    • @PixelProf
      @PixelProf  2 года назад

      In RealityCapture, if you're exporting an FBX, you will see an "Export Model" dialog box after selecting the destination and name of the file you're about to export. In this, make sure that the "Export Texture" and "Embedded Texture" options are both set to "Yes" before clicking OK. This should save your textures in the FBX file and Unreal should read them in.
      Hope this helps. Have fun.

    • @videohelper2299
      @videohelper2299 2 года назад

      I see. Thanks so much for your reply Sir. I'll try it out.

  • @_casg
    @_casg 2 года назад +2

    Dammit I should of gone to your school

  • @snap-n-shoot
    @snap-n-shoot Год назад

    Why cant the Texture map of the face from Reality Capture be used in the Metahuman Creator to make it more realistic?

  • @mikecarvelas4720
    @mikecarvelas4720 2 года назад

    The result of the 3d face in reality caprure is bad how can I fix it?

  • @Cano.34
    @Cano.34 Год назад

    Best guy I got everytNice tutorialng for free also but I rather buy soft so I can be happy and proud I get a official version.

  • @joshuacadogan5174
    @joshuacadogan5174 2 года назад

    My meta human isn’t showing up in the creator, any ideas?

  • @REALVIBESTV
    @REALVIBESTV 2 года назад

    What kind of PC are you using with Unreal Engine 5 your CUP and graphic card

    • @PixelProf
      @PixelProf  2 года назад

      It's a pretty old machine. The CPU is Intel(R) Xeon(R) CPU E5-2640 v4 @ 2.40GHz 2.40 GHz (2 processors).... but the GPU has been upgraded to a Quadro RTX A6000.

  • @kiomcreations
    @kiomcreations 2 года назад

    Hello, thanks for your workflow.
    In this case the metahuman cant open mouth ?!....
    We have to make scan with mouth open right ?

    • @PixelProf
      @PixelProf  2 года назад

      For this workflow, scan with mouth closed. If the track doesn't properly identify the seam between lips, you can manually adjust it. The Metahum system will create a mouth interior based on pre-defined assets in the MH library.

    • @kiomcreations
      @kiomcreations 2 года назад

      ​@@PixelProf That good to know. Mouth close ! 😁 Thanks. 👍

  • @Pauliotoshi
    @Pauliotoshi 2 года назад

    Do I understand it correctly that only the mesh is altered on the final MetaHuman? Is it possible to blend in (diffuse) texture data from the original scan into the MetaHuman ?

    • @PixelProf
      @PixelProf  2 года назад +1

      It seems that the plugin and MHC do not currently use the texture information from the scan. It is possible to make use of this data through the use of external tools (Maya, Blender, Zbrush, etc)

    • @Pauliotoshi
      @Pauliotoshi 2 года назад

      @@PixelProf That's what I thought, thanks for the quick answer! Would be nice if the features of Mesh to Metahuman and Reallusion's Headshot plug-in were combined!

    • @PixelProf
      @PixelProf  2 года назад

      ​@@Pauliotoshi Well.... you can still use Headshot to generate the mesh to use in the UE5 plugin.

    • @abhishekpatra7954
      @abhishekpatra7954 2 года назад

      @@Pauliotoshi yeah external 3d softwares are always there for you if you want more skin textures but you can only render through Unreal engine

  • @fabianoperes2155
    @fabianoperes2155 2 года назад +3

    Is there a way to reduce head size?
    I always think MetaHuman head it too big compared to the body.

    • @PixelProf
      @PixelProf  2 года назад +1

      Yes. You can adjust head size in MetaHuman Creator. The control is in the body Proportions panel.

    • @RoCkShaDoWWaLkEr
      @RoCkShaDoWWaLkEr 2 года назад

      it stuck out to me also Fabiano.

  • @pile333
    @pile333 2 года назад

    Is there a way to adjust the proportion of the head and neck to match the body's proportion on the fly?

    • @PixelProf
      @PixelProf  2 года назад +1

      You can make these adjustments in Metahuman Creator after generating the MH with the plugin.

    • @xaby996
      @xaby996 2 года назад

      Its kinda weird for all metahumans tbh

    • @pile333
      @pile333 2 года назад

      @@PixelProf Oh, ok. Great. Thanks for the kind reply.

    • @adleralonsozamoraruiz7909
      @adleralonsozamoraruiz7909 2 года назад +1

      There is actually a slidebar in the metahuman creator site, is at the bottom left section.

  • @MiguePizar
    @MiguePizar 2 года назад

    Is that the best software for photogrammetry or there is something even better and higher quality or resolution out there? Thank you for the tutorial

    • @PixelProf
      @PixelProf  2 года назад +1

      For photogrammetry, the output quality and resolution generally depends on the quality and resolution of the photos provided.
      Reality Capture is certainly one of the best. I don't think I've used every tool out there, so can't really speak to if it is "the" best, but I've routinely used Reality Capture, Recap Photo, Agisoft MetaShape, plus a variety of iOS tools. Agisoft and RC seem to be at least on par with one-another, with RC having some more functionality and features that I don't have access to in Metashape.

    • @MiguePizar
      @MiguePizar 2 года назад

      @@PixelProf thank you for answering, so the realism of the character to 3d mesh is mostly from the metahuman creator then? Because I want the most realistic 3d human that I can create, but although metahumans are amazing and very easily to use, it still looks like a character from a video game, and what I'm planning is like creating an unreal engine movie that looks real in everyway, not just the environment which right now is almost impossible to know if it is real or not, but with CGI or 3d characters, one can notice that is not a real human at least with metahumans, only the demo from unity from a few months ago looks almost real, but we don't if its going to be that quality until is launch and also how easy would it be to create a CGI like metahuman does. Anyway, thank you and have a good day.

    • @PixelProf
      @PixelProf  2 года назад +1

      @@MiguePizar The "Enemy" demo from Unity is fantastic. I'd look at the "Matrix Awakens: An Unreal 5 Experience" as the corollary from the Unreal universe. Keanu Reeves and Carrie-Anne Moss are both almost entirely rendered in engine from about the 2min. mark on. Their characters, both present-day and "young" versions, are essentially customized metahumans operated and rendered in engine throughout the piece:
      ruclips.net/video/WU0gvPcc3jQ/видео.html

    • @MiguePizar
      @MiguePizar 2 года назад

      @@PixelProf thank you again for the reply, well, I tried the Reality Capture but the 3d output was a complete mess. it is because I didn't use a black background like you did? or Did I put too many pictures (180 or so)? I guess one of those were the problem, I'll try tomorrow again if I have time.

    • @PixelProf
      @PixelProf  2 года назад +1

      @@MiguePizar I'd need to see your source photos to provide input. Most common issues are....
      1) blurry photos
      2) reflections or transparency in photos
      3) changing light conditions from photo-to-photo
      4) movement of the subject between photos
      There are other conditions that can make automated photogrammetry return poor results, but those are the the most typical and common.

  • @omarcoslacerda
    @omarcoslacerda Год назад

    My MetaHuman plugin keeps crashing every time I try to track the frame. Any solution for this?

  • @artificial374
    @artificial374 Год назад

    is it also possible to just make a video around the person and then take snapshots of the video or will this not work ?

    • @PixelProf
      @PixelProf  Год назад

      Photos work better as they typically have stored meta-data with each image that documents camera information including lens settings that is useful for the photogrammetry software.
      Video can work, but it is generally less than ideal as it is much more prone to motion blur and lacks the per-frame metadata stored in most digital still images.

  • @harshitcomputers
    @harshitcomputers Год назад

    god

  • @artificial374
    @artificial374 Год назад

    where can i find the metahuman identity plugin ??? all i can find is a metahuman toolkit plugin

    • @PixelProf
      @PixelProf  Год назад

      Plugin may have been renamed since the creation of this video. At the time of this comment, it is listed as "MetaHuman Plugin" in the Marketplace and can be used in 5.0 and 5.1

  • @killerlifealbum
    @killerlifealbum 2 года назад

    Does this work on Mac for apple please

  • @shekiba.646
    @shekiba.646 2 года назад

    I got download UE 5.02 update but there's no list in Metahuman, I can't found it.. where did you get ?

    • @PixelProf
      @PixelProf  2 года назад +1

      Open the Plugins window (Edit Menu->Plugins), search for Metahuman, activate the Metahuman plugin... you'll probably need to restart Unreal after that.

    • @shekiba.646
      @shekiba.646 2 года назад

      @@PixelProf I will try and see if works. Thanks.

  • @thelazyphotographer822
    @thelazyphotographer822 Год назад

    is it possible to call you tomorrow (Wednesday) to answer a few questions. im shooting in a studio a full length person.

  • @archananagaraj4054
    @archananagaraj4054 Год назад

    Anyone know what version of soft soft he's using? Like do I have to buy the $200 version for the stuff in the video or is the $100 dollar

    • @PixelProf
      @PixelProf  Год назад

      Which software? Unreal, Mesh to Metahuman and MetaHuman creator are all free. RealityCapture has a PPI (pay per image) mode, so you don’t have to pay hundreds of dollars to use it. You would just pay a small fee based on the quantity and resolution of images used. (This example would have cost under $2 in PPI mode)

    • @spegss
      @spegss Год назад

      soft soft lmao

  • @Regna
    @Regna 2 года назад

    Ok so i have done all of this and i cant figure out how to get it into unreal engine it's just on that website now =/

    • @PixelProf
      @PixelProf  2 года назад

      You need to use Quixel Bridge (built into Unreal 5) to bring the custom MH into your project.

  • @romannavratilid
    @romannavratilid Год назад

    LOL... this was done only with ONE camera :-O...? You were able to hold steady for the whole session...? How long did the capture session last...?

  • @baldeepsingh5516
    @baldeepsingh5516 Год назад

    I lost track again

  • @unijascha3005
    @unijascha3005 Год назад

    it is also possible to put it in unity?

    • @PixelProf
      @PixelProf  Год назад

      The photogrammetry can be done and imported into Unity, but the MetaHuman portion of this tutorial (that creates the fully rigged digital double) only works in the Unreal Engine ecosystem

    • @unijascha3005
      @unijascha3005 Год назад

      @@PixelProf okey thank you for your fast answer :)

  • @unijascha3005
    @unijascha3005 Год назад

    it seems that it doesn´t work with unreal engine 5.1, 5.0 is required right? Or am I just stupid ?

    • @PixelProf
      @PixelProf  Год назад +1

      At this moment, the plugin only works in UE5.0, but you can have both 5.0 and 5.1 installed (even if one or both is on external drives). So you can use the plugin in 5.0 and upload the resulting head to MHC to finish your custom MH. Then the finished MH can be brought into 5.1 through Bridge and will work fine there.

    • @unijascha3005
      @unijascha3005 Год назад

      @@PixelProf thank you

  • @aliosha123
    @aliosha123 2 года назад

    they look different

  • @sonymaxofficial398
    @sonymaxofficial398 Год назад

    !

  • @Instant_Nerf
    @Instant_Nerf 2 года назад

    Honestly the reality capture 3D mesh is 1000 better then anything metahuman tried to do. Why not even work with that model ? Let’s take a realistic 3D model and create a Cartoony version of that model? Makes no sense. Thanks for the tutorial tho. I see the attraction to using metahuman.

    • @PixelProf
      @PixelProf  2 года назад +3

      The raw "scan" output cannot be animated. It's like a marble bust. The utility of converting to MH is that it can be fully animated.

  • @antoparjiyo2785
    @antoparjiyo2785 Год назад

    like tNice tutorials are genuinely ant, it helps make the world a better place. Thank you for the amount you wrote too, I can tell you sincerely care.

  • @rraptor158
    @rraptor158 Год назад

    Is there any way to make it an identical and fully customizable face? A through-line I've seen is that Metahumans never quite look exactly like the scanned references and always have these low testosterone-looking poorly plucked eyebrows.(which you don't have)

    • @PixelProf
      @PixelProf  Год назад

      Yes, there is a way, but it involves going outside the pre-defined options in MH Creator....
      "out of the box" MH's have a pre-defined collection of 15 eyebrow sets to choose from. Many of these are fuller/thicker than others, but within the MH Creator, there are only these 15 eyebrow designs to choose from.
      That said, anyone with knowledge of how to create hair grooms in other software (Blender, Maya, C4D, etc) can create entirely custom hair definitions for the eyebrows and every other hair component of MHs.

  • @eddy-readysteady-go9001
    @eddy-readysteady-go9001 Год назад

    This looks much easier than using polycam and blender....too bad its not free

  • @pAULEE_wORLi
    @pAULEE_wORLi Год назад

    omg its 10 mins in u still rambling on about strobs and boring stuff nothing to do with it

  • @unijascha3005
    @unijascha3005 Год назад

    Hey when I click on MetaHuman Identity Solve: This error appears:
    Creating thread pool with 6 threads.
    (DnaDatabaseDescription.cpp, l40): DNA Database description does not contain blend_identity model.
    (DnaDatabaseDescription.cpp, l46): DNA Database description does not contain DNA database folder.
    (DnaDatabaseDescription.cpp, l52): DNA Database description does not contain archetype DNA.
    (ActorCreationAPI.cpp, l327): failure to rigid align: vector too long.
    I tried it with different 3D scans made with an 3d scanner app form my ipad pro. It´s an fbx model and it looks as good your model looks so I am not sure what is the issue.

  • @everend_xyz
    @everend_xyz 3 месяца назад

    Hey guys only get half of a head, I think it's something to do with a boundary box, can anyone help with the right settings? 🖲️Much appreciated