Behind the Scenes on MetaHuman Animator Showcase ‘Blue Dot’ | Spotlight | Unreal Engine

Поделиться
HTML-код
  • Опубликовано: 30 сен 2024
  • ‘Blue Dot’ is a short film that demonstrates how MetaHuman Animator unlocks the ability for teams to create cinematics of stunning fidelity and resonance by using a creative on-set process typical of traditional filmmaking to direct and capture a performance. In this video, we take you behind the scenes with Cinematographer and Director of Photography Ivan Šijak, renowned actor Radivoje Bukvić, and the 3Lateral team who collaborated on the project. Find out more at: epic.gm/blue-d...
    #MetaHuman, #MetaHumanAnimator, #BlueDot, #DigitalHuman, #UnrealEngine

Комментарии • 77

  • @Keji839
    @Keji839 Год назад +44

    Unity can go ahead and liquidate their assets now. It’s over

    • @ncard00
      @ncard00 Год назад +3

      Unity can still be used for loads of other stuff, and very simple stuff, competition is good, makes everybody wanna be the best and improve their product all the time.

    • @IamSH1VA
      @IamSH1VA Год назад +1

      @@ncard00I want that too, but Unity don’t want to compete with their product.
      Unity’s CEO just wants to feed his greed.
      I have seen so many great artistic & unique games from Unity devs, everything destroyed by one greedy CEO.

    • @vindicator879
      @vindicator879 Год назад

      they should use the source code of Unity to make a near identical engine but change its name and some other minor features

    • @FromFame
      @FromFame 9 месяцев назад +1

      Bro let people dev on what they want

  • @jakelofgren2510
    @jakelofgren2510 Год назад +57

    I would kill for them to do a deep dive into their process.... Kinda like how Quixel did the medieval environment deep dive. Like, what are extra steps they took on the model to nail his likeness, add custom clothes, custom hair, custom skin textures? Did they just use mesh to metahuman to make his face? Or did they use any custom scanning workflow? And also What mocap tech did they use, and what was their cleanup processes like? Please, I must know!!! :)

    • @gvdjurre
      @gvdjurre Год назад +7

      Agreed, this doesn’t tell us a while lot about the process. I’m 100% certain they did not use Mesh to Metahuman. This (and the Senua demo) use a custom model that they’ve built similarly to the other Metahuman templates. Real people that get 3D scanned, optimised and rigged.

    • @daneblacic7132
      @daneblacic7132 11 месяцев назад +1

      They were deep diving last week at the Unreal Day 23 in Belgrade. Maybe Epic or SGA (Serbian Gaming Association) publishes something....

    • @superqd
      @superqd 4 месяца назад +1

      I keep coming back to this video as an example of what's possible with Unreal and MetaHuman, but have no idea how they accomplished it. This is definitely not the quality of capture you get out of the box, or quality of textures, etc. I would love to see a deep dive on that as well.
      Most videos I've seen of Metahuman clearly look robotic or fake. This did not. This looked photographic and live. I don't know what they did that others do not.

    • @Eirex
      @Eirex 3 месяца назад

      It feels a bit like false advertising. Metahuman Animator out of the box does not look like this. The mouth movements alone are way better here

  • @yearight1205
    @yearight1205 Год назад +10

    Why doesn't Epic release the full video of their course that they teach to animation developers trying to learn Unreal Engine professionally? Makes no sense to not at the very least let us watch what they learned. Gatekeeping this information is the wrong way forward...

    • @yearight1205
      @yearight1205 Год назад +3

      @@B1gBut I think you misunderstood. Epic has courses that you can sign up for where they teach you the most cutting edge methods on how to use Unreal Engine for cinematics. As in they have a full on course for this very thing. The problem? You have to be a professional in the industry in order for them to let you sign up for it. So my point is that they should at the very least let us WATCH what those people get to see. It's not asking much.

    • @RetzyWilliams
      @RetzyWilliams 10 месяцев назад

      The 'tease' gets more views and earns more $ than the reveal. And the reveal would show cheating - using alembic mocap blendshape mesh, in conjunction with a Metahuman used sparingly. So, they couldn't advertise a MH if they showed them mainly using alembic mocap.

  • @ncard00
    @ncard00 Год назад +14

    This demo was otherworldly, and watching it at 60fps instead of 24fps made all the difference in realism. Can’t wait for video games in 2030 to become as realistic as reality.

  • @kennetherhabor5543
    @kennetherhabor5543 Год назад +12

    Props to everyone who worked on this project. Soon movie studios would have to clearly specify that their films are either live-action or animated because we would not be able to tell the difference just by looking at it. Lol

  • @TheUncutAngel
    @TheUncutAngel Год назад +4

    As a unity dev, seeing this raises some red flags. What shaders are they using? Are they customizable? Are they dependent? Proprietary/locked? What's the rig? Can it be exported? What does importing your own animations onto a metahuman rig look like and is it even a compatible thing? If I wanted to write my own shaders would the metahuman model allow it it without all the magic evaporating? Etc.
    Seeing something put together in an engine is one thing, but the tool has to still afford you the flexibility to do things that a game demands. If I needed to render the character in front of a UI, does it break the rendering dependences of the meta human skin/hair/eye shader? It'd be interesting to see just how flexible this can be.

  • @AlberXR
    @AlberXR Год назад +7

    It’s truly amazing what can be achieved! Congratulations 🎉. It would be amazing for you to breakdown how to achieve the lighting, camera presets and meta human animator fine-tuning.🙏🏻🙏🏻 For us artists the process is as important as the final result.

  • @CGeneralist92
    @CGeneralist92 Год назад +6

    Please Epic Games, can you release the eye shader that you used for this? It looks incredible! I would love to learn more about the process that was made to make this epic short film.

    • @sephirothcloud3953
      @sephirothcloud3953 11 месяцев назад +1

      This is the standard metahuman eye shader but it has all real texture scans and all raytracing cranked up with hdri environmental map

  • @screenapple1660
    @screenapple1660 Год назад +5

    OMG. unreal engine is insane. The graphics is insane crazy real. I like to use the word "Insane." every time I see people blow up new video game realism.

  • @Mrbeara425
    @Mrbeara425 Год назад +1

    Hey. Trying to get the industry as a whole to listen. These advancements are amazing. I get we need to promote the ease of use and breaking down how the technology works. But I think a video designed to show just how easy it is for experienced artist while reminding everyone that art is not free and the creator needs to still be talented and intuitive enough to come up with something amazing. Lately the talk around tech and ai and these programs is that " artist are no longer needed. " and this video is just another example of that why we do need artist and thinkers in this medium. But I'm suggesting we challenge the medium for everyone to see, including big ceos. Just a simple art project between 5 completely new artist and 2 experienced artist.
    We need to have a definitive example of how brilliant the tech is but remind everyone that talent is still very much required and focused for those who are so quick to judge and claim that "some robot or ai will do all of this for me"
    We need scope. Show scale. Provide hard definitive proof that imagination and creativity isn't something to manufacture nor should we ever doubt it.

  • @luxeomni
    @luxeomni Год назад +2

    Quick question, does someone know what is the camera on the facial rig ? it must have depth sensor for the new metahuman animator.

  • @SuperWiiBros08
    @SuperWiiBros08 8 месяцев назад

    This is the quality ImageMovers so damn wished to achieve

  • @LiamPatterson
    @LiamPatterson 9 месяцев назад

    "Look at the amazing fidelity in this image!!"
    Posts the video in 1080p........
    ffs

  • @behrampatel3563
    @behrampatel3563 Год назад +2

    Im hoping this is a series of talks around each topic ( texturing, clothes, eyes etc ) as it will help the rest of us level up. Cheers and huge respect to all the talent that went into producing this. It is a turning point for Real Time Characters.

  • @valentinnist
    @valentinnist 8 месяцев назад

    Can I animate the faces of characters created in Character Creator with Metahuman Animator?

  • @shamrocksessions4802
    @shamrocksessions4802 11 месяцев назад +1

    PLEASE GIVE US A DEEP DIVE!!

  • @perfectionist724
    @perfectionist724 Год назад +1

    I'm always excited to see new releases of ue!
    Thank you epic!

  • @babyrmedet7398
    @babyrmedet7398 8 месяцев назад

    Can anybody tell name of stereo camera for face capture? Thank you

  • @chrisnooitgedagt5144
    @chrisnooitgedagt5144 Год назад +1

    Everyone in this video looks CGI to me now 🤯

  • @templarjay
    @templarjay 8 месяцев назад

    plot twist: this entire video was made in UE

  • @AgentSmith911
    @AgentSmith911 9 месяцев назад

    Great job! Obviously still possible to tell it's artificial to a trained eye, but more than good enough for most people watching TV at home. We're very close to it being impossible to distinguish a regular recording and VFX.

  • @RairAffair
    @RairAffair Год назад +1

    I'm curious how the actor was re-created with such a detail and still be as Metahuman? Anyone can point Me there? Mesh to Metahuman?
    Normal Metahuman doesn't have that HQ, and realism.
    Despite everything this VIDEO pushed Me to do My own, and it's looking nice :D

    • @matteo.grossi
      @matteo.grossi Год назад +7

      My guess is that metahuman was used "only" as backend.
      I think this was their workflow (more or less):
      - Scanned actor face
      - Used metahuman animator to generate base mesh for solver (tracking)
      - Applied scanned mesh (retopologized, cleaned, etc) to base mesh as a blendshape
      - Transferred all textures (with manual authoring, clean up, dramatization, etc) to metahuman head
      - Created custom grooms for hair and facial hair
      - Packed everything back together in engine
      Not saying it's 100% accurate but I think it goes close to it.
      P.S.
      The most important thing here is how authenticity is carried from the actor to the 3d model.
      This process only works so accurately if the actor and the 3d mesh used are pretty much identical. The greater the difference, the more the performance and subtleties get lost. An example would be when the actor that drives the digital performance is not the same as the one used to create the 3d model.

    • @kichmadev
      @kichmadev 8 месяцев назад +1

      Lemme take a stab at explaining it.
      Default metahumans are simply too generic and everyone is using them as they are, but if you want higher fidelity you want to edit the metahuman meshes.
      This can be as simple as editing the mesh in zbrush and re-exporting some maps, or if you have the resources like this team then you do a full face scan, together with expressions.
      They have scanners that are built for this purpose as well as very talented artists that did the cleanup, hair models etc.
      What then makes this video metahuman, well it was used as a base, so they are using the metahuman head mesh and with that comes a facial animation workflow, that they wanted to highlight with this "very talented and very expensive" demo.

  • @KaVQ-artist
    @KaVQ-artist Год назад +1

    Спасибо большое, очень круто!!!

  • @ibrahimakca8628
    @ibrahimakca8628 Год назад

    i need some training, I'm trying to do something to make this especially easy for ue5, but there are always mistakes, problems, blueprints that don't work, 1 week, I'm trying to float the ship, if they could make some things simpler

  • @Doctor-ts4le
    @Doctor-ts4le Год назад

    Yeah sure with high tech studio and big budget u can do anything !! Notify us when you made it accessible for indie developers like most of us !!! So far u only made it possible for Apple phones to match with metahuman ! What about Androids ?! Or do u have some sort of contract with Apple to create more customers for them ?!

  • @50shadesofskittles9
    @50shadesofskittles9 10 месяцев назад

    It's not so hard to avoid uncanny valley ... just don't exaggerate emotions in the face like in that advert unreal did, lol.

  • @BubbaSatori
    @BubbaSatori Год назад

    The Hollowood Industrial Complex should be very worried. The writers strike couldn’t have come at a worse time.

  • @mehface
    @mehface 10 месяцев назад

    I would really like to see big improvements on the face to metahuman tech. Currently trying to get a very realistic model to map to metahuman is quite destructive. If the character is supposed to look like the actor, its way better, but mapping someone elses capture onto another face that wasn't captured is extremely hard xD

  • @hadigarsooei
    @hadigarsooei Год назад

    Please introduce the best motion capture system for live recording of body, fingers, and face movements in Unreal Engine that you have used in the same project.

  • @TRIPLE-rj3ey
    @TRIPLE-rj3ey 10 месяцев назад

    We have this before gta 6

  • @BreakBackReality
    @BreakBackReality Год назад

    01:57 please, tell me how to distinguish this from a living person?

  • @artrush3603
    @artrush3603 Год назад

    Can you tell me Sir which depth camera you are using because in iphone this result didn't I get?

  • @gamecyborg3003
    @gamecyborg3003 Год назад +1

    Bethesda looking this 😂 Let's go Kojima😁👍🏻

    • @khymaaren
      @khymaaren Год назад

      Bethesda are more than happy with their engine, don't you worry.

    • @gamecyborg3003
      @gamecyborg3003 Год назад

      16 times better lol, good point@@khymaaren

  • @elvismorellidigitalvisuala6211

    Amazing, I can watch this for hours, eyes are amazing, never seen before in real time sucha quality

  • @safc237
    @safc237 5 месяцев назад

    get the badge in

  • @vyacheslav_tashkin
    @vyacheslav_tashkin 8 месяцев назад

    👍👍👍👍👍👍👍

  • @Clukay416
    @Clukay416 11 месяцев назад

    Esse filme short é mais que uma demonstração do Metahuman, é uma obra prima. A história. o poema contado nele é muito envolvente.

  • @jerrywang592
    @jerrywang592 Год назад

    If only at the end there was text that said: Everybody in this video was actually a Metahuman

  • @wdarnett560
    @wdarnett560 9 месяцев назад

    I'm waiting for the moment where everyone in the video is CG

  • @vincenttoujas
    @vincenttoujas Год назад

    Amazing work, and thank you to highlight actors work. Without them, Metahumans are just code.

  • @truthbringer8574
    @truthbringer8574 Год назад

    I wonder what mocap system and cameras they’re using, wonder what facial capture they’re using too

  • @perfectionist724
    @perfectionist724 Год назад

    The real world is over. Unreal engine is now the real world lol

  • @g3nius
    @g3nius 9 месяцев назад

    I am impressed

  • @mooglemog4726
    @mooglemog4726 Год назад

    plot twist: the whole interview is in engine

  • @deworldzlastpunk666
    @deworldzlastpunk666 Год назад

    can we download the project file?

  • @ChumleysArt
    @ChumleysArt 9 месяцев назад

    Stunning. Blown away by this!

  • @KizzCaTs
    @KizzCaTs Год назад

    Wait, Does unreal support quad based mesh models?

    • @kichmadev
      @kichmadev 8 месяцев назад +1

      Both yes and no.
      1. No - all 3d models are displayed as triangles in the end regardless of program
      2. Yes - aldo its not supported - if you export a quad mesh in ue and export it out, it will stay a quad mesh. Proving that quad meshes work in ue.
      Because they work that doesn't mean you should use them, for example n-gons cause problems with lumen with my experience.

  • @velocinaci
    @velocinaci Год назад

    uncanny valley is past

  • @raynhornzxz
    @raynhornzxz 10 месяцев назад

    bravo!

  • @brick7113
    @brick7113 Год назад

    you are the hero

  • @EnterPlayMode
    @EnterPlayMode 12 дней назад

    2:19 me when the edibles kick in🌿

  • @venturestar
    @venturestar 11 месяцев назад

    very impressive!!

  • @SGRuLeZ_Art
    @SGRuLeZ_Art Год назад

    This is Epic!

  • @mistakenmeme
    @mistakenmeme Год назад

    This looks awesome!

  • @马大圣
    @马大圣 Год назад

    soCool

  • @raulgrt6301
    @raulgrt6301 Год назад

    Amazing 😲

  • @DaVizzle_Bro
    @DaVizzle_Bro 8 месяцев назад

    I also was blown away by the 4k.. I mean 1080p resolution this was uploaded in to fully justify and represent the potential of this new revolutionary technology.

  • @TheTruthIsGonnaHurt
    @TheTruthIsGonnaHurt 11 месяцев назад

    *Not realistic for an indie..*
    The tech is supposed to be empowering us, not creating barriers.
    Is everyone supposed to have access to a warehouse? cameras on rails? a makeup artist to put dots on your face? 3 guys at the ready to look at the monitors? and the budget for a motion suit and all the bells and whistles to compete with this? When is MetaHuman and motions capturing companies going to focus on the individual indie with user friendly solutions? and when is Unreal Engine going to showcase that?

    • @kichmadev
      @kichmadev 8 месяцев назад +1

      1. The tech is empowering you, it made thousands of dollars of work free for you.
      2. This demo is showing what can be achieved with the same "free technology" as a base.
      3. Yes there is a ton of work and money in this demo, because they are "pushing the limits".
      4. Its realistic for indie to use Metahumans as a base and have them look decent.
      5. Its not realistic to expect the same result as this video for free.
      Personally, I don't even like metahumans that much, but still gotta call out bs when i see it. 😋

  • @arulmoorthymarappan4651
    @arulmoorthymarappan4651 9 месяцев назад

    This is a real milestone in Cinema!