Insanely Fast Virtual Production | Nuke, Switchlight & Blender

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 71

  • @giovani_aesthetic
    @giovani_aesthetic 16 дней назад +36

    dude your ADHD type of tutorial is the best thing that i found in the internet in a long time

    • @rafaelturang3514
      @rafaelturang3514 13 дней назад +1

      Agree😂

    • @edmungbean
      @edmungbean 13 дней назад +1

      agree, works for my brain

    • @DeltaNovum
      @DeltaNovum 6 дней назад

      The ADHD seems quite apparent, and is very much appreciated 😄!
      I know ADHD, like any disorder, is on a spectrum. But unlike the autism spectrum (for example), which can influence personality in a million different ways, I feel that people with ADHD are more alike, than with any other influence on personality. Thusfar every ADHDer I've met was very easy to instantly vibe and communicate with. Same goes for youtubers with ADHD, being very easy to instantly follow along or be fully captivated by.

  • @Zizos
    @Zizos 8 дней назад +4

    Dude is high on vfx... By the end of the video he's already seeing polygons.

  • @orcanimal
    @orcanimal 16 дней назад +9

    It never ceases to amaze me how people continue to underestimate Davinci Resolve (Studio version). You can literally do all of this in 5 minutes flat with Davinci. You can roto out super quickly with magic mask + a saver node to export the roto into an EXR image sequence. And Davinci's Relight feature lets you generate normal map sequence (you can also export depth texture maps) and throw them all in Blender. It's super fast and easy. Davinci is the best.

    • @curtskelton
      @curtskelton  16 дней назад +6

      I like Davinci but there's a reason I chose these programs. Copycat is the greatest rotoscoper out right now. The fact that I can roto an hour long clip in 10 minutes is insane, I literally use it everyday lol. And I like that Davinci can make normal and depth maps but I can't find anything about it also making roughness, metallic, specular, or occlusion maps. I still like Davinci! Don't get me wrong! But it's Nuke everyday for me :)

  • @peterwojtechkojr
    @peterwojtechkojr 18 дней назад +9

    Watching this video is the most important thing I've done today

    • @curtskelton
      @curtskelton  18 дней назад +2

      @@peterwojtechkojr why vote when you can cgi?

  • @nehemiahsomers4141
    @nehemiahsomers4141 18 дней назад +12

    Another banger from whats quickly becoming one of my favorite VFX channels. Your channel is so underrated bro

  • @IncredibleGaming
    @IncredibleGaming 18 дней назад +7

    Thank you so much for this! This'll help so much for my school project, you deserve way more subscriber's.

  • @behrampatel4872
    @behrampatel4872 16 дней назад +2

    One of the most loaded videos in mixing AI with VFX . And a record breaking burp 🥳

  • @thevfxwizard7758
    @thevfxwizard7758 18 дней назад +4

    BTW, in Blender 4.3 you can press Shift+A -> Image -> Mesh Plane. No longer an addon.

  • @carbendi
    @carbendi 18 дней назад +4

    Have been using omniscient for about 2 months now. It is greaattt. I wish apple's Ar kit allow to shoot not only hdr but also in log format than It would be nearly perfect for everything.

  • @thevfxwizard7758
    @thevfxwizard7758 18 дней назад +3

    Dang dude you're doing great work! Very informative and engaging. Keep it up and you're sure to grow.

  • @animatz
    @animatz 16 дней назад +1

    Coool! Depth anything combined with copycat is very good for temporally stable depth.

  • @morecarstuff
    @morecarstuff 16 дней назад +2

    THIS is an amazing video. thank you for showing all your workflow. 💯💯🙏🏾🙏🏾

  • @Halllbert
    @Halllbert 15 дней назад +1

    This is game changing

  • @jackrasksilver6188
    @jackrasksilver6188 6 дней назад +1

    36:29 For Blender 4.2 "import images as planes" was moved from the Import menu to the Add menu of the 3D Viewport.
    You can find it here: 3D Viewport ‣ Add (Shift A) ‣ Image ‣ Mesh Plane

  • @TheJan
    @TheJan 7 дней назад +2

    wizardry but im willing to learn from you. if a silly goose like you can do it and can teach it to a silly goose like myself... i am learning

  • @edmungbean
    @edmungbean 13 дней назад +3

    maybe you could remap the depth map to make a "relative height" map. For it to push the back foot back and the front foot forward, your torso would have to be the midpoint(greypoint) of the heightmap, but the heightmap generated is based on your distance from camera and you're walking towards it. The heightmap probably is putting you on a plane as you are in roughly the same place compared to the treeline 300 metres behind you. You'd need to track the colour on the midpoint (torso, hips) and remap the depth map across your video so your torso and hips remain the greypoint and your feet changeably are darker or lighter

  • @1polygonewithme
    @1polygonewithme 16 дней назад

    Bro you are a monster thank you for this video absolutely 100 percent value viewing

  • @MrDickyTea
    @MrDickyTea 13 дней назад

    With the depth you can probably in Nuke take your alpha, and dilate out the depth so you won't get those depth errors on the edge.
    Regarding displacement, it needs to displace towards the cameras origin and not the normal of the surface.
    Hope that helps.

  • @Anonymous-hs5nc
    @Anonymous-hs5nc 11 дней назад

    Watching it on .75x is perfect for me

  • @aakashSharma-s3x
    @aakashSharma-s3x 18 дней назад +4

    crazy ❤

  • @SpectraVision-f5o
    @SpectraVision-f5o 16 дней назад +2

    ... A dell is not a hardware description.

  • @toapyandfriends
    @toapyandfriends 8 дней назад +2

    Holy **** you could rotoscope while your f**kin' I gotta tell my girlfriend!!

  • @arielwollinger
    @arielwollinger 10 дней назад +1

    ahahahhah! I loved the burp at the end!

  • @guno6596
    @guno6596 13 дней назад

    This is soo cool ! thank you for sharing:)

  • @OLIVIA-h6r
    @OLIVIA-h6r 16 дней назад +1

    i love you curt skelton your videos are great

  • @poormartytoo
    @poormartytoo 13 дней назад

    dude. this is clutch

  • @exodus_legion7804
    @exodus_legion7804 13 дней назад

    broooooooooo i couldnt figure out how to put the footage in front of the camera thank youuuuu

  • @ltescre8
    @ltescre8 15 дней назад +1

    Ed Sheeran now teaches 3D!!!!!!!!!! 😁😁😁😁😁😁🫨🫨🫨🤯🤯🤯🤯🤯🤯🤯🤯😲😲😲😲😲😲😱😱😱😱😱😱😱😱

  • @meme-lu6lc
    @meme-lu6lc 12 дней назад

    Yeah bro you owned a sub ☄️

  • @DominicGrzecka
    @DominicGrzecka 18 дней назад +3

    keep posting man, every tutorial I get closer to downloading nuke and deleting after effects. AFTER EFFECTS ROTO SUCKS!!! Great tutorial, explained perfect ✅, still entertaining ✅, keep posting

  • @alimkotwal5876
    @alimkotwal5876 16 дней назад

    Awesome ❤

  • @matchdance
    @matchdance 18 дней назад +1

    Great tutorial!

  • @sogoodelectric
    @sogoodelectric 14 дней назад +1

    to avoid this displacement glitches i just set the scene, add orthorgaphic camera, render the relighted footage and import back to scene as a plane

  • @HelpingHand48
    @HelpingHand48 18 дней назад +1

    @30:24 there’s a note from you about 16bit being not right. but back when you were choosing 8bit in nuke a second later it was back to 16bit so I think you did export at 16?

    • @curtskelton
      @curtskelton  18 дней назад

      @@HelpingHand48 lol what the heck that’s funny. That’s just an editing error because I constantly re-do takes I don’t like. My bad.

  • @vincelund7203
    @vincelund7203 18 дней назад +1

    Since you have the depth, and you have the camera track, I bet you could use geometry nodes to push your mesh away from the camera (along camera normals) by the depth map amount. So, the pixels of that back leg would move diagonally away from the camera. 🤷‍♂️

    • @curtskelton
      @curtskelton  18 дней назад +1

      @@vincelund7203 I think you’re right. I can’t get it to displace the way I want it to with modifiers so I really gotta get better at geometry nodes lol. Thank you!

  • @I3ra
    @I3ra 11 дней назад

    This could have been condensed way down for the people who already know Nuke.

    • @curtskelton
      @curtskelton  11 дней назад

      And if my mother had wheels, she’d have been a bike

  • @Atsolok
    @Atsolok 18 дней назад +2

    Great video bro! I'm not an expert like you but would it help if you would create a depth map with Davinci Resolve?

    • @curtskelton
      @curtskelton  18 дней назад +1

      I don't know much about Davinci's depth generator but I'm fairly confident that Depth Anything V2 is currently the top dog rn. And it works really well! But I think V3 will fix my gripes lol. Thank you so much for telling me about Davinci though!! I didn't know that and that motivates me just a little more to ditch Premiere :)

    • @Venous_fx
      @Venous_fx 17 дней назад +2

      resolve actually has an auto bg remover. in my experience it is extremely good and its relatively fast. While you're at it you can generate your normals for your roto and any depth maps you need without jumping programs. Once you finish rendering in blender just drop it back into resolve for grading and final editing. Extremely powerful tool and cuts down on a lot of time hopping between programs.

  • @edh615
    @edh615 17 дней назад +3

    I can get substantially more detailed results with Sapiens models + Bilateral Normal Integration, I could share the workflow if interested.

  • @1zymn1
    @1zymn1 17 дней назад

    You need to do camera tracking so you stay in the right place while the camera does the moving.

  • @MarcusGes
    @MarcusGes 10 дней назад +1

    Can I rig the phone to my real camera and use the data in a VFX program on the camera footage?

    • @curtskelton
      @curtskelton  10 дней назад

      I legit think you can, I was just talking about this same idea with a coworker lol. If you do, let me know how it goes

  • @alazaaaa
    @alazaaaa 13 дней назад +1

    what if you used your rotoscoped alpha from nuke as a mask on the output from depthanything and export it as gray scale. am a nood so this might be waaay of and if it is i would love it if any of you guys could correct me. wanna see where i went of the rails. awesome vid

    • @curtskelton
      @curtskelton  12 дней назад

      You are 1,000 percent on the money lol. That will work, you did great :) My gripe though is that every depth generator I've used, struggles with feet and legs. It bends the legs backwards and the feet fade into the ground too suddenly. So I'm kinda waiting until somebody else fixes this problem or depth anything v3 releases.

    • @alazaaaa
      @alazaaaa 11 дней назад

      @@curtskelton thanks for the replay, means alot. lets hope v3 solves this. But since you seem to like experimenter like my self, am shit compared to you but what if you photo scanned your self into a 3d model with the same outfit and took the original video and extracted the walk animation with the new video to animation ai models, cleaned the animation with cascaduer and matched it to be almost perfect to the original video and super imposed it to the original video in blender, maybe even parent them. or just use the rigged 3d models depth output from blender from waist down and the original from waist up. this might fix the shadows and maybe the lighting, not sure about the light reflection tho. this might be the equivalent of using an axe to cut a piece of grass but if it worked the applications might be endless.

  • @apatsa_basiteni
    @apatsa_basiteni 16 дней назад +1

    Cool tricks.

  • @choc_blok
    @choc_blok 17 дней назад +1

    Ctrl shift t works too 37:30

  • @Enderblade18
    @Enderblade18 18 дней назад +1

    Really wish I had an NVidia GPU, can't run Switchlight 😭
    Cool video though!

  • @vlad_ivakhnov
    @vlad_ivakhnov 11 дней назад +1

    Is there a similar program like Omniscient, for Android?

    • @curtskelton
      @curtskelton  9 дней назад

      Maybe CameraTrackAR. That might be on the android app store but idk

  • @brianmolele7264
    @brianmolele7264 2 дня назад

    There's another easier rotoscope method, Davinci Resolve 19, also free.

  • @edmungbean
    @edmungbean 13 дней назад

    Resolve/Fusion can import image sequences, it's just annoying, do it in the Media tab
    Blender 4.2 can import images as planes, it's just integrated: Add>Images>Images As Planes

  • @bearfm
    @bearfm 18 дней назад +2

    so sad that surt ckelton got herpes

  • @FrancoisRimasson
    @FrancoisRimasson 12 дней назад

    😢
    Interesting. But you speak too much