Insanely Fast Virtual Production | Nuke, Switchlight & Blender

Поделиться
HTML-код
  • Опубликовано: 11 янв 2025

Комментарии • 89

  • @giovani_aesthetic
    @giovani_aesthetic 2 месяца назад +57

    dude your ADHD type of tutorial is the best thing that i found in the internet in a long time

    • @rafaelturang3514
      @rafaelturang3514 2 месяца назад +1

      Agree😂

    • @edmungbean
      @edmungbean 2 месяца назад +1

      agree, works for my brain

    • @DeltaNovum
      @DeltaNovum Месяц назад

      The ADHD seems quite apparent, and is very much appreciated 😄!
      I know ADHD, like any disorder, is on a spectrum. But unlike the autism spectrum (for example), which can influence personality in a million different ways, I feel that people with ADHD are more alike, than with any other influence on personality. Thusfar every ADHDer I've met was very easy to instantly vibe and communicate with. Same goes for youtubers with ADHD, being very easy to instantly follow along or be fully captivated by.

    • @ariabk
      @ariabk Месяц назад +1

      @@DeltaNovumas someone with adhd and autism i approve this message

    • @andreybagrichuk5365
      @andreybagrichuk5365 Месяц назад

      YEEES!

  • @nehemiahsomers4141
    @nehemiahsomers4141 2 месяца назад +13

    Another banger from whats quickly becoming one of my favorite VFX channels. Your channel is so underrated bro

  • @Zizos
    @Zizos Месяц назад +8

    Dude is high on vfx... By the end of the video he's already seeing polygons.

  • @carsonwl
    @carsonwl 29 дней назад

    Easiest subscribe I've ever done. I run a RUclips channel based on teaching people to write fiction and I do a bunch of intros and skits using digital production and this tutorial was super helpful. I never knew that Nuke had a non-commercial license, but I'm going to give it a shot! I appreciate all your work on this workflow!

  • @lulubeloo
    @lulubeloo Месяц назад +1

    Dude, I absolutely love your short, I think I've watched it at least 20 times in a row. If the entrance to afterlife is like this, I'm actually looking forward to it!

  • @IncredibleGaming
    @IncredibleGaming 2 месяца назад +7

    Thank you so much for this! This'll help so much for my school project, you deserve way more subscriber's.

  • @thevfxwizard7758
    @thevfxwizard7758 2 месяца назад +6

    BTW, in Blender 4.3 you can press Shift+A -> Image -> Mesh Plane. No longer an addon.

  • @thevfxwizard7758
    @thevfxwizard7758 2 месяца назад +3

    Dang dude you're doing great work! Very informative and engaging. Keep it up and you're sure to grow.

  • @D123-f9k
    @D123-f9k 2 дня назад +1

    The Dell livestream was great. I made sure to use your name in the “How did you find out?” section when registering 👍🏼

    • @curtskelton
      @curtskelton  2 дня назад

      @@D123-f9k thank you so much! You have no clue how happy that makes me and thank you so much for watching :)

  • @behrampatel4872
    @behrampatel4872 2 месяца назад +2

    One of the most loaded videos in mixing AI with VFX . And a record breaking burp 🥳

  • @winkyflex7826
    @winkyflex7826 20 дней назад

    Holy cow. So glad the algorithm gods lead me to your channel

  • @peterwojtechkojr
    @peterwojtechkojr 2 месяца назад +9

    Watching this video is the most important thing I've done today

    • @curtskelton
      @curtskelton  2 месяца назад +2

      @@peterwojtechkojr why vote when you can cgi?

  • @The_Mitchell
    @The_Mitchell 4 дня назад

    46:18 very underrated tip to improve Ian's camera tracking technique thank you.

  • @jackrasksilver6188
    @jackrasksilver6188 Месяц назад +1

    36:29 For Blender 4.2 "import images as planes" was moved from the Import menu to the Add menu of the 3D Viewport.
    You can find it here: 3D Viewport ‣ Add (Shift A) ‣ Image ‣ Mesh Plane

  • @animatz
    @animatz 2 месяца назад +1

    Coool! Depth anything combined with copycat is very good for temporally stable depth.

  • @carbendi
    @carbendi 2 месяца назад +4

    Have been using omniscient for about 2 months now. It is greaattt. I wish apple's Ar kit allow to shoot not only hdr but also in log format than It would be nearly perfect for everything.

  • @Halllbert
    @Halllbert 2 месяца назад +1

    This is game changing

  • @morecarstuff
    @morecarstuff 2 месяца назад +2

    THIS is an amazing video. thank you for showing all your workflow. 💯💯🙏🏾🙏🏾

  • @morveuxdanish1822
    @morveuxdanish1822 19 дней назад

    really like the way you explained everything, keep it up !!! ADHD gang

  • @orcanimal
    @orcanimal 2 месяца назад +13

    It never ceases to amaze me how people continue to underestimate Davinci Resolve (Studio version). You can literally do all of this in 5 minutes flat with Davinci. You can roto out super quickly with magic mask + a saver node to export the roto into an EXR image sequence. And Davinci's Relight feature lets you generate normal map sequence (you can also export depth texture maps) and throw them all in Blender. It's super fast and easy. Davinci is the best.

    • @curtskelton
      @curtskelton  2 месяца назад +7

      I like Davinci but there's a reason I chose these programs. Copycat is the greatest rotoscoper out right now. The fact that I can roto an hour long clip in 10 minutes is insane, I literally use it everyday lol. And I like that Davinci can make normal and depth maps but I can't find anything about it also making roughness, metallic, specular, or occlusion maps. I still like Davinci! Don't get me wrong! But it's Nuke everyday for me :)

  • @alekprop
    @alekprop Месяц назад

    You ARE HILARIOUS!!! haha. Amazing tuts mate!!!!

  • @midnoonfilms
    @midnoonfilms 5 дней назад

    The import images as plane addon is now integrated in Blender.
    You have to go...
    Shift + A > Image > Mesh >
    Took me while to get find that out.

  • @1polygonewithme
    @1polygonewithme 2 месяца назад

    Bro you are a monster thank you for this video absolutely 100 percent value viewing

  • @edmungbean
    @edmungbean 2 месяца назад +3

    maybe you could remap the depth map to make a "relative height" map. For it to push the back foot back and the front foot forward, your torso would have to be the midpoint(greypoint) of the heightmap, but the heightmap generated is based on your distance from camera and you're walking towards it. The heightmap probably is putting you on a plane as you are in roughly the same place compared to the treeline 300 metres behind you. You'd need to track the colour on the midpoint (torso, hips) and remap the depth map across your video so your torso and hips remain the greypoint and your feet changeably are darker or lighter

  • @OLIVIA-h6r
    @OLIVIA-h6r 2 месяца назад +1

    i love you curt skelton your videos are great

  • @arielwollinger
    @arielwollinger Месяц назад +1

    ahahahhah! I loved the burp at the end!

  • @MrDickyTea
    @MrDickyTea 2 месяца назад

    With the depth you can probably in Nuke take your alpha, and dilate out the depth so you won't get those depth errors on the edge.
    Regarding displacement, it needs to displace towards the cameras origin and not the normal of the surface.
    Hope that helps.

  • @TheJan
    @TheJan Месяц назад +2

    wizardry but im willing to learn from you. if a silly goose like you can do it and can teach it to a silly goose like myself... i am learning

  • @shioradagoat
    @shioradagoat Месяц назад

    amazing tutorial man

  • @guno6596
    @guno6596 2 месяца назад

    This is soo cool ! thank you for sharing:)

  • @poormartytoo
    @poormartytoo 2 месяца назад

    dude. this is clutch

  • @DominicGrzecka
    @DominicGrzecka 2 месяца назад +3

    keep posting man, every tutorial I get closer to downloading nuke and deleting after effects. AFTER EFFECTS ROTO SUCKS!!! Great tutorial, explained perfect ✅, still entertaining ✅, keep posting

  • @sogoodelectric
    @sogoodelectric 2 месяца назад +1

    to avoid this displacement glitches i just set the scene, add orthorgaphic camera, render the relighted footage and import back to scene as a plane

  • @pinkinins
    @pinkinins 29 дней назад

    for displacement use adaptive subdivisions and set render mode to experimental and set subdivisions in render settings to 2 viewport 1 render and do same with map range node and turn down displacement

  • @taeyoonkang548
    @taeyoonkang548 Месяц назад

    48:53 this made my day

  • @meme-lu6lc
    @meme-lu6lc 2 месяца назад

    Yeah bro you owned a sub ☄️

  • @aakashSharma-s3x
    @aakashSharma-s3x 2 месяца назад +4

    crazy ❤

  • @exodus_legion7804
    @exodus_legion7804 2 месяца назад

    broooooooooo i couldnt figure out how to put the footage in front of the camera thank youuuuu

  • @matchdance
    @matchdance 2 месяца назад +1

    Great tutorial!

  • @Anonymous-hs5nc
    @Anonymous-hs5nc 2 месяца назад

    Watching it on .75x is perfect for me

  • @HelpingHand48
    @HelpingHand48 2 месяца назад +1

    @30:24 there’s a note from you about 16bit being not right. but back when you were choosing 8bit in nuke a second later it was back to 16bit so I think you did export at 16?

    • @curtskelton
      @curtskelton  2 месяца назад

      @@HelpingHand48 lol what the heck that’s funny. That’s just an editing error because I constantly re-do takes I don’t like. My bad.

  • @edh615
    @edh615 2 месяца назад +3

    I can get substantially more detailed results with Sapiens models + Bilateral Normal Integration, I could share the workflow if interested.

  • @vincelund7203
    @vincelund7203 2 месяца назад +1

    Since you have the depth, and you have the camera track, I bet you could use geometry nodes to push your mesh away from the camera (along camera normals) by the depth map amount. So, the pixels of that back leg would move diagonally away from the camera. 🤷‍♂️

    • @curtskelton
      @curtskelton  2 месяца назад +1

      @@vincelund7203 I think you’re right. I can’t get it to displace the way I want it to with modifiers so I really gotta get better at geometry nodes lol. Thank you!

  • @ltescre8
    @ltescre8 2 месяца назад +1

    Ed Sheeran now teaches 3D!!!!!!!!!! 😁😁😁😁😁😁🫨🫨🫨🤯🤯🤯🤯🤯🤯🤯🤯😲😲😲😲😲😲😱😱😱😱😱😱😱😱

  • @morveuxdanish1822
    @morveuxdanish1822 19 дней назад

    Btw I was wondering, Do you reckon if we use the free version of nuke to do a roto in HD, then we bring the matte in da vinci and apply it on a 4K , wloud it work better then a magic mask ?

  • @seanedits10
    @seanedits10 Месяц назад

    Could the masking be done on resolve studio magic mask? (I know nothing about 3d rendering :D )

  • @creat10n_art6
    @creat10n_art6 12 дней назад

    so i removed my background with the switchlight exported png, imported to extract pbr but gives me blank materials. Albedo is like alpha, everything else just blocks of blanks, WHY? PLS

  • @MarcusGes
    @MarcusGes Месяц назад +1

    Can I rig the phone to my real camera and use the data in a VFX program on the camera footage?

    • @curtskelton
      @curtskelton  Месяц назад

      I legit think you can, I was just talking about this same idea with a coworker lol. If you do, let me know how it goes

  • @apatsa_basiteni
    @apatsa_basiteni 2 месяца назад +1

    Cool tricks.

  • @alazaaaa
    @alazaaaa 2 месяца назад +1

    what if you used your rotoscoped alpha from nuke as a mask on the output from depthanything and export it as gray scale. am a nood so this might be waaay of and if it is i would love it if any of you guys could correct me. wanna see where i went of the rails. awesome vid

    • @curtskelton
      @curtskelton  2 месяца назад

      You are 1,000 percent on the money lol. That will work, you did great :) My gripe though is that every depth generator I've used, struggles with feet and legs. It bends the legs backwards and the feet fade into the ground too suddenly. So I'm kinda waiting until somebody else fixes this problem or depth anything v3 releases.

    • @alazaaaa
      @alazaaaa 2 месяца назад

      @@curtskelton thanks for the replay, means alot. lets hope v3 solves this. But since you seem to like experimenter like my self, am shit compared to you but what if you photo scanned your self into a 3d model with the same outfit and took the original video and extracted the walk animation with the new video to animation ai models, cleaned the animation with cascaduer and matched it to be almost perfect to the original video and super imposed it to the original video in blender, maybe even parent them. or just use the rigged 3d models depth output from blender from waist down and the original from waist up. this might fix the shadows and maybe the lighting, not sure about the light reflection tho. this might be the equivalent of using an axe to cut a piece of grass but if it worked the applications might be endless.

  • @I3ra
    @I3ra 2 месяца назад

    This could have been condensed way down for the people who already know Nuke.

    • @curtskelton
      @curtskelton  2 месяца назад +3

      And if my mother had wheels, she’d have been a bike

  • @alimkotwal5876
    @alimkotwal5876 2 месяца назад

    Awesome ❤

  • @vlad_ivakhnov
    @vlad_ivakhnov 2 месяца назад +1

    Is there a similar program like Omniscient, for Android?

    • @curtskelton
      @curtskelton  Месяц назад

      Maybe CameraTrackAR. That might be on the android app store but idk

  • @1zymn1
    @1zymn1 2 месяца назад

    You need to do camera tracking so you stay in the right place while the camera does the moving.

  • @Atsolok
    @Atsolok 2 месяца назад +2

    Great video bro! I'm not an expert like you but would it help if you would create a depth map with Davinci Resolve?

    • @curtskelton
      @curtskelton  2 месяца назад +1

      I don't know much about Davinci's depth generator but I'm fairly confident that Depth Anything V2 is currently the top dog rn. And it works really well! But I think V3 will fix my gripes lol. Thank you so much for telling me about Davinci though!! I didn't know that and that motivates me just a little more to ditch Premiere :)

    • @Venous_fx
      @Venous_fx 2 месяца назад +2

      resolve actually has an auto bg remover. in my experience it is extremely good and its relatively fast. While you're at it you can generate your normals for your roto and any depth maps you need without jumping programs. Once you finish rendering in blender just drop it back into resolve for grading and final editing. Extremely powerful tool and cuts down on a lot of time hopping between programs.

  • @choc_blok
    @choc_blok 2 месяца назад +1

    Ctrl shift t works too 37:30

  • @toapyandfriends
    @toapyandfriends Месяц назад +2

    Holy **** you could rotoscope while your f**kin' I gotta tell my girlfriend!!

  • @edmungbean
    @edmungbean 2 месяца назад

    Resolve/Fusion can import image sequences, it's just annoying, do it in the Media tab
    Blender 4.2 can import images as planes, it's just integrated: Add>Images>Images As Planes

  • @Enderblade18
    @Enderblade18 2 месяца назад +1

    Really wish I had an NVidia GPU, can't run Switchlight 😭
    Cool video though!

  • @lonewolf-vw9wf
    @lonewolf-vw9wf Месяц назад

    lostme at that 13 min

  • @brianmolele7264
    @brianmolele7264 Месяц назад

    There's another easier rotoscope method, Davinci Resolve 19, also free.

  • @bearfm
    @bearfm 2 месяца назад +2

    so sad that surt ckelton got herpes

  • @xanderusa1
    @xanderusa1 Месяц назад

    Useless. No available on android.

  • @FrancoisRimasson
    @FrancoisRimasson 2 месяца назад

    😢
    Interesting. But you speak too much