Jed Smith
Jed Smith
  • Видео 6
  • Просмотров 193 049
Stitching Spherical HDR Panoramas in NukeX with CaraVR
In this video we take an in-depth look at using the CaraVR toolset in NukeX 12.0+ for stitching spherical HDR panoramas.
We talk about
- Advantages of stitching in Nuke vs alternatives like PTGui and AutoPanoGiga
- Blending multiple bracketed exposures into a single hdr image
- Calibrating color and exposure
- Using the C_CameraSolver node to solve our camera rig, including an in-depth rundown on creating manual feature matches, and a manual approach for getting a solve
- Fixing alignment, ghosting and blending artifacts
- Using C_SphericalTransform to paint out unwanted objects in frame
- Using Blender, we'll test out our final HDR to see how it works!
Links
Debayer tool: github.com/jedypod/deba...
Просмотров: 11 559

Видео

The Math of Color Grading in Nuke
Просмотров 9 тыс.4 года назад
In this video we'll take a close look at the simple mathematical functions behind color grading operations in Nuke. We will only discuss operations that affect brightness, or value of an image. We'll save the more complex discussion of color and hue for a future video. 00:00 - Introduction 01:00 - Scene-linear and why it's important 02:30 - What is a linear function 04:30 - How the Grade node w...
Tears of Steel Plates
Просмотров 21 тыс.4 года назад
This video collects all 50 minutes of scene-linear OpenEXR plates shot for the Tears of Steel project created by the Blender Foundation. mango.blender.org/production/4-tb-original-4k-footage-available-as-cc-by/ media.xiph.org/tearsofsteel/tearsofsteel-footage-exr/ Tears of Steel remains one of the few open sources of high quality scene linear exr footage for testing and learning visual effects ...
Simulating Physically Accurate Depth of Field in Nuke
Просмотров 65 тыс.7 лет назад
A discussion of lenses and optics, and how they affect depth of field behavior in an image. We talk about depth channels, what they are and how they work. How depth of field can be simulated in Nuke with the ZDefocus tool. We also demonstrate the OpticalZDefocus gizmo, which is a tool for generating physically accurate depth of field. It does this using the depth of field equation, given lens c...
Nuke Screen Replacement Tutorial Part 2: Compositing and Integrating the Screen Content
Просмотров 45 тыс.12 лет назад
This is a tutorial about how to replace the screen in a cell-phone using Nuke. It deals with the challenges of tracking a glossy screen, and getting a good key from a light-emitting screen covered in smudges. I tried to speed through as many of the slow parts as possible, but it is still a bit lengthy. The tutorial is split into two 45 minute parts. Part 1 covers the tracking challenges. Part 2...
Nuke Screen Replacement Tutorial Part 1: Tracking The Screen
Просмотров 40 тыс.12 лет назад
This is a tutorial about how to replace the screen in a cell-phone using Nuke. It deals with the challenges of tracking a glossy screen, and getting a good key from a light-emitting screen covered in smudges. I tried to speed through as many of the slow parts as possible, but it is still a bit lengthy. The tutorial is split into two 45 minute parts. Part 1 covers the tracking challenges. Part 2...

Комментарии

  • @jagannadhyellapu7034
    @jagannadhyellapu7034 3 месяца назад

    Must and should we need to download each frame or any batch download ??

  • @adamghering
    @adamghering 5 месяцев назад

    I cant get the wget to work

  • @marcangele
    @marcangele 11 месяцев назад

    hey jed, thanks a lot for the amazing tutorial and the tools you shared! i watched also your other tutorials and there are great. so, please keep uploading videos! definitely it would be great to see a tutorial about debayering.

  • @reed4109
    @reed4109 Год назад

    Do you do any one on one mentoring?

  • @SebastianLarsen
    @SebastianLarsen Год назад

    I get "strptime() argument 1 must be str, not None" when I hit group - What am I doing wrong?

    • @jedsmith
      @jedsmith Год назад

      happy to try to help but i'll need more specific information. - in what software / in what tool are you hitting "group"? - what os? - what version of nuke (if applicable)? - what version of python (if applicable)?

    • @ivantovaralbaladejo687
      @ivantovaralbaladejo687 Год назад

      @@jedsmith it happens to me too. I'm using nuke 15 in windows 11 with python 3.10. It happens when I select several images and click on ''short selected reads''

    • @ivantovaralbaladejo687
      @ivantovaralbaladejo687 Год назад

      @@jedsmith here the log: [22:04.52] ERROR: ExposureBlend.EqualizeExposure9.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure7.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure8.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure6.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure5.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure4.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure3.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure2.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure1.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure0.value: unexpected end of "exposure/" Traceback (most recent call last): File "<string>", line 127, in <module> File "<string>", line 43, in sort File "<string>", line 43, in <dictcomp> TypeError: strptime() argument 1 must be str, not None

    • @jedsmith
      @jedsmith Год назад

      ​@@ivantovaralbaladejo687 Thanks for the additional info, that gives me something to go on. On line 43 the code calls datetime.strptime(n.metadata().get('exr/Exif:DateTimeOriginal')-- this is used for the exposure bracket detection. It sounds like maybe that metadata does not exist in the debayered exr image you are loading as your source. Can you confirm? If so is there other metadata describing the capture time? It's possible Canon cameras use this but not others.

    • @ivantovaralbaladejo687
      @ivantovaralbaladejo687 Год назад

      ​@@jedsmith I'm using 16-bit tiff images converted from RawTherapee. I tested with Lightroom too and doesn't work. Could it be the format? In the metadata of my file I see the exposure time for each image though

  • @veronikashmorhun6089
    @veronikashmorhun6089 Год назад

    ❤‍🔥 wow

  • @YashJoshi-uv3sh
    @YashJoshi-uv3sh Год назад

    This is 💎

  • @shorn77777
    @shorn77777 Год назад

    super smashing color math video!!!

  • @stevemcleod8607
    @stevemcleod8607 Год назад

    this was pretty cool, thanks for doing this video, cleared up the differences between some of the duplicate functions for me.

  • @tko03
    @tko03 Год назад

    ASMR keyboard tracking sounds to composite/track to

  • @monocore
    @monocore Год назад

    This one might be one of the nicest in depth chat about this topic in youtube. Thank you again.

  • @sureshsumith7396
    @sureshsumith7396 2 года назад

    Where I can get cg & bg elements for all these footages

    • @jedsmith
      @jedsmith 2 года назад

      here I believe: studio.blender.org/join/

  • @marccb
    @marccb 2 года назад

    Great explanations, thank you! Btw, the script did help me, especially looking into the CompressToe... Only thing... I can't find your name anywhere, so I can`t see if you have content elsewhere...

  • @libanahmed2912
    @libanahmed2912 2 года назад

    This is very helpfull. do you have the script. I wanna play around with it

    • @jedsmith
      @jedsmith 2 года назад

      Not sure it's gonna help you much, but sure I added it to the description: codeshare.io/zyMb7D

  • @aryankhan494
    @aryankhan494 2 года назад

    can u plzzz upload another tuts plzzz . i liked this one.thank u

  • @marioCazares
    @marioCazares 2 года назад

    Thank you so much! I've been trying to do this for so long but never took the time to figure it out. This is perfect and now I can use the absolute values from CG depth passes. Your gizmo will be my go to for that now. Again thanks

    • @InLightVFX
      @InLightVFX 2 года назад

      I swear I just find you in the most random places on the internet lol

    • @marioCazares
      @marioCazares 2 года назад

      @@InLightVFX haha hello Jacob! Glad to see you on this gem of a video x)!

  • @davidmarin6305
    @davidmarin6305 2 года назад

    It was great. Looking forward to seeing more of these

  • @davidmarin6305
    @davidmarin6305 2 года назад

    Love it!!

  • @FaitelTech
    @FaitelTech 2 года назад

    Thanks for detailed description, you even attached links to desmos graphs! Very appreciated 👍

  • @alexandrkrasnovitskii2512
    @alexandrkrasnovitskii2512 2 года назад

    Hey there! What a nice tutorial, thank you so much! I have only one question which tool you're using to change the AOV passes right in the viewer?

    • @jedsmith
      @jedsmith 2 года назад

      Thanks! It's Falk Hoffman's Channel Hotbox www.nukepedia.com/python/ui/channel-hotbox

  • @shahidmunir2846
    @shahidmunir2846 2 года назад

    Realy a detailed information math calculation thankyou very much.

  • @shaocaholica
    @shaocaholica 2 года назад

    Thanks for this. What are your thoughts on using Lightroom for processing the raws? I know its Win/Mac only but I already have that anyway. It has very good chromatic fringing correction. If you export as DNG those should be in linear space as well, already debayered and readable in Nuke.

    • @SebastianLarsen
      @SebastianLarsen Год назад

      Howdy, I am a little late here but AFAIK, Lightroom will always apply its aestethics to your shots so it should be avoided in a linear workflow altogether?

  • @Headlantern
    @Headlantern 2 года назад

    thank you so much for the tutorial with detailed explanation! May I know why do you use 'lightwarp' node on finger?

  • @danielbrylka2228
    @danielbrylka2228 2 года назад

    Thanks Jed, that is such a great video. On the Histogram node: I use it sometimes to remap let’s say a world depth channel to create a mask channel.

    • @jedsmith
      @jedsmith 2 года назад

      Hey Daniel! Thanks for the kind words. In Nuke I do wish we had a nice Remap node. The math in the Histogram is the same as the Grade node, but I agree it could be more intuitive to use for the simple task of remapping a source min and max to a target min and max value.

  • @abbyrodebaugh6634
    @abbyrodebaugh6634 3 года назад

    This is golden. Thank you for sharing your knowledge!

  • @GFinVFX
    @GFinVFX 3 года назад

    Great Breakdown... much appreciated from anyone interested in this topic ;) Thank you

  • @vishalwadhe
    @vishalwadhe 3 года назад

    can we use this footages in reel ?

    • @jedsmith
      @jedsmith 3 года назад

      This comment does not constitute legal advice. According to the readme included with the footage media.xiph.org/tearsofsteel/tearsofsteel-footage-exr/README.txt The license of that footage is Creative Commons Attribution 3.0 : creativecommons.org/licenses/by/3.0/

  • @martinconstable5911
    @martinconstable5911 3 года назад

    Great work! Many thanks

  • @bidsandmen
    @bidsandmen 3 года назад

    I do not have a node MergeDivide (17.09 in lesson). Tell me what to do?

    • @jedsmith
      @jedsmith 3 года назад

      Ah sorry I missed that -- the built-in divide node doesn't handle values below 0 properly so I usually use my own built from a MergeExpression node. Ar/Br

    • @jedsmith
      @jedsmith 3 года назад

      Sometimes I will swap the order so it's Br/Ar as well, so that I can disable the Merge and pass through the B pipe to disable the operation instead of passing through the thing I'm dividing out.

    • @djordjeilic9634
      @djordjeilic9634 2 года назад

      @@jedsmith Hey thank you for the video. Very usefull.When you said Ar/Br. You're using the R channel? and why? Thanks

    • @jedsmith
      @jedsmith 2 года назад

      @@djordjeilic9634 Yeah sorry that was not very clear. It would be Ar/Br for the red channel, Ag/Bg for the green channel etc

  • @UVtec
    @UVtec 3 года назад

    19:36 Yeah, the bokeh is really cool! =)

  • @joelswfx
    @joelswfx 3 года назад

    Hey Jed, those gizmos that you wrote (you used 2 of them) how can I obtain those gizmos?)

    • @jedsmith
      @jedsmith 3 года назад

      Hey! It's all in my nuke-config git repo: github.com/jedypod/nuke-config/blob/master/ToolSets/Merge/ExposureBlend.nk github.com/jedypod/nuke-config/blob/master/ToolSets/Color/CalibrateMacbeth.nk

    • @frobles83
      @frobles83 2 года назад

      @@jedsmith Hello Jed, nice to see you again. Would it be possible to make these links live again?

    • @jedsmith
      @jedsmith 2 года назад

      Hey Ms. Robles! Good to hear from you! :D Sorry it looks like I moved those files around since I posted that - I edited the above links so they work again.

    • @frobles83
      @frobles83 2 года назад

      @@jedsmith Thank you. You are the best!

  • @KenedyTorcatt
    @KenedyTorcatt 3 года назад

    Hi, I wanted to know if I can use this tool with nuke 13? It's really awesome

    • @jedsmith
      @jedsmith 3 года назад

      Yes it should work fine in Nuke 13. The only thing that would need to be updated is support for Camera3 class nodes in the "Get Selected Camera" python script button.

  • @conradallan8091
    @conradallan8091 3 года назад

    You get my upvote just for that hilariously great intro.

  • @IamJohnnyFan
    @IamJohnnyFan 3 года назад

    Thank you for this!

  • @boogiekimx
    @boogiekimx 3 года назад

    Thanks for the tutorial keep these coming.

  • @teddyarcher3957
    @teddyarcher3957 3 года назад

    very well explained, thx!

  • @nicolaschan3335
    @nicolaschan3335 3 года назад

    Thanks!

  • @ackila1819
    @ackila1819 3 года назад

    Hi Jed Thanks a lot for your detailed tutorial on how to stich spherical HDRs in Cara VR. It's been very helpful to me. Also very nice to have a workflow within nuke without the need of using lots of different software packages. Makes things quite a bit easier. Do you think it would be possible go over your process of debayering raw images to scene linear in more detail. I am also on Linux but I am currently struggling to get your debayer / RawTherapee workflow to work properly.

    • @jedsmith
      @jedsmith 3 года назад

      Hi! Yours is the second request for the topic of scene-linear raw debayer workflow... I will definitely make it my next video. Now... just need to get some spare time to work on this :)

    • @ackila1819
      @ackila1819 3 года назад

      @@jedsmith awesome. Thanks a lot. Looking forward to it.

  • @sebastianbrandow914
    @sebastianbrandow914 3 года назад

    This is really useful, thank you so much!

  • @davideghirelli4453
    @davideghirelli4453 3 года назад

    is there any way to get right defocus edge on front and on the back of the object? can't find a way

  • @姜哈哈-s4w
    @姜哈哈-s4w 3 года назад

    It's hard not to have Chinese

    • @jedsmith
      @jedsmith 3 года назад

      我对此感到抱歉

  • @DJEast2
    @DJEast2 4 года назад

    Hey Jed, Thanks a lot for that great tutorial! Finally I'm close to a solution to process my raw images into ACES - but not getting it quite right yet. Is there any chance you could get more into detail how to debayer raw images into scene linear step by step? I already checked your debayer Github site but struggle with it a bit. I'm not a TD and have no clue how to use/install your tool the right way using Windows. Maybe someone could share a link or drop-in the right keywords for me to get my head around it? Thanks so much! I've also tried your steps shown with RawTherapee but my exported ACES 32bit TIFs seem clipped (other formats also just show the same 0-1 values). Any idea? I really appreciate your time - cheers!

    • @jedsmith
      @jedsmith 4 года назад

      Hey! Thanks for the nice comment. Debayering into ACES can be tricky. This is the subject of a video I have planned to create, going into a lot more detail than the little hints I put in this video. Getting Debayer to work on Windows can be tricky due to the OpenImageIO and OpenColorIO dependency. Even using Choco package manager to get them running can be difficult. So I can't say I blame you there. You're actually not the first person to inquire about this, so I think I'll try to work out a how-to for getting Debayer running on windows as well. Short answer for your question: Most raw image formats store the raw bayer data in an integer format. Integer implied a 0 to 1 range, with the number of steps between determined by the bit-depth. Therefore all data that the camera captures is going to be between 0 and 1. Values above 1 will be clipped. If you debayer using the RawTherapee profile I supplied in the Debayer github repo, you will get something resembling Scene Linear ACES data, but encoded in a 0 to 1 range. The next step would be to expose your image such that an 18% diffuse reflector would be a value of 0.18 in scene linear. Now you have a proper scene linear image, maybe with specular highlights above 1. Where sensor clip gets mapped to depends on how your image was exposed in the camera and your camera's dynamic range. For example an Alexa will usually clip at around 35 to 40 in scene linear depending on the camera, the dp and the show. My shitty Canon 5Dmk3 will clip at around 12 to 15 if shooting in dualiso using magic lantern. Hope that helps! I'm working on a video about image data formats, and then one on debayering is up next I think. Just need to get some solid time to work on it! :)

    • @DJEast2
      @DJEast2 4 года назад

      @@jedsmith Thanks a lot for your detailed reply - I appreciate that! Of course there are still a lot of questions but I'm happy to wait for your next video and get my head around it. Maybe I'll also jump on ACEScentral and create a thread to get more information on the workflow. Wish you all the best!

  • @PhotoGuoRichard
    @PhotoGuoRichard 4 года назад

    u r so professional! i am using Ptgui, I am not so much satisfied with the HDR result! I am new in Nuke, Learn a lot from u! thanks!

  • @PeterJansen
    @PeterJansen 4 года назад

    Damn. Thank you. I didn't need to know this today, but I damn well learnt it today.

  • @mandrewkano8939
    @mandrewkano8939 4 года назад

    Amazing stuff, thats the way to learn, by undestand the underlying processes , not by moving sliders.

    • @jedsmith
      @jedsmith 4 года назад

      My philosophy exactly. Thanks for your nice comment! :)

  • @shivas4831
    @shivas4831 4 года назад

    You sovled my problem...thanks man 'every artist must watch this video'

  • @anniep4442
    @anniep4442 4 года назад

    Radio-announcer mode activated. rofl. Amazing content, caught myself staying for the entire video though I meant to finish it later. Love the chapters breakdown.

  • @СамирМуртазалиев-у7ю

    super! wait for next tutorial )))

  • @mamama2568
    @mamama2568 4 года назад

    thanks!

  • @mamama2568
    @mamama2568 4 года назад

    thx!