3 LUT myths + Why do Hollywood LUTs work better?

Поделиться
HTML-код
  • Опубликовано: 24 ноя 2024

Комментарии • 123

  • @VernardNuncioFields
    @VernardNuncioFields Месяц назад +20

    I like big LUTS and I cannot lie!

  • @elcasanelles5806
    @elcasanelles5806 Месяц назад +1

    I've been using the Voyager luts for over a year now and I'm very happy with them.

  • @soraaoixxthebluesky
    @soraaoixxthebluesky Месяц назад

    Not gonna lie I just bought a camera that can handle custom LUT and the result was insane. I nailed both exposure and white balance on set unlike using a camera built in gamma assist. It’s a night and day differences.
    And yes you need to feed the camera with a proper colorspace that the camera expect > intermediate color space that you going to grade > output display colorspace that you going to deliver. This hybrid LUTS going to make your life much easier in post.

  • @AdventuresAdam
    @AdventuresAdam Месяц назад

    Thank you! Pro pack purchased and look forward to having more consistent workflow as a solo guy. Cheers.

  • @ema.colorg
    @ema.colorg Месяц назад +34

    I think the problem is the sale of luts. People buy them without knowing how they were made and then expect their footage to look identical.

    •  Месяц назад

      Lattice helps a lot to check LUTs…

    • @chevonpetgrave4991
      @chevonpetgrave4991 Месяц назад +2

      This! I no longer buy LUT’s from anyone who cant a t the minimum state what color space the lut expects and outputs.
      That is asking for disappointment.

    • @maurice_morales
      @maurice_morales Месяц назад

      But my question is how do we use Lattice to properly evaluate the LUT? There’s very little information on how to use Lattice.

    •  Месяц назад

      @@maurice_morales watch Yedlins videos, curves crossing more than once is a quick way to spot an improper LUT

  • @janispolar5916
    @janispolar5916 7 дней назад

    Do you ever apply a LUT after the output CST? E.g. you have a LUT from REC 709 to a certain look. Or would you always have the CST output as the last node in the Hierarchy? Thanks.

  • @Gordymax
    @Gordymax Месяц назад

    Dinner is served!!!

  • @maurice_morales
    @maurice_morales Месяц назад +1

    Can you do a video on stress testing a LUT?

    • @CullenKelly
      @CullenKelly  Месяц назад +2

      I actually don't stress test LUTs! Which would probably be a great topic for a video 😂

    • @maurice_morales
      @maurice_morales Месяц назад

      @@CullenKelly It could possibly kill the market of problematic LUT packs being sold online while also teaching users about what LUT's are.

  • @AllThingsFilm1
    @AllThingsFilm1 Месяц назад +4

    Thank you for the LUTs and this walkthrough.
    Q: When creating a viewing LUT to install in the camera, do you create a 33 cube or 65 cube LUT?

    • @GeezerStray
      @GeezerStray Месяц назад +6

      Has to be 33, most cameras can't take a 65 cube.

    • @maurice_morales
      @maurice_morales Месяц назад +1

      @@GeezerStraythe Alexa 35 can take 65. I believe (not sure) the Venice 2 can take 65 as well.

    • @CullenKelly
      @CullenKelly  Месяц назад +1

      Good replies below...generally 33, though the Alexa 35 can do 65

    • @maurice_morales
      @maurice_morales Месяц назад

      @@CullenKelly According to the Sony Venice 2 documentation you can import 17, 33, and 65 grid LUT files into the camera.
      Side note that Sony Venice also have another color processing pipeline that is superior to LUTs called ART (Advanced Rendering Transform). They have a white paper brochure floating around (hard to find) about this workflow. I used it once on set a few years back. Visually it had a slightly different rending of the image. In their white paper they have a graphic simulating some banding of two colored light bulbs when using LUT pipeline vs using the ART pipeline. Pretty interesting.

  • @danielaleksis
    @danielaleksis Месяц назад

    Hey Cullen! Do you have a tutorial on how to use your Voyager LUT pack on HDR videos?

    • @LoremIpsumProd
      @LoremIpsumProd Месяц назад

      Prolly just output to the requirement?

    • @CullenKelly
      @CullenKelly  Месяц назад

      No special consideration needed for using Voyager with HDR, just set up your color management as needed and place Voyager within it!

    • @f.d.thdlifestyle6079
      @f.d.thdlifestyle6079 Месяц назад

      does your luts work in all editing software or only resolve

    • @LoremIpsumProd
      @LoremIpsumProd Месяц назад

      @@f.d.thdlifestyle6079 the color space is DWG/I and you must convert it with technical transformation for use in other apps or camera.

  • @EposVox
    @EposVox Месяц назад

    Silly question: if you have LUTS that add looks that can’t be created in Resolve, how were they created? What higher tier color grading tool was used?

    • @TreyMotes
      @TreyMotes Месяц назад

      Who said they can't be created in Resolve?

    • @EposVox
      @EposVox Месяц назад

      @@TreyMotes He did. Multiple times.

    • @TreyMotes
      @TreyMotes Месяц назад

      @@EposVox "Multiple times." Is this you being snarky for no reason whatsoever? All you had to say was that he said it. I hadn't yet finished the video when I read your comment and asked my question.

    • @TreyMotes
      @TreyMotes Месяц назад

      @@EposVox now, to actually answer your question, there are extremely complex film luts for example that are created analytically with large color charts and other measurements. Generally these won't be created in resolve or in any traditional color grading tool. Film labs used to provide preview luts (and some still do) for grading into to get an idea of how the grade would look when sent to a print stock.
      Additionally, some LUTs can be created with very different transforms in tools like Nuke that are not available in the traditional Resolve color tools, though some of that functionality could be ported over via a DCTL.

    • @EposVox
      @EposVox Месяц назад +1

      @@TreyMotes Wasn't intended to be snarky, but I'm on mobile and don't really feel like grabbing timecodes. At a couple different points he demos LUTs that he says would be "very difficult to replicate in Resolve or simply impossible to replicate in Resolve" prompting my question.

  • @xplodingminion2029
    @xplodingminion2029 Месяц назад +5

    Where could I go if I’m interested in creating LUTs that like you mentioned are able to do things not possible in resolve?

    • @Mahdi-ahmadzade
      @Mahdi-ahmadzade Месяц назад

      It's also my question!

    • @dukebozikowski3801
      @dukebozikowski3801 Месяц назад +2

      He makes an awesome course on TAC Resolve training

    • @creed3500
      @creed3500 Месяц назад +3

      nuke can manipulate luts in some ways that resolve can't and is probably the area I would start off with

    • @MuhammadAli-ny6ni
      @MuhammadAli-ny6ni Месяц назад +4

      You can use contour plugin build by Cullen Kelly to create luts other option are to use Dehancer or look designer plugin

    • @xplodingminion2029
      @xplodingminion2029 Месяц назад

      @@creed3500Do you know any good places to start online for learning LUT development in Nuke?

  • @lombardy3274
    @lombardy3274 Месяц назад

    Always a fountain of knowledge! Question, I purchased some Ravengrade Kharma LUTs (Kodak Vision3 5203 among others). However, even when using the correct colour space for those LUTs (converting from BMD Gen 5 to Arri LogC3) the look out of the box has extremely low saturation, so low that it cannot be what was intended by the LUTs. Have I missed something?

    • @TechMediaLifr
      @TechMediaLifr Месяц назад

      Have you set your output to a Display Colorspace like Rec.709?

    • @lombardy3274
      @lombardy3274 Месяц назад

      @@TechMediaLifr yes all colour space management is correct as intended

    • @pavol0
      @pavol0 Месяц назад

      which color space does your camera output

    • @lombardy3274
      @lombardy3274 Месяц назад

      @@pavol0 it’s BMCC6K BRAW so you can decode into most mainstream colour spaces and gamma… I’ve tried decoding into BMD5 with CST from BMD5 to LogC3 and I’ve also tried decoding straight into LogC3 but both result in the same desaturated look. Also, the Kharma LUTs are ‘hybrid’ in that they are both imparting a look and taking the footage from LogC3 into Rec709. Wondering if there’s something else under the hood I am missing that needs changing.

    • @pavol0
      @pavol0 Месяц назад

      @@lombardy3274 try downloading a red log sample footage, if it still remains

  • @MMPHOTOANDVIDEO
    @MMPHOTOANDVIDEO Месяц назад

    Good Idea Every Time Made own Fresh Lut Become Healthy Footage

  • @SquashGearReviews
    @SquashGearReviews Месяц назад

    Beginner question - If you're creating a viewing LUT purely for in camera monitoring, but intend on exposing to the right, how do you navigate this? (i.e could you factor in a 2-stop exposure reduction within the viewing LUT to get a more accurate idea of what the final result will look like, or are there drawbacks to this?)

    • @conortychowski
      @conortychowski Месяц назад +1

      Yeah, you would just reduce the exposure by X amount of stops before generating the camera LUT, and then when you load it into the camera, it'll force you to expose X amounts over to get to well-exposed image. There's no drawbacks that I can think of!

    • @OlegUstimenko
      @OlegUstimenko Месяц назад

      i would argue that its pointless to do this, but its surely possible. Either use the hdr wheels exposure tool with a value of -2 or the gain wheel in the node before your creative look.

    • @SquashGearReviews
      @SquashGearReviews Месяц назад

      @@OlegUstimenko why would it be pointless? If the intention of a monitoring LUT is to see something that represents what my end result’s potentially going to look like, surely it would help to have a representation of this in camera? For example, you might set up a grade to have cooler shadows, but if you’re exposing to the right, you’re not going to see this in action?

    • @OlegUstimenko
      @OlegUstimenko Месяц назад

      @@SquashGearReviews because exposing to the right isnt super useful anymore with current camera tech, and can be achieved more quickly by selecting a lower iso.

    • @SquashGearReviews
      @SquashGearReviews Месяц назад

      ​@@OlegUstimenko I primarily shoot in relatively dark environments where exposing to the right is standard practice to ensure the cleanest shadows. This is using the 2nd base ISO on a Sony FX3. I can tell you first hand that exposing to the right yields significantly cleaner results in my use case. But everyone's needs are different.
      For the record though, lowering your ISO will have little to no effect on the amount of noise in your image unless you can introduce more light to the sensor to raise the exposure (either through opening the aperture or increasing the physical amount of light in your scene).
      Cullen did a fantastic video on exposing to the right - ruclips.net/video/aB8ku9ET-dw/видео.html&ab_channel=CullenKelly - well worth a watch.

  • @mediaflmcreation
    @mediaflmcreation Месяц назад +1

    It looks better because today's people don't take the TIME AND EFFORT to design their look and save it. Mainly social media... but the majority likes to buy their way around it and although looks good sometimes, it's not the right look or looks out of place, cliche and/or simply degrades the footage. HOWEVER... those won't believe it but LENSES makes a big ass difference and exposure too. White/color balancing plays 90% role too. I'm no hypocrite here, I have this major problem somewhat too. I need to upgrade lenses to compliment the rest of my good lenses that looks very good.

  • @NOIRGRADE
    @NOIRGRADE Месяц назад

    saying "LUTs are bad" is like saying "ArriWG is bad". There is always more to story. I'd actually like to hear your thoughts on grading in the native capturing color space (granted there is only one in a project)

    • @CullenKelly
      @CullenKelly  Месяц назад

      100% agree. And working camera native can be great when there's just one format in play! Though that's increasingly rare these days...

  • @mohamadali2066
    @mohamadali2066 Месяц назад

    In one of your very early videos you have shown a way to separate the contrast and the color effect of the lut into 2 different nodes using layer mixer and compositing mode can you please explain it again soon or send me the video which I couldn’t find, also is there a different method since BDM added compositing mode per node?

    • @Pazeditions
      @Pazeditions Месяц назад

      This may help How to separate Color and Contrast from any LUT inside Davinci Resolve
      ruclips.net/video/kmoik-rdjss/видео.html

    • @CullenKelly
      @CullenKelly  Месяц назад

      Sure thing! I discuss that technique in this video: ruclips.net/video/Sh9JyQYjEu0/видео.html

  • @artengineeringmedia4183
    @artengineeringmedia4183 Месяц назад

    Just to clarify, can I purchase a LUT, use it as a viewing LUT on my camera, and also use the same LUT for my grading? Also are there things i need to put into consideration?

    • @LoremIpsumProd
      @LoremIpsumProd Месяц назад

      LUTs are color space specified. You need to find one that has a technical transform from your camera space to 709/2.4 most probably.

    • @artengineeringmedia4183
      @artengineeringmedia4183 Месяц назад

      @@LoremIpsumProd ok, thank you

    • @CullenKelly
      @CullenKelly  Месяц назад

      Yes, it's possible to use a single LUT for production as well as your grade, though ideally you'd start with color management plus a look *prior* to production, cook that full stack into a LUT for shooting purpose, then return to having the individual color management + look pieces for the grade. Hope this helps!

  • @kirankiranmishra
    @kirankiranmishra Месяц назад

    So you saying the look we gonna get from the lut to grade a film, contains same colors and ratio as viewing lut on set so that DP and set he’s exposure and do lighting accordingly and then we use same lut to grade with working color space ????

    • @CullenKelly
      @CullenKelly  Месяц назад

      Not sure I followed all this, but I think you've got the right idea here!

  • @cloudrippr
    @cloudrippr Месяц назад +2

    So I'm shooting sLOG3. Are your LUTs for use in camera or post production? Sony gives me the option of viewing using a Gamma sLOG3. Would I use one of your LUTs then or in post? In case you haven't noticed, I'm new at this.

    • @akshaysaiju4615
      @akshaysaiju4615 Месяц назад

      His LUTs would come in handy while you're in post production. His LUTs work inside Davinci Wide Gamut Intermediate which is a color space inside Davinci Resolve to which you can convert your SLog footage to and use his LUTs. So in short, you can use his LUTs regardless of which camera you're using because you'll be converting your footage to DWG anyway. There are a lot of videos on his channel which will explain this process a lot better than i have, i recommend checking them out

    • @albertorambaudi6055
      @albertorambaudi6055 Месяц назад +1

      i did it in my sony...if i dont forget somethin what i did was > inside davinci with CST input slog3gmutcine and CST OUT DW/inter { then your manual look or LUT2383kodakC.kelly DW} and then CST OUTPUT BACK TO SLOG3 . export lut , and import in Camera SD and install lut in Pictures profile , and turn on Gamma Assist display for Slog to R709. VERY IMPORTANT remember to turn off option in sony camera for not bake the lut in the video if you want to keep for original shoot in slog3

    • @chevonpetgrave4991
      @chevonpetgrave4991 Месяц назад

      @@albertorambaudi6055this is an unnecessary workflow. Is it because sony mirrorless cameras don’t have a true LUT preview feature?
      A true LUT preview will always convert from the color space the luts tells it, to the out put color space that the lut tells it to…So your lut should only need to specific that input and output.
      Im guessing picture profiles don’t work that way.

    • @CullenKelly
      @CullenKelly  Месяц назад

      My Voyager LUTs are designed to slot into a post production workflow, but they can also easily be cooked into viewing LUTs for use on set. I demonstrate this in the free course that comes with the LUTs!

  • @bigsnap5
    @bigsnap5 Месяц назад +1

    Reliable in camera viewing LUTS are in very short supply. Cullen would you consider creating LUT packs for in camera viewing in the future?

    • @MarkRay84
      @MarkRay84 Месяц назад +4

      You should be able to export the same Luts is 33 for the monitors or cameras.

    • @mattstahley6340
      @mattstahley6340 Месяц назад +1

      I’ve converted most of his voyager luts for my camera in resolve.

    • @AdventuresAdam
      @AdventuresAdam Месяц назад

      @@mattstahley6340Could you point to a video that shows this process for a noob? Thank you!

    • @mattstahley6340
      @mattstahley6340 Месяц назад +1

      @@AdventuresAdam no but Cullen has done it a few times in his videos but I’m not exactly sure which ones.

    • @mattstahley6340
      @mattstahley6340 Месяц назад

      @@AdventuresAdam actually I had one of them saved. I think It’s around the 40 minute mark or so he goes over it.
      ruclips.net/user/livewGUj2btvGD0?si=3y11ZjLh2Y49wX6R

  • @TubeSilva
    @TubeSilva Месяц назад

    Do productions use more than one viewing lut since scenes in a film are not all identical. For example, indoors, outdoors, night, etc?

    • @OlegUstimenko
      @OlegUstimenko Месяц назад +2

      Depends on the project. The last movie I was a DIT on, there was a separate lut for night scenes that raised the black levels.

    • @maurice_morales
      @maurice_morales Месяц назад

      The idea is to have 1 LUT for the show and use CDL’s for adjustments

    • @CullenKelly
      @CullenKelly  Месяц назад +1

      Sometimes! Check out my interview here on the channel with colorist Jill Bogdanowicz, I believe we discuss this subject: ruclips.net/video/lPu8sQ0C5xA/видео.html

  • @blacktar
    @blacktar Месяц назад +1

    It would be neat if LUTs would be standardized (ISO? IEC? AMPAS?) as a format with obligatory metadata like the intended input and output color space it was created for, attribution, license, etc. LUTs are super useful (I wouldn't want to live without them), but unfortunately still a very "dumb" data format.

    • @ErrickJackson
      @ErrickJackson Месяц назад +2

      LMTs kinda do this, just more specific to ACES, but no one makes LMTs so it’s just kind of an empty space.

    • @maurice_morales
      @maurice_morales Месяц назад +1

      That would be awesome.

    • @maurice_morales
      @maurice_morales Месяц назад +1

      @@ErrickJackson I’m curious to see what ACES 2.0 will be like

  • @frankinblackpool
    @frankinblackpool Месяц назад +4

    Using a Viewing LUT can have strange results with prosumer kit.
    With my Panasonic camera using a Log colour profile, I can and do use a Viewing LUT created by and distributed by Panasonic to replicate a Rec 709 colour space when I use my camera. The camera knows how to use the LUT and compensates the Waveform to give good exposure when filming.
    Contrast that with a field monitor by Atomos Ninja. The Atomos also allows me to use a viewing LUT and lets me use the Waveform to expose my shots.
    BUT. And it’s a very big but, the field monitor does NOT compensate the difference between exposing correctly for LOG and using the viewing LUT and this results in incorrectly exposed footage, better known as a day wasted.
    I’m not a professional, and am probably not sing viewing LUTs ncorrectly through user stupidity, but I’d rather look at a washed out LOG colour profile when filming, knowing that I have the exposure nailed. And, more importantly, I can visualise the end result. I just wish clients could visualise end results too, it would make life much more simple.

    • @mattstahley6340
      @mattstahley6340 Месяц назад +1

      I use lumix cameras and if you are shooting in v-log and monitoring with a lut the waveform displays the v-log pre-lut exposure. Also with most monitors they have setting that allow you to use exposure tools pre lut. I know my Atomos Shinobi and Blackmagic Video assist do this.

    • @frankinblackpool
      @frankinblackpool Месяц назад

      @@mattstahley6340 If you know a way that the Ninja V does this, then I’d be very grateful for the assist.

    • @TechMediaLifr
      @TechMediaLifr Месяц назад

      @@frankinblackpool not using any of those devices but maybe it's possible to enable the viewing LUT and Waveform in Camera and send that out to your monitor, so you don't have to rely on the monitors tools (recording has to be done in camera then).

    • @mattstahley6340
      @mattstahley6340 Месяц назад

      @@frankinblackpool Ive never used a Ninja V but it has to be somewhat similar to the non recording Shinobi. Do you have the Input source set up to receive V-Log and V Gamma? I think it's in the INPUT part of the menu and there's a section where you choose what your your camera is outputting.

    • @CullenKelly
      @CullenKelly  Месяц назад

      Looks like a great discussion started here! My take is that exposing using your waveform is never ideal, even if you're able to see it without the effects of your output transform applied.

  • @haileykurioreilly9890
    @haileykurioreilly9890 Месяц назад

    How are the LUTs made if they do impossible things on Resolve?

    • @CullenKelly
      @CullenKelly  Месяц назад

      I build my LUTs using custom tools I've developed for myself (which are now available in the form of my look dev plugin Contour)

  • @mohamedashfaq9167
    @mohamedashfaq9167 Месяц назад

  • @vybesmedia7176
    @vybesmedia7176 Месяц назад +2

    Why do you have banding in your video?

    • @nietlullenmaargraven887
      @nietlullenmaargraven887 Месяц назад +2

      Blame RUclips!

    • @Wildridefilms
      @Wildridefilms Месяц назад +2

      8 bit youtube encoding + dark gradients

    • @wxjunkie
      @wxjunkie Месяц назад +1

      RUclips is an 8-bit video platform.

    • @CullenKelly
      @CullenKelly  Месяц назад +1

      Yep, 8-bit w/ dark gradients is tricky! Still refining our solution for this

  • @IDKOKIDK
    @IDKOKIDK Месяц назад

    Luts are destructive, that's why they shouldn't be used as first node, The NLE can't see the destroyed data beyond that node.

    • @CullenKelly
      @CullenKelly  Месяц назад +4

      There are other reasons why you may not want to use a LUT in your first node, but none to do with it being destructive. If you’re using the right LUT for the right job in the right way, it’s *never* functionally destructive.

    • @thatcherfreeman
      @thatcherfreeman Месяц назад +1

      The main destructive way that a LUT is going to be unrecoverable is that it will clamp inputs to the 0-1 range. Any other destructive behavior is going to be dependent on what specifically the LUT is supposed to do and the sampling resolution of the LUT.
      Ironically in the first node, which you're sending the camera's original log footage, you actually have excellent odds of it being entirely within the 0-1 range. It is possible to apply a LUT in that context and not lose any information, though you'd have to be careful that the LUT you're using will be appropriate for such a task, IE log to rec709 LUTs likely would not pass that bar.

  • @Mionwang
    @Mionwang Месяц назад

    LUTS WITH BUTTS!
    Any og film riot fans here?

  • @focuspulling
    @focuspulling Месяц назад +2

    Emphasis on LUTs is misleading; LOG is the fulcrum of any such debate/discussion, of course. "Look Up Table" literally means precise conversion BACK to reality, using one and only one mathematical formula keyed to a specific camera and shooting mode: it's reverse-engineering to reality with only one allowed result. Your notion of a hybrid LUT tries to interpolate the creative tweaking process into the reverse-engineering pipeline. But AFTER conversion, as a discrete stage (not a LUT), we sometimes, but not always, starts with a faux LUT file that stacks on a vague look. We proceed from there...

    • @m1nt9reen
      @m1nt9reen Месяц назад

      Eh?

    • @CullenKelly
      @CullenKelly  Месяц назад +1

      Hmm, I think we might be talking about different things...the emphasis on LUTs is definitely intended here, and my points aren't specifically related to log spaces. Since your definition of LUT is one I've never heard before, and I don't share, I think this may just be a matter of using the same term for different things?

    • @focuspulling
      @focuspulling Месяц назад +1

      @@CullenKelly I can only continue to emphasize that LUT literally means Look Up Table: a precise mathematical formula for looking up the reverse-engineering of conversion from temporary LOG acquisition, to REC.709, on a camera-sensor-specific basis. The file format used for LUTs (used to be more commonly called cube files) happens to work for creative profiles too, but that's the extent of it.

    • @joePiercyTA
      @joePiercyTA Месяц назад

      @@focuspulling "Eh?", great response! Creative LUTs are useful for defining a Look, whether or not you're using log footage. Cullen only converts to rec709 here because its his output color space (super important). The simplest way to put it is LUTS reassign values. Can this be destructive? Of course, especially if you aren't aware of all the best practices involved in creating a LUT. That's where "$5 LUTs" are dangerous, imo, ultimately compromising the image. A well designed camera LUT created with the director, assistant directors, DP, production designer, etc. can be absolutely invaluable to ensure they get the image they intend to from their camera and lighting allowing them to apply their creative look on set and make judgement calls right away. This LUT can then be reused as a starting point for the Colorist too providing them valuable information on how the camera data was manipulated and how the look the project leadership intends was achieved.

    • @focuspulling
      @focuspulling Месяц назад +1

      @@joePiercyTA I'd add to this an emphasis that a LUT file designed to tweak actual LOG conversion further creatively, is as you wrote just a starting point: no professional colorist obeys a creative look for every scene anyway. Further: there's practically no such thing as not shooting in LOG (even though some old-timey boutique types argue that it's extra work or bill more for it). Any production by a working professional with grading in the pipeline, will always use LOG footage without exception (otherwise, amateur or lazy). Notably, the past century of color cinematography history is characterized by celluloid camera masters that have pretty much had that flat LOG look requiring further post work ("color timing," which was how long the celluloid copy got dipped into emulsion burning for R, G and B).