Not gonna lie I just bought a camera that can handle custom LUT and the result was insane. I nailed both exposure and white balance on set unlike using a camera built in gamma assist. It’s a night and day differences. And yes you need to feed the camera with a proper colorspace that the camera expect > intermediate color space that you going to grade > output display colorspace that you going to deliver. This hybrid LUTS going to make your life much easier in post.
This! I no longer buy LUT’s from anyone who cant a t the minimum state what color space the lut expects and outputs. That is asking for disappointment.
Do you ever apply a LUT after the output CST? E.g. you have a LUT from REC 709 to a certain look. Or would you always have the CST output as the last node in the Hierarchy? Thanks.
@@CullenKelly According to the Sony Venice 2 documentation you can import 17, 33, and 65 grid LUT files into the camera. Side note that Sony Venice also have another color processing pipeline that is superior to LUTs called ART (Advanced Rendering Transform). They have a white paper brochure floating around (hard to find) about this workflow. I used it once on set a few years back. Visually it had a slightly different rending of the image. In their white paper they have a graphic simulating some banding of two colored light bulbs when using LUT pipeline vs using the ART pipeline. Pretty interesting.
@@EposVox "Multiple times." Is this you being snarky for no reason whatsoever? All you had to say was that he said it. I hadn't yet finished the video when I read your comment and asked my question.
@@EposVox now, to actually answer your question, there are extremely complex film luts for example that are created analytically with large color charts and other measurements. Generally these won't be created in resolve or in any traditional color grading tool. Film labs used to provide preview luts (and some still do) for grading into to get an idea of how the grade would look when sent to a print stock. Additionally, some LUTs can be created with very different transforms in tools like Nuke that are not available in the traditional Resolve color tools, though some of that functionality could be ported over via a DCTL.
@@TreyMotes Wasn't intended to be snarky, but I'm on mobile and don't really feel like grabbing timecodes. At a couple different points he demos LUTs that he says would be "very difficult to replicate in Resolve or simply impossible to replicate in Resolve" prompting my question.
Always a fountain of knowledge! Question, I purchased some Ravengrade Kharma LUTs (Kodak Vision3 5203 among others). However, even when using the correct colour space for those LUTs (converting from BMD Gen 5 to Arri LogC3) the look out of the box has extremely low saturation, so low that it cannot be what was intended by the LUTs. Have I missed something?
@@pavol0 it’s BMCC6K BRAW so you can decode into most mainstream colour spaces and gamma… I’ve tried decoding into BMD5 with CST from BMD5 to LogC3 and I’ve also tried decoding straight into LogC3 but both result in the same desaturated look. Also, the Kharma LUTs are ‘hybrid’ in that they are both imparting a look and taking the footage from LogC3 into Rec709. Wondering if there’s something else under the hood I am missing that needs changing.
Beginner question - If you're creating a viewing LUT purely for in camera monitoring, but intend on exposing to the right, how do you navigate this? (i.e could you factor in a 2-stop exposure reduction within the viewing LUT to get a more accurate idea of what the final result will look like, or are there drawbacks to this?)
Yeah, you would just reduce the exposure by X amount of stops before generating the camera LUT, and then when you load it into the camera, it'll force you to expose X amounts over to get to well-exposed image. There's no drawbacks that I can think of!
i would argue that its pointless to do this, but its surely possible. Either use the hdr wheels exposure tool with a value of -2 or the gain wheel in the node before your creative look.
@@OlegUstimenko why would it be pointless? If the intention of a monitoring LUT is to see something that represents what my end result’s potentially going to look like, surely it would help to have a representation of this in camera? For example, you might set up a grade to have cooler shadows, but if you’re exposing to the right, you’re not going to see this in action?
@@SquashGearReviews because exposing to the right isnt super useful anymore with current camera tech, and can be achieved more quickly by selecting a lower iso.
@@OlegUstimenko I primarily shoot in relatively dark environments where exposing to the right is standard practice to ensure the cleanest shadows. This is using the 2nd base ISO on a Sony FX3. I can tell you first hand that exposing to the right yields significantly cleaner results in my use case. But everyone's needs are different. For the record though, lowering your ISO will have little to no effect on the amount of noise in your image unless you can introduce more light to the sensor to raise the exposure (either through opening the aperture or increasing the physical amount of light in your scene). Cullen did a fantastic video on exposing to the right - ruclips.net/video/aB8ku9ET-dw/видео.html&ab_channel=CullenKelly - well worth a watch.
It looks better because today's people don't take the TIME AND EFFORT to design their look and save it. Mainly social media... but the majority likes to buy their way around it and although looks good sometimes, it's not the right look or looks out of place, cliche and/or simply degrades the footage. HOWEVER... those won't believe it but LENSES makes a big ass difference and exposure too. White/color balancing plays 90% role too. I'm no hypocrite here, I have this major problem somewhat too. I need to upgrade lenses to compliment the rest of my good lenses that looks very good.
saying "LUTs are bad" is like saying "ArriWG is bad". There is always more to story. I'd actually like to hear your thoughts on grading in the native capturing color space (granted there is only one in a project)
In one of your very early videos you have shown a way to separate the contrast and the color effect of the lut into 2 different nodes using layer mixer and compositing mode can you please explain it again soon or send me the video which I couldn’t find, also is there a different method since BDM added compositing mode per node?
Just to clarify, can I purchase a LUT, use it as a viewing LUT on my camera, and also use the same LUT for my grading? Also are there things i need to put into consideration?
Yes, it's possible to use a single LUT for production as well as your grade, though ideally you'd start with color management plus a look *prior* to production, cook that full stack into a LUT for shooting purpose, then return to having the individual color management + look pieces for the grade. Hope this helps!
So you saying the look we gonna get from the lut to grade a film, contains same colors and ratio as viewing lut on set so that DP and set he’s exposure and do lighting accordingly and then we use same lut to grade with working color space ????
So I'm shooting sLOG3. Are your LUTs for use in camera or post production? Sony gives me the option of viewing using a Gamma sLOG3. Would I use one of your LUTs then or in post? In case you haven't noticed, I'm new at this.
His LUTs would come in handy while you're in post production. His LUTs work inside Davinci Wide Gamut Intermediate which is a color space inside Davinci Resolve to which you can convert your SLog footage to and use his LUTs. So in short, you can use his LUTs regardless of which camera you're using because you'll be converting your footage to DWG anyway. There are a lot of videos on his channel which will explain this process a lot better than i have, i recommend checking them out
i did it in my sony...if i dont forget somethin what i did was > inside davinci with CST input slog3gmutcine and CST OUT DW/inter { then your manual look or LUT2383kodakC.kelly DW} and then CST OUTPUT BACK TO SLOG3 . export lut , and import in Camera SD and install lut in Pictures profile , and turn on Gamma Assist display for Slog to R709. VERY IMPORTANT remember to turn off option in sony camera for not bake the lut in the video if you want to keep for original shoot in slog3
@@albertorambaudi6055this is an unnecessary workflow. Is it because sony mirrorless cameras don’t have a true LUT preview feature? A true LUT preview will always convert from the color space the luts tells it, to the out put color space that the lut tells it to…So your lut should only need to specific that input and output. Im guessing picture profiles don’t work that way.
My Voyager LUTs are designed to slot into a post production workflow, but they can also easily be cooked into viewing LUTs for use on set. I demonstrate this in the free course that comes with the LUTs!
@@AdventuresAdam actually I had one of them saved. I think It’s around the 40 minute mark or so he goes over it. ruclips.net/user/livewGUj2btvGD0?si=3y11ZjLh2Y49wX6R
Sometimes! Check out my interview here on the channel with colorist Jill Bogdanowicz, I believe we discuss this subject: ruclips.net/video/lPu8sQ0C5xA/видео.html
It would be neat if LUTs would be standardized (ISO? IEC? AMPAS?) as a format with obligatory metadata like the intended input and output color space it was created for, attribution, license, etc. LUTs are super useful (I wouldn't want to live without them), but unfortunately still a very "dumb" data format.
Using a Viewing LUT can have strange results with prosumer kit. With my Panasonic camera using a Log colour profile, I can and do use a Viewing LUT created by and distributed by Panasonic to replicate a Rec 709 colour space when I use my camera. The camera knows how to use the LUT and compensates the Waveform to give good exposure when filming. Contrast that with a field monitor by Atomos Ninja. The Atomos also allows me to use a viewing LUT and lets me use the Waveform to expose my shots. BUT. And it’s a very big but, the field monitor does NOT compensate the difference between exposing correctly for LOG and using the viewing LUT and this results in incorrectly exposed footage, better known as a day wasted. I’m not a professional, and am probably not sing viewing LUTs ncorrectly through user stupidity, but I’d rather look at a washed out LOG colour profile when filming, knowing that I have the exposure nailed. And, more importantly, I can visualise the end result. I just wish clients could visualise end results too, it would make life much more simple.
I use lumix cameras and if you are shooting in v-log and monitoring with a lut the waveform displays the v-log pre-lut exposure. Also with most monitors they have setting that allow you to use exposure tools pre lut. I know my Atomos Shinobi and Blackmagic Video assist do this.
@@frankinblackpool not using any of those devices but maybe it's possible to enable the viewing LUT and Waveform in Camera and send that out to your monitor, so you don't have to rely on the monitors tools (recording has to be done in camera then).
@@frankinblackpool Ive never used a Ninja V but it has to be somewhat similar to the non recording Shinobi. Do you have the Input source set up to receive V-Log and V Gamma? I think it's in the INPUT part of the menu and there's a section where you choose what your your camera is outputting.
Looks like a great discussion started here! My take is that exposing using your waveform is never ideal, even if you're able to see it without the effects of your output transform applied.
There are other reasons why you may not want to use a LUT in your first node, but none to do with it being destructive. If you’re using the right LUT for the right job in the right way, it’s *never* functionally destructive.
The main destructive way that a LUT is going to be unrecoverable is that it will clamp inputs to the 0-1 range. Any other destructive behavior is going to be dependent on what specifically the LUT is supposed to do and the sampling resolution of the LUT. Ironically in the first node, which you're sending the camera's original log footage, you actually have excellent odds of it being entirely within the 0-1 range. It is possible to apply a LUT in that context and not lose any information, though you'd have to be careful that the LUT you're using will be appropriate for such a task, IE log to rec709 LUTs likely would not pass that bar.
Emphasis on LUTs is misleading; LOG is the fulcrum of any such debate/discussion, of course. "Look Up Table" literally means precise conversion BACK to reality, using one and only one mathematical formula keyed to a specific camera and shooting mode: it's reverse-engineering to reality with only one allowed result. Your notion of a hybrid LUT tries to interpolate the creative tweaking process into the reverse-engineering pipeline. But AFTER conversion, as a discrete stage (not a LUT), we sometimes, but not always, starts with a faux LUT file that stacks on a vague look. We proceed from there...
Hmm, I think we might be talking about different things...the emphasis on LUTs is definitely intended here, and my points aren't specifically related to log spaces. Since your definition of LUT is one I've never heard before, and I don't share, I think this may just be a matter of using the same term for different things?
@@CullenKelly I can only continue to emphasize that LUT literally means Look Up Table: a precise mathematical formula for looking up the reverse-engineering of conversion from temporary LOG acquisition, to REC.709, on a camera-sensor-specific basis. The file format used for LUTs (used to be more commonly called cube files) happens to work for creative profiles too, but that's the extent of it.
@@focuspulling "Eh?", great response! Creative LUTs are useful for defining a Look, whether or not you're using log footage. Cullen only converts to rec709 here because its his output color space (super important). The simplest way to put it is LUTS reassign values. Can this be destructive? Of course, especially if you aren't aware of all the best practices involved in creating a LUT. That's where "$5 LUTs" are dangerous, imo, ultimately compromising the image. A well designed camera LUT created with the director, assistant directors, DP, production designer, etc. can be absolutely invaluable to ensure they get the image they intend to from their camera and lighting allowing them to apply their creative look on set and make judgement calls right away. This LUT can then be reused as a starting point for the Colorist too providing them valuable information on how the camera data was manipulated and how the look the project leadership intends was achieved.
@@joePiercyTA I'd add to this an emphasis that a LUT file designed to tweak actual LOG conversion further creatively, is as you wrote just a starting point: no professional colorist obeys a creative look for every scene anyway. Further: there's practically no such thing as not shooting in LOG (even though some old-timey boutique types argue that it's extra work or bill more for it). Any production by a working professional with grading in the pipeline, will always use LOG footage without exception (otherwise, amateur or lazy). Notably, the past century of color cinematography history is characterized by celluloid camera masters that have pretty much had that flat LOG look requiring further post work ("color timing," which was how long the celluloid copy got dipped into emulsion burning for R, G and B).
I like big LUTS and I cannot lie!
🤣🤣🤣
I've been using the Voyager luts for over a year now and I'm very happy with them.
Not gonna lie I just bought a camera that can handle custom LUT and the result was insane. I nailed both exposure and white balance on set unlike using a camera built in gamma assist. It’s a night and day differences.
And yes you need to feed the camera with a proper colorspace that the camera expect > intermediate color space that you going to grade > output display colorspace that you going to deliver. This hybrid LUTS going to make your life much easier in post.
Thank you! Pro pack purchased and look forward to having more consistent workflow as a solo guy. Cheers.
I think the problem is the sale of luts. People buy them without knowing how they were made and then expect their footage to look identical.
Lattice helps a lot to check LUTs…
This! I no longer buy LUT’s from anyone who cant a t the minimum state what color space the lut expects and outputs.
That is asking for disappointment.
But my question is how do we use Lattice to properly evaluate the LUT? There’s very little information on how to use Lattice.
@@maurice_morales watch Yedlins videos, curves crossing more than once is a quick way to spot an improper LUT
Do you ever apply a LUT after the output CST? E.g. you have a LUT from REC 709 to a certain look. Or would you always have the CST output as the last node in the Hierarchy? Thanks.
Dinner is served!!!
Can you do a video on stress testing a LUT?
I actually don't stress test LUTs! Which would probably be a great topic for a video 😂
@@CullenKelly It could possibly kill the market of problematic LUT packs being sold online while also teaching users about what LUT's are.
Thank you for the LUTs and this walkthrough.
Q: When creating a viewing LUT to install in the camera, do you create a 33 cube or 65 cube LUT?
Has to be 33, most cameras can't take a 65 cube.
@@GeezerStraythe Alexa 35 can take 65. I believe (not sure) the Venice 2 can take 65 as well.
Good replies below...generally 33, though the Alexa 35 can do 65
@@CullenKelly According to the Sony Venice 2 documentation you can import 17, 33, and 65 grid LUT files into the camera.
Side note that Sony Venice also have another color processing pipeline that is superior to LUTs called ART (Advanced Rendering Transform). They have a white paper brochure floating around (hard to find) about this workflow. I used it once on set a few years back. Visually it had a slightly different rending of the image. In their white paper they have a graphic simulating some banding of two colored light bulbs when using LUT pipeline vs using the ART pipeline. Pretty interesting.
Hey Cullen! Do you have a tutorial on how to use your Voyager LUT pack on HDR videos?
Prolly just output to the requirement?
No special consideration needed for using Voyager with HDR, just set up your color management as needed and place Voyager within it!
does your luts work in all editing software or only resolve
@@f.d.thdlifestyle6079 the color space is DWG/I and you must convert it with technical transformation for use in other apps or camera.
Silly question: if you have LUTS that add looks that can’t be created in Resolve, how were they created? What higher tier color grading tool was used?
Who said they can't be created in Resolve?
@@TreyMotes He did. Multiple times.
@@EposVox "Multiple times." Is this you being snarky for no reason whatsoever? All you had to say was that he said it. I hadn't yet finished the video when I read your comment and asked my question.
@@EposVox now, to actually answer your question, there are extremely complex film luts for example that are created analytically with large color charts and other measurements. Generally these won't be created in resolve or in any traditional color grading tool. Film labs used to provide preview luts (and some still do) for grading into to get an idea of how the grade would look when sent to a print stock.
Additionally, some LUTs can be created with very different transforms in tools like Nuke that are not available in the traditional Resolve color tools, though some of that functionality could be ported over via a DCTL.
@@TreyMotes Wasn't intended to be snarky, but I'm on mobile and don't really feel like grabbing timecodes. At a couple different points he demos LUTs that he says would be "very difficult to replicate in Resolve or simply impossible to replicate in Resolve" prompting my question.
Where could I go if I’m interested in creating LUTs that like you mentioned are able to do things not possible in resolve?
It's also my question!
He makes an awesome course on TAC Resolve training
nuke can manipulate luts in some ways that resolve can't and is probably the area I would start off with
You can use contour plugin build by Cullen Kelly to create luts other option are to use Dehancer or look designer plugin
@@creed3500Do you know any good places to start online for learning LUT development in Nuke?
Always a fountain of knowledge! Question, I purchased some Ravengrade Kharma LUTs (Kodak Vision3 5203 among others). However, even when using the correct colour space for those LUTs (converting from BMD Gen 5 to Arri LogC3) the look out of the box has extremely low saturation, so low that it cannot be what was intended by the LUTs. Have I missed something?
Have you set your output to a Display Colorspace like Rec.709?
@@TechMediaLifr yes all colour space management is correct as intended
which color space does your camera output
@@pavol0 it’s BMCC6K BRAW so you can decode into most mainstream colour spaces and gamma… I’ve tried decoding into BMD5 with CST from BMD5 to LogC3 and I’ve also tried decoding straight into LogC3 but both result in the same desaturated look. Also, the Kharma LUTs are ‘hybrid’ in that they are both imparting a look and taking the footage from LogC3 into Rec709. Wondering if there’s something else under the hood I am missing that needs changing.
@@lombardy3274 try downloading a red log sample footage, if it still remains
Good Idea Every Time Made own Fresh Lut Become Healthy Footage
Beginner question - If you're creating a viewing LUT purely for in camera monitoring, but intend on exposing to the right, how do you navigate this? (i.e could you factor in a 2-stop exposure reduction within the viewing LUT to get a more accurate idea of what the final result will look like, or are there drawbacks to this?)
Yeah, you would just reduce the exposure by X amount of stops before generating the camera LUT, and then when you load it into the camera, it'll force you to expose X amounts over to get to well-exposed image. There's no drawbacks that I can think of!
i would argue that its pointless to do this, but its surely possible. Either use the hdr wheels exposure tool with a value of -2 or the gain wheel in the node before your creative look.
@@OlegUstimenko why would it be pointless? If the intention of a monitoring LUT is to see something that represents what my end result’s potentially going to look like, surely it would help to have a representation of this in camera? For example, you might set up a grade to have cooler shadows, but if you’re exposing to the right, you’re not going to see this in action?
@@SquashGearReviews because exposing to the right isnt super useful anymore with current camera tech, and can be achieved more quickly by selecting a lower iso.
@@OlegUstimenko I primarily shoot in relatively dark environments where exposing to the right is standard practice to ensure the cleanest shadows. This is using the 2nd base ISO on a Sony FX3. I can tell you first hand that exposing to the right yields significantly cleaner results in my use case. But everyone's needs are different.
For the record though, lowering your ISO will have little to no effect on the amount of noise in your image unless you can introduce more light to the sensor to raise the exposure (either through opening the aperture or increasing the physical amount of light in your scene).
Cullen did a fantastic video on exposing to the right - ruclips.net/video/aB8ku9ET-dw/видео.html&ab_channel=CullenKelly - well worth a watch.
It looks better because today's people don't take the TIME AND EFFORT to design their look and save it. Mainly social media... but the majority likes to buy their way around it and although looks good sometimes, it's not the right look or looks out of place, cliche and/or simply degrades the footage. HOWEVER... those won't believe it but LENSES makes a big ass difference and exposure too. White/color balancing plays 90% role too. I'm no hypocrite here, I have this major problem somewhat too. I need to upgrade lenses to compliment the rest of my good lenses that looks very good.
saying "LUTs are bad" is like saying "ArriWG is bad". There is always more to story. I'd actually like to hear your thoughts on grading in the native capturing color space (granted there is only one in a project)
100% agree. And working camera native can be great when there's just one format in play! Though that's increasingly rare these days...
In one of your very early videos you have shown a way to separate the contrast and the color effect of the lut into 2 different nodes using layer mixer and compositing mode can you please explain it again soon or send me the video which I couldn’t find, also is there a different method since BDM added compositing mode per node?
This may help How to separate Color and Contrast from any LUT inside Davinci Resolve
ruclips.net/video/kmoik-rdjss/видео.html
Sure thing! I discuss that technique in this video: ruclips.net/video/Sh9JyQYjEu0/видео.html
Just to clarify, can I purchase a LUT, use it as a viewing LUT on my camera, and also use the same LUT for my grading? Also are there things i need to put into consideration?
LUTs are color space specified. You need to find one that has a technical transform from your camera space to 709/2.4 most probably.
@@LoremIpsumProd ok, thank you
Yes, it's possible to use a single LUT for production as well as your grade, though ideally you'd start with color management plus a look *prior* to production, cook that full stack into a LUT for shooting purpose, then return to having the individual color management + look pieces for the grade. Hope this helps!
So you saying the look we gonna get from the lut to grade a film, contains same colors and ratio as viewing lut on set so that DP and set he’s exposure and do lighting accordingly and then we use same lut to grade with working color space ????
Not sure I followed all this, but I think you've got the right idea here!
So I'm shooting sLOG3. Are your LUTs for use in camera or post production? Sony gives me the option of viewing using a Gamma sLOG3. Would I use one of your LUTs then or in post? In case you haven't noticed, I'm new at this.
His LUTs would come in handy while you're in post production. His LUTs work inside Davinci Wide Gamut Intermediate which is a color space inside Davinci Resolve to which you can convert your SLog footage to and use his LUTs. So in short, you can use his LUTs regardless of which camera you're using because you'll be converting your footage to DWG anyway. There are a lot of videos on his channel which will explain this process a lot better than i have, i recommend checking them out
i did it in my sony...if i dont forget somethin what i did was > inside davinci with CST input slog3gmutcine and CST OUT DW/inter { then your manual look or LUT2383kodakC.kelly DW} and then CST OUTPUT BACK TO SLOG3 . export lut , and import in Camera SD and install lut in Pictures profile , and turn on Gamma Assist display for Slog to R709. VERY IMPORTANT remember to turn off option in sony camera for not bake the lut in the video if you want to keep for original shoot in slog3
@@albertorambaudi6055this is an unnecessary workflow. Is it because sony mirrorless cameras don’t have a true LUT preview feature?
A true LUT preview will always convert from the color space the luts tells it, to the out put color space that the lut tells it to…So your lut should only need to specific that input and output.
Im guessing picture profiles don’t work that way.
My Voyager LUTs are designed to slot into a post production workflow, but they can also easily be cooked into viewing LUTs for use on set. I demonstrate this in the free course that comes with the LUTs!
Reliable in camera viewing LUTS are in very short supply. Cullen would you consider creating LUT packs for in camera viewing in the future?
You should be able to export the same Luts is 33 for the monitors or cameras.
I’ve converted most of his voyager luts for my camera in resolve.
@@mattstahley6340Could you point to a video that shows this process for a noob? Thank you!
@@AdventuresAdam no but Cullen has done it a few times in his videos but I’m not exactly sure which ones.
@@AdventuresAdam actually I had one of them saved. I think It’s around the 40 minute mark or so he goes over it.
ruclips.net/user/livewGUj2btvGD0?si=3y11ZjLh2Y49wX6R
Do productions use more than one viewing lut since scenes in a film are not all identical. For example, indoors, outdoors, night, etc?
Depends on the project. The last movie I was a DIT on, there was a separate lut for night scenes that raised the black levels.
The idea is to have 1 LUT for the show and use CDL’s for adjustments
Sometimes! Check out my interview here on the channel with colorist Jill Bogdanowicz, I believe we discuss this subject: ruclips.net/video/lPu8sQ0C5xA/видео.html
It would be neat if LUTs would be standardized (ISO? IEC? AMPAS?) as a format with obligatory metadata like the intended input and output color space it was created for, attribution, license, etc. LUTs are super useful (I wouldn't want to live without them), but unfortunately still a very "dumb" data format.
LMTs kinda do this, just more specific to ACES, but no one makes LMTs so it’s just kind of an empty space.
That would be awesome.
@@ErrickJackson I’m curious to see what ACES 2.0 will be like
Using a Viewing LUT can have strange results with prosumer kit.
With my Panasonic camera using a Log colour profile, I can and do use a Viewing LUT created by and distributed by Panasonic to replicate a Rec 709 colour space when I use my camera. The camera knows how to use the LUT and compensates the Waveform to give good exposure when filming.
Contrast that with a field monitor by Atomos Ninja. The Atomos also allows me to use a viewing LUT and lets me use the Waveform to expose my shots.
BUT. And it’s a very big but, the field monitor does NOT compensate the difference between exposing correctly for LOG and using the viewing LUT and this results in incorrectly exposed footage, better known as a day wasted.
I’m not a professional, and am probably not sing viewing LUTs ncorrectly through user stupidity, but I’d rather look at a washed out LOG colour profile when filming, knowing that I have the exposure nailed. And, more importantly, I can visualise the end result. I just wish clients could visualise end results too, it would make life much more simple.
I use lumix cameras and if you are shooting in v-log and monitoring with a lut the waveform displays the v-log pre-lut exposure. Also with most monitors they have setting that allow you to use exposure tools pre lut. I know my Atomos Shinobi and Blackmagic Video assist do this.
@@mattstahley6340 If you know a way that the Ninja V does this, then I’d be very grateful for the assist.
@@frankinblackpool not using any of those devices but maybe it's possible to enable the viewing LUT and Waveform in Camera and send that out to your monitor, so you don't have to rely on the monitors tools (recording has to be done in camera then).
@@frankinblackpool Ive never used a Ninja V but it has to be somewhat similar to the non recording Shinobi. Do you have the Input source set up to receive V-Log and V Gamma? I think it's in the INPUT part of the menu and there's a section where you choose what your your camera is outputting.
Looks like a great discussion started here! My take is that exposing using your waveform is never ideal, even if you're able to see it without the effects of your output transform applied.
How are the LUTs made if they do impossible things on Resolve?
I build my LUTs using custom tools I've developed for myself (which are now available in the form of my look dev plugin Contour)
❤
Why do you have banding in your video?
Blame RUclips!
8 bit youtube encoding + dark gradients
RUclips is an 8-bit video platform.
Yep, 8-bit w/ dark gradients is tricky! Still refining our solution for this
Luts are destructive, that's why they shouldn't be used as first node, The NLE can't see the destroyed data beyond that node.
There are other reasons why you may not want to use a LUT in your first node, but none to do with it being destructive. If you’re using the right LUT for the right job in the right way, it’s *never* functionally destructive.
The main destructive way that a LUT is going to be unrecoverable is that it will clamp inputs to the 0-1 range. Any other destructive behavior is going to be dependent on what specifically the LUT is supposed to do and the sampling resolution of the LUT.
Ironically in the first node, which you're sending the camera's original log footage, you actually have excellent odds of it being entirely within the 0-1 range. It is possible to apply a LUT in that context and not lose any information, though you'd have to be careful that the LUT you're using will be appropriate for such a task, IE log to rec709 LUTs likely would not pass that bar.
LUTS WITH BUTTS!
Any og film riot fans here?
Emphasis on LUTs is misleading; LOG is the fulcrum of any such debate/discussion, of course. "Look Up Table" literally means precise conversion BACK to reality, using one and only one mathematical formula keyed to a specific camera and shooting mode: it's reverse-engineering to reality with only one allowed result. Your notion of a hybrid LUT tries to interpolate the creative tweaking process into the reverse-engineering pipeline. But AFTER conversion, as a discrete stage (not a LUT), we sometimes, but not always, starts with a faux LUT file that stacks on a vague look. We proceed from there...
Eh?
Hmm, I think we might be talking about different things...the emphasis on LUTs is definitely intended here, and my points aren't specifically related to log spaces. Since your definition of LUT is one I've never heard before, and I don't share, I think this may just be a matter of using the same term for different things?
@@CullenKelly I can only continue to emphasize that LUT literally means Look Up Table: a precise mathematical formula for looking up the reverse-engineering of conversion from temporary LOG acquisition, to REC.709, on a camera-sensor-specific basis. The file format used for LUTs (used to be more commonly called cube files) happens to work for creative profiles too, but that's the extent of it.
@@focuspulling "Eh?", great response! Creative LUTs are useful for defining a Look, whether or not you're using log footage. Cullen only converts to rec709 here because its his output color space (super important). The simplest way to put it is LUTS reassign values. Can this be destructive? Of course, especially if you aren't aware of all the best practices involved in creating a LUT. That's where "$5 LUTs" are dangerous, imo, ultimately compromising the image. A well designed camera LUT created with the director, assistant directors, DP, production designer, etc. can be absolutely invaluable to ensure they get the image they intend to from their camera and lighting allowing them to apply their creative look on set and make judgement calls right away. This LUT can then be reused as a starting point for the Colorist too providing them valuable information on how the camera data was manipulated and how the look the project leadership intends was achieved.
@@joePiercyTA I'd add to this an emphasis that a LUT file designed to tweak actual LOG conversion further creatively, is as you wrote just a starting point: no professional colorist obeys a creative look for every scene anyway. Further: there's practically no such thing as not shooting in LOG (even though some old-timey boutique types argue that it's extra work or bill more for it). Any production by a working professional with grading in the pipeline, will always use LOG footage without exception (otherwise, amateur or lazy). Notably, the past century of color cinematography history is characterized by celluloid camera masters that have pretty much had that flat LOG look requiring further post work ("color timing," which was how long the celluloid copy got dipped into emulsion burning for R, G and B).