little update : the chormatic adaptation is not needed anymore its now build into the colrospace node (adjust whitepoint button) and Nuke now supoorts to read BRAW files :) might do a updated video if people are interested
Yooo thank you!! I was doing xml and had no clue what I was doing. And it never came back from nuke gouge it was suppose to look. This is very helpful. thank you
@@FinnJaeger1337 I think I've figured it out with the native braw nuke integration. Not sure if I'm doing it the "correct" way, but I suppose I am as there's no difference after comparing export to raw. In Nuke it says the input transform is invalid for Blackmagic Film Gen 5 when I import the braw clip, so I switch to DaVinci Wide Gamut Intermediate and export the same. Working in ACES CG color space. Does that sound right?
@@Applemarko Yea i would not use any of the BM spaces, just go directly to aces or to arrilogC or whatever is more common. Idk where you export Davinci WideGamut files.. something seems wrong but I might not fully understand your workflow. I would always use resolve to make plates for nuke and not use braw in nuke, same with any other raw format, mostly due to speed and there should be a central place for "developing" the raw and not many different places allmover your colorpipeline
Thanks Finn, well explained! Since moving from big facilities to my home office this is becoming much more relevant. The chromatic adaptation, white point was new to me, really helpful.
this is gold! im managing a small post-team and we had difficult times when it came down to conforming and kicking footage back and forth to and from different departments. this video is helpful yes
Just wanted to let you know Finn, that some huge part of viewing numbers is me just going back and forth through your awesome tutorial. It helps me a lot, even when I'm not responsible for DI/plates, just so I can check on where lies the problem. Also I can pretend to be smart ;)
FINN! You are my king. I am working on my senior thesis, and having shot on the Venice there are like no tutorials for this stuff. I was having a panic with my VFX artist trying to figure out the workflow and without your video we may not have figured it out and had to suck up grainy shitty footage and mismatched color. Thank you!
Finn this was amazing, thank you for making this! Trying to troubleshoot with a colorist right now and your tips saved me probably a good day of just blindly trying to science our roundtrip issues
@@FinnJaeger1337 Ey Finn! One question about the env variable for ACES. Could you let me know how you add the OCIO var for the pipe? I am trying to leave it as set up for nuke, maya and houdini. Thanks again!!
@@lioMrocks Hey Yea totally! Ideally you would set up a env variable for each shot you launch from a pipe perspective, ideally not using the OCIO variable as that locks down nuke to inly use that one... as this is my personal machine I just used the windows GUI , just search for environment variable and type in OCIO as your key and then value the path to the .ocio file. thats all, then it works in blender e.t.c, in linux you can just type "export OCIO="path/to/config.ocio" and then start the program from the same terminal OR add it to your .bashrc , all the options :-)
Awesome video thanks! Been working with FX3 raw footage for the first time and scratching my head having to use After Effects to process it unfortunately. Going to try this out but probably need to buy the full version of resolve first for RAW I believe.
Great tutorial, Finn thank you! One thing I'd love to see is setting up shot naming/numbering for VFX in Davinci. Done a lot of digging and seems its not as easy as Studio or Flame. But if you have a method, I would love to see it.
Resolve has no concept of shot names , I mean you can give a shot name token to a source clip but thats about it, I usually go via NukeStudio or Flame for that and only use resolve as a raw converter type thing for formats that the afformentioned cant handle. if its just one plate per shot then its doable I usually use the reelName or some other metadata field for my shotnames and then use the % symbol during export to setup some logical exporting but yea its cumbersome and annoying, so yea havent found anything great in resolve for that sadly, I have some WIP python scripts to name clips in a timeline but thats about it.
@@FinnJaeger1337 Ah yeah I had a feeling this was the case. Though can't really complain since it's free... Add it to the wish list, but for now I'll use it the way you showed here and I'm sure it'll do the job!
Hi Finn, great tutorial! I have a resolve project that contains alexa, RED, DJI Zenmuse and Sony Alpha 7 Footage and I need to conform to linear EXRs for compositing in Nuke, would you mind telling me how I can do the Resolve to Nuke to Resolve roundtrip without using the ACES workflow? The colorist insist on not using ACES but grade everything in AlexaLogC, this confused me because wouldn't that alter the colorspace of the non Arri cameras?
Hey, thats a perfecty valid workflow , even though I would argue that the colorist needs to have a better technical understandinng as he can just convert aces EXRs to logC in davinci on the fly he never has to touch the aces mode, I know colorists arent usually that technical , so its probably hest to feed him logC/AlexaWidegamut DPX files , you can still use aces on ingest and export from nuke , If you follow my arri example , you can convert everything to Aces-AP0 in resolve and then export alexa-logC-AlexaWidegamut from nuke, yes you would be for example fot the a7 footage convert from slog2 to AlexaLogC but that seems like what he wants. . You can also convert everything to linear/alexaWidegamut and comp on that, really deends on how the compositor wants to work, I would just go aces for everything, as its one click to setup and everything will match which is nice for compositing, then export LogC/AWG dpx files from nuke and grade those. It very typical and a good idea imho to change all cameras to the master camera gamut and gamma before grading as you want them to match instead of manually grading them to match, wether thats aces or alexaLogC/awg or red ipp2 or whatever doesnt actually matter, Aces just gives you a nice framework with transforms that mostly works both ways. hope this helped a bit?
Hei Finn, great video here, thank you for your insights. I'm deep diving in ACES and I'm finding it quite confusing. I have a couple of questions: how would you linearize BlackMagic's ProRes 4444? I'm used to linearize footage directly in Nuke, is that a bad habit? (Also, the lack of support to BlackMagic cameras in ACES in Nuke is not helpful...).
Hey, newer blackmagic cameras do not shoot prores anymore so I left it out, linearizing in Nuke is not a bad habit! If nuke supports the codec there is no need to do it in resolve, but there are a lots of reasons to do it in resolve, like better timewarp support from AAFs and a bunch of ofter things. Now for the baked log from blckmagic prores, you need to know which of the log/gamut combinations was used during recording and then use a aces transform node and go from the appropirate Log to ACES, so just like what i did with the alexa prores, just you cant go back to the BMlog directly in nuke so you have to send aces exr back to resolve and do the reverse transform to the same Log used when recording. its a bit of a weird workflow due to blackmagic not giving AMPAS nice IDTs like arri and red etc have done so you dont have those in the aces OCIO.. As I said I am not happy with how BM is handeling this, at least the braw workflow works nice, I would not shoot prores on a Blacmagic camera if I can avoid it :/
@@FinnJaeger1337Cool that's super clear, thanks. So I guess I'll talk to the DOP and ask for the right log profile :). Next time will use braw instead. Thank you again!
Does "Highlight recovery" make difference while outputing? I have DNG and I find that lots of data is lost when "Highlight recovery" isn't checked. But I want that whole dynamic range when outputting for Nuke.
highlight recovery is a sort of hack where it "restores" clipped color channels from their non clipped counterparts - it can lead to problems(like pink skies) and its not really giving you more information , I have tried and succeded in recreating a sort of parametric highlight recovery tool in nuke that worked even better. so I would say it can be effective in certain situations but its not a holy grail, same goes for dowscaling mot every scaling filter works well with every source.
yea absolutely, its acutally extremely easy, set resolve to ACES mode (which one doesnt matter) set output transform to acesCG, and just render EXRs, make sure "creative" luts are off in the r3ds other wise thats really it
hello! followed your workflow - need to convert braw and return it the same after vfx - but something strange happens - it messes colors and doesnt return the same clip))) help pls
It should be pretty clear in the video, debayer as AP0/linear render EXR for VFX and they just render back AP0/Linear and voila? you can NOT use any of the blackmagic Log options those dont work for VFX workflows and should be ignored, In my example I even render back out alexa LogC/AlexaWidegamut from nuke you can be in total control .
hello. thanks for this tutorial. One problem solved led to another. I am getting frame drops when I write back to DR. I followed your steps in writing out of DR and back from Nuke. How do I fix that?
sounds like you are rendering with the nuke frameserver which constantly crashes on random frames if your machine doesnt have enough ram? try to not use frameserver
@@FinnJaeger1337 I thought it was cause of mismatch in working fps so I matched the raw videos fps to Nuke’s fps which is 59.940 but same issue, I also changed to 24 fps and same issue
This would be more the next step for platepreparation, you cant reverse the colorchart balancing in Resolve not even sure you can export it as anyhting useful, I much rather balance/neutralgrade the footage in Nuke using mmColortarget, gives me a proper matrix transform that I can use later down the road, reverse , etc.
Hey, you should also get similar options, when loading in a blackmagic DNG sequence to BRAW, so use those settings, there are a plethora of other DNG formats out there, like stills DNG and whatever as well and all is very different.
@@FinnJaeger1337 Ok, i will use BRaw settings for my BM DNG RAW Seq. Just another question, when export EXR to nuke with your settings, then from Nuke Write which are the best settings to export VFX Comp with CGI to re-edit CC in Resolve? DPX or EXR? Which one Aces colorspace? (cg, 2065-1, scc...) Grateful :)
@@RockandRollRobot you can do whatever you feel like , I would personally export Arri LogC DPX files and grade those but thats because I am so used to logC material as a colorist. AcessCC dpx and AcesCG exr are all completely fine, its really up to you! if you want to grade in a proper aces way you can set your project to aces in resolve and just set the idt to whatever for your clips as well. for CG comps dont forget you can shuffle in your mattes into the output multichannel exr and use those later in grading even.
@@FinnJaeger1337 ok, i am a compositor and i wanted to understand more about Aces color space and what to ask and export for freelance work. It's nice to find professionals like you who explain their knowledge. Thanks for your time and for your well done tutorial.
Ha! I guess I didnt do that, but its very straightforward, its just like LogC but you select which of the Red colorspaces you want, I would go with red ipp2 log3g10/RWG
@@FinnJaeger1337 If the log mov footage was brought into Nuke and I set the colorspace to Log3s10 to RGW, and then the same to the render setting and render. How do I keep the same results that I get with the viewer with my render? My render view on sRGB Aces. Thank you for your time.
@@IamJohnnyFan you can then just do the exact reverse of that in Resolve, acestransform node from Log3g10/rwg to sRGB. What you did in nuke is Log3g10/rwg->AcesCG -> Viewer(sRGB) . a read node setting is the same as a aces transform node in resolve just that it always transforms to your working space, which for the default aces ocio config is acesCG. Write node is the other way around, you can play around with setting everything and the viewer to RAW and do the same thing with ocioColorspacetransform nodes manually, its great to troubleshoot stuff with.
Super tutorial for a complex topic. Thanks a lot. Saying that, despite understanding the concept you exposed for the round trip from DR to Nuke, I’m a bit overwhelmed by all the options available and not sure which path I should choose. In DR18 ACES transform and CST node have different version that the one you mentioned in the video and I’m a bit confused, no sure of the equivalent in DR18. I basically want to send canon.cinema/c-log3 to a VFX artist. In my test I used a CST to ACEScc / API0 - EXR -> Nuke -> ACES / API0 - EXR -> CST to go back to canon.cinema / clog 3 and then grade as usual. I read that ACEScg might be a better choice to send to NUKE and then send back DPX ACEScc to DR. ACES transform versus CST, various ACES versions, API1 or API0, I can’t find a consistent recommendation. You POV would be super interesting. Cheers from Montreal.
hey man! it really doesnt matter much as to which path you take! that said nuke prefers acesCG EXRs in zip1 compression, so thats usually the safest way to go. You can just request back acesCG EXRs with PIZ compression which is good for resolve. I would avoid DPX as its uncompressed and pretty ridicolous in filesize. If its RAW footage you can just switch to ACES mode , set output transform to acesCG and export EXRs, i am honestly stil, perplexed by the crazy amount of choice you have when picking a IDT for the canon stuff, quiet interesting that they have different options for things like "Daylight" and such.
little update : the chormatic adaptation is not needed anymore its now build into the colrospace node (adjust whitepoint button) and Nuke now supoorts to read BRAW files :) might do a updated video if people are interested
Yooo thank you!! I was doing xml and had no clue what I was doing. And it never came back from nuke gouge it was suppose to look. This is very helpful. thank you
Hi, would you be able to do an updated video, or be able to point me somewhere on BRAW nuke - resolve workflow. Thanks in advance
@@Applemarko You can use the braw workflow i used in this video just fine, but yea could do a just braw workflow, but its pretty simple tbh.
@@FinnJaeger1337 I think I've figured it out with the native braw nuke integration. Not sure if I'm doing it the "correct" way, but I suppose I am as there's no difference after comparing export to raw. In Nuke it says the input transform is invalid for Blackmagic Film Gen 5 when I import the braw clip, so I switch to DaVinci Wide Gamut Intermediate and export the same. Working in ACES CG color space. Does that sound right?
@@Applemarko Yea i would not use any of the BM spaces, just go directly to aces or to arrilogC or whatever is more common.
Idk where you export Davinci WideGamut files.. something seems wrong but I might not fully understand your workflow. I would always use resolve to make plates for nuke and not use braw in nuke, same with any other raw format, mostly due to speed and there should be a central place for "developing" the raw and not many different places allmover your colorpipeline
Thanks Finn, well explained!
Since moving from big facilities to my home office this is becoming much more relevant.
The chromatic adaptation, white point was new to me, really helpful.
glad it helped took me a while to find that was going completely crazy trying to match stuff...
this is gold! im managing a small post-team and we had difficult times when it came down to conforming and kicking footage back and forth to and from different departments. this video is helpful yes
thank you!
Just wanted to let you know Finn, that some huge part of viewing numbers is me just going back and forth through your awesome tutorial. It helps me a lot, even when I'm not responsible for DI/plates, just so I can check on where lies the problem. Also I can pretend to be smart ;)
Thx highly appreciated!
FINN! You are my king. I am working on my senior thesis, and having shot on the Venice there are like no tutorials for this stuff. I was having a panic with my VFX artist trying to figure out the workflow and without your video we may not have figured it out and had to suck up grainy shitty footage and mismatched color. Thank you!
Finn this was amazing, thank you for making this! Trying to troubleshoot with a colorist right now and your tips saved me probably a good day of just blindly trying to science our roundtrip issues
awesome!!! Glad to hear it helped :-)
Thanks Finn for this video! Keep it up!
Great timing, I was just starting to read into proper conforming with ACES - thank you!!
Hope it helps!
Loved it, hope you do more tutorials in the future :)
@@TheMonkeyMenace thx for the feedback I really appreciate it!
Nicely done. I didn't understand most of it, because I dont work in that area, but it looked and sounded very professional
Hahah thank you!!!
Thank you 🙏 great tutorial
So helpful, thank you. %timeline_index was also a clue I needed
Great info! Thank you.
super valuable
Nice one!! Really helpful Finn!
Thank you!
@@FinnJaeger1337 Ey Finn! One question about the env variable for ACES. Could you let me know how you add the OCIO var for the pipe? I am trying to leave it as set up for nuke, maya and houdini. Thanks again!!
@@lioMrocks Hey Yea totally! Ideally you would set up a env variable for each shot you launch from a pipe perspective, ideally not using the OCIO variable as that locks down nuke to inly use that one... as this is my personal machine I just used the windows GUI , just search for environment variable and type in OCIO as your key and then value the path to the .ocio file. thats all, then it works in blender e.t.c, in linux you can just type "export OCIO="path/to/config.ocio" and then start the program from the same terminal OR add it to your .bashrc , all the options :-)
@@FinnJaeger1337 Many thanks Finn!!
Awesome video thanks! Been working with FX3 raw footage for the first time and scratching my head having to use After Effects to process it unfortunately. Going to try this out but probably need to buy the full version of resolve first for RAW I believe.
you are a god
Great tutorial, Finn thank you! One thing I'd love to see is setting up shot naming/numbering for VFX in Davinci. Done a lot of digging and seems its not as easy as Studio or Flame. But if you have a method, I would love to see it.
Resolve has no concept of shot names , I mean you can give a shot name token to a source clip but thats about it, I usually go via NukeStudio or Flame for that and only use resolve as a raw converter type thing for formats that the afformentioned cant handle. if its just one plate per shot then its doable I usually use the reelName or some other metadata field for my shotnames and then use the % symbol during export to setup some logical exporting but yea its cumbersome and annoying, so yea havent found anything great in resolve for that sadly, I have some WIP python scripts to name clips in a timeline but thats about it.
@@FinnJaeger1337 Ah yeah I had a feeling this was the case. Though can't really complain since it's free... Add it to the wish list, but for now I'll use it the way you showed here and I'm sure it'll do the job!
Hi Finn, great tutorial! I have a resolve project that contains alexa, RED, DJI Zenmuse and Sony Alpha 7 Footage and I need to conform to linear EXRs for compositing in Nuke, would you mind telling me how I can do the Resolve to Nuke to Resolve roundtrip without using the ACES workflow? The colorist insist on not using ACES but grade everything in AlexaLogC, this confused me because wouldn't that alter the colorspace of the non Arri cameras?
Hey, thats a perfecty valid workflow , even though I would argue that the colorist needs to have a better technical understandinng as he can just convert aces EXRs to logC in davinci on the fly he never has to touch the aces mode, I know colorists arent usually that technical , so its probably hest to feed him logC/AlexaWidegamut DPX files , you can still use aces on ingest and export from nuke , If you follow my arri example , you can convert everything to Aces-AP0 in resolve and then export alexa-logC-AlexaWidegamut from nuke, yes you would be for example fot the a7 footage convert from slog2 to AlexaLogC but that seems like what he wants. . You can also convert everything to linear/alexaWidegamut and comp on that, really deends on how the compositor wants to work, I would just go aces for everything, as its one click to setup and everything will match which is nice for compositing, then export LogC/AWG dpx files from nuke and grade those. It very typical and a good idea imho to change all cameras to the master camera gamut and gamma before grading as you want them to match instead of manually grading them to match, wether thats aces or alexaLogC/awg or red ipp2 or whatever doesnt actually matter, Aces just gives you a nice framework with transforms that mostly works both ways. hope this helped a bit?
Hi Finn, great video with lots of information. How would you go about importing iPhone 14 pro max for a plate to working with cg
Many thanks
L
Hei Finn, great video here, thank you for your insights. I'm deep diving in ACES and I'm finding it quite confusing. I have a couple of questions: how would you linearize BlackMagic's ProRes 4444? I'm used to linearize footage directly in Nuke, is that a bad habit? (Also, the lack of support to BlackMagic cameras in ACES in Nuke is not helpful...).
Hey, newer blackmagic cameras do not shoot prores anymore so I left it out, linearizing in Nuke is not a bad habit! If nuke supports the codec there is no need to do it in resolve, but there are a lots of reasons to do it in resolve, like better timewarp support from AAFs and a bunch of ofter things. Now for the baked log from blckmagic prores, you need to know which of the log/gamut combinations was used during recording and then use a aces transform node and go from the appropirate Log to ACES, so just like what i did with the alexa prores, just you cant go back to the BMlog directly in nuke so you have to send aces exr back to resolve and do the reverse transform to the same Log used when recording. its a bit of a weird workflow due to blackmagic not giving AMPAS nice IDTs like arri and red etc have done so you dont have those in the aces OCIO.. As I said I am not happy with how BM is handeling this, at least the braw workflow works nice, I would not shoot prores on a Blacmagic camera if I can avoid it :/
@@FinnJaeger1337Cool that's super clear, thanks. So I guess I'll talk to the DOP and ask for the right log profile :). Next time will use braw instead. Thank you again!
@@Deaklighter braw is also smaller at the same quality :-)
Does "Highlight recovery" make difference while outputing? I have DNG and I find that lots of data is lost when "Highlight recovery" isn't checked. But I want that whole dynamic range when outputting for Nuke.
highlight recovery is a sort of hack where it "restores" clipped color channels from their non clipped counterparts - it can lead to problems(like pink skies) and its not really giving you more information , I have tried and succeded in recreating a sort of parametric highlight recovery tool in nuke that worked even better. so I would say it can be effective in certain situations but its not a holy grail, same goes for dowscaling mot every scaling filter works well with every source.
Hello! would you know how to ingest R3D RAW media with davinci resolve and export exr for VFX?
yea absolutely, its acutally extremely easy, set resolve to ACES mode (which one doesnt matter) set output transform to acesCG, and just render EXRs, make sure "creative" luts are off in the r3ds other wise thats really it
hello! followed your workflow - need to convert braw and return it the same after vfx - but something strange happens - it messes colors and doesnt return the same clip))) help pls
It should be pretty clear in the video, debayer as AP0/linear render EXR for VFX and they just render back AP0/Linear and voila? you can NOT use any of the blackmagic Log options those dont work for VFX workflows and should be ignored, In my example I even render back out alexa LogC/AlexaWidegamut from nuke you can be in total control .
Hello, what’s the best way to use a custom .DCTL in resolve to bring in to nuke as a IDT so I can convert it to aces?
You just drop it into the lut folder and use it like a lut
How do you find the camera raw settings shown about two minutes in?
all the way to the left is the highlighted symbol in the colortab, the settings are also in the project settings
hello. thanks for this tutorial. One problem solved led to another. I am getting frame drops when I write back to DR. I followed your steps in writing out of DR and back from Nuke. How do I fix that?
sounds like you are rendering with the nuke frameserver which constantly crashes on random frames if your machine doesnt have enough ram? try to not use frameserver
@@FinnJaeger1337 I’m rendering straight. No frameserver. I just hit the render in Nuke’s Write node
@@FinnJaeger1337 I thought it was cause of mismatch in working fps so I matched the raw videos fps to Nuke’s fps which is 59.940 but same issue, I also changed to 24 fps and same issue
@@FinnJaeger1337 does rendering in exr cause that? Now I’m rendering same thing in dpx from nuke to see if it works
Why not use the color chart to correct the red footage?
This would be more the next step for platepreparation, you cant reverse the colorchart balancing in Resolve not even sure you can export it as anyhting useful, I much rather balance/neutralgrade the footage in Nuke using mmColortarget, gives me a proper matrix transform that I can use later down the road, reverse , etc.
Hey Finn Great Tutorial, very helpful. One question: how to use DNG RAW of old Ursa Mini as BRaw or as DNG dji ? Thanks
Hey, you should also get similar options, when loading in a blackmagic DNG sequence to BRAW, so use those settings, there are a plethora of other DNG formats out there, like stills DNG and whatever as well and all is very different.
@@FinnJaeger1337 Ok, i will use BRaw settings for my BM DNG RAW Seq.
Just another question, when export EXR to nuke with your settings, then from Nuke Write which are the best settings to export VFX Comp with CGI to re-edit CC in Resolve? DPX or EXR? Which one Aces colorspace? (cg, 2065-1, scc...)
Grateful :)
@@RockandRollRobot you can do whatever you feel like , I would personally export Arri LogC DPX files and grade those but thats because I am so used to logC material as a colorist. AcessCC dpx and AcesCG exr are all completely fine, its really up to you! if you want to grade in a proper aces way you can set your project to aces in resolve and just set the idt to whatever for your clips as well. for CG comps dont forget you can shuffle in your mattes into the output multichannel exr and use those later in grading even.
the "best" will be exr Aces 2065-1 its all about choosing the correct one everytime you do a conversion more than choosing a correct one during export
@@FinnJaeger1337 ok, i am a compositor and i wanted to understand more about Aces color space and what to ask and export for freelance work. It's nice to find professionals like you who explain their knowledge. Thanks for your time and for your well done tutorial.
Will you be roundtripping that red file?
Ha! I guess I didnt do that, but its very straightforward, its just like LogC but you select which of the Red colorspaces you want, I would go with red ipp2 log3g10/RWG
@@FinnJaeger1337 If the log mov footage was brought into Nuke and I set the colorspace to Log3s10 to RGW, and then the same to the render setting and render. How do I keep the same results that I get with the viewer with my render? My render view on sRGB Aces. Thank you for your time.
@@IamJohnnyFan you can then just do the exact reverse of that in Resolve, acestransform node from Log3g10/rwg to sRGB. What you did in nuke is Log3g10/rwg->AcesCG -> Viewer(sRGB) . a read node setting is the same as a aces transform node in resolve just that it always transforms to your working space, which for the default aces ocio config is acesCG. Write node is the other way around, you can play around with setting everything and the viewer to RAW and do the same thing with ocioColorspacetransform nodes manually, its great to troubleshoot stuff with.
@@FinnJaeger1337 okay thank you.
Super tutorial for a complex topic. Thanks a lot. Saying that, despite understanding the concept you exposed for the round trip from DR to Nuke, I’m a bit overwhelmed by all the options available and not sure which path I should choose. In DR18 ACES transform and CST node have different version that the one you mentioned in the video and I’m a bit confused, no sure of the equivalent in DR18.
I basically want to send canon.cinema/c-log3 to a VFX artist. In my test I used a CST to ACEScc / API0 - EXR -> Nuke -> ACES / API0 - EXR -> CST to go back to canon.cinema / clog 3 and then grade as usual.
I read that ACEScg might be a better choice to send to NUKE and then send back DPX ACEScc to DR. ACES transform versus CST, various ACES versions, API1 or API0, I can’t find a consistent recommendation. You POV would be super interesting. Cheers from Montreal.
hey man! it really doesnt matter much as to which path you take! that said nuke prefers acesCG EXRs in zip1 compression, so thats usually the safest way to go. You can just request back acesCG EXRs with PIZ compression which is good for resolve. I would avoid DPX as its uncompressed and pretty ridicolous in filesize. If its RAW footage you can just switch to ACES mode , set output transform to acesCG and export EXRs, i am honestly stil, perplexed by the crazy amount of choice you have when picking a IDT for the canon stuff, quiet interesting that they have different options for things like "Daylight" and such.