@@Ecceptor Yes you can, which is why I don't understand the point of this program. All this does is take control of the render engine in Unreal, and render out the passes in Unreal and automatically import them into Nuke. It's not as if you cannot do the same exact process rendering out EXR passes in Unreal and simply dragging those into Fusion or AE or any other compositing software. On some level this might save a tiny bit of time, but as you say has the disadvantage of having to have two very resource heavy programs open at the same time.
Nuke talks give you match more render flexabilities in term of rendering and also compositing which you have lot of limitation doing directly with the sequncer.. @@landonp629
Hi, i did what you but my i cannot click anything what's on the right (UnrealReader-Render-Advanced-Camera-Variables-Node). Everything is greyout. And of course i cannot click "connect to server". Do you know maybe what's the issue?
This UnrealReader node doesn't do anything special - it just helps automate the import of Unreal renders into Nuke. It does nothing that you cant so on your own with Unreal Engine.
Having to have both Nuke and Unreal open at the same time is going to be a resource drain, and neither will perform their best in such a situation. It's much better to just render out EXR passes in Unreal - close Unreal - and then open Nuke (or any other program) and composite them. This has been the workflow for decades now between software's.
The main issue with rendering out EXRs from Unreal Engine and passing them to Nuke as I've done, is you completely lack deep EXR output support. Having the Unreal reader pass the scene depth data along with all the other AOVs/render passes interactively streamed to the Nuke compositor gives you effectively that deep sample/point cloud information that would otherwise be completely lacking without Deep EXR support. There is Octane and Vray Next for Unreal Engine, and I believe they support Deep EXR output from Unreal, but at that point with offline render engine plugins you're giving up the benefit of using Unreal for the main reason it's used for--realtime rendering!
@@NUCLEARARMAMENT That's definitely one of the biggest tradeoffs. Same with storage consumption for having to re-render out EXRs or any other file type you may be working with if you need to make some changes. Learned that the hard way this past year rendering out essentially a terabyte of files across 5 or so projects trying to fix or change things. Live linked projects works great in terms of workflow and being able to make snap changes on the fly, but at the cost of resources your computer can pull from with all the programs open; and constantly exporting and replugging in saves those resources but ends up being a huge storage hog after some time, especially with larger projects.
What do you mean 'game quality'? Are you saying Unreal Engine only renders out game quality images? That isn't true. Or are you saying that somehow Nukes render viewer alters the Unreal Engine render? That is also not the case. The only thing this does it help automate Unreal do the rendering and import - Nuke is not actually rendering the scene here - Unreal is.
I was in between Nuke and Fusion, but as an Unreal Engine creator, this 100% tips me over to Nuke.
Fusion is a powerfull softwafer no doubt about it
Can't you just render UE scene and composite it in Fusion?. Having UE opened while compositing is not ideal anyway.
@@Ecceptor Yes you can, which is why I don't understand the point of this program. All this does is take control of the render engine in Unreal, and render out the passes in Unreal and automatically import them into Nuke. It's not as if you cannot do the same exact process rendering out EXR passes in Unreal and simply dragging those into Fusion or AE or any other compositing software. On some level this might save a tiny bit of time, but as you say has the disadvantage of having to have two very resource heavy programs open at the same time.
nuke is too expensive... :)
Nuke talks give you match more render flexabilities in term of rendering and also compositing which you have lot of limitation doing directly with the sequncer.. @@landonp629
This is the future!
Good tutorial. Seems like a lot of work just to isolate the pillars... Woudn't be easier to just seperate these as individual objects?
Hi, i did what you but my i cannot click anything what's on the right (UnrealReader-Render-Advanced-Camera-Variables-Node). Everything is greyout. And of course i cannot click "connect to server". Do you know maybe what's the issue?
When I align with the point cloud in 3D everything animates perfectly but when I switch to 2D my card is way off. What step am I missing?
Does this workflow works also with Nuke Indie?
h\yes
@@mostlyharmless88 Is that correct? Foundry's website says it's only for NukeX and Nuke Studio
so good ❤❤❤❤❤
Could the same be applied to fusion and if so how?
Hello! The UnrealReader node is only available in NukeX
This UnrealReader node doesn't do anything special - it just helps automate the import of Unreal renders into Nuke. It does nothing that you cant so on your own with Unreal Engine.
And in fact has a major draw back - you need two very resource intensive programs open at the same time, which steals resources from each.
Having to have both Nuke and Unreal open at the same time is going to be a resource drain, and neither will perform their best in such a situation. It's much better to just render out EXR passes in Unreal - close Unreal - and then open Nuke (or any other program) and composite them. This has been the workflow for decades now between software's.
The main issue with rendering out EXRs from Unreal Engine and passing them to Nuke as I've done, is you completely lack deep EXR output support. Having the Unreal reader pass the scene depth data along with all the other AOVs/render passes interactively streamed to the Nuke compositor gives you effectively that deep sample/point cloud information that would otherwise be completely lacking without Deep EXR support. There is Octane and Vray Next for Unreal Engine, and I believe they support Deep EXR output from Unreal, but at that point with offline render engine plugins you're giving up the benefit of using Unreal for the main reason it's used for--realtime rendering!
@@NUCLEARARMAMENT That's definitely one of the biggest tradeoffs. Same with storage consumption for having to re-render out EXRs or any other file type you may be working with if you need to make some changes. Learned that the hard way this past year rendering out essentially a terabyte of files across 5 or so projects trying to fix or change things. Live linked projects works great in terms of workflow and being able to make snap changes on the fly, but at the cost of resources your computer can pull from with all the programs open; and constantly exporting and replugging in saves those resources but ends up being a huge storage hog after some time, especially with larger projects.
Why not in deep compositing for the masks?
Hello! Due to the large file sizes require for deep compositing, we decided to not utilize it in this tutorial.
Amit Bensangi
Was is the point ? You have loose realtime but you have game quality (not trolling)
What do you mean 'game quality'? Are you saying Unreal Engine only renders out game quality images? That isn't true. Or are you saying that somehow Nukes render viewer alters the Unreal Engine render? That is also not the case. The only thing this does it help automate Unreal do the rendering and import - Nuke is not actually rendering the scene here - Unreal is.
Don't get me wrong, this is a pretty pointless plugin - but its not for quality reasons.
1st Bro