Towards the beginning I was mixing up some words a little bit. Like saying "camera tracker" instead of "camera renderer". But if you are following along hopefully it all makes sense. If you have other ideas, know of plugins, or workflows on exporting the camera and point clouds that work, let me know. ** I forgot to mention in the video that if you are using the overscan workflow, you can use the crop method, from the full undistorted plate workflow, to figure out your render size from your 3D software.
Thank you so much! You are very clear in your explanations. More 3D please! Quick question, in Blender, from the PointCloud exported from Fusion with your thing (Replicate3D + Shape3D), if I select 4-5-6 points, how do I create a shape with the selected points? The equivalent in Fusion to ImagePlane. We select our points and create the ImagePlane from the points. THANKS!
Thanks! I would think having a local orientation and snapping would work. I'm a little crunched right now but can get back to you if you don't find a solution.
Great work! I'm trying to export the result of the camera tracker as a transform that could be used in fusion like the planar transform. Any idea how to do this? I have a shot where the camera turned in front of the subject, and I'm trying to paint but with a 3d transform applied to the paint strokes since the perspective changes a lot. Let me know if you can think of anything.
Thanks! Sounds like the 3D locator node would work well for this. Check out this video and let me know if this helps: ruclips.net/video/r6RIwS3FfoM/видео.html
As a quick test what I found was that if you export out your camera as FBX using the ascii format and version 2020 it works. One thing to note is that to get the camera into Unreal I had to shut off everything except for the camera. Meaning only export the camera from Fusion. The camera and animation was then brought into Unreal and looked good. Another option is to use Blender as a go between. The Blender/Unreal workflow seems pretty solid. I only checked to see if I was getting the animated camera into Unreal and to see if it was moving the right way, so not very thorough. Let me know if those settings work for you, if not maybe I will spend some time putting together a tutorial.
@@prophetless okay thank you. im going to test that now. before I was getting this error message that says: "Failed to find any matching camera for CineCameraActor. Importing onto first camera from fbx Camera3D1". Do you know if that has to do with the naming of the camera node in davinci resolve? Also, do you know if the 6.0 versions of the FBX format work too? or should I just stick to the default?
@@prophetless okay so I tried it and I'm still getting the error message, but the camera is still rotating. However, it is not moving in physical space. I can get it to match the rotation of the shot but not the physical displacement so far. Maybe I need to check the import settings on unreal.
Is there a reason to use camera tracker in davinci resolve instead in blender? You can do all the stuff inside blender and export for compositing in davinci resolve. I had only few projects using camera tracking in blender OR in davinci resolve.
Hello, i have a problem with the fbx export. I hook up a fbx-exporter into the Merge3D-Node and hit render all savers. But in the destination is no file. It seems that Davinci is doing something (a progressline appears) but nothing is there in the folder.
It sounds like you might have it setup backwards. You need to have the output of the merge 3d go to the input of the FBX export. Let me know if that is the problem.
@@prophetless oh man, thank for this tip. I tried two hours to solve this problem. Yes i hooked up the exporter into the merge. An then its in an sceneinput. When i put the mergeoutput into the exporter, it works. thanks again.
VERY derailed tutorial. Great ! I ‘m surprised you have choosen fusion to track the footage and not Blender ? Even if it has no manual mode in 3-D tracking, is it more precise? Otherwise what have you done this choice for ?
Towards the beginning I was mixing up some words a little bit. Like saying "camera tracker" instead of "camera renderer". But if you are following along hopefully it all makes sense. If you have other ideas, know of plugins, or workflows on exporting the camera and point clouds that work, let me know.
** I forgot to mention in the video that if you are using the overscan workflow, you can use the crop method, from the full undistorted plate workflow, to figure out your render size from your 3D software.
No problem. All clear. Many thanks, for another great tutorials.
Thanks!
Thanks for the great tutorial! I want to try DaVinci, is this tutorial suitable for the free version of DaVinci Resolve 19?
@@ilyanemihin6029 I believe you need the studio version to use the camera tracker
was looking for a fusion camera tracker to Blender workflow like this for a while. You are a genius. Thanks for your great work!
You're very welcome! Thanks!
Big Thanks, very instructive !!
You are welcome!
Thanks for this!
You're Welcome!
Awesome tutorial. Thanks.
You're welcome! Glad you liked it.
very needed tutorial!!!
Thanks! I'm glad you found it helpful!
you're a hero
Great Tutorial!
Thank you!
thanks man
tou re really the best at this😀
You're welcome! Thanks!
Thank you so much! You are very clear in your explanations. More 3D please! Quick question, in Blender, from the PointCloud exported from Fusion with your thing (Replicate3D + Shape3D), if I select 4-5-6 points, how do I create a shape with the selected points? The equivalent in Fusion to ImagePlane. We select our points and create the ImagePlane from the points. THANKS!
Thanks! I would think having a local orientation and snapping would work. I'm a little crunched right now but can get back to you if you don't find a solution.
oh thank you so much!!!!
Thanks!!
Great work! I'm trying to export the result of the camera tracker as a transform that could be used in fusion like the planar transform. Any idea how to do this? I have a shot where the camera turned in front of the subject, and I'm trying to paint but with a 3d transform applied to the paint strokes since the perspective changes a lot. Let me know if you can think of anything.
Thanks! Sounds like the 3D locator node would work well for this. Check out this video and let me know if this helps: ruclips.net/video/r6RIwS3FfoM/видео.html
Hey this video is awesome, would you also happen to know how to bring the camera track frok davinci into unreal engine 5?
I'm pretty sure it's similar to the problems laid out in this video. Let me open up Unreal later today or tomorrow and get back to you.
@@prophetless okay thank you, i appreciate it
As a quick test what I found was that if you export out your camera as FBX using the ascii format and version 2020 it works. One thing to note is that to get the camera into Unreal I had to shut off everything except for the camera. Meaning only export the camera from Fusion. The camera and animation was then brought into Unreal and looked good. Another option is to use Blender as a go between. The Blender/Unreal workflow seems pretty solid. I only checked to see if I was getting the animated camera into Unreal and to see if it was moving the right way, so not very thorough. Let me know if those settings work for you, if not maybe I will spend some time putting together a tutorial.
@@prophetless okay thank you. im going to test that now. before I was getting this error message that says: "Failed to find any matching camera for CineCameraActor. Importing onto first camera from fbx Camera3D1". Do you know if that has to do with the naming of the camera node in davinci resolve? Also, do you know if the 6.0 versions of the FBX format work too? or should I just stick to the default?
@@prophetless okay so I tried it and I'm still getting the error message, but the camera is still rotating. However, it is not moving in physical space. I can get it to match the rotation of the shot but not the physical displacement so far. Maybe I need to check the import settings on unreal.
Is there a reason to use camera tracker in davinci resolve instead in blender? You can do all the stuff inside blender and export for compositing in davinci resolve. I had only few projects using camera tracking in blender OR in davinci resolve.
Nope. I prefer Syntheyes for my 3D tracking but you can use whatever you want.
Hello, i have a problem with the fbx export. I hook up a fbx-exporter into the Merge3D-Node and hit render all savers. But in the destination is no file. It seems that Davinci is doing something (a progressline appears) but nothing is there in the folder.
It sounds like you might have it setup backwards. You need to have the output of the merge 3d go to the input of the FBX export. Let me know if that is the problem.
@@prophetless oh man, thank for this tip. I tried two hours to solve this problem. Yes i hooked up the exporter into the merge. An then its in an sceneinput. When i put the mergeoutput into the exporter, it works. thanks again.
Awesome!
VERY derailed tutorial. Great ! I ‘m surprised you have choosen fusion to track the footage and not Blender ? Even if it has no manual mode in 3-D tracking, is it more precise? Otherwise what have you done this choice for ?
Fantastic! Well, it will be when my 73 year old brain catches up ... and stops hurting. Certainly pushing its limits. 🤣
Hahaha. My apologies. I'll make sure the next one is more relaxing, so you can watch it in the morning with your coffee.😀