That's an interesting feature request. Where would you expect that to be shown? As an Undistortion node in the Compositing nodetree? Would it change your mind if that limited the quality of the lens distortion models you could use?
@@BorisFXco yes, if you track a footage in blender, it automatically add the distortion to the camera setting, you can turn it off and on, there is also a distortion node in the compositor that holds the values which we can use on the footage, I’m not sure what type of calculation blender uses for distortion but its very good, we need to use the original footage in other platforms, syntheyes forces us to bake the distortion in the footage, the whole distortion workflow in syntheyes is actually confusing, looking forward to that, thank you
Do you have any tutorials on how to track and solve anamorphic footage? It doesn't seem to be very clear in the support documents, and all of the RUclips tutorials seem to be for traditional lenses...
We have a couple here already : borisfx.com/videos/?tags=product:SynthEyes&search=anamorphic You'll be glad to hear that this is something that we are working on at the moment. Stay tuned.
@@BorisFXco Thank you. It would be really useful to have a tutorial on solving an anamorphic shot once a distortion board has been shot. My solves really aren't working out despite shooting one, the 'Anamorphic Solve using a lidar scan' workflow doesn't seem to work well either.
I'm just pasting my answer from the other thread here, so that others can see it on this video too: There are big differences between them. If you are mainly tracking in 2D and doing common tasks like rotoscoping, stabilization, and mesh warping, with some 3D camera tracking then Mocha Pro is going to be the one for you. If you are doing a lot of 3D camera and object solving with often tricky shots, then SynthEyes is the choice. There is also the Boris FX Suite, which contains everything. Probably the best option for you is to talk to one our Sales support people. If you let them know what type of projects you're working on, or want to work on, then they will happily guide you to a more informed choice : support.borisfx.com/hc/en-us/requests/new?ticket_form_id=8863752433421
Hi! I have a question. I try to export to After Effects via jsx Redistort(2), but all I get is "Camera01 has animated distorsion, this is not supported as it would require writing two entire sequences of STMap images." Would you happen to know where can I find this camera animated distortion and remove it?
If you open up the Floating Graph Editor, you can see the lens distortion in there. If you haven't run the Lens Workflow yet, it's under Camera & Objects > Camera # > Lens Distortion. If you have run Lens Workflow, it will be under Shots > Shot # > Lens. Hope that helps and sorry for the delay in responding. Your comment was caught up in the holiday break,
great tut ihave question is there anyway ican get rid of red cache on the timeline ihave 32 gb ram and still half of my 4k footage doesnt cache it at all
What format are you using and how long is your clip? Long clips and RAW or uncompressed formats won't be able to cache into 32GB of RAM. SynthEyes should use all available RAM for the cache.
There's this : ruclips.net/video/KIBeSmKekQ0/видео.htmlsi=GYpvwiZ21t2Gu-_2&t=115 Be sure to turn on Use LensDistort node when doing the export. I'll add this request to our own list, so we have a Boris FX video for you too.
Thanks for the question. No, After Effects is not necessary to use SynthEyes, but you will need a different host to do your composite. Here is a list of hosts that we export data for currently : borisfx.com/documentation/syntheyes/syntheyesum_files/exporting_to_your_animation Let us know where and how you want to use 3D matchmoving data and we may be able to find a solution for you.
@@BorisFXco I use Mocha Pro as a plugin for Vegas. I use it for many tracking jobs. But when there are jobs that are too difficult, I was thinking SynthEyes could be a better option. Mocha Pro lets you input a picture into the planar tracking. That's all I need to do. I don't need something super fancy. I just need a 2D image input into the track. A very basic, simple task. Can SynthEyes input an image? Or do I have to export to another program?
Yes! Go to Shot > Write Distortion Maps to create the distortion map sequence. This will generate the STMap for the undistortion or redistortion as required.
You're in luck! You can export the data directly out to a Fusion comp to and it sets it all up for you. Check out the trial here : borisfx.com/products/syntheyes/
Wherever possible we try to use footage that can be downloaded. There are a few free stock sites available for practice footage. Try this : www.pexels.com/video/exterior-design-of-a-modern-high-rise-buildings-3648257/
I watched more than 20 tutorials but this guy let me understand alot in 2 tutorials ، professional and on point .
That means a lot. Thank you.
Harry Frank the legend I love his tuts since the old days with red giant :D
I would love to see the lens distortion backed into camera setting for blender.
That's an interesting feature request. Where would you expect that to be shown? As an Undistortion node in the Compositing nodetree? Would it change your mind if that limited the quality of the lens distortion models you could use?
@@BorisFXco yes, if you track a footage in blender, it automatically add the distortion to the camera setting, you can turn it off and on, there is also a distortion node in the compositor that holds the values which we can use on the footage, I’m not sure what type of calculation blender uses for distortion but its very good, we need to use the original footage in other platforms, syntheyes forces us to bake the distortion in the footage, the whole distortion workflow in syntheyes is actually confusing, looking forward to that, thank you
Great new export option to after effects!
Thanks! The new lens calibration models are really a step up.
Do you have any tutorials on how to track and solve anamorphic footage? It doesn't seem to be very clear in the support documents, and all of the RUclips tutorials seem to be for traditional lenses...
We have a couple here already : borisfx.com/videos/?tags=product:SynthEyes&search=anamorphic
You'll be glad to hear that this is something that we are working on at the moment. Stay tuned.
@@BorisFXco Thank you! That's great to hear.
@@BorisFXco Thank you. It would be really useful to have a tutorial on solving an anamorphic shot once a distortion board has been shot. My solves really aren't working out despite shooting one, the 'Anamorphic Solve using a lidar scan' workflow doesn't seem to work well either.
Really usefull tutorial.👍
Fantastic. Glad you found it handy. Thanks for the comment.
OK Nice details and information. But which one is better for tracking and compositing Mocha or SynthEyes. Do i need both applications?
I'm just pasting my answer from the other thread here, so that others can see it on this video too:
There are big differences between them. If you are mainly tracking in 2D and doing common tasks like rotoscoping, stabilization, and mesh warping, with some 3D camera tracking then Mocha Pro is going to be the one for you.
If you are doing a lot of 3D camera and object solving with often tricky shots, then SynthEyes is the choice.
There is also the Boris FX Suite, which contains everything.
Probably the best option for you is to talk to one our Sales support people. If you let them know what type of projects you're working on, or want to work on, then they will happily guide you to a more informed choice : support.borisfx.com/hc/en-us/requests/new?ticket_form_id=8863752433421
Hi! I have a question. I try to export to After Effects via jsx Redistort(2), but all I get is "Camera01 has animated distorsion, this is not supported as it would require writing two entire sequences of STMap images." Would you happen to know where can I find this camera animated distortion and remove it?
If you open up the Floating Graph Editor, you can see the lens distortion in there. If you haven't run the Lens Workflow yet, it's under Camera & Objects > Camera # > Lens Distortion. If you have run Lens Workflow, it will be under Shots > Shot # > Lens.
Hope that helps and sorry for the delay in responding. Your comment was caught up in the holiday break,
@@BorisFXco No problem! Thank you anyway, I'll give it a shot. The real problem was my distortion model was set to "Classic".
great tut
ihave question
is there anyway ican get rid of red cache on the timeline
ihave 32 gb ram and still half of my 4k footage doesnt cache it at all
What format are you using and how long is your clip? Long clips and RAW or uncompressed formats won't be able to cache into 32GB of RAM. SynthEyes should use all available RAM for the cache.
@@BorisFXco i used 4k footage from phone stright to syntheys it's about 18 sec
maybe ya its to long
can the same workflow be done in fusion?
Yes. There is a Fusion composition output that does all the hard work for you for the Undistort and Redistort.
@@BorisFXco sounds great, is there any video demonstrating the same?
There's this : ruclips.net/video/KIBeSmKekQ0/видео.htmlsi=GYpvwiZ21t2Gu-_2&t=115
Be sure to turn on Use LensDistort node when doing the export.
I'll add this request to our own list, so we have a Boris FX video for you too.
Does the AE plugin support Apple Silicon now?
Yes, it was updated for Apple Silicon as well as MFR in After Effects.
I have the Boris FX Suite. Is After Effects necessary to use Syntheyes? Because that's the only thing I ever hear about in these tutorials.
Thanks for the question. No, After Effects is not necessary to use SynthEyes, but you will need a different host to do your composite. Here is a list of hosts that we export data for currently : borisfx.com/documentation/syntheyes/syntheyesum_files/exporting_to_your_animation
Let us know where and how you want to use 3D matchmoving data and we may be able to find a solution for you.
@@BorisFXco I use Mocha Pro as a plugin for Vegas. I use it for many tracking jobs. But when there are jobs that are too difficult, I was thinking SynthEyes could be a better option.
Mocha Pro lets you input a picture into the planar tracking. That's all I need to do. I don't need something super fancy. I just need a 2D image input into the track. A very basic, simple task. Can SynthEyes input an image? Or do I have to export to another program?
Is there a way to export STMap only?
Yes! Go to Shot > Write Distortion Maps to create the distortion map sequence. This will generate the STMap for the undistortion or redistortion as required.
I need this for Fusion! 😅
You're in luck! You can export the data directly out to a Fusion comp to and it sets it all up for you. Check out the trial here : borisfx.com/products/syntheyes/
@@BorisFXco is there a tutorial about this workflow inside fusion? I already own syntheyes :)
would be nice if you post the footage you are working with in the comment section so we can follow with you as beginners....
Wherever possible we try to use footage that can be downloaded. There are a few free stock sites available for practice footage. Try this : www.pexels.com/video/exterior-design-of-a-modern-high-rise-buildings-3648257/
It really bothers me that youre not locking your trackers
Chalk it up to the passion of the screen record. Hopefully this PSA helps: "Kids, always lock your manual trackers when you're finished with them."