I'm in the VFX industry for many years, and seeing this workflow and the new technology you're bringing to the masses is so exiting :) Can't wait to test Jetset once I finish my new green screen studio (cyclorama). Amazing job guys!
Refinement tracking : do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CG set with a live video. ?? Thanks
The standard Jetset tracking is easily good enough for shots that don't have visible ground contact. You can see some videos done by Alden Peters on RUclips that are all straight Jetset tracking. The shots with highly visible ground contact require an additional level of precision; that's what the SynthEyes pipeline is designed to handle.
Hello. I am considering purchasing an iPhone Pro to use the LiDAR feature specifically for virtual production with the LightCraft application. Could you please let me know if there is a significant difference in LiDAR quality and performance between the iPhone models from version 12 up to the upcoming iPhone 16? Are there any major benefits of using the newer models with your application?
We've found remarkable improvements with each new generation of iPhone, especially in GPU capacity and in cooling. The LiDAR hasn't changed much, but I'd still recommended getting the newest iPhone you can simply for the other performance aspects. It makes a big difference. We're getting Jetset ready for iOS 18 and looking forward to what is in the new hardware coming up soon.
Shots with visible CGI & live action joins. In this case it's the join between the CG railing and the practical floor, but in other shots it might be high degrees of floor contact.
@@eliotmack do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CGT sets with a live video. ?? Thanks
@@stephanec3436 For many shots the standard Jetset tracking is fine. All but one of the shots in ruclips.net/video/s2y2lcsL_Lk/видео.htmlsi=oOkG1VY3s8Q5wM5X are from the Jetset data. For certain shots with very visible ground contact, you may need to do tracking refinement. It's very shot-specific.
I'm in the VFX industry for many years, and seeing this workflow and the new technology you're bringing to the masses is so exiting :)
Can't wait to test Jetset once I finish my new green screen studio (cyclorama).
Amazing job guys!
Thanks! Post some shots when can!
@@eliotmacksure 👍
Amazing
Where can I get the Overscan addon that you are using for Blender?
In this case, we're not using an overscan add-on, but manually entering in the overscan sensor size calculated in Syntheyes. Much simpler.
Refinement tracking : do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CG set with a live video. ?? Thanks
The standard Jetset tracking is easily good enough for shots that don't have visible ground contact. You can see some videos done by Alden Peters on RUclips that are all straight Jetset tracking. The shots with highly visible ground contact require an additional level of precision; that's what the SynthEyes pipeline is designed to handle.
wow
Hello. I am considering purchasing an iPhone Pro to use the LiDAR feature specifically for virtual production with the LightCraft application. Could you please let me know if there is a significant difference in LiDAR quality and performance between the iPhone models from version 12 up to the upcoming iPhone 16? Are there any major benefits of using the newer models with your application?
We've found remarkable improvements with each new generation of iPhone, especially in GPU capacity and in cooling. The LiDAR hasn't changed much, but I'd still recommended getting the newest iPhone you can simply for the other performance aspects. It makes a big difference. We're getting Jetset ready for iOS 18 and looking forward to what is in the new hardware coming up soon.
@@eliotmack thank you for the clearly answer.
how to bring them over to unreal or blender ? tried fbx and usd does not works.. fails hard.. unreal has no camera sometimes
Watch closely at 17:32 -- it goes into detail on the Blender import process.
Under what circumstances would you need to refine the live track?
Shots with visible CGI & live action joins. In this case it's the join between the CG railing and the practical floor, but in other shots it might be high degrees of floor contact.
@@eliotmack do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CGT sets with a live video. ?? Thanks
@@stephanec3436 For many shots the standard Jetset tracking is fine. All but one of the shots in ruclips.net/video/s2y2lcsL_Lk/видео.htmlsi=oOkG1VY3s8Q5wM5X are from the Jetset data. For certain shots with very visible ground contact, you may need to do tracking refinement. It's very shot-specific.
you should make more tutorial about SYNTHEYES
✨😎😮😵😮😎👍✨