Tracking Refinement with Syntheyes

Поделиться
HTML-код
  • Опубликовано: 30 ноя 2024

Комментарии • 21

  • @LokmanVideo
    @LokmanVideo 2 месяца назад +3

    I'm in the VFX industry for many years, and seeing this workflow and the new technology you're bringing to the masses is so exiting :)
    Can't wait to test Jetset once I finish my new green screen studio (cyclorama).
    Amazing job guys!

    • @eliotmack
      @eliotmack 2 месяца назад

      Thanks! Post some shots when can!

    • @LokmanVideo
      @LokmanVideo 2 месяца назад

      @@eliotmacksure 👍

  • @weshootfilms
    @weshootfilms 4 месяца назад +4

    Amazing

  • @dickie_hrodebert
    @dickie_hrodebert Месяц назад

    Where can I get the Overscan addon that you are using for Blender?

    • @lightcrafttechnology
      @lightcrafttechnology  Месяц назад

      In this case, we're not using an overscan add-on, but manually entering in the overscan sensor size calculated in Syntheyes. Much simpler.

  • @stephanec3436
    @stephanec3436 24 дня назад

    Refinement tracking : do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CG set with a live video. ?? Thanks

    • @lightcrafttechnology
      @lightcrafttechnology  24 дня назад +1

      The standard Jetset tracking is easily good enough for shots that don't have visible ground contact. You can see some videos done by Alden Peters on RUclips that are all straight Jetset tracking. The shots with highly visible ground contact require an additional level of precision; that's what the SynthEyes pipeline is designed to handle.

  • @momenkhaled99
    @momenkhaled99 3 месяца назад +1

    wow

  • @michaelounsa5056
    @michaelounsa5056 3 месяца назад

    Hello. I am considering purchasing an iPhone Pro to use the LiDAR feature specifically for virtual production with the LightCraft application. Could you please let me know if there is a significant difference in LiDAR quality and performance between the iPhone models from version 12 up to the upcoming iPhone 16? Are there any major benefits of using the newer models with your application?

    • @eliotmack
      @eliotmack 3 месяца назад

      We've found remarkable improvements with each new generation of iPhone, especially in GPU capacity and in cooling. The LiDAR hasn't changed much, but I'd still recommended getting the newest iPhone you can simply for the other performance aspects. It makes a big difference. We're getting Jetset ready for iOS 18 and looking forward to what is in the new hardware coming up soon.

    • @michaelounsa5056
      @michaelounsa5056 2 месяца назад

      @@eliotmack thank you for the clearly answer.

  • @ApexArtistX
    @ApexArtistX 4 месяца назад

    how to bring them over to unreal or blender ? tried fbx and usd does not works.. fails hard.. unreal has no camera sometimes

    • @eliotmack
      @eliotmack 3 месяца назад

      Watch closely at 17:32 -- it goes into detail on the Blender import process.

  • @jordanthecadby5762
    @jordanthecadby5762 4 месяца назад

    Under what circumstances would you need to refine the live track?

    • @eliotmack
      @eliotmack 3 месяца назад +1

      Shots with visible CGI & live action joins. In this case it's the join between the CG railing and the practical floor, but in other shots it might be high degrees of floor contact.

    • @stephanec3436
      @stephanec3436 24 дня назад

      @@eliotmack do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CGT sets with a live video. ?? Thanks

    • @lightcrafttechnology
      @lightcrafttechnology  24 дня назад

      @@stephanec3436 For many shots the standard Jetset tracking is fine. All but one of the shots in ruclips.net/video/s2y2lcsL_Lk/видео.htmlsi=oOkG1VY3s8Q5wM5X are from the Jetset data. For certain shots with very visible ground contact, you may need to do tracking refinement. It's very shot-specific.

  • @kabalxizt5028
    @kabalxizt5028 3 месяца назад

    you should make more tutorial about SYNTHEYES

  • @manolomaru
    @manolomaru 3 месяца назад

    ✨😎😮😵😮😎👍✨