Virtual Production in Unreal: Zoom, Focus, and Camera Tracking using 3 Vive Trackers

Поделиться
HTML-код
  • Опубликовано: 29 янв 2025

Комментарии • 30

  • @jfconanan
    @jfconanan 3 года назад +1

    Awesome work man! Hoping to get my stuff working thru your tutorials. Thanks!

  • @antxionrhykel2699
    @antxionrhykel2699 3 года назад +4

    Hey Jake! This is incredible, great work finding all this stuff and getting it to work.
    In the video where you go over the zoom you said you would also do a tutorial for the focus, I've been racking my brain trying to figure out how to set the focus distance with absolutely no progress, would love to know how to set the focus distance so I can

    • @JakeGWater
      @JakeGWater  3 года назад

      I have so many things to discuss, and so little time these days. I probably won't get to any focus tutorials for a while, but I started ignoring focus and just adding it during post-production. Movie Render Queue can export a depth-layer, and you can fake the rest reasonably well in Nuke or possibly Resolve.

  • @rushboardtechuk
    @rushboardtechuk 3 года назад

    It would be useful to know what cage rig and focus/zoom encoder you used.

  • @wywarren
    @wywarren 3 года назад

    Can you share what lens you’re using? Is it a focus by wire lens? If so how are you syncing the focus distance with the focus acceleration? I have a similar rig setup but focus by wire is inconsistent for syncing with virtual camera. I tried extracting the data via Sony SDK but they have a bunch of data except for focal length and focus distance from the lens for some reason.

  • @4865213
    @4865213 2 года назад

    Hello, I'm learning a lot from your high-quality videos. Thank you. I have a question. Can I use manual focus and auto focus at the same time? If I disconnect and use one input node, it works individually, but I confirmed that it doesn't work when I run it using the input node at the same time. Is there a good way?

  • @juanperini2097
    @juanperini2097 Год назад

    hey jake, i think your videos are grate as they shows us what can we do with all this things, but, please, share a blueprint screen so we can understand how you managed to do this, please

  • @artistic2005-m2w
    @artistic2005-m2w 3 года назад +1

    can you make a tutorial on how to setup it up ?! great vid by the way

  • @ionutdragan9056
    @ionutdragan9056 7 месяцев назад

    good tutorial! I don't understand how to connect the sensor to Mars without using the rover.

  • @FilmArchive-z8b
    @FilmArchive-z8b 4 года назад +1

    great tutorials! Thank you a lot!

  • @cahayamedia4446
    @cahayamedia4446 3 года назад

    Were those trackers work with 2 base station? Anyway your video always helps me lot. Thanks

  • @yimmonirak2651
    @yimmonirak2651 3 года назад +1

    i need position and rotation data can i use only VIVE tracker without base station device?

    • @JakeGWater
      @JakeGWater  3 года назад +1

      You need at least one base station.

    • @yimmonirak2651
      @yimmonirak2651 3 года назад +1

      @@JakeGWater I wonder! So I just have vive tracker 2.0 and streamVR base station 2.0 I can use it, right?

    • @JakeGWater
      @JakeGWater  3 года назад

      @@yimmonirak2651 Yes, but Unreal has special instructions on how to set it up without the HMD. I have never tried it, but check our the Unreal documentation.

    • @yimmonirak2651
      @yimmonirak2651 3 года назад +1

      ​@@JakeGWater ohh, thank you very helpful, i'm starting with ue production

    • @JakeGWater
      @JakeGWater  3 года назад +1

      Awesome. Good luck!

  • @piero.massera
    @piero.massera 2 года назад

    very nice !

  • @mosesbluestudios
    @mosesbluestudios 3 года назад

    Quick question. Where do you have your Samsung T7 hooked to. Are you getting a direct camera input?

    • @JakeGWater
      @JakeGWater  3 года назад

      The Samsung is hooked into the camera via USB-C, and records the raw output of the camera. I will later sync it up in post-production, but I want to hook up timecode first to ensure Unreal and the Blackmagic footage is in sync.

  • @tokyologist2544
    @tokyologist2544 3 года назад

    I really hope to see an open standard protocol to send all of these parameters in one go.
    But for now, this is geeeenius solution!

  • @jdvroum
    @jdvroum 3 года назад

    What do you think about ursa broadcast for VP. I already use vive !Nice video

    • @JakeGWater
      @JakeGWater  3 года назад

      The URSA broadcast seems to have Genlock/SDI which is good. Are you doing live or not?

    • @jdvroum
      @jdvroum 3 года назад +1

      @@JakeGWater I finaly bought Ursa mini Pro 4.6K G2. I'm doing R&D for augmented reality and virtual studio with unreal

    • @JakeGWater
      @JakeGWater  3 года назад

      @@jdvroum That is a super solid camera. Are you sharing your progress anywhere?

    • @jdvroum
      @jdvroum 3 года назад

      @@JakeGWater I just got it few days ago and I still don't have tripod (and have two children :-) ) I create a channel on youtube call JDVP, I will add your channel cause I found a lot of good advice. Thanks a lot

  • @omc7250
    @omc7250 3 года назад

    That's great! Can you do a tutorial on the calibration bit please!? I saw your other video about zoom/focus and timecode/genlock! Great progress!
    By the way, what lens are you using!? It would be interesting if you link your equipment. Cheers!

  • @shermanmak310
    @shermanmak310 2 года назад

    why three trackers?

  • @hubert_zatorski
    @hubert_zatorski 3 года назад

    Woah :0