Dual Follow Focus + Dual Vive Tracker

Поделиться
HTML-код
  • Опубликовано: 16 янв 2025
  • КиноКино

Комментарии • 127

  • @Megasteakman
    @Megasteakman 4 года назад +3

    Thanks for the shout-out (and you did a great job of the pronunciation of my name)! Yeah, I really like your solution to drive the second tracker with a follow focus: really smart. Looking forward to more, and happy to be in the virtual production group!

  • @DucatiKozak
    @DucatiKozak 4 года назад +16

    Your journey through this new technology is fascinating!
    Keep it up. Can't wait to join you myself.

  • @nenproductions4395
    @nenproductions4395 4 года назад +4

    Thank you grand wizzard, for leading the way for us indie folk

  • @mariomokarram7570
    @mariomokarram7570 3 года назад

    Man, I recognized you because I know your meta human on Linkedin...Matt you are one amazing human being, bravo!

  • @robbioe
    @robbioe 4 года назад +8

    so awesome man. answering all the questions I needed to get into this

  • @SanczykLucas
    @SanczykLucas 4 года назад +1

    Man, it just awesome to see how much you developed so far. I still remember only a couple years ago when you were thinking about how cool would be all of this in your Cinematography Breakdowns. Congrats Matt, keep doing it!

  • @pauljmanoogian
    @pauljmanoogian 4 года назад

    Love following your progress on this subject. I'm part of the Facebook group and there is so much great stuff in there for folks interested in this process. Keep up the great work!

  • @KwasnikPictures
    @KwasnikPictures 4 года назад +3

    Even tho I dont have a VR system this is so interesting to watch!

  • @TheMonicaProject
    @TheMonicaProject 4 года назад

    I get so excited everytime you upload a video! You are a huge inspiration for me

  • @virtualtolive1077
    @virtualtolive1077 4 года назад

    So looking forward to your virtual production tool! Great experiment.

  • @SaschaFrenzer
    @SaschaFrenzer 4 года назад

    Awesome! Sounds like a lot of fun and potential.

  • @raymondchang9098
    @raymondchang9098 4 года назад

    just bought the cinetracer, love your virtual production ideas.

  • @jfconanan
    @jfconanan 4 года назад

    Started on this and looks like a good project to learn. Super thanks and hope you are able to make more in depth tutorials. Thanks!

  • @GregCorson
    @GregCorson 4 года назад +10

    Would really be interested in seeing how this works with the zoom too. Probably not good without lots of calibration, but would be interesting to see just how "off" it looks without it.

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +3

      It could be done, but I would want to calculate the FOV using checker charts and Nuke. I don’t trust the lens markings for focal length. If it’s just MR, it would be fine, but for AR we would need to be more precise.

    • @dcnguyen28
      @dcnguyen28 3 года назад

      @@CinematographyDatabase hi!, could you share your calculation ? I'm having trouble matching the focal length on Unreal or disguise to match with real lens focal length

  • @JoshuaMKerr
    @JoshuaMKerr 3 года назад +3

    Did you make a blueprint availlable for this? I'd love to have a play with focus mapping.

  • @AnthonyDeCroce
    @AnthonyDeCroce 3 года назад +2

    is there an updated method to this?

  • @foxy2348
    @foxy2348 4 года назад

    Dude you're so awsome! Me and my coder friend gonna die into virtual production! So hypped! Thanks for All your work!

  • @goodman-group
    @goodman-group Год назад

    awesome, fascinating! we want it :)

  • @Masterstreami
    @Masterstreami 3 года назад

    Thanks for sharing Your knowledge!

  • @timparsons3565
    @timparsons3565 4 года назад +9

    Call it "VIP Unreal: Virtual Indie Productions for Unreal Engine"

  • @saemranian
    @saemranian 4 года назад

    Thanks man for your sharing
    and for your Being

  • @SOAR-OF-AVEM
    @SOAR-OF-AVEM 4 года назад

    Go man Go!

  • @syedshueeb1667
    @syedshueeb1667 4 года назад +1

    Pls share the blueprint you used to calibrate second vive tracker.

  • @EanMartinTays
    @EanMartinTays 4 года назад

    Very cool test! I'm REALLY happy to hear though that I won't have to do this in my own studio lol

  • @leonardokala
    @leonardokala 4 года назад +3

    It's so interesting to watch you evolve and solve all kinds of problems to conquer a new feature!
    I noticed something that maybe would make this look even more natural. What do you think about blurring the edges of the keyed object when it gets increasingly more out of focus? I don't know if this would look strange. Because you'd probably have to "shrink" the whole key to actually have data to blur in the first place.

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +2

      Yeah, I was considering something like that. I haven’t tried to adjust the key live, but it looks like something like that would be necessary.

  • @syedshueeb1667
    @syedshueeb1667 4 года назад +1

    Hi,
    Please share the lens mapping you used and also the ue4 blueprints.Thanks in advance

  • @daviddelayat-dnapictures
    @daviddelayat-dnapictures 4 года назад +3

    That's a great hack !
    Here is an idea that popped up to me :
    Would it be possible to make this through an Arduino microcontroller ?
    You could have a potentiometer instead of the second vive tracker and get only a rotation value.
    It would imply to parse the data inside of the Arduino and send it to UE, which I don't really know about, but you would have much quicker and accurate results I think. And you could do the same for aperture.
    Thanks for the video ! Can't wait to see the next idea :)
    EDIT : The arduino can create a Serial connection, which can be used in C++ so I guess it could work !

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +1

      That definitely works and there are some people on the community making Arduino/Rotary encoders. Getting the data to UE4 is pretty straight forward from there too. We will see if any of them turn into actual products people can buy. That’s my primary interest, vs one off custom builds.

    • @daviddelayat-dnapictures
      @daviddelayat-dnapictures 4 года назад

      @@CinematographyDatabase thanks for the reply
      But once that done, what are the next steps/constraints to get the best results?

  • @MohammedIdrei
    @MohammedIdrei 4 года назад +2

    Asking you everywhere :) ..
    Please show us how to map the Focal Dictance to vive tracker rotation .. Please .
    Thanks !

  • @kenonaidoo2241
    @kenonaidoo2241 4 года назад +1

    Amazing work Matt ! Was wondering if you considered using a Rotary Encoder and an Arduino Micro to measure the rotation of the focus / zoom rings. Send info via usb/WiFi. Using as a midi. No need to customize every single lens. Just set ur 0 point, then rotate lens rings to stop points and your able to map any lens very quickly. And it’s way smaller than a tracking puk.

    • @kenonaidoo2241
      @kenonaidoo2241 4 года назад

      You could possibly build the rotary encoder directly into the follow focus hardware 🤔

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +1

      There are several new products coming out that have done just that. Vanishing Point Viper I have already and will test soon and there are more affordable Arduino based encoders coming soon.

  • @Mautemas
    @Mautemas 3 года назад

    Thanks, this video is Enlightening

  • @eddiefilm
    @eddiefilm Год назад +1

    please share BP

  • @PROC3699
    @PROC3699 4 года назад +1

    Brilliant! Really!!

  • @cainable2241
    @cainable2241 4 года назад +1

    great video. Do you need the VIVE Pro, or you can just use the base stations without buying the unit? Thanks

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +1

      It's possible to hack the Vive so you only need the Base Stations and Tracker. But I recommend having a full Vive Pro in case that hack stops working one day.

    • @cainable2241
      @cainable2241 4 года назад

      @@CinematographyDatabase That's awesome thanks!

    • @saish24
      @saish24 4 года назад +1

      It is always good to keep Vive Headset with you, as you can test with VR mode in Unreal Engine. :)

    • @cainable2241
      @cainable2241 4 года назад

      @@saish24 Great, thank you!

  • @PatrickCharpenet
    @PatrickCharpenet 4 года назад

    Super interesting. What lenses do you recommend for this setup?

  • @EthanJohnston-eg9si
    @EthanJohnston-eg9si Год назад

    Hey Matt.. What are you mounting your vive trackers with? Also has anything new come to light since this 2 year old video? :)

    • @CinematographyDatabase
      @CinematographyDatabase  Год назад

      Yeah people use the LOLED lens encoder now at this level. This was just a funny experiment, it wouldn’t be reliable for production.

  • @nialltracey2599
    @nialltracey2599 4 года назад

    How about using a OpenCV on a Raspberry Pi, and just tracking the focus ring visually...? If done right, you could get it to track both focus and zoom, and possibly even automatically recognise which lens you've mounted.

  • @miljopp949
    @miljopp949 3 года назад +1

    My question is : How to connect a dslr camera to unreal engine and the vive tracker..?

  • @苏国昇
    @苏国昇 4 года назад

    I want to know what the code with focus looks like. Can you tell me? Thank you very much!

  • @NirajMundra
    @NirajMundra 3 года назад

    Is there an option to use your consultation services for VR production. Thank you.

  • @lonyrecord
    @lonyrecord 3 года назад

    Can you make a turtorial, how you dit it please?

  • @a3vision790
    @a3vision790 4 года назад

    hello! I really like your videos! Thanks for sharing!! keep it up! I have a question: to connect the vive tracker to a camera with unreal and only
    capture the movement, it is necessary to connect it with the htc bases or the vive tracker does it alone without another accessory. I need to capture only camera movement in unreal. thanks!!

  • @LiuKun
    @LiuKun 4 года назад

    Hey,you can set-up a Genlock and make a Transform Array(Cache) to delay vive raw transform per frame,then get a perfect sync.

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад

      yeah I'm using that technique in this demo, but it still feels a bit off.

  • @samsonadikrobert3290
    @samsonadikrobert3290 4 года назад

    THANK YOU. for this video.just got a follower a subscriber too. looking forward to seeing more videos. i plan to go fully into digital cinematography. #HugeFanFromNigeria

  • @alexandrebachmatiuk3591
    @alexandrebachmatiuk3591 3 года назад

    Hi, What is the name of the product to hang the htc tracker on the camera?

  • @Nattodayy
    @Nattodayy 4 года назад

    Most entertaining content during this pandemic. Is there no way to use the Blackmagic camera sdk to get focus data to the computer?

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад

      We are looking for options, none are commercially available that I’m aware of.

  • @archonoff
    @archonoff 4 года назад

    How to make a good looking edge when shooting an out of focus object in front of a green screen? For example this one 10:22 looks unnatural.

  • @watashiwastevedesu6928
    @watashiwastevedesu6928 4 года назад

    Hey Matt have you ever tried doing this in stereoscopic?

  • @bicycleninja1685
    @bicycleninja1685 2 года назад

    So I'm seeing a few videos on this, and all of the chose Unreal. Is there a reason? Can this be done with Unity?

  • @mouxionstudio7929
    @mouxionstudio7929 3 года назад

    hi man, great video, can you teel me how make a green screen background like yours?

  • @PalemanFPV
    @PalemanFPV 4 года назад

    Thanks a lot dude! Really excited to see how far we can get 😂👌 virtual production tools for UE4 should be out for sure and make you some cash. You have to be rewarded for your work and I'll be the first who's going to buy it. Cuz you still didn't found the way how to implement 3D objects in cinetracer 😂😂😂

  • @djdavewarren
    @djdavewarren 4 года назад

    why dont you use midi to control the focus. Using Midi CC on a rotary dial attached to the focus ring

  • @AnimatedASMR
    @AnimatedASMR 4 года назад

    Is this only functional for Unreal or can it be used for Blender as well?

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад

      Blender can’t do this natively, but someone could technically write all of these plugins and attempt to use Eevee to real time render it. But it would be a massive re write of the core of Blender.

    • @creasedaf1s900
      @creasedaf1s900 4 года назад

      Can you do what Ian Hubert does on unreal engine ? His way of composting is really interesting and I can’t wrap my head around it just yet

  • @brianwennersten8456
    @brianwennersten8456 4 года назад

    This is totally awesome! I downloaded Unreal today and want to pass the quarantine shooting mixed reality! As an AC, the "hack" that comes to mind is to bind two wireless FF motors to the same channel and put the vive tracker on one that's off-camera. It seems like you could then avoid needing to subtract camera movement from the focus tracker. Do you think that could work Matt?

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +1

      we will have a lovely follow focus hardware solution in the near future

  • @michaelslemmons
    @michaelslemmons 4 года назад

    Do you need the HTC Vive headset for this? Or could it work with just 2 trackers and 2 base stations?

  • @johnbosley5680
    @johnbosley5680 4 года назад

    What about when productions shoot Virtual Production with LED panels? Isn’t the lens focusing just making the background on the panel in-focus/out-of-focus based on what setting the lens is on?

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад

      focus tracking is less important on an LED VP shoot, but it's normally tracked anyway to account for any lens breathing that would then be reflected in the virtual frustum.

  • @mullinsmediaLLC
    @mullinsmediaLLC 4 года назад +1

    hello my biggest question is would the focus still be synced if i use canon c500 mark ii in auto focus?

  • @vitos1k
    @vitos1k 4 года назад +1

    You can add easy and cheap setup. Using NodeMCU(or ESP32) dev board with a simple rotary encoder, or even potentiometer. And use simple protocol like OSC which should have simple addon for Unreal, and that way you can translate Data from Encoder to Unreal via WiFi in realtime. It would cost you smth like 5$ per unit. And it's super simple to program. If you need i can mockup setup and a simple firmware for units

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад

      There are some people/companies building similar gear. I’m looking for a commercial grade solution that gets certified and distributed as a proper product. If you make a company and do those steps I’d love to test it.

    • @vitos1k
      @vitos1k 4 года назад

      @@CinematographyDatabase Nah, i'm just suggesting a DIY solution for those who want to mockup something quickly with cheap parts available everywhere

  • @RLinares22
    @RLinares22 4 года назад

    You think we'll get goggles for FPV for monitoring and immersion instead of looking at a monitor?

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад

      in Cine Tracer? I'm waiting for GPUs to be a bit more powerful. Maybe 3080ti.

  • @creasedaf1s900
    @creasedaf1s900 4 года назад

    I see you finally took your CDB hat ! Lol

  • @georges8408
    @georges8408 4 года назад

    what is the difference between the vive tracker and the vive motion controler you have used in other videos ? Can both of them be used for tacking camera ?

    • @TheGladScientist
      @TheGladScientist 4 года назад

      they are essentially identical, except the controller has more buttons that could be used for other functions, and the tracker is smaller and lighter (and slightly more finicky)

    • @georges8408
      @georges8408 4 года назад

      @@TheGladScientist thank you so much... I m trying to understand how to set up - connect all these things and what I really need.

  • @andrewsharapko
    @andrewsharapko 4 года назад

    These trackers working with usb dongles only or you need 2 base stations also?

    • @fernsehdesign
      @fernsehdesign 4 года назад +1

      Andrew Sharapko each tracker needs its own dongle. The Lighthouse base stations are needed for the whole system anyway.

  • @user-wu9no3mx8q
    @user-wu9no3mx8q 3 года назад

    Instead of camera, Is iphone available?

  • @loveparmar1
    @loveparmar1 4 года назад

    Can you please make a video on how to track the vive in UE4!
    Thank you.

    • @FergyleMedia
      @FergyleMedia 4 года назад

      Agree. Or, how to actually setup Action and Axis inputs in Unreal.

  • @Dazzer1234567
    @Dazzer1234567 4 года назад

    Wouldn't it make waaaaaay more sense to use MIDI (musical instrument digital interface) for this? I mean, attach a physical midi encoder to the focus ring? MIDI is very coarse, but it also has pitchbend that is 14 bit. And you also have MIDI 2.0 which i imagine has higher resolution. You might also want to look into Eurorack gear as that could be used for capturing real world movement of anything on your real stage (as control voltage) and converting that to MIDI to drive stuff in UE.

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад

      Most people have gone Arduino. There are a few coming to markets from Vanishing Point and LOLED.

  • @paulpierantozzi
    @paulpierantozzi 4 года назад

    wouldn't it be possible to use the metadata from the digital lens to drive the lens in unreal? Not a developer, but it would be so cool if there was a way to set that up

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +1

      Yes Zeiss/Cooke I data and Canon/Fuji lenses have data ports. But receiving the data and getting to the computer requires either custom or very expensive IO boxes. Like the Arri Encoder box is $20K and getting the data to UE4 would still require custom coding.

    • @paulpierantozzi
      @paulpierantozzi 4 года назад

      @@CinematographyDatabase bummer, I wonder if we will see an indie solution come out for autofocus lenses like canon EF. Sort of like the Aputure DEC but that hooks up to your computer

  • @Drahoslav_Lysak
    @Drahoslav_Lysak 4 года назад

    cool 😊

  • @MattysEdits
    @MattysEdits 4 года назад

    these 'diary' style videos where you explain where you're up to would be great if you ended them with a live demonstration of the progress you've made since last time, not pre-recorded B-roll but a live demo

  •  4 года назад

    not working anymore in 4.25 cannot use tracking with htc same time with greenscreen keyed live material.

  • @jibi_jibs
    @jibi_jibs 4 года назад

    i am wondering, how can you connect your camera dispay to your Unreal Engine in the PC?? Do you use like Blackmagic Ultrastudio mini card??

  • @zhangcash7501
    @zhangcash7501 4 года назад

    Hi bro, do i need get whold HTC VIVE , can be make this working.

  • @FalconWingz88
    @FalconWingz88 4 года назад

    Why dont you gaffer tape the vive tracker to the follow focus if it keeps falling ?

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад

      If I was planning on using this, I’d either command strip it lol or make a 1/4 20” to Follow Focus adapter. But I’ve decided to move on from this technique.

  • @chuck2501
    @chuck2501 4 года назад

    could you use this with a UST projector in place of a green screen?

  • @shekiba.646
    @shekiba.646 4 года назад

    I'm curious. Ask you. Many tracker 4? And 2 or 4 base station 2.0 full room? Plug in to UE4? Can you show me video. I will know. Buy it. Let me know. Thanks

  • @joshuamccrary9878
    @joshuamccrary9878 4 года назад +1

    Have you looked at the PL mount Zeiss CP3s with Extended Data? They can provide most of the necessary data in real time (focal length
    , focus distance, T-stop, depth of field/hyperfocal distance, horizontal field of view, entrance pupil position, and some distortion and shading data) . I've wondered for a long time if there's a way to feed the data from them into Unreal live, but I haven't seen anyone try it small scale. If it's doable, they like a really helpful indie-level solution to lens mapping for stuff like this. So far I've only heard of higher end systems that have integrated them.

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +1

      The only solution I know of is either custom or the Arri Box which is like $20K. There are some people looking into the best solutions. Canon, Zeiss/Cooke, Fujinon are all in top of this encoding, but it’s still like you said expensive to get the data. I’m confident the community will figure something out.

  • @davida.c.9228
    @davida.c.9228 4 года назад

    I need cinematography breakdowns again :(

    • @CinematographyDatabase
      @CinematographyDatabase  4 года назад +1

      Follow Indy Mogul and Buff Needs and Cooke Optics. They do great ones with the actual DP shot the project 🍻

    • @davida.c.9228
      @davida.c.9228 4 года назад

      @@CinematographyDatabase I am following them, but the breakdowns you used to do were the best :D

  • @henriquecariboni1357
    @henriquecariboni1357 3 года назад

    u can share the bp?

  • @SumarlinPutra
    @SumarlinPutra 4 года назад

    Would you like add UNDO button...

  • @gondezmilenial3669
    @gondezmilenial3669 3 года назад

    you need big LED monitor for background its so bad with green screen doesnt have e ambient light interaction

  • @blender_wiki
    @blender_wiki 4 года назад

    An arduino with rotary optical encoder is very easy to build it and is more precise and much cheaper

  • @zenbauhaus1345
    @zenbauhaus1345 3 года назад

    7:38 u have to map sh*t for every aperture

  • @dilithium72
    @dilithium72 4 года назад

    It's a shame it doesn't put the key out-of-focus. It really breaks the illusion, seeing the live feed blurred, but have the key still crisp.

  • @yang5616
    @yang5616 4 года назад

    tracker的数量如果能开放 就好玩了

  • @dominicdoddy8605
    @dominicdoddy8605 4 года назад

    microsoft surface dial?

  • @ddavidebor
    @ddavidebor 4 года назад +1

    I'm an electronic engineer I could build something like that pretty easily

  • @damienradymskylanders3421
    @damienradymskylanders3421 2 года назад

    Do you know you might not need to go on golf there’s a new solution out that compare about three

    • @arumugamjega
      @arumugamjega 2 года назад

      What is that tech?

    • @damienradymskylanders3421
      @damienradymskylanders3421 2 года назад

      Tundra Labs create their own dongle that compare about three by truckers actually I’m thinking about seven but also works with their own trackers

  • @ashoakenfold
    @ashoakenfold 4 года назад

    Do any of these these cameras support OSC? That would make communication between camera functions and UE4 trivial. opensoundcontrol.org/introduction-osc

  • @odomic
    @odomic 2 года назад

    Talk show

  • @vivektyagi6848
    @vivektyagi6848 4 года назад

    Quiet a research.