Thanks for the shout-out (and you did a great job of the pronunciation of my name)! Yeah, I really like your solution to drive the second tracker with a follow focus: really smart. Looking forward to more, and happy to be in the virtual production group!
Man, it just awesome to see how much you developed so far. I still remember only a couple years ago when you were thinking about how cool would be all of this in your Cinematography Breakdowns. Congrats Matt, keep doing it!
Love following your progress on this subject. I'm part of the Facebook group and there is so much great stuff in there for folks interested in this process. Keep up the great work!
Would really be interested in seeing how this works with the zoom too. Probably not good without lots of calibration, but would be interesting to see just how "off" it looks without it.
It could be done, but I would want to calculate the FOV using checker charts and Nuke. I don’t trust the lens markings for focal length. If it’s just MR, it would be fine, but for AR we would need to be more precise.
@@CinematographyDatabase hi!, could you share your calculation ? I'm having trouble matching the focal length on Unreal or disguise to match with real lens focal length
It's so interesting to watch you evolve and solve all kinds of problems to conquer a new feature! I noticed something that maybe would make this look even more natural. What do you think about blurring the edges of the keyed object when it gets increasingly more out of focus? I don't know if this would look strange. Because you'd probably have to "shrink" the whole key to actually have data to blur in the first place.
That's a great hack ! Here is an idea that popped up to me : Would it be possible to make this through an Arduino microcontroller ? You could have a potentiometer instead of the second vive tracker and get only a rotation value. It would imply to parse the data inside of the Arduino and send it to UE, which I don't really know about, but you would have much quicker and accurate results I think. And you could do the same for aperture. Thanks for the video ! Can't wait to see the next idea :) EDIT : The arduino can create a Serial connection, which can be used in C++ so I guess it could work !
That definitely works and there are some people on the community making Arduino/Rotary encoders. Getting the data to UE4 is pretty straight forward from there too. We will see if any of them turn into actual products people can buy. That’s my primary interest, vs one off custom builds.
Amazing work Matt ! Was wondering if you considered using a Rotary Encoder and an Arduino Micro to measure the rotation of the focus / zoom rings. Send info via usb/WiFi. Using as a midi. No need to customize every single lens. Just set ur 0 point, then rotate lens rings to stop points and your able to map any lens very quickly. And it’s way smaller than a tracking puk.
There are several new products coming out that have done just that. Vanishing Point Viper I have already and will test soon and there are more affordable Arduino based encoders coming soon.
It's possible to hack the Vive so you only need the Base Stations and Tracker. But I recommend having a full Vive Pro in case that hack stops working one day.
How about using a OpenCV on a Raspberry Pi, and just tracking the focus ring visually...? If done right, you could get it to track both focus and zoom, and possibly even automatically recognise which lens you've mounted.
hello! I really like your videos! Thanks for sharing!! keep it up! I have a question: to connect the vive tracker to a camera with unreal and only capture the movement, it is necessary to connect it with the htc bases or the vive tracker does it alone without another accessory. I need to capture only camera movement in unreal. thanks!!
THANK YOU. for this video.just got a follower a subscriber too. looking forward to seeing more videos. i plan to go fully into digital cinematography. #HugeFanFromNigeria
Thanks a lot dude! Really excited to see how far we can get 😂👌 virtual production tools for UE4 should be out for sure and make you some cash. You have to be rewarded for your work and I'll be the first who's going to buy it. Cuz you still didn't found the way how to implement 3D objects in cinetracer 😂😂😂
Blender can’t do this natively, but someone could technically write all of these plugins and attempt to use Eevee to real time render it. But it would be a massive re write of the core of Blender.
This is totally awesome! I downloaded Unreal today and want to pass the quarantine shooting mixed reality! As an AC, the "hack" that comes to mind is to bind two wireless FF motors to the same channel and put the vive tracker on one that's off-camera. It seems like you could then avoid needing to subtract camera movement from the focus tracker. Do you think that could work Matt?
What about when productions shoot Virtual Production with LED panels? Isn’t the lens focusing just making the background on the panel in-focus/out-of-focus based on what setting the lens is on?
focus tracking is less important on an LED VP shoot, but it's normally tracked anyway to account for any lens breathing that would then be reflected in the virtual frustum.
You can add easy and cheap setup. Using NodeMCU(or ESP32) dev board with a simple rotary encoder, or even potentiometer. And use simple protocol like OSC which should have simple addon for Unreal, and that way you can translate Data from Encoder to Unreal via WiFi in realtime. It would cost you smth like 5$ per unit. And it's super simple to program. If you need i can mockup setup and a simple firmware for units
There are some people/companies building similar gear. I’m looking for a commercial grade solution that gets certified and distributed as a proper product. If you make a company and do those steps I’d love to test it.
what is the difference between the vive tracker and the vive motion controler you have used in other videos ? Can both of them be used for tacking camera ?
they are essentially identical, except the controller has more buttons that could be used for other functions, and the tracker is smaller and lighter (and slightly more finicky)
Wouldn't it make waaaaaay more sense to use MIDI (musical instrument digital interface) for this? I mean, attach a physical midi encoder to the focus ring? MIDI is very coarse, but it also has pitchbend that is 14 bit. And you also have MIDI 2.0 which i imagine has higher resolution. You might also want to look into Eurorack gear as that could be used for capturing real world movement of anything on your real stage (as control voltage) and converting that to MIDI to drive stuff in UE.
wouldn't it be possible to use the metadata from the digital lens to drive the lens in unreal? Not a developer, but it would be so cool if there was a way to set that up
Yes Zeiss/Cooke I data and Canon/Fuji lenses have data ports. But receiving the data and getting to the computer requires either custom or very expensive IO boxes. Like the Arri Encoder box is $20K and getting the data to UE4 would still require custom coding.
@@CinematographyDatabase bummer, I wonder if we will see an indie solution come out for autofocus lenses like canon EF. Sort of like the Aputure DEC but that hooks up to your computer
these 'diary' style videos where you explain where you're up to would be great if you ended them with a live demonstration of the progress you've made since last time, not pre-recorded B-roll but a live demo
4 года назад
not working anymore in 4.25 cannot use tracking with htc same time with greenscreen keyed live material.
If I was planning on using this, I’d either command strip it lol or make a 1/4 20” to Follow Focus adapter. But I’ve decided to move on from this technique.
I'm curious. Ask you. Many tracker 4? And 2 or 4 base station 2.0 full room? Plug in to UE4? Can you show me video. I will know. Buy it. Let me know. Thanks
Have you looked at the PL mount Zeiss CP3s with Extended Data? They can provide most of the necessary data in real time (focal length , focus distance, T-stop, depth of field/hyperfocal distance, horizontal field of view, entrance pupil position, and some distortion and shading data) . I've wondered for a long time if there's a way to feed the data from them into Unreal live, but I haven't seen anyone try it small scale. If it's doable, they like a really helpful indie-level solution to lens mapping for stuff like this. So far I've only heard of higher end systems that have integrated them.
The only solution I know of is either custom or the Arri Box which is like $20K. There are some people looking into the best solutions. Canon, Zeiss/Cooke, Fujinon are all in top of this encoding, but it’s still like you said expensive to get the data. I’m confident the community will figure something out.
Do any of these these cameras support OSC? That would make communication between camera functions and UE4 trivial. opensoundcontrol.org/introduction-osc
Thanks for the shout-out (and you did a great job of the pronunciation of my name)! Yeah, I really like your solution to drive the second tracker with a follow focus: really smart. Looking forward to more, and happy to be in the virtual production group!
Your journey through this new technology is fascinating!
Keep it up. Can't wait to join you myself.
Thank you grand wizzard, for leading the way for us indie folk
Man, I recognized you because I know your meta human on Linkedin...Matt you are one amazing human being, bravo!
so awesome man. answering all the questions I needed to get into this
Man, it just awesome to see how much you developed so far. I still remember only a couple years ago when you were thinking about how cool would be all of this in your Cinematography Breakdowns. Congrats Matt, keep doing it!
Love following your progress on this subject. I'm part of the Facebook group and there is so much great stuff in there for folks interested in this process. Keep up the great work!
Even tho I dont have a VR system this is so interesting to watch!
I get so excited everytime you upload a video! You are a huge inspiration for me
So looking forward to your virtual production tool! Great experiment.
Awesome! Sounds like a lot of fun and potential.
just bought the cinetracer, love your virtual production ideas.
Started on this and looks like a good project to learn. Super thanks and hope you are able to make more in depth tutorials. Thanks!
Would really be interested in seeing how this works with the zoom too. Probably not good without lots of calibration, but would be interesting to see just how "off" it looks without it.
It could be done, but I would want to calculate the FOV using checker charts and Nuke. I don’t trust the lens markings for focal length. If it’s just MR, it would be fine, but for AR we would need to be more precise.
@@CinematographyDatabase hi!, could you share your calculation ? I'm having trouble matching the focal length on Unreal or disguise to match with real lens focal length
Did you make a blueprint availlable for this? I'd love to have a play with focus mapping.
is there an updated method to this?
Dude you're so awsome! Me and my coder friend gonna die into virtual production! So hypped! Thanks for All your work!
awesome, fascinating! we want it :)
Thanks for sharing Your knowledge!
Call it "VIP Unreal: Virtual Indie Productions for Unreal Engine"
Thanks man for your sharing
and for your Being
Go man Go!
Pls share the blueprint you used to calibrate second vive tracker.
Very cool test! I'm REALLY happy to hear though that I won't have to do this in my own studio lol
It's so interesting to watch you evolve and solve all kinds of problems to conquer a new feature!
I noticed something that maybe would make this look even more natural. What do you think about blurring the edges of the keyed object when it gets increasingly more out of focus? I don't know if this would look strange. Because you'd probably have to "shrink" the whole key to actually have data to blur in the first place.
Yeah, I was considering something like that. I haven’t tried to adjust the key live, but it looks like something like that would be necessary.
Hi,
Please share the lens mapping you used and also the ue4 blueprints.Thanks in advance
That's a great hack !
Here is an idea that popped up to me :
Would it be possible to make this through an Arduino microcontroller ?
You could have a potentiometer instead of the second vive tracker and get only a rotation value.
It would imply to parse the data inside of the Arduino and send it to UE, which I don't really know about, but you would have much quicker and accurate results I think. And you could do the same for aperture.
Thanks for the video ! Can't wait to see the next idea :)
EDIT : The arduino can create a Serial connection, which can be used in C++ so I guess it could work !
That definitely works and there are some people on the community making Arduino/Rotary encoders. Getting the data to UE4 is pretty straight forward from there too. We will see if any of them turn into actual products people can buy. That’s my primary interest, vs one off custom builds.
@@CinematographyDatabase thanks for the reply
But once that done, what are the next steps/constraints to get the best results?
Asking you everywhere :) ..
Please show us how to map the Focal Dictance to vive tracker rotation .. Please .
Thanks !
Amazing work Matt ! Was wondering if you considered using a Rotary Encoder and an Arduino Micro to measure the rotation of the focus / zoom rings. Send info via usb/WiFi. Using as a midi. No need to customize every single lens. Just set ur 0 point, then rotate lens rings to stop points and your able to map any lens very quickly. And it’s way smaller than a tracking puk.
You could possibly build the rotary encoder directly into the follow focus hardware 🤔
There are several new products coming out that have done just that. Vanishing Point Viper I have already and will test soon and there are more affordable Arduino based encoders coming soon.
Thanks, this video is Enlightening
please share BP
Brilliant! Really!!
great video. Do you need the VIVE Pro, or you can just use the base stations without buying the unit? Thanks
It's possible to hack the Vive so you only need the Base Stations and Tracker. But I recommend having a full Vive Pro in case that hack stops working one day.
@@CinematographyDatabase That's awesome thanks!
It is always good to keep Vive Headset with you, as you can test with VR mode in Unreal Engine. :)
@@saish24 Great, thank you!
Super interesting. What lenses do you recommend for this setup?
Hey Matt.. What are you mounting your vive trackers with? Also has anything new come to light since this 2 year old video? :)
Yeah people use the LOLED lens encoder now at this level. This was just a funny experiment, it wouldn’t be reliable for production.
How about using a OpenCV on a Raspberry Pi, and just tracking the focus ring visually...? If done right, you could get it to track both focus and zoom, and possibly even automatically recognise which lens you've mounted.
My question is : How to connect a dslr camera to unreal engine and the vive tracker..?
I want to know what the code with focus looks like. Can you tell me? Thank you very much!
Is there an option to use your consultation services for VR production. Thank you.
Can you make a turtorial, how you dit it please?
hello! I really like your videos! Thanks for sharing!! keep it up! I have a question: to connect the vive tracker to a camera with unreal and only
capture the movement, it is necessary to connect it with the htc bases or the vive tracker does it alone without another accessory. I need to capture only camera movement in unreal. thanks!!
Hey,you can set-up a Genlock and make a Transform Array(Cache) to delay vive raw transform per frame,then get a perfect sync.
yeah I'm using that technique in this demo, but it still feels a bit off.
THANK YOU. for this video.just got a follower a subscriber too. looking forward to seeing more videos. i plan to go fully into digital cinematography. #HugeFanFromNigeria
Hi, What is the name of the product to hang the htc tracker on the camera?
Most entertaining content during this pandemic. Is there no way to use the Blackmagic camera sdk to get focus data to the computer?
We are looking for options, none are commercially available that I’m aware of.
How to make a good looking edge when shooting an out of focus object in front of a green screen? For example this one 10:22 looks unnatural.
Hey Matt have you ever tried doing this in stereoscopic?
So I'm seeing a few videos on this, and all of the chose Unreal. Is there a reason? Can this be done with Unity?
hi man, great video, can you teel me how make a green screen background like yours?
Thanks a lot dude! Really excited to see how far we can get 😂👌 virtual production tools for UE4 should be out for sure and make you some cash. You have to be rewarded for your work and I'll be the first who's going to buy it. Cuz you still didn't found the way how to implement 3D objects in cinetracer 😂😂😂
why dont you use midi to control the focus. Using Midi CC on a rotary dial attached to the focus ring
Is this only functional for Unreal or can it be used for Blender as well?
Blender can’t do this natively, but someone could technically write all of these plugins and attempt to use Eevee to real time render it. But it would be a massive re write of the core of Blender.
Can you do what Ian Hubert does on unreal engine ? His way of composting is really interesting and I can’t wrap my head around it just yet
This is totally awesome! I downloaded Unreal today and want to pass the quarantine shooting mixed reality! As an AC, the "hack" that comes to mind is to bind two wireless FF motors to the same channel and put the vive tracker on one that's off-camera. It seems like you could then avoid needing to subtract camera movement from the focus tracker. Do you think that could work Matt?
we will have a lovely follow focus hardware solution in the near future
Do you need the HTC Vive headset for this? Or could it work with just 2 trackers and 2 base stations?
What about when productions shoot Virtual Production with LED panels? Isn’t the lens focusing just making the background on the panel in-focus/out-of-focus based on what setting the lens is on?
focus tracking is less important on an LED VP shoot, but it's normally tracked anyway to account for any lens breathing that would then be reflected in the virtual frustum.
hello my biggest question is would the focus still be synced if i use canon c500 mark ii in auto focus?
You can add easy and cheap setup. Using NodeMCU(or ESP32) dev board with a simple rotary encoder, or even potentiometer. And use simple protocol like OSC which should have simple addon for Unreal, and that way you can translate Data from Encoder to Unreal via WiFi in realtime. It would cost you smth like 5$ per unit. And it's super simple to program. If you need i can mockup setup and a simple firmware for units
There are some people/companies building similar gear. I’m looking for a commercial grade solution that gets certified and distributed as a proper product. If you make a company and do those steps I’d love to test it.
@@CinematographyDatabase Nah, i'm just suggesting a DIY solution for those who want to mockup something quickly with cheap parts available everywhere
You think we'll get goggles for FPV for monitoring and immersion instead of looking at a monitor?
in Cine Tracer? I'm waiting for GPUs to be a bit more powerful. Maybe 3080ti.
I see you finally took your CDB hat ! Lol
what is the difference between the vive tracker and the vive motion controler you have used in other videos ? Can both of them be used for tacking camera ?
they are essentially identical, except the controller has more buttons that could be used for other functions, and the tracker is smaller and lighter (and slightly more finicky)
@@TheGladScientist thank you so much... I m trying to understand how to set up - connect all these things and what I really need.
These trackers working with usb dongles only or you need 2 base stations also?
Andrew Sharapko each tracker needs its own dongle. The Lighthouse base stations are needed for the whole system anyway.
Instead of camera, Is iphone available?
Can you please make a video on how to track the vive in UE4!
Thank you.
Agree. Or, how to actually setup Action and Axis inputs in Unreal.
Wouldn't it make waaaaaay more sense to use MIDI (musical instrument digital interface) for this? I mean, attach a physical midi encoder to the focus ring? MIDI is very coarse, but it also has pitchbend that is 14 bit. And you also have MIDI 2.0 which i imagine has higher resolution. You might also want to look into Eurorack gear as that could be used for capturing real world movement of anything on your real stage (as control voltage) and converting that to MIDI to drive stuff in UE.
Most people have gone Arduino. There are a few coming to markets from Vanishing Point and LOLED.
wouldn't it be possible to use the metadata from the digital lens to drive the lens in unreal? Not a developer, but it would be so cool if there was a way to set that up
Yes Zeiss/Cooke I data and Canon/Fuji lenses have data ports. But receiving the data and getting to the computer requires either custom or very expensive IO boxes. Like the Arri Encoder box is $20K and getting the data to UE4 would still require custom coding.
@@CinematographyDatabase bummer, I wonder if we will see an indie solution come out for autofocus lenses like canon EF. Sort of like the Aputure DEC but that hooks up to your computer
cool 😊
these 'diary' style videos where you explain where you're up to would be great if you ended them with a live demonstration of the progress you've made since last time, not pre-recorded B-roll but a live demo
not working anymore in 4.25 cannot use tracking with htc same time with greenscreen keyed live material.
i am wondering, how can you connect your camera dispay to your Unreal Engine in the PC?? Do you use like Blackmagic Ultrastudio mini card??
I use a Blackmagic Design Decklink 8K Pro and SDI.
Hi bro, do i need get whold HTC VIVE , can be make this working.
Why dont you gaffer tape the vive tracker to the follow focus if it keeps falling ?
If I was planning on using this, I’d either command strip it lol or make a 1/4 20” to Follow Focus adapter. But I’ve decided to move on from this technique.
could you use this with a UST projector in place of a green screen?
Yes
I'm curious. Ask you. Many tracker 4? And 2 or 4 base station 2.0 full room? Plug in to UE4? Can you show me video. I will know. Buy it. Let me know. Thanks
Have you looked at the PL mount Zeiss CP3s with Extended Data? They can provide most of the necessary data in real time (focal length
, focus distance, T-stop, depth of field/hyperfocal distance, horizontal field of view, entrance pupil position, and some distortion and shading data) . I've wondered for a long time if there's a way to feed the data from them into Unreal live, but I haven't seen anyone try it small scale. If it's doable, they like a really helpful indie-level solution to lens mapping for stuff like this. So far I've only heard of higher end systems that have integrated them.
The only solution I know of is either custom or the Arri Box which is like $20K. There are some people looking into the best solutions. Canon, Zeiss/Cooke, Fujinon are all in top of this encoding, but it’s still like you said expensive to get the data. I’m confident the community will figure something out.
I need cinematography breakdowns again :(
Follow Indy Mogul and Buff Needs and Cooke Optics. They do great ones with the actual DP shot the project 🍻
@@CinematographyDatabase I am following them, but the breakdowns you used to do were the best :D
u can share the bp?
Would you like add UNDO button...
you need big LED monitor for background its so bad with green screen doesnt have e ambient light interaction
An arduino with rotary optical encoder is very easy to build it and is more precise and much cheaper
7:38 u have to map sh*t for every aperture
It's a shame it doesn't put the key out-of-focus. It really breaks the illusion, seeing the live feed blurred, but have the key still crisp.
tracker的数量如果能开放 就好玩了
microsoft surface dial?
I own one, but don’t know how to get it talking to UE4 yet.
I'm an electronic engineer I could build something like that pretty easily
Do you know you might not need to go on golf there’s a new solution out that compare about three
What is that tech?
Tundra Labs create their own dongle that compare about three by truckers actually I’m thinking about seven but also works with their own trackers
Do any of these these cameras support OSC? That would make communication between camera functions and UE4 trivial. opensoundcontrol.org/introduction-osc
Talk show
Quiet a research.