In the final comp you will see a bit of noise/bounce in the footage. I believe this is because the Vive Base Stations didn't both have a clear view of the controller the way I setup the set. On top of that my Composure settings were a bit rushed and I think that may have added a bit noise as well.
also the Base Stations don't like lights pointed at them, so I need to control the on set lighting much better in the future. And I can also try adding more Base Stations for better coverage.
Yeap! We have ATEM from blackmagic and we have timecode setup, after that the sync is much better. HTC TRACKERS have big potential, but the software was made for fast reaction so the prediction algorytms are making this not so accurate. I hope VIVE is working now on new driver for this virtual production new market :)
Cinematography Database 4 Base stations should cut down on that jitter and a dedicated "Vive Tracker" would be great, it's easier to mount on a rig and it'll free the controllers up to do other things. Putting a similar set up together with a buddy of mine. Exciting times!
Unfortunately, the vive trackers do jitter a bit. Even with careful setup you will see some when the camera is stationary. You can see an example here ruclips.net/video/H6M5PleSJD4/видео.html there is also a link to a small unreal project that lets you dump tracker stats to a spreadsheet for analysis. Handy if you are trying to fine tune your setup and want to see if it really reduced the jitter.
Would love to see the nitty gritty. Would also love to see if say an OLED monitor could work as a “window” similar to the Mandalorian method but on a small scale like a window .
Here in Nebraska, most of my paying work is StageHand realated. I have local partners I can potentially rent a pretty nice Live Event Video walls from. So I am super stocked to see how I can potentially incorporate this workflow for live events & music preformances! Especially when I pair it with the projection mapping equipment I am gonna be picking up. OMG I'm so pumped right now.
This absolutely does work. I know a guy at a movie studio who did a setup with 4 regular TVs as proof of concept before they got a big video wall, they say it worked great! If you do see pixels you can just setup your depth of field so the background is a bit out of focus to cover it.
getting that simulcam to really lock in I think is the make or break for the project. Listening to what you were saying about 'where is the camera' and how that position would change based on lens and zooms is a very interesting complex problem. I feel like lens manufacturers have all the info you would need, but getting that from them is the tricky part. Super exciting to follow your indie approach. I look forward to seeing it progress and learning from your hours of trial and error.
This is so monumental! I wish I had all the equipment to keep up with you, it was easy with C4D but this is a whole new monster and I cant wait to play catch up with you and this new technology as soon as I can :)
A full step-by-step setup walkthrough would be wonderful, as there's not much out there on this process. We know that's not an easy thing to put together, but it seems like a lot of people are really hoping to find something like that. Would be great to have it come from someone like you who, like us, is trying to do this on an 'indie' level. :) Thank you for all the content you are putting out!
You’ve opened my eyes to a all new world for me or a virtual one I should say...very interesting and inspiring, I’m going to try and dive head first into 3D, hope to survive 😄🤞
I just found this channel and i have to say your videos are amazing! This is exactly what I was looking for and wanted to know! Thank you very much for this content!
Hi Matt, great tut !. You can use vive trackers for a better experience. Using it you can measure the distance, from the mount position to the camera sensor plate . Add that to your Virtual camera to get the best position.
Great work!! Would love to see an in-depth tutorial !! At School, we have a Mo-Sys Jib and a green studio that was used with BrainStorm virtual studio ughh which now is outdated but will be getting some new equip from Blackmagic Design. Would love to see this working with Unreal Engine!! I know there is a plugin for Mo-Sys in Unreal but would love to see the workflow !!! Keep up the good work!
This is great! Simulcam however was combining virtual camera and actor mocap. What you're doing here is incredibly impressive at indie levels but it's called virtual set. The bigger term virtual production, usually refers to using all these technologies to break down barriers between previs, shooting and post. This is a big step towards opening this up and bravo!
This is great! This is going to be the way many amateurs, schools and small companies are going to break into VP. Please could you put together the longer tutorial that demonstrates how you got the camera tracking working with this test scene. I think a lot of people will find this helpful and maybe even use your example as a proof of concept to justify purchasing the equipment to try this.
Matt thanks a lot for posting these series, it’s an invaluable resource! I’ve been a longtime follower and the virtual production stuff is very intriguing! I am trying to get started and test some of these ideas and I was wondering if you have any resources for doing a similar setup but with projectors similar to what you had done at that motorcycle shoot?
Hi Matt, would appreciate if we could get a step by step video tutorial as the hardware connectivity seems a bit complicated. May be a blueprint document would also do. Thanks a ton for sharing the pipeline and process behind virtual production. Its a major game changer for visual effects industry in terms of cost an time both. Thank you once again. Good day !!
Do we want to see a detailed tutorial? Sure man this is amazing and I will appreciate it a lot I am sure many others do, so Please do so Mat, and you are awesome thanks for all you do, and Cinetracer as well, you are inspiratioal man
Show the long and detailed steps on how you do the composure... I think this is amazing there is alot of creative things you could do with this technique
Yeah I worked on set with them for a month using it. That is essentially now built into UE4 4.25 and more will be added in future releases. I'll be showing it here.
Hello. Just getting into virtual production at work and have a video wall and vive kit to test with. I'm basically doing an indie set up, but am struggling to get the basics like lining up the tracked space and getting the scale and position of the real world screen set up (fixed lens, no zoom yet!) - are there any tutorials covering these basic essentials yet? Im using Unity at the moment just because im familiar with doing architectural and training visuals, but also happy to use Unreal if its any easier to get up and running. My scenes never quite seem to match the perspective and skate around, look wrong scale because i think i have to scale and move the virtual camera and scene for the screen size etc. My mind is blown right now after a few days doing a rough set up. absolutely loving it, but there's only me really thinking about it at work so its difficult on my own!
You can do the same thing without paying a dime. 1. Create a 3D environment in AE with walls. Use a Blend mode over the Key footage from the 3D walls/lights in a Comp. With some little adjustments when you change the light or background the Key footage will automatically change in real time. So, when you rotate the walls you will see in real time light buncoing over the Key footage. You can even it more real if you create a Emboss of the Key footage that gets of the lights from the Walls, making the light have some 3D FX to it. Got use the imagination when comes to FX.
Would LOVE to see every single step broken down. I know that would be a lot of work for you, but would really appreciate it and I would love you forever
Hey, there I am loving your series on virtual production! I am new to this and I wanted to have 2 cameras so I can film different angles. Is it possible to have two cameras inside UE4 outputting in real time like I see in your videos or is it limited to one? Also, how do you feed the real time output to a different monitor so I can preview it in real time?
Thank you for a good clip, I want to know, after finishing this step, how should I use Sequence recorder or Take Recorder? and what steps must be taken I want you to make a clip, please.
Thanks!!, Two questions but the same purpose: Can I use the Blackmagic UltraStudio Recorder 3G instead of Design Decklink 8k as video source input in a laptop? Can the blackmagic input plugin identify it? Thanks for the knowledge, keep going please!
hello sir, i loved your videos they are great and well explained please can you also make tutorial on how to setup camera and connecting with pc all setup
Hi Matt question here. Would you think a 4k laser projector could work as a budget friendly way. Because led wall at home would cost alot for a home setup. And you can skip the green screen
DUDE I've been watching your videos and this one has super excited man! Thought that I subscribed a while back (must be on another account), but I had to subscribe after this!!! How well must a DP know Unreal Engine, I'm trying to learn it now on Udemy (Learn C++/Unreal Engine), any advice on the depth of knowledge to really be decent at this setup?
Would this be possible through HDMI? I'd be very keen to see if the same thing can be achieved with camera like an a7iii. Also I've seen people do this with an iphone instead of a vive to record the motion data. Do you think there's potential in that for indie production? (even if movements aren't as accurate as with a vive)
I saw a video with a similar system that was very expensive but still used vive trackers. Even with all the effort and money they still had issues with the virtual and physical cameras desyncing occasionally which would look like sliding or bad camera tracking in VFX. I think a better way to do it would be to make linear motion tracks for xyz axis and move the the physical camera via the software like a CNC machine operates or with IK and a robotic arm. That should make the tracking more consistant. Or you could use optical encoders to tell the position of the physical camera to the software and move the virtual camera
yeah I was going to address that. My Vive Trackers are not an ideal place to see the Vive Controller so I was getting semi noisy tracking and also I believe the Composure Key I was pulling was doing something funny, there was a setting I forget to tweak.
Is(or will) the footage saved separately (Green clip + backgorund clip). Or is it merged into one footage - option for later BG replacement, or editing ?
how do u match the virtual camera position to the cinecamera in the scene? How can we link the virtual camera to scene camera. I'm using uremote 2 app for virtual camera tracking and ivcam for live camera feed. When i hit play, the live composited feed is suspended and position of virtual camera is really off position somewhere in the scene.
I wanted to suggest that if people have to go buy digital hardware for it then so be it I rather have two buy additional Hardware then be limited I'm referring to when you're iffy how you were going to roll it out I would say have the best of both worlds we have your consumer and prosumer in one package
Have you figured out how to record the "COMP" at a high resolution? When I try to use the take recorder or sequencer it only records the CG layer and I never get the Media plate to form the full comped shot
In the final comp you will see a bit of noise/bounce in the footage. I believe this is because the Vive Base Stations didn't both have a clear view of the controller the way I setup the set. On top of that my Composure settings were a bit rushed and I think that may have added a bit noise as well.
also the Base Stations don't like lights pointed at them, so I need to control the on set lighting much better in the future. And I can also try adding more Base Stations for better coverage.
Yeap! We have ATEM from blackmagic and we have timecode setup, after that the sync is much better. HTC TRACKERS have big potential, but the software was made for fast reaction so the prediction algorytms are making this not so accurate. I hope VIVE is working now on new driver for this virtual production new market :)
Cinematography Database 4 Base stations should cut down on that jitter and a dedicated "Vive Tracker" would be great, it's easier to mount on a rig and it'll free the controllers up to do other things. Putting a similar set up together with a buddy of mine. Exciting times!
Unfortunately, the vive trackers do jitter a bit. Even with careful setup you will see some when the camera is stationary. You can see an example here ruclips.net/video/H6M5PleSJD4/видео.html there is also a link to a small unreal project that lets you dump tracker stats to a spreadsheet for analysis. Handy if you are trying to fine tune your setup and want to see if it really reduced the jitter.
Greg Corson yeah I saw your video. I’m going to keep trying different setups and experimenting. Have you used the Vanishing Point solution?
Yes please do a full tutorial.
YES PLEASE!
Would love to see the nitty gritty. Would also love to see if say an OLED monitor could work as a “window” similar to the Mandalorian method but on a small scale like a window .
yeah I'm definitely going to try a Mini Mando set with a TV soon.
the screen needs to be far enough away so the camera cant see the pixles/LEDs so your OLED might not be big enough
Here in Nebraska, most of my paying work is StageHand realated. I have local partners I can potentially rent a pretty nice Live Event Video walls from. So I am super stocked to see how I can potentially incorporate this workflow for live events & music preformances!
Especially when I pair it with the projection mapping equipment I am gonna be picking up. OMG I'm so pumped right now.
This absolutely does work. I know a guy at a movie studio who did a setup with 4 regular TVs as proof of concept before they got a big video wall, they say it worked great! If you do see pixels you can just setup your depth of field so the background is a bit out of focus to cover it.
@@CinematographyDatabase Yes pls
Wow, this IS a game changer! Full tutorial please!
This is incredibly cool! Full tutorial?? yes please!
Don’t get your hope high.
@@valcriston hahah it’s been 11 months since this comment - hehe. not about hopes it’s about “did it happen” i’d have to go back and look :-)
Amen, please Matt more... keep em coming you're on fire.
you are the OG
What you are doing here is truly amazing Matt. I love how you are trailblazing the process for us Indie folks 😀
It's crazy how accessible this technology is.
getting that simulcam to really lock in I think is the make or break for the project. Listening to what you were saying about 'where is the camera' and how that position would change based on lens and zooms is a very interesting complex problem. I feel like lens manufacturers have all the info you would need, but getting that from them is the tricky part. Super exciting to follow your indie approach. I look forward to seeing it progress and learning from your hours of trial and error.
Awesome Matt! Appreciate you sharing with us
YES! Please do a full rundown. My basement is definitely becoming a virtual studio. Thank you so much for pushing the enveloppe for us indies.
just about to set this up and found your page awesome work thank you!!!!! Coming from video so lots to learn
Yes!! Count me in on the in-depth tutorial!!!🤓
This is so monumental! I wish I had all the equipment to keep up with you, it was easy with C4D but this is a whole new monster and I cant wait to play catch up with you and this new technology as soon as I can :)
If you come forward with the right attitude you'd be amazed the businesses/studios you can partner with in your region.
Loving the virtual production content! Would absolutely LOVE to see a long and involved step by step process!!!
Your last week of uploads have been insane. You have really blown my mind when it comes to whats possible with virtual production. Keep it up mate!
Yes! Long and involved tutorial please! Awesome videos!!!
Yo! i am in with you! I just ordered my Decklink 8K Pro and HTC Vive Pro. Thank you Mat!!
A full step-by-step setup walkthrough would be wonderful, as there's not much out there on this process. We know that's not an easy thing to put together, but it seems like a lot of people are really hoping to find something like that. Would be great to have it come from someone like you who, like us, is trying to do this on an 'indie' level. :) Thank you for all the content you are putting out!
You’ve opened my eyes to a all new world for me or a virtual one I should say...very interesting and inspiring, I’m going to try and dive head first into 3D, hope to survive 😄🤞
amazing!!! thought id never be able to get into this!
This is revolutionary, yes I would love to see more!
Awesome work. Would love to see a budget projected real-time background like you did with Unreal on the motorcycle shoot
We audience would appreciate a deep detailed tutorial 🙏🏻 :)
I just found this channel and i have to say your videos are amazing! This is exactly what I was looking for and wanted to know! Thank you very much for this content!
Hi Matt, great tut !. You can use vive trackers for a better experience. Using it you can measure the distance, from the mount position to the camera sensor plate . Add that to your Virtual camera to get the best position.
Great work!! Would love to see an in-depth tutorial !! At School, we have a Mo-Sys Jib and a green studio that was used with BrainStorm virtual studio ughh which now is outdated but will be getting some new equip from Blackmagic Design. Would love to see this working with Unreal Engine!! I know there is a plugin for Mo-Sys in Unreal but would love to see the workflow !!! Keep up the good work!
I would love to see the long version of how you do everything!
This is great! Simulcam however was combining virtual camera and actor mocap. What you're doing here is incredibly impressive at indie levels but it's called virtual set. The bigger term virtual production, usually refers to using all these technologies to break down barriers between previs, shooting and post. This is a big step towards opening this up and bravo!
Ron Fischer oh really? You would know! I may have massively just “re branded” the term to the next generation 😂
@@CinematographyDatabase But we shall not confuse words with deeds 😁. Rock on.
This is brilliant! If you can please share the depth! Its would be great to learn the whole stack!
I was on a shoot last week and wished I could do this... now I can try! Thanks for this
I was trained to film on 35MM Film Reel Cameras. I was taught to edit by physically SPLICING actual reels. This...is magic to me.
Awesome work! Looking forward to the full break down :)
You're killing it Matt! Keep it up!
This is great! This is going to be the way many amateurs, schools and small companies are going to break into VP. Please could you put together the longer tutorial that demonstrates how you got the camera tracking working with this test scene. I think a lot of people will find this helpful and maybe even use your example as a proof of concept to justify purchasing the equipment to try this.
This is super clean dude!
Never saw you more excited 😆
You are awesome and this is the future!
Thank you so much for this - I'm working on a streaming series and this is soooooo helpful!
Yes I absolutely want to see a full step tutorial!
this is actually mindblowing! you are a genius dude ahah
Yes, please....a full tutorial!! :)
Long and involved video would be amazing!
We want the long and detailed version!!!!
i just realized until now that you say at the begining "THIS IS THE WAY" xD
Would love to see a full tutorial on how you set this up I have been wanting to do this since I bought my Vive.
Matt thanks a lot for posting these series, it’s an invaluable resource! I’ve been a longtime follower and the virtual production stuff is very intriguing! I am trying to get started and test some of these ideas and I was wondering if you have any resources for doing a similar setup but with projectors similar to what you had done at that motorcycle shoot?
Yes please do the full tutorial!
"This is the way" 😁
Hi Matt, would appreciate if we could get a step by step video tutorial as the hardware connectivity seems a bit complicated. May be a blueprint document would also do. Thanks a ton for sharing the pipeline and process behind virtual production. Its a major game changer for visual effects industry in terms of cost an time both. Thank you once again. Good day !!
A full tutorial would be amazing!!!!!!
Full tutorial would be great! Like really really great
Long involved setup yes!
very interesting, thx for this sharing...this is what I am looking for....god job
you're doing great bro! 🙌🏽✊🏾
Do we want to see a detailed tutorial? Sure man this is amazing and I will appreciate it a lot I am sure many others do, so Please do so Mat, and you are awesome thanks for all you do, and Cinetracer as well, you are inspiratioal man
Thank you for your VLOG! DO you use 2.0 base stations?
long and involved step please! your awesome
Show the long and detailed steps on how you do the composure... I think this is amazing there is alot of creative things you could do with this technique
Yes pleas do step by step Video or course will buy it. Make some step by step guide this makes really sense... thanks
Lux Machina has a code they wrote that drives Arri LED panels along to change the lights with the virtual sets lighting. It's trippy to see in person.
Yeah I worked on set with them for a month using it. That is essentially now built into UE4 4.25 and more will be added in future releases. I'll be showing it here.
@@CinematographyDatabase Awesome
amazing work. Part 2: makeshift DIY LED Walls?:)
Hello. Just getting into virtual production at work and have a video wall and vive kit to test with. I'm basically doing an indie set up, but am struggling to get the basics like lining up the tracked space and getting the scale and position of the real world screen set up (fixed lens, no zoom yet!) - are there any tutorials covering these basic essentials yet? Im using Unity at the moment just because im familiar with doing architectural and training visuals, but also happy to use Unreal if its any easier to get up and running. My scenes never quite seem to match the perspective and skate around, look wrong scale because i think i have to scale and move the virtual camera and scene for the screen size etc. My mind is blown right now after a few days doing a rough set up. absolutely loving it, but there's only me really thinking about it at work so its difficult on my own!
This is amazing!!!
thanks for the great video
You can do the same thing without paying a dime.
1. Create a 3D environment in AE with walls. Use a Blend mode over the Key footage from the 3D walls/lights in a Comp. With some little adjustments when you change the light or background the Key footage will automatically change in real time. So, when you rotate the walls you will see in real time light buncoing over the Key footage. You can even it more real if you create a Emboss of the Key footage that gets of the lights from the Walls, making the light have some 3D FX to it. Got use the imagination when comes to FX.
Would LOVE to see every single step broken down. I know that would be a lot of work for you, but would really appreciate it and I would love you forever
was looking how to do this with blender and a cell phone, but this will work well.
I would love to see a long in depth step by step on how to do this!
Hey, there I am loving your series on virtual production! I am new to this and I wanted to have 2 cameras so I can film different angles. Is it possible to have two cameras inside UE4 outputting in real time like I see in your videos or is it limited to one? Also, how do you feed the real time output to a different monitor so I can preview it in real time?
Amazing stuff 🤟
nice. thanks!
Thank you for a good clip, I want to know, after finishing this step, how should I use Sequence recorder or Take Recorder? and what steps must be taken I want you to make a clip, please.
Definitely do the step by step.
Thanks!!, Two questions but the same purpose: Can I use the Blackmagic UltraStudio Recorder 3G
instead of Design Decklink 8k as video source input in a laptop? Can the blackmagic input plugin identify it? Thanks for the knowledge, keep going please!
hello sir, i loved your videos they are great and well explained please can you also make tutorial on how to setup camera and connecting with pc all setup
I would love to see a complete compositing tutorial which includes tracking real actors and compositing them into the scene and finally exporting.
Hi Matt question here. Would you think a 4k laser projector could work as a budget friendly way. Because led wall at home would cost alot for a home setup. And you can skip the green screen
DUDE I've been watching your videos and this one has super excited man! Thought that I subscribed a while back (must be on another account), but I had to subscribe after this!!! How well must a DP know Unreal Engine, I'm trying to learn it now on Udemy (Learn C++/Unreal Engine), any advice on the depth of knowledge to really be decent at this setup?
Would this be possible through HDMI? I'd be very keen to see if the same thing can be achieved with camera like an a7iii. Also I've seen people do this with an iphone instead of a vive to record the motion data. Do you think there's potential in that for indie production? (even if movements aren't as accurate as with a vive)
Just a quick question. Can we use this for Still Photography instead of Film? If it's possible, I would love to know.
I saw a video with a similar system that was very expensive but still used vive trackers. Even with all the effort and money they still had issues with the virtual and physical cameras desyncing occasionally which would look like sliding or bad camera tracking in VFX.
I think a better way to do it would be to make linear motion tracks for xyz axis and move the the physical camera via the software like a CNC machine operates or with IK and a robotic arm. That should make the tracking more consistant.
Or you could use optical encoders to tell the position of the physical camera to the software and move the virtual camera
Likes likes baby. You are my jam right now over at Posterposterous. I try to have the same values on Postposterous. haha!!
hey I'm a beginner and you could make a video tutorial for us from the beginning to the end with all the steps so we can understand something
Very interesting, how are you dealing with lens distortion?
Very impressive results! Looking forward for more detailed tutorial. Btw, do you know yet how to deal with little jitter between FG and BG?
yeah I was going to address that. My Vive Trackers are not an ideal place to see the Vive Controller so I was getting semi noisy tracking and also I believe the Composure Key I was pulling was doing something funny, there was a setting I forget to tweak.
@@CinematographyDatabase Maybe increaing number of base stations will help to improve tracking accuracy
@@yuppiesband403 yeah I may try that as well.
Is(or will) the footage saved separately (Green clip + backgorund clip). Or is it merged into one footage - option for later BG replacement, or editing ?
how do u match the virtual camera position to the cinecamera in the scene? How can we link the virtual camera to scene camera. I'm using uremote 2 app for virtual camera tracking and ivcam for live camera feed. When i hit play, the live composited feed is suspended and position of virtual camera is really off position somewhere in the scene.
Yes, please tell us how this works!
Please do the most detailed stepiest step by step tutorial with sub steps and side steps too
thank you for this lovely info ... but is there a subtle delay between the live camera and unreal camera ?
In this demo yes, I’ll be testing my new Vive setup soon and hopefully it will have less latency.
@@CinematographyDatabase waiting for your latest updates :)
awesome work sir
can i do this with a Canon 700D camera and VIVE controler ??
Technically yes, assuming that camera has a clean HDMI out.
Insane :)
I wanted to suggest that if people have to go buy digital hardware for it then so be it I rather have two buy additional Hardware then be limited I'm referring to when you're iffy how you were going to roll it out I would say have the best of both worlds we have your consumer and prosumer in one package
7:25 and add LED Screens to project UR Engine on Ur own Mandalorian set 😜
Hi! Can you save the unreal camera and then save it in a video? so you can mount it in AE later? Also... export the axis camera to ae camera?
Hi from where I get virtual sets, assets, which markets place.
Amazing, there is one thing that i dont understend ' what do i need to connct my camera to a pc and be ready for virtual production ?
A Blackmagic capture card or NDI output capable camera (video over ethernet)
Have you figured out how to record the "COMP" at a high resolution? When I try to use the take recorder or sequencer it only records the CG layer and I never get the Media plate to form the full comped shot
加油💪