It’s good to know that one doesn’t need to invest in the $5000 VIVE Mars CamTrack Solution for Virtual Production (The Mars, Rovers x3, VIVE Base Station 2.0 x 2, VIVE Tracker 3.0 x 2) and instead just get the VIVE Pro 2 Full Kit. $1,399.00 😊👍 Thank you for the video. Very informative.
This was great thanks - it's one of the only videos that goes through the technical process step by step. One point I would add (I don't think it was available when this vid was made) is that HTC now do a device called 'Mars' which makes tracking much easier with using livelink - and also does not require steam vr to work.
This is a great step by step on how to get setup. It would be awesome if you could create a tutorial on this process with more depth on why these steps need to be implemented, and perhaps more info on what each of these functions intended uses are. The problem with tutorials in the format you have provided is that they essentially provide us with a fish, but they don’t teach us how to fish. And with a process as intensive as this, learning the how, what, where and why to each step is crucial for troubleshooting. I am sure many of your viewers would be willing to pay to get a more in depth tutorial (I am including myself here).
Really great tutorial, have been looking for such a tutorial for quite a while. Would love to see more tutorials also about how to calibrate, record, knowing the right camera settings and more from you!
It's nice if you can start a set of tutorials on how to use UE for virtual sets and 3D broadcasting graphics. It seems there are not much tutorials on this for newbies out there
Thank you for sharing your setup! Did you use timecode generator and sync generator to sync the camera and Unreal? I’m struggling to figure out if I really need a external timecode generator or just should I just feed the timecode from camera to Unreal.
thanks! I'm feeding the timecode from my Blackmagic Ursa mini pro 12k, so you probably dont need an external generator as long as your camera can send timecode
@@bennettthomaswin Awesome, sounds good. My colleagues and I are using a Blackmagic Ursa mini pro 12k, too. Are you using the internal or program for Reference Source?
@@bennettthomaswin whats the minimum camera model that will work inside unreal, do you have any knowlage on older models that can be bought used , beside ursa mini 4.6 gen 2 ?
This is great!! Exactly what I needed. Thanks for all your effort. I'm just lacking 1 step, how to make a garbage mask for the green screen. Any help would be much appreciated.
Great tutorial and easy to follow. i got all working with Ursa Mini 4.6 , Genlok, Timecode, Video is showing inside UE, really exciting. However i got stuck at 11:10 plug in Media Composure, from here can not go forward, it does not show and not sure where to find it. Any help ios appreciated. Thanks
I pulled my hair out over this exact question for a few weeks. Composure does not currently support lumen. However, the Off World Live camera does. It's $100 but totally worth it if you want the best possible composite from UE
@@bennettthomaswin ok, I will still need to composite my unreal background with my live camera feed. I just need to be able to see the composite with lumen so I can properly light my scene. Does it work with the same workflow in composure?
Do you know other way of capturing Ndisplay? I used to use Movie Render queue for that but the niagara (with preview mesh) is not rendered so I have to find other way!
Did you get the audio routed through the media player back out to the deck link sdi outputs correctly? In media player I can click on line in and route the audio to the deck link outputs but the video drops out so it's like one or the other. What's ur audio solution keeping the timecode. Thanks!
may have to adjust the positioning of your base stations and/or trackers. I've noticed bright lights aimed at or near the base stations tend to have a negative effect on tracking accuracy as well
Thanks Bennett, I need 1 card input and 1 card to output? and why output the PC then the NINJA will see the result? and one more thing, I had find a Advanced Green Screen Chroma Key in UE, but I cant do the exact same as the tutorial, can we study together? here is he youtube link: ruclips.net/video/0EUVEA7diaI/видео.html
It’s good to know that one doesn’t need to invest in the $5000 VIVE Mars CamTrack Solution for Virtual Production (The Mars, Rovers x3, VIVE Base Station 2.0 x 2, VIVE Tracker 3.0 x 2)
and instead just get the VIVE Pro 2 Full Kit.
$1,399.00 😊👍
Thank you for the video. Very informative.
Trust me, MARS makes a HUGE difference!!
This was great thanks - it's one of the only videos that goes through the technical process step by step. One point I would add (I don't think it was available when this vid was made) is that HTC now do a device called 'Mars' which makes tracking much easier with using livelink - and also does not require steam vr to work.
Great summary. I'm deep-diving into virtual production right now and this helps alot. UE is super great but so many stepping stones in the way.
This is a great step by step on how to get setup. It would be awesome if you could create a tutorial on this process with more depth on why these steps need to be implemented, and perhaps more info on what each of these functions intended uses are.
The problem with tutorials in the format you have provided is that they essentially provide us with a fish, but they don’t teach us how to fish. And with a process as intensive as this, learning the how, what, where and why to each step is crucial for troubleshooting.
I am sure many of your viewers would be willing to pay to get a more in depth tutorial (I am including myself here).
Really great tutorial, have been looking for such a tutorial for quite a while. Would love to see more tutorials also about how to calibrate, record, knowing the right camera settings and more from you!
Thank you so much, got through to the end and made it work !
Now, i am trying to figure out how to get 2nd tracker linked to another actor...
Thanks for sharing! Great straightforward video.
You need to activate plugin 'Timed Data Monitor' as well.
It's nice if you can start a set of tutorials on how to use UE for virtual sets and 3D broadcasting graphics. It seems there are not much tutorials on this for newbies out there
Thank you for sharing your setup! Did you use timecode generator and sync generator to sync the camera and Unreal? I’m struggling to figure out if I really need a external timecode generator or just should I just feed the timecode from camera to Unreal.
thanks! I'm feeding the timecode from my Blackmagic Ursa mini pro 12k, so you probably dont need an external generator as long as your camera can send timecode
@@bennettthomaswin Awesome, sounds good. My colleagues and I are using a Blackmagic Ursa mini pro 12k, too. Are you using the internal or program for Reference Source?
@@bennettthomaswin whats the minimum camera model that will work inside unreal, do you have any knowlage on older models that can be bought used , beside ursa mini 4.6 gen 2 ?
@@lizlin6596 same question here! Did you already know how to sync them right?
@@bennettthomaswin You mentioned External for your output. But you are using camera timecode?
This is great!! Exactly what I needed. Thanks for all your effort. I'm just lacking 1 step, how to make a garbage mask for the green screen. Any help would be much appreciated.
Great tutorial and easy to follow. i got all working with Ursa Mini 4.6 , Genlok, Timecode, Video is showing inside UE, really exciting.
However i got stuck at 11:10 plug in Media Composure, from here can not go forward, it does not show and not sure where to find it. Any help ios appreciated. Thanks
Perfect- all inclusive :)
Are using macbook as computer? which hardware are you using? possibly Blackmagic UltraStudio Recorder 3G?
perfectly compact tutorial! thx!
have you found the nodal point of your lens and how? im struggling with the same lens
Great Tutorial! Do you know if composure supports lumen? My output or even just my cg bg layer looks really bad compared to what my cinemacamera sees.
I pulled my hair out over this exact question for a few weeks. Composure does not currently support lumen. However, the Off World Live camera does. It's $100 but totally worth it if you want the best possible composite from UE
@@bennettthomaswin what is the offworld live camera? Is this a plugin?
@@tallpinesstudio it replaces the stock cine cam inside of unreal: offworld.live/unreal-engine-live-streaming-toolkit
@@bennettthomaswin ok, I will still need to composite my unreal background with my live camera feed. I just need to be able to see the composite with lumen so I can properly light my scene. Does it work with the same workflow in composure?
Does Composure support Lumen yet? Also why do you have 2 BM 8K cards? I thought one had multiple inputs and out? @@bennettthomaswin
Can you add the plug-ins list in your video description?
Do you know other way of capturing Ndisplay? I used to use Movie Render queue for that but the niagara (with preview mesh) is not rendered so I have to find other way!
Hi,May I ask whether this process only supports transmission through SDI interface, not HDMI interface
Works great but when I hit play the camera disconnects from MP. any ideas?
Are you able to move freely in the studio with a matching unreal world? Or is the vive tracker not accurate enougth?
Did you get the audio routed through the media player back out to the deck link sdi outputs correctly? In media player I can click on line in and route the audio to the deck link outputs but the video drops out so it's like one or the other. What's ur audio solution keeping the timecode. Thanks!
I m using vive tracker too. but when i use this have some jitter and shaking. do you have any solution about this?
may have to adjust the positioning of your base stations and/or trackers. I've noticed bright lights aimed at or near the base stations tend to have a negative effect on tracking accuracy as well
best tutorial
in the configuration media connection is hidden i cant adjust the frame rate and all
i want to open timecode provider. it was not in unreal5 developtools where is it?
You are the Luke Smith of Virtual Production.
Can u give a better tutorial? Tell me can I just use my BMPCC?
Thanks Bennett, I need 1 card input and 1 card to output? and why output the PC then the NINJA will see the result? and one more thing, I had find a Advanced Green Screen Chroma Key in UE, but I cant do the exact same as the tutorial, can we study together? here is he youtube link: ruclips.net/video/0EUVEA7diaI/видео.html
Great video but those background noises are really annoying. It's hard to make out what you're saying at times. Maybe use a lapel mic??