This was a great tutorial. Really appreciate all the time, camera angles, and detail you put into it. Excited to see Aximmetry get compatible with UE5 for even more realistic background sets.
Hello, your video was great and I followed your method step by step. However, there was no issue with G genlock not taking effect on my aximmetry. Could you please discuss this issue with you?
Hey Guys, great Video. Im kind of unexperienced in this field. Does the lens in this video output its focus/zoom values or isnt that needed to run this setup? Is focus pulling possible using this?
Hi guys! Fantastic video! Thanks for detailing the setup from start to finish, it was incredibly helpful for our setup. I was wondering if you guys experienced any object/environment slippage that wasn't caused by tracking delay. We have done everything in this video step by step, but when we use the camera along with the trackers, our objects within the scene (real objects on green) tend to slip and move around when the camera moves, meaning they arent in a fixed spot in the virtual environment. Any help would be appreciated, best, Luca.
We're using the Mars system with the Sony FX9 and we continue to have latency issues with the Unreal footage coming out of the Blackmagic Decklink card.
Wow, a brilliant tutorial. thank you for taking the time do it. Would like to ask if the MARS support only Aximmetry Broadcast edition or it can also support the Professional version?
Nice! I finished setting up my Mars in our new studio last week so we went through a lot of the troubleshooting you have fixed here. One thing I've noticed though is I'm often getting a lot of jitter in the system, especially as it's been set up for a while. I've not set it up on it's own subnet like you have so I'll give that a go. Otherwise what are some things I should look out for to get a better track?
You did a great job on how to setup the Vive Mars Camtrack. I have one question, did you change any of the switch setting on the Mini Converter Sync Generator?
We are very sorry for our late reply. Since the Blackmagic Mini Converter Sync Generator only provides a reference signal, not an external Timecode signal, we cannot use Timecode sync. With further technical questions, please please post on our official Forum moderated by Aximmetry consultants: my.aximmetry.com/posts for expert advice.
Hi.. My camera is BM Broadcast G2 (2022) version. hopefully I won't need an additional genlock sync converter. These cameras come with built-in genlock. is it true? please reply i will be glad
Can you send the lens Depth of Field information through the Mars system to Unreal from the camera? Exp. If I rack focus will unreal's virtual camera match the real camera?
Please note that we do not provide support on RUclips. Please post your technical questions on our official Forum moderated by Aximmetry consultants: my.aximmetry.com/posts for expert advice.
Hi, In the video, the calibrated lens file added to the mapped tracking device at 14:33. You can also add the lens files to the tracked camera compounds in the Input panel, like it is shown here: aximmetry.com/learn/tutorials/for-studio-operators/camera-calibrator/#applying-calibration-results-in-aximmetry-composer With further questions, please post to our official forum ( my.aximmetry.com/posts ), as we do not provide support on RUclips.
I keep getting the error message in camera calibration "the tracking device was lost" after we complete the lens calibration. Do you know why this is happening and how to resolve it?
There might be some issues with the configuration of your system. Please post your questions on our official Forum: my.aximmetry.com/posts. The Forum is designed to be community-based and monitored by our technical consultants, where they will be able to answer within 15 working days, unless someone from the community answers before.
Way too much work and money still for results like that. Lighting is usually off and doesn't move or match the scene so in this case it seems like it would be easier to track motion of person in After Effects with background markers, send tracking data into UE5 and render a scene where the camera in the program moves to match original green screen video. One of the best features of the LED wall is it lighting the subject as well so looks natural so if we can't get that bonus, why spend all the money for Mars stuff and just go through software.
Add a 3rd rover to the lighting fixture and use it in the unreal project. I don't know if technically possible, but I would try to use DMX to sync the virtual light with the fixture. I also think they didn't want to spend additional resources/time to set up a correct gaffing setup, for this quick demo, like using a softbox,.... Take a look at this "old" setup, with Vive Pro (before Mars was available): ruclips.net/video/cOI-cjdULmk/видео.html
It means that your Capture Cars output is locked to the reference signal (it is genlocked). Please note though that we do not provide support on RUclips. Please post your technical questions on our official Forum moderated by Aximmetry consultants: my.aximmetry.com/posts for expert advice.
Around 9:40 the the Camera ID should be set to "1" instead of "0". Sorry for the inconvenience!
GREAT JAMIE!!
Wow - great tutorial. MO-SYS should watch this to see how a tutorial should be done!
right
It was amazing to see the Aximmetry software and Vive technology in action at IBC Amsterdam. Looking forward to working with it!
This was a great tutorial. Really appreciate all the time, camera angles, and detail you put into it. Excited to see Aximmetry get compatible with UE5 for even more realistic background sets.
Thank you!
congratulations fam!!!
Thanks for sharing this. Can the Decklink Duo 2 work?
Hello, your video was great and I followed your method step by step. However, there was no issue with G genlock not taking effect on my aximmetry. Could you please discuss this issue with you?
Hello, thank you for your message. For further discussions, please post on the official Aximmetry Forum: my.aximmetry.com/posts
Very helpful ,solved my problem in 5 minutes.
@3:57 you mentioned blackmagic card, are you referencing to a card in the PC to take genlock information? Also, great video, thank you!
Hey Guys, great Video. Im kind of unexperienced in this field. Does the lens in this video output its focus/zoom values or isnt that needed to run this setup? Is focus pulling possible using this?
Hi guys! Fantastic video! Thanks for detailing the setup from start to finish, it was incredibly helpful for our setup. I was wondering if you guys experienced any object/environment slippage that wasn't caused by tracking delay. We have done everything in this video step by step, but when we use the camera along with the trackers, our objects within the scene (real objects on green) tend to slip and move around when the camera moves, meaning they arent in a fixed spot in the virtual environment. Any help would be appreciated, best, Luca.
We're using the Mars system with the Sony FX9 and we continue to have latency issues with the Unreal footage coming out of the Blackmagic Decklink card.
Great video! Thank you! Just curious as to why you’re using the NTSC frame rate. Mind if I ask?
The video was created by our American ambassador. NTSC is the first American standard for analog television, explaining its use in the video.
@@aximmetry
Yes, that I understand. But isn’t the standard frame rate for standard HD 23.976?
For movies and TV shows, yes. For broadcast, it is 25 fps (PAL)(or 50 interlaced to be precise) in Europe, and 29.97 fps (NTSC) in Northern America.
A great video, indeed! If you have a BM Television Studio Pro 4K, do you still need Mini Sync?
Wow, a brilliant tutorial. thank you for taking the time do it.
Would like to ask if the MARS support only Aximmetry Broadcast edition or it can also support the Professional version?
Thanks! You'll need the Broadcast version.
Nice! I finished setting up my Mars in our new studio last week so we went through a lot of the troubleshooting you have fixed here.
One thing I've noticed though is I'm often getting a lot of jitter in the system, especially as it's been set up for a while. I've not set it up on it's own subnet like you have so I'll give that a go. Otherwise what are some things I should look out for to get a better track?
Hi! For Mars-related questions, please email Mars-support@vive.com, thank you.
You did a great job on how to setup the Vive Mars Camtrack. I have one question, did you change any of the switch setting on the Mini Converter Sync Generator?
Hello, please, can I shoot with anamorphic lens while on set using this VIVE MARS CAMTRACK???
Great video!
Was surprised you used the manual tracking delay offset instead of the Timecode Sync toggle next to it. What was the reasoning for that?
We are very sorry for our late reply. Since the Blackmagic Mini Converter Sync Generator only provides a reference signal, not an external Timecode signal, we cannot use Timecode sync. With further technical questions, please please post on our official Forum moderated by Aximmetry consultants: my.aximmetry.com/posts for expert advice.
Hi.. My camera is BM Broadcast G2 (2022) version. hopefully I won't need an additional genlock sync converter. These cameras come with built-in genlock. is it true? please reply i will be glad
nice
Can you send the lens Depth of Field information through the Mars system to Unreal from the camera? Exp. If I rack focus will unreal's virtual camera match the real camera?
s it necessary to have cameras with time code for Vive Mars to work?
Really great video! (Is it an issue that your registration details are visible on the Axi boot screens?)
He sounds like a news reporter 😮
What about the zoom? This calibration support the lens zoom?
Amazing,thanks for you. My question is what about the record,video and the audio how we do it ,,,
Please note that we do not provide support on RUclips. Please post your technical questions on our official Forum moderated by Aximmetry consultants: my.aximmetry.com/posts for expert advice.
Hi i dont understand why you put ip 20 at the end in mars network and 10 at camera tracking ? where you put this diferent ip 20 ?
I don’t see how to use lens file after you calibration lens and distortion. Can you show?
Hi,
In the video, the calibrated lens file added to the mapped tracking device at 14:33.
You can also add the lens files to the tracked camera compounds in the Input panel, like it is shown here: aximmetry.com/learn/tutorials/for-studio-operators/camera-calibrator/#applying-calibration-results-in-aximmetry-composer
With further questions, please post to our official forum ( my.aximmetry.com/posts ), as we do not provide support on RUclips.
I use mirrorless cameras could i use nanolockit device for the time code? Or this works without sinc?
For Mars-related questions, please email Mars-support@vive.com, thank you.
How you get 29.97 from the BlackMagic - SG? I don't seem to have that option.
You'll want to set the pins to the "NTSC" setting for 29.97fps
I keep getting the error message in camera calibration "the tracking device was lost" after we complete the lens calibration. Do you know why this is happening and how to resolve it?
There might be some issues with the configuration of your system. Please post your questions on our official Forum: my.aximmetry.com/posts. The Forum is designed to be community-based and monitored by our technical consultants, where they will be able to answer within 15 working days, unless someone from the community answers before.
@@aximmetry I found the solution to the issue: it's now the pinned comment: The Cmaera ID should be set to "1" instead of "0"
@@poalaburr 💯 Thank you.
Looks like you want camera movement so people can’t focus on the background added.
Any Mac User here using Aximmetry?
puhhh the calibration process looks not really production ready :-/
Way too much work and money still for results like that. Lighting is usually off and doesn't move or match the scene so in this case it seems like it would be easier to track motion of person in After Effects with background markers, send tracking data into UE5 and render a scene where the camera in the program moves to match original green screen video. One of the best features of the LED wall is it lighting the subject as well so looks natural so if we can't get that bonus, why spend all the money for Mars stuff and just go through software.
Never rely on LED panel lighting
Mars costs 5000$, led wall that size will be 350000$ easy. That’s why
Add a 3rd rover to the lighting fixture and use it in the unreal project. I don't know if technically possible, but I would try to use DMX to sync the virtual light with the fixture. I also think they didn't want to spend additional resources/time to set up a correct gaffing setup, for this quick demo, like using a softbox,.... Take a look at this "old" setup, with Vive Pro (before Mars was available): ruclips.net/video/cOI-cjdULmk/видео.html
What if the REF INPUT is showing up as locked?
It means that your Capture Cars output is locked to the reference signal (it is genlocked). Please note though that we do not provide support on RUclips. Please post your technical questions on our official Forum moderated by Aximmetry consultants: my.aximmetry.com/posts for expert advice.