Thank you for this video. I'm stuck at the livelink configuration. I have jetset running on the iphone, ip adress in the clipboard, when adding the livesource, I see a source machine "receiving" but there is no "JetSet" inside.
@@EliotMack-z3v However thank you for your reply, I'm trying to contact you, we are using the cine licence on jetset and we have few questions : How to get the comp in 25i/s in after effect without stretching / lerp it manualy in AE, and why the need of importing images from the iphone, as we only need the tracking data ? In live session on unreal, no matter the timecode from tentacle on sdi and the iphone, we always need to resync manualy by a frame or 1.5 frame, with the sdi input (almost realtime from BM card / ARRI mini). were looking for something a bit more reliable (less manual)
@@c3ribrium Are you talking about rendering 25fps interlaced footage? It would be easier to shoot 50fps and then alternate your line processing. We import the iPhone images as we use them for precise sync of tracking data with the cine footage using the flashing codes at the start of a take. Timecode by itself isn't usually precise enough. We're doing our own tests with live sync and a Blackmagic card. As you note it can be tricky. We'll have more info after we close the loop a few times reliably.
Thank you! So there is no workflow where a Tentacle Sync is not required? For example, having Unreal follow the built-in timecode coming from the camera itself? (Even if the camera's speed drifts ever so slightly over time)
If you want the live action and rendered CG signal to stay in sync, you'll need the Tentacle Sync. There may be some way to hack Timed Data Monitor to make that work, but frankly the Tentacle works great and is inexpensive. We also plan to use the timecode data to assist post production take syncing.
@eliotmack Sounds good :) I ask because I try to keep my camera rigs with the least amount of extra battery powered hardware as possible. Less hassle and points of failure for a rig that is already full of lens encoders, tracking system, monitor, etc etc. I was also able to get a non-Jetset VP rig to "appear" to be in sync with Unreal without a Tentacle Sync so was wondering if your script might be able to do the same. Good point on organizing the takes, though. I'll go with the TS on the rig.
this is f**** amazing, i need an iphone now
Can just the genlocked output of Unreal be output to an LED wall/projector for ambient lighting preview and capture final result in-camera?
We've seen some early examples of this, but don't have anything we can post.
@eliotmack could this method be combined with Jetset Cine? ruclips.net/video/6PYYvBORXvI/видео.htmlsi=fmmQ5g32VngImsZl
@eliotmack could this method be combined with Jetset Cine? ruclips.net/video/6PYYvBORXvI/видео.htmlsi=fmmQ5g32VngImsZl
Thank you for this video. I'm stuck at the livelink configuration. I have jetset running on the iphone, ip adress in the clipboard, when adding the livesource, I see a source machine "receiving" but there is no "JetSet" inside.
The same problem as u :(
Can you join an Office Hours session? We do then Mon/Tues/Fri at 9am Pacific time. Easier to debug live.
@@EliotMack-z3v Thank you. My pb was resolved by changing my network infrastructure. With a dedicated AP and multicast activated
@@EliotMack-z3v However thank you for your reply, I'm trying to contact you, we are using the cine licence on jetset and we have few questions : How to get the comp in 25i/s in after effect without stretching / lerp it manualy in AE, and why the need of importing images from the iphone, as we only need the tracking data ? In live session on unreal, no matter the timecode from tentacle on sdi and the iphone, we always need to resync manualy by a frame or 1.5 frame, with the sdi input (almost realtime from BM card / ARRI mini). were looking for something a bit more reliable (less manual)
@@c3ribrium Are you talking about rendering 25fps interlaced footage? It would be easier to shoot 50fps and then alternate your line processing.
We import the iPhone images as we use them for precise sync of tracking data with the cine footage using the flashing codes at the start of a take. Timecode by itself isn't usually precise enough.
We're doing our own tests with live sync and a Blackmagic card. As you note it can be tricky. We'll have more info after we close the loop a few times reliably.
Thank you! So there is no workflow where a Tentacle Sync is not required? For example, having Unreal follow the built-in timecode coming from the camera itself? (Even if the camera's speed drifts ever so slightly over time)
If you want the live action and rendered CG signal to stay in sync, you'll need the Tentacle Sync. There may be some way to hack Timed Data Monitor to make that work, but frankly the Tentacle works great and is inexpensive. We also plan to use the timecode data to assist post production take syncing.
@eliotmack Sounds good :) I ask because I try to keep my camera rigs with the least amount of extra battery powered hardware as possible. Less hassle and points of failure for a rig that is already full of lens encoders, tracking system, monitor, etc etc. I was also able to get a non-Jetset VP rig to "appear" to be in sync with Unreal without a Tentacle Sync so was wondering if your script might be able to do the same. Good point on organizing the takes, though. I'll go with the TS on the rig.