Lightcraft Technology
Lightcraft Technology
  • Видео 45
  • Просмотров 51 424
Fixing a Misaligned Shot
Shot courtesy of Micaiah Chau.
Multi-Peel Script courtesy of Matt Merkovich:@MatthewMerkovich
00:00 Fixing a Misaligned Shot
00:35 Importing Shot Script
01:08 Viewing Camera Motion Graphs
01:46 Correcting Camera Angles
02:48 Installing Multi-Peel Synthia Script
04:34 Using Alternative AI Rotomattes
05:13 Running Multi-Peel Script
06:28 Picking Initial Survey Frame
07:03 Picking Survey Markers
07:59 Solving and Refinement
10:15 Solving for Lens Distortion
Просмотров: 239

Видео

In-Phone Re-Rendering
Просмотров 2,5 тыс.28 дней назад
lightcraft.pro/ Superior Keying with Delta Keyer: ruclips.net/video/QN4UxWnRZEI/видео.htmlsi=gHAuuOjoYK4aCtdk 00:00 In-Phone Re-Rendering 00:24 Self-Contained Re-Rendering 01:46 Capturing Tracked Takes 01:59 Capturing the Source Take 02:36 Re-Rendering Settings 03:07 Re-Matching Previous Takes 03:20 Rendering Clean Tracked Background Clip 04:13 Transferring Takes 04:50 Loading Takes Into Resolv...
Gaussian Splat Setup
Просмотров 5732 месяца назад
This video covers the new Gaussian Splat authoring workflow in Jetset Pro and Jetset Cine. 00:00 Gaussian Splat Setup 00:31 Installing Gaussian Splat Add-on 01:24 Importing Gaussian Splat 02:01 Viewing Splat 02:45 Adding Splat Locator 03:46 Parenting Gaussian Splat to splatloc 04:25 Creating 3D Reference Geo 06:23 Rough Splat Location 07:22 Setting Wireframe Display 07:53 Fixing Splat Orientati...
Jetset Cine drives SynthEyes sub-pixel tracking refinement in seconds!
Просмотров 6672 месяца назад
This video shows the Autoshot-scripted Jetset Cine shot import and refinement workflow with SynthEyes in less than 2 minutes! Starting with AI rotomattes to keep trackers off actors, and rapidly setting survey points using the Jetset stage geo scan, we refine the solve to a sub-pixel level while maintaining the original on-set scale, orientation, and position. This workflow solves the 'tracking...
Synchronized Rendering in Unreal
Просмотров 6813 месяца назад
Thanks to Bennet Win for his excellent walkthrough! ruclips.net/video/NvcFtDYupfc/видео.htmlsi=hiKZ15bKH_HcsAJr lightcraft.pro/ 00:00 Synchronized Rendering in Unreal 00:11 Hardware Used 00:42 Setting up Unreal Plugins 01:38 Running Live Render Preview Script 02:23 Configuring LONET2 on Jetset 03:07 Adding Live Link Source and Preset 03:59 Creating AJA Media Bundle 05:20 Dragging Media Bundle I...
Tracking Refinement with Syntheyes
Просмотров 3,1 тыс.3 месяца назад
Syntheyes Essentials: ruclips.net/video/IIF1Htbog_o/видео.htmlsi=lpwg3I0HsOrYGP7A Auto Tracking Deep Dive: ruclips.net/video/iu3Ils6GlD4/видео.htmlsi=4gipnS-uduTZUduZ InSpyReNet Download: lightcraft.pro/downloads/ 0:00 Tracking Refinement with Syntheyes 00:28 Installing InSpyReNet 01:33 Verifying Installation 02:04 Take Color Space 02:44 Generating AI Roto Mattes 03:49 Blender Render File 04:20...
Unreal Live Render Preview v2
Просмотров 8004 месяца назад
This is a video about Live Render Preview with LONET v2 0:00 Live Render Preview with LONET v2 01:47 24 fps Required 02:24 Jetset Setup 03:30 Installing LONET Plugins 04:18 Enabling Plugins and UI 04:56 Unreal Live Link Setup 05:08 Live Render Preview Script 05:33 Composure Layers 06:18 LC Media Player and Texture 06:39 Correcting Video Player UR 07:24 Live Media Layer 07:51 Keyer Settings 09:0...
Jetset Cine with Blackmagic BRAW
Просмотров 2,6 тыс.4 месяца назад
lightcraft.pro/ 0:00 Jetset Cine with Blackmagic BRAW 00:35 Accsoon See App Installation 00:52 Connecting Accsoon SeeMo to iPhone 01:41 Tentacle Sync App Setup 02:04 Project Folder Setup 02:45 Tentacle to Jetset Connection 03:18 Cine Calibration Setup 03:39 Cine Calibration Capture 06:02 Exiting Calibration Capture and Red Reticule 06:36 Autoshot Calibration Panel 06:47 Cine Camera Sensor Width...
JetsetCineRigging
Просмотров 3,5 тыс.4 месяца назад
Jetset Cine rigging components: lightcraft.pro/docs/jetset-cine-rigging/ Accsoon SeeMo components: lightcraft.pro/docs/which-accsoon-seemo/ 0:00 JetsetCineRigging 00:20 Cine Camera Cage 01:31 iPhone Cage 03:14 Rails on Camera Cage 04:05 Mounting iPhone to Rails 04:53 Accsoon SeeMo 05:17 MagSafe Cooler 06:10 Video Cable to iPhone 06:29 NP-F Battery 06:38 HDMI to SeeMo 07:26 Tentacle Sync 08:41 E...
Autoshot Unreal Round Trip
Просмотров 2,6 тыс.4 месяца назад
Fantasy Castle on Unreal Marketplace: www.unrealengine.com/marketplace/en-US/product/infinity-blade-castle nVidia Omniverse: www.nvidia.com/en-us/omniverse/ Lightcraft Downloads: lightcraft.pro/downloads/ 0:00 Autoshot Unreal Round Trip 00:20 Creating Jetset Project 01:00 Installing Autoshot Tools & Autoshot 01:56 Create Matching Autoshot Project Folder 02:29 Project Folder Organization 03:11 n...
Autoshot Blender Round Trip
Просмотров 1,1 тыс.5 месяцев назад
lightcraft.pro/downloads/ 0:00 Autoshot Blender Round Trip 00:42 Downloads 01:27 Installing Autoshot and Tools 01:54 Installing Blender Add-On 02:32 Setting Autoshot Project Folder 03:16 Project Folder File Structure 03:57 Importing Polycam Scan 05:13 Setting Materials to Emissive 06:19 Decimating Mesh 07:11 Adding Scene Locators 07:46 sceneloc_* prefix 09:52 Scene Locator Orientation In Jetset...
Setting Cine Offset Manually
Просмотров 9516 месяцев назад
lightcraft.pro 0:00 Setting Cine Offset Manually 00:11 Create Timeline at Cine Frame Rate 00:45 Set Cine Marker 01:00 Set Jetset Marker 01:14 Aligning Clips 01:27 Checking Time Alignment 01:52 Calculating Frame Offset 02:29 Entering Offset in Autoshot
Importing Animated Takes
Просмотров 4919 месяцев назад
www.lightcraft.pro/ 0:00 Importing Animated Takes 00:19 Autoshot Settings 01:05 Creating the Unreal Sequence 01:27 Adding the Elevator Subsequence 01:46 Adding Scene Locator to Elevator Sequence 01:59 Adding Camera Attach Track 02:26 Fixing Animated Texture Display 02:45 Fixing Texture Brightness 03:33 Disabling Image Plate 04:11 Command Line Encoder Shortcut 04:50 Rendering with Movie Render Q...
Exporting Animated Scenes
Просмотров 3859 месяцев назад
www.lightcraft.pro/ Get nVidia Omniverse FREE at: www.nvidia.com/en-us/omniverse/ 0:00 Exporting Animated Scenes with Omniverse 00:23 Omniverse Setup 00:55 Select Export Layer Objects 01:31 Adding the Level Sequence Animation 01:44 Exporting to Omniverse 02:33 Downloading USD Zip from Nucleus 03:43 Converting USD to USDZ in Autoshot 04:18 Loading Scene and 360 Pano in Jetset 05:07 Synchronizing...
Rendering 360 Panoramic Backgrounds
Просмотров 3809 месяцев назад
Link to WINBUSH's 360 tutorial: ruclips.net/video/SpkP1XCF_mE/видео.htmlsi=UpzuwXWYFVOc-MLt 0:00 Rendering 360 Panoramic Backgrounds 00:43 USD Export Layer 01:06 Placing 360 Camera 01:33 Setting Foreground Objects to Hidden in Game 02:02 Creating 360 Level Sequence 02:36 Movie Render Queue Panoramic Setup 03:40 Reverting Scene and MRQ Settings 04:12 Transferring 360 Images to Jetset
Animation Timeline Control
Просмотров 71610 месяцев назад
Animation Timeline Control
Exporting Animated Characters
Просмотров 82611 месяцев назад
Exporting Animated Characters
Convert Unreal Scenes to Blender
Просмотров 3,2 тыс.Год назад
Convert Unreal Scenes to Blender
Setting Tracking Origin with Markers
Просмотров 767Год назад
Setting Tracking Origin with Markers
2D Backgrounds
Просмотров 819Год назад
2D Backgrounds
Browser Slate and Remote
Просмотров 658Год назад
Browser Slate and Remote
Project Pick and Take Storage
Просмотров 654Год назад
Project Pick and Take Storage
AI Matte Paint Fixes
Просмотров 174Год назад
AI Matte Paint Fixes
Export Animated USDZ
Просмотров 9 тыс.Год назад
Export Animated USDZ
Baking Motion and Scene Locators
Просмотров 137Год назад
Baking Motion and Scene Locators
Baking Motion and Scene Locators
Просмотров 77Год назад
Baking Motion and Scene Locators
Linking and Animating Vehicle
Просмотров 315Год назад
Linking and Animating Vehicle
Vehicle Armature
Просмотров 61Год назад
Vehicle Armature
Vehicle File Organization
Просмотров 206Год назад
Vehicle File Organization
Render Scene and Viewport Compositor
Просмотров 345Год назад
Render Scene and Viewport Compositor

Комментарии

  • @Makeyourmarken671
    @Makeyourmarken671 10 часов назад

    Hey, thanks for all these videos. In engine, once I paste into the command line and the sequencer loads, I only have a black image on my image plate. I can see the video from my cinema camera in the engine separately, but it remains black as an image plane, and export render it isn’t even included.

    • @lightcrafttechnology
      @lightcrafttechnology 3 часа назад

      OK -- can you post your question on the forums at forums.lightcraft.pro/? Then we can get a link with the take and test the behavior.

  • @stephanec3436
    @stephanec3436 9 дней назад

    Refinement tracking : do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CG set with a live video. ?? Thanks

    • @lightcrafttechnology
      @lightcrafttechnology 9 дней назад

      The standard Jetset tracking is easily good enough for shots that don't have visible ground contact. You can see some videos done by Alden Peters on RUclips that are all straight Jetset tracking. The shots with highly visible ground contact require an additional level of precision; that's what the SynthEyes pipeline is designed to handle.

  • @Kayserjp
    @Kayserjp 10 дней назад

    can i use atomos hdmi output to synch the phone?

    • @lightcrafttechnology
      @lightcrafttechnology 10 дней назад

      Are you using the Atomos device to record ProRes RAW? If it can output a HDMI signal that the Accsoon Seemo can read, it should be fine.

  • @dickie_hrodebert
    @dickie_hrodebert 18 дней назад

    Where can I get the Overscan addon that you are using for Blender?

    • @lightcrafttechnology
      @lightcrafttechnology 18 дней назад

      In this case, we're not using an overscan add-on, but manually entering in the overscan sensor size calculated in Syntheyes. Much simpler.

  • @chiya_mohammed
    @chiya_mohammed Месяц назад

    Cool. thanks please after Not Hidden this video

  • @galaxyexpress998
    @galaxyexpress998 Месяц назад

    Can you do scale adjustments too?

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      If you create a scene locator in Blender or Unreal with a non-zero scale, Jetset will use that value to scale the tracking motion. This will break live action tracking, but is really useful when doing cg-only previs shots, as you can make a 25x multiplier and fly around your scene quickly.

  • @davidspiers6638
    @davidspiers6638 Месяц назад

    Does the auto lens calibrate work with anamorphic lenses?

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      I'm interested in testing this. It should calibrate the offsets and overall field of view correctly with an unsqueezed image. Then to get the more accurate anamorphic calculations we would want to use the SynthEyes post refinement pipeline. If you want to test with us we are interested.

  • @Shuyunliu-j9q
    @Shuyunliu-j9q Месяц назад

    26:46 Hi, Lightcraft Technology. why I click Sync button shows ' Autoshot Tools not installed. toolsdir is None ' ?

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      You'll need to install the Autoshot Tools download from the lightcraft.pro/downloads site. Autoshot Tools contains the big AI models and executables that don't change very much, whereas Autoshot changes quickly, so on Windows we have 2 separate downloads.

  • @Shuyunliu-j9q
    @Shuyunliu-j9q Месяц назад

    "Hello, Lightcraft Technology. Does Seemo already synchronize timecode between an iPhone and a camera, or is it necessary to purchase a Tentacle Sync or similar device for timecode synchronization?"

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      You'll need the Tentacle Sync to get timecode matched between the iPhone and the cine footage. The SeeMo uses HDMI to capture & transfer video, and there is no standard timecode embedding in HDMI like there is in SDI (although different manufacturers do put timecode in HDMI in different ways.)

  • @c3ribrium
    @c3ribrium Месяц назад

    Thank you for this video. I'm stuck at the livelink configuration. I have jetset running on the iphone, ip adress in the clipboard, when adding the livesource, I see a source machine "receiving" but there is no "JetSet" inside.

    • @Shuyunliu-j9q
      @Shuyunliu-j9q Месяц назад

      The same problem as u :(

    • @EliotMack-z3v
      @EliotMack-z3v Месяц назад

      Can you join an Office Hours session? We do then Mon/Tues/Fri at 9am Pacific time. Easier to debug live.

    • @c3ribrium
      @c3ribrium Месяц назад

      @@EliotMack-z3v Thank you. My pb was resolved by changing my network infrastructure. With a dedicated AP and multicast activated

    • @c3ribrium
      @c3ribrium Месяц назад

      @@EliotMack-z3v However thank you for your reply, I'm trying to contact you, we are using the cine licence on jetset and we have few questions : How to get the comp in 25i/s in after effect without stretching / lerp it manualy in AE, and why the need of importing images from the iphone, as we only need the tracking data ? In live session on unreal, no matter the timecode from tentacle on sdi and the iphone, we always need to resync manualy by a frame or 1.5 frame, with the sdi input (almost realtime from BM card / ARRI mini). were looking for something a bit more reliable (less manual)

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      @@c3ribrium Are you talking about rendering 25fps interlaced footage? It would be easier to shoot 50fps and then alternate your line processing. We import the iPhone images as we use them for precise sync of tracking data with the cine footage using the flashing codes at the start of a take. Timecode by itself isn't usually precise enough. We're doing our own tests with live sync and a Blackmagic card. As you note it can be tricky. We'll have more info after we close the loop a few times reliably.

  • @Utsab_Giri
    @Utsab_Giri 2 месяца назад

    So, the phone sits on top of the lens, not the sensor? Why is that?

    • @eliotmack
      @eliotmack 2 месяца назад

      The Jetset Cine lens calibration system calculates the offset between the iPhone lens and the cine camera lens wherever they are. The lens 'entry pupils' are (usually) close to the front of the glass, so it's a good idea to keep the iPhone close to the front (and also gives the phone a clear line of site.)

  • @jemsophia
    @jemsophia 2 месяца назад

    this is soooooo cool! thank you!

  • @Utsab_Giri
    @Utsab_Giri 2 месяца назад

    I'm wondering what Jetset's performance will be like in low-light conditions. Thoughts? Thanks you!

    • @eliotmack
      @eliotmack 2 месяца назад

      When using Jetset Cine with an external cine camera, the iPhone can operate at a different exposure than the main phone (and in fact we run the iPhone at auto exposure during Cine shots for exactly this reason. )

  • @timdoubleday4627
    @timdoubleday4627 2 месяца назад

    Great video as always, I'm guessing we could do a similar workflow but use Unreal Engine?

    • @eliotmack
      @eliotmack 2 месяца назад

      I haven't used any of the Gaussian implementations in Unreal so I don't know how they behave. The basic splatloc concept should work, however.

  • @duchmais7120
    @duchmais7120 2 месяца назад

    Hello Lightcraft Technology. Thanks for sharing video. Which Lens on the BMPCC are you using in this demonstration?

    • @eliotmack
      @eliotmack 2 месяца назад

      This is a standard Canon 24-105 zoom.

  • @LokmanVideo
    @LokmanVideo 2 месяца назад

    I'm in the VFX industry for many years, and seeing this workflow and the new technology you're bringing to the masses is so exiting :) Can't wait to test Jetset once I finish my new green screen studio (cyclorama). Amazing job guys!

    • @eliotmack
      @eliotmack 2 месяца назад

      Thanks! Post some shots when can!

    • @LokmanVideo
      @LokmanVideo 2 месяца назад

      @@eliotmacksure 👍

  • @lnproductions7958
    @lnproductions7958 2 месяца назад

    Has anyone had issues with the proxy step? I'm not sure if it's the amount of clips but it doesn't even load for me

  • @michaelounsa5056
    @michaelounsa5056 2 месяца назад

    Hello. I am considering purchasing an iPhone Pro to use the LiDAR feature specifically for virtual production with the LightCraft application. Could you please let me know if there is a significant difference in LiDAR quality and performance between the iPhone models from version 12 up to the upcoming iPhone 16? Are there any major benefits of using the newer models with your application?

    • @eliotmack
      @eliotmack 2 месяца назад

      We've found remarkable improvements with each new generation of iPhone, especially in GPU capacity and in cooling. The LiDAR hasn't changed much, but I'd still recommended getting the newest iPhone you can simply for the other performance aspects. It makes a big difference. We're getting Jetset ready for iOS 18 and looking forward to what is in the new hardware coming up soon.

    • @michaelounsa5056
      @michaelounsa5056 2 месяца назад

      @@eliotmack thank you for the clearly answer.

  • @Kavouhn
    @Kavouhn 2 месяца назад

    Ensure your camera is focused on the object you're scanning, to get more tracking points.

  • @codydobie
    @codydobie 2 месяца назад

    Would this workflow with Jetset Cine and Syntheyes be useful outside of virtual production applications? Footage shot with the Jetset Cine rig mounted on your camera is basically able to capture a pretty solid camera track and attach some tracked lidar scanned geo that can represent the set? I could definitely see a huge time savings being able to bring a shot into syntheyes with the camera and geo already tracked for reference to integrate CG and VFX. Or am I misunderstanding what it does? Also, does it only work with solid backgrounds/green screen?

    • @eliotmack
      @eliotmack 2 месяца назад

      Yes -- we think this will be very useful for 'normal' non-greenscreen projects! The AI rotomattes are very good, and the set scanning + post tracking technique will work on any shot that needs VFX added. In fact, the iPhone uses a natural feature tracking algorithm that will work better in a 'normal' environment, since there are many more trackable corner features than on a greenscreen.

  • @MarkStefenelli
    @MarkStefenelli 2 месяца назад

    and now you just buy the ultimatte keyer from BMD and you have a professional virtual studio setup. Was waiting for this, NOW i will go for JETSET CINE....awesome. I worked many years ago with a Lightcraft Tech Previzion system, which was extremely expensive but now this is way better for an indie producer. Congrats, guys !!

  • @guillaumewalle
    @guillaumewalle 2 месяца назад

    this is f**** amazing, i need an iphone now

  • @joezohar
    @joezohar 2 месяца назад

    Thank you so much for this! When using USD viewer I noticed you loaded Castle_3 but didn't show what those export settings were. When I load _2 version I get the d3d error that crashed Unreal. Any help be greatly appreciated!

    • @bradballew3037
      @bradballew3037 2 месяца назад

      I thought I was crazy because I didn't a _3 version and was wondering when that was supposed to be created? It seems like a part of the tutorial is missing?

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      If you're running into a problem, please post on the forums at forums.lightcraft.pro. Then we can follow up on the details. USD model exports are usually easy to fix.

  • @Kumarswamy_Hosmath
    @Kumarswamy_Hosmath 2 месяца назад

    I am interested in only gathering 1. Camera tracking data 2. Camera track plus respective lens data Can we please have a simple tutorial on how to gather inputs and use with Unreal post? And pls share shoot data so that one can try first hand?

  • @Kumarswamy_Hosmath
    @Kumarswamy_Hosmath 2 месяца назад

    Have you tested this with full blown rigs with mattebox? What is the limit of distance between cine lens and the iPhone lens? In most case of movie shoot we have matteboxes, why not demonstrate such scenario?

    • @eliotmack
      @eliotmack 2 месяца назад

      Good suggestion. I'm working on an update of the rigging that better handles an underslung camera rig by mounting the iPhone to the side of the main lens instead of above it. It's also fine to raise up the iPhone a bit to clear the matte box.

  • @kabalxizt5028
    @kabalxizt5028 2 месяца назад

    you should make more tutorial about SYNTHEYES

  • @pinkuzaimas
    @pinkuzaimas 3 месяца назад

    Is there a link to InSpyReNet? I found it on GitHub but not sure what to download

  • @brettcameratraveler
    @brettcameratraveler 3 месяца назад

    Thank you! So there is no workflow where a Tentacle Sync is not required? For example, having Unreal follow the built-in timecode coming from the camera itself? (Even if the camera's speed drifts ever so slightly over time)

    • @eliotmack
      @eliotmack 3 месяца назад

      If you want the live action and rendered CG signal to stay in sync, you'll need the Tentacle Sync. There may be some way to hack Timed Data Monitor to make that work, but frankly the Tentacle works great and is inexpensive. We also plan to use the timecode data to assist post production take syncing.

    • @brettcameratraveler
      @brettcameratraveler 3 месяца назад

      @eliotmack Sounds good :) I ask because I try to keep my camera rigs with the least amount of extra battery powered hardware as possible. Less hassle and points of failure for a rig that is already full of lens encoders, tracking system, monitor, etc etc. I was also able to get a non-Jetset VP rig to "appear" to be in sync with Unreal without a Tentacle Sync so was wondering if your script might be able to do the same. Good point on organizing the takes, though. I'll go with the TS on the rig.

  • @momenkhaled99
    @momenkhaled99 3 месяца назад

    wow

  • @zykoman825
    @zykoman825 3 месяца назад

    Hey. Can we get a video with cine-camera pipeline to blender? We only have with iphone and that makes me confused...

    • @eliotmack
      @eliotmack 3 месяца назад

      The base version of Jetset is iPhone/iPad only. Jetset Cine (the version in this video) connects to external cine cameras and provides the lens calibration and post production footage processing.

  • @Kumarswamy_Hosmath
    @Kumarswamy_Hosmath 3 месяца назад

    Is tenacle sync a must?

    • @eliotmack
      @eliotmack 3 месяца назад

      Required for synchronized real time rendering in Unreal. Highly recommended for general use as then the tracking data has the same timecode as the cine video takes.

    • @Kumarswamy_Hosmath
      @Kumarswamy_Hosmath 3 месяца назад

      @@eliotmack can I get away with it for just camtrack data to be used in post?

  • @shawnhuang-m3x
    @shawnhuang-m3x 3 месяца назад

    Why do I set Lonet2 livelink and subject name show nothing?

  • @manolomaru
    @manolomaru 3 месяца назад

    ✨😎😮😵😮😎👍✨

  • @jordanthecadby5762
    @jordanthecadby5762 3 месяца назад

    Under what circumstances would you need to refine the live track?

    • @eliotmack
      @eliotmack 3 месяца назад

      Shots with visible CGI & live action joins. In this case it's the join between the CG railing and the practical floor, but in other shots it might be high degrees of floor contact.

    • @stephanec3436
      @stephanec3436 9 дней назад

      @@eliotmack do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CGT sets with a live video. ?? Thanks

    • @lightcrafttechnology
      @lightcrafttechnology 9 дней назад

      @@stephanec3436 For many shots the standard Jetset tracking is fine. All but one of the shots in ruclips.net/video/s2y2lcsL_Lk/видео.htmlsi=oOkG1VY3s8Q5wM5X are from the Jetset data. For certain shots with very visible ground contact, you may need to do tracking refinement. It's very shot-specific.

  • @ApexArtistX
    @ApexArtistX 3 месяца назад

    how to bring them over to unreal or blender ? tried fbx and usd does not works.. fails hard.. unreal has no camera sometimes

    • @eliotmack
      @eliotmack 3 месяца назад

      Watch closely at 17:32 -- it goes into detail on the Blender import process.

  • @weshootfilms
    @weshootfilms 3 месяца назад

    Amazing

  • @ApexArtistX
    @ApexArtistX 3 месяца назад

    How to do foreground occlusion composition

  • @ApexArtistX
    @ApexArtistX 3 месяца назад

    What happens if there is duplicated and drop frames ?

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      Too many dropped frames can cause the tracking quality to suffer. A cooler is highly recommended especially when rendering some of the heavier scenes that Unreal generates.

  • @ApexArtistX
    @ApexArtistX 3 месяца назад

    USD importer is still beta damn

  • @sanjimanga8923
    @sanjimanga8923 3 месяца назад

    The top view does not work for me its just grey space even after pressing F. I'm not sure what the reason is

    • @eliotmack
      @eliotmack 3 месяца назад

      Make sure you have something in the Unreal scene selected before hitting F; the F command just frames the selected object in the viewport.

  • @mustachefilms7949
    @mustachefilms7949 3 месяца назад

    Ok so I've got the USD working but for some reason the model I am using is always laying on the ground. I've tried rotating the scene-loc and I've tried switch the z-axis to the y-axis and it still doesn't work. Also does the tracking work without the full preview. For example, filming with the door messed up and just adjusting the camera after the fact in unreal. Thanks!

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      Please feel free to join our Office Hours sessions if you want to interactively fix problems. lightcraft.pro/office-hours/

  • @mustachefilms7949
    @mustachefilms7949 3 месяца назад

    Is there a Mac workflow to this? Because the omniverse launcher only has windows and linux.

    • @mustachefilms7949
      @mustachefilms7949 3 месяца назад

      I also can't export the layer as a USD

    • @eliotmack
      @eliotmack 3 месяца назад

      Unreal has an integrated USD exporter that works on Mac. It works but doesn't handle everything that the Omniverse exporter does. We''ll do a tutorial on it at some point.

    • @mustachefilms7949
      @mustachefilms7949 3 месяца назад

      @@eliotmack Thank you so much

  • @somasekharkari8364
    @somasekharkari8364 4 месяца назад

    Hello Team, I am looking to use my Sony Fx3 for Virtual Production, does jetset and its software integrate well with this specific camera? if not how do I go ahead using my Fx3 for Virtual Production?

  • @raphieljoeroeja9406
    @raphieljoeroeja9406 4 месяца назад

    Can you save the hdri that has been captured?

    • @eliotmack
      @eliotmack 4 месяца назад

      Yes, but the quality of the HDRI wasn't that high, so we haven't pursued this heavily for VFX integration.

    • @raphieljoeroeja9406
      @raphieljoeroeja9406 4 месяца назад

      @@eliotmack Oh that’s too bad. It would be awesome if that was possible somehow. Thank you for the answer!

  • @darkodj4131
    @darkodj4131 4 месяца назад

    Thanks for this video, been following and keen to get on the board with Cine.

  • @abelarabian3895
    @abelarabian3895 4 месяца назад

    in my case it doesnt create the level sequence when copy pasting the code in unreal! its having issues everywhere...

    • @knstntn.g
      @knstntn.g 4 месяца назад

      check if imageplate plugin is enabled, worked for me

    • @abelarabian3895
      @abelarabian3895 4 месяца назад

      @@knstntn.g cool thanks! i will try it out!

    • @lightcrafttechnology
      @lightcrafttechnology Месяц назад

      If you run into problems, please post on the forums at forums.lightcraft.pro/. Much easier to track down details there.

  • @TrueButFictional
    @TrueButFictional 4 месяца назад

    Hi, I have the Seemo Pro 1080p version and was wondering if that will work?

    • @eliotmack
      @eliotmack 4 месяца назад

      Yes -- all of the SeeMo devices will work with Jetset Cine.

    • @TrueButFictional
      @TrueButFictional 4 месяца назад

      @@eliotmack thank you! Any limitations with the non-4K version?

    • @eliotmack
      @eliotmack 4 месяца назад

      @@TrueButFictional 1080p is fine for calibration. The SeeMo 4K has a SD card reader so you will be able to record takes directly to it if desired. Very useful for productions that don't want anything near iCloud.

    • @eliotmack
      @eliotmack 4 месяца назад

      @@TrueButFictional The SeeMo 4K has a SD card slot that Jetset will soon be able to record to. Other than that no limitations.

  • @StudioWerkz
    @StudioWerkz 4 месяца назад

    Can or has this been calibrated with an anamorphic lens with a round trip into blender? Or is best to stick to spherical glass?

    • @lightcrafttechnology
      @lightcrafttechnology 4 месяца назад

      We've tested spherical extensively. Anamorphic calibration in Jetset should be fine for getting a basic track. For sub-pixel refinement we're developing a Syntheyes workflow that can be extended to anamorphic solves.

    • @StudioWerkz
      @StudioWerkz 4 месяца назад

      @@lightcrafttechnology Thank you, looking forward to it