How I Align Photogrammetry to 3D Tracks in SynthEyes.

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024

Комментарии • 61

  • @macker202
    @macker202 Год назад +8

    Please never take this tutorial down. I don't use syntheyes often enough to remember all of this stuff - so it's like learning from scratch every time I open it!

  • @benwatne4593
    @benwatne4593 3 года назад +1

    Thank you thank you thank you for this! Saved me what would've been hours of trial and error. Awesome.

  • @MatthewMerkovich
    @MatthewMerkovich  3 года назад +5

    I was just talking to someone else about this tutorial and said the following: "This is why I say things like 'This is how ***I*** do ... [fill in the blank]. I'm wary of any instructional materials that intimate there is the only one way to do anything. All I see there is someone with no imagination. LOL!"
    Keep that in mind. I know there are many ways to approach a problem, and I may approach the same problem differently depending on the day of the week. None of the above should be understood as ***my*** way of doing every shot like this, and there's only ***one*** way to approach it. Think of this as just one more tool for your tracking toolbox, so go forth with my blessing and adapt it and make it your own. -MM

  • @warrenography
    @warrenography 3 года назад +7

    that was very generous of you, we learned a moat load. thank you!

  • @1chrissirhc1
    @1chrissirhc1 3 года назад +3

    Thank you for sharing this! Now I just need to film something and practice it!

  • @madlove2339
    @madlove2339 Год назад +1

    i really need this tute. amazing Matt

  • @lucho3612
    @lucho3612 4 месяца назад +1

    Saving this tutorial to save my life

  • @danilodelucio
    @danilodelucio 3 года назад +1

    You helped a lot again man, you are my hero! Thanks!

  • @jamaljamalwaziat1002
    @jamaljamalwaziat1002 3 года назад +1

    Advance good quality tutorials. Thank u

  • @clonkex
    @clonkex 3 года назад +2

    You deserve waaaaay more subs! Not only is this video incredibly helpful, it's also supremely well-edited and funny.

  • @andybclarke1465
    @andybclarke1465 4 месяца назад +1

    Excellent, thank you for this. Very helpful. I'd been using keentols pin tool in nuke, not realizing it already existed in here.

  • @millolab
    @millolab 3 года назад +1

    like before watching. :D

  • @alinazaranimation557
    @alinazaranimation557 10 месяцев назад

    Thanks Matt for the incredible info.
    And now I have irrelevant question about Syntheyes, and that is when I solved my well done supervised trackers, the solve messages just says that there are 53 trackers unsolved which is half amount of all the trackers I have in the scene, so the solved camera in 3d doesn't move for half of the shot, then started moving after the solved trackers, my unsolved trackers turned to red color, which indicates the unsolved trackers, please help and tell me what wrong did I do, snd why this is happening?
    And please make a video about these kinda issues may face beginners, intermediates, and pros.
    Thanks in advance.

    • @MatthewMerkovich
      @MatthewMerkovich  10 месяцев назад

      Without actually seeing yuor scene, there's really no way to troubleshoot, I'm sorry to say.

  • @joachimkarstens9769
    @joachimkarstens9769 Год назад

    Great video, as always!!
    I´m wondering if you could do an overview video that shows ways how to align 3D objects with a track? I have always troubles with it. Like a toy car being pulled over over the floor, with fixed camera and with moving camera, and then align a corresponding 3D object and the floor easily.

    • @thematchmovingcompany5739
      @thematchmovingcompany5739 Год назад

      It's really about the same process, just pinning 2D trackers to the same features on a 3D object. The main problem with doing such a video is having the geometry and the footage. I can't really use studio assets from big movies on this little RUclips channel, so I'm a bit constrained.

  • @ReinOwader
    @ReinOwader 3 года назад

    I have question off topic - I want to by lens for my new camera witch I intend to use for VFX. Should I avoid IS lens or do they make my life easier while camera tracking. I have noticed that I can't track my android camera footage because of ugly distortions in footage (auto stabilizing). So is IS lens doing the same (distorting image to make it look smooth) or is IS treating footage differently?

    • @MatthewMerkovich
      @MatthewMerkovich  3 года назад +1

      Definitely go back and watch my last video where I warn against these terrible IS lenses! 🤣 IS not only moves your lens center, it animates that center. Image stabilization is a bad idea for anything to do with tracking.

  • @macker202
    @macker202 Год назад

    Another question - sorry for being a pain.
    Could this technique also be used to align still images? Seems better than just trying to eyeball it and guess focal length in 3DS max.

    • @thematchmovingcompany5739
      @thematchmovingcompany5739 Год назад

      absolutely, just as long as there are identifiable features in common between the photos and enough parallax.

  • @mx472000
    @mx472000 3 года назад +1

    Excellent tutorial. Thank you.

  • @clonkex
    @clonkex Год назад

    Would I be right in assuming a solid track is critical to having the geometry line up correctly throughout the shot? I might re-track my shot and try to do a better job. It's got a couple of areas of high blur that make it difficult to track accurately.

  • @mikegentile13
    @mikegentile13 3 года назад

    Great workflow!
    Can I get a discord invite?! Thanks.

  • @teresaleong1887
    @teresaleong1887 3 года назад +1

    Hey Matt, I’ve gotten SO MUCH value from your tutorials! Literally have saved me at work a few times. I’m curious about this discord channel...how do I get access?

  • @MrMcnamex
    @MrMcnamex 3 месяца назад

    I can't seem to get the mesh to stays throughout the entire timeline after pinning. There must be some steps missing after pinning. Help

    • @MatthewMerkovich
      @MatthewMerkovich  2 месяца назад

      I recommend doing a video screen capture of your own to demonstrate what you are seeing.

    • @MrMcnamex
      @MrMcnamex 2 месяца назад

      @@MatthewMerkovich All good finally understand the importance of locking the seed. So hard to get it right, especially with less accurate geometry. Much appreciated Matt

  • @glmstudiogh
    @glmstudiogh 2 месяца назад

    you said no lens distortion before tracking. does that rule apply everywhere else or just in SynthEyes if I may ask? cause that is something has been bogging me

    • @MatthewMerkovich
      @MatthewMerkovich  2 месяца назад +1

      I'd say mostly SynthEyes, because the lens distortion calculations are so accurate. Beyond that, depending on the apps you are using, that assertion may or may not apply. But if you are 3D tracking in SynthEyes, you typically want to try letting SynthEyes solve for the distortion first, and only then moving on to other distortion management approaches.

    • @glmstudiogh
      @glmstudiogh 2 месяца назад

      @@MatthewMerkovich I love Syntheyes but sometimes time constraints don’t permit me to jump between programs. And I’m a comp artist so I do the 3d camera tracking in nuke. Now I noticed something about nuke when about to solve, you can check the undistort lens button, but that button is not needed when the plate is already undistort.

    • @glmstudiogh
      @glmstudiogh 2 месяца назад

      @@MatthewMerkovich so when I’m in Syntheyes. I shouldn’t bother undistorting. It’s wrong. And true, Syntheyes undistortion is very accurate

    • @MatthewMerkovich
      @MatthewMerkovich  2 месяца назад

      @@glmstudiogh We all have to play to our strengths, so if you are getting good results staying in Nuke, I say fabulous! But for me, when I was compositing in Nuke (or any other app), SynthEyes was much faster for me. The trick there is having good pipeline and workflow tools to make those transitions easier. For example, the new SynthEyes to Flame workflow makes switching back and forth much, much faster.

  • @Robinbobin
    @Robinbobin 2 года назад

    awesome!

  • @snahfu
    @snahfu 3 года назад

    Big legend

  • @hernanllano9711
    @hernanllano9711 2 года назад

    Hey Matthew, thanks for the video, noob question here, why should we not undistort the footage before tracking? what would be your workflow if you are given lens and camera info, plus a lens grid from the client, or even for a personal project. Thanks!

    • @MatthewMerkovich
      @MatthewMerkovich  2 года назад +1

      Typically SynthEyes can calculate more accurate lens distortion maps than lens grids. There are instances where lens grids are very helpful, but that really only applies to very distorted anamorphic lenses. My workflow with spherical lenses and on-set notes is to ignore the notes. Just remember, a 35mm focal length lens isn't actually a 35mm focal length lens.

    • @hernanllano9711
      @hernanllano9711 2 года назад

      @@MatthewMerkovich thanks for your very informative answer, one last question, on this workflow, how do you pass the lens distortion information to the compers and CG artist so they can distort their renders to match the original plate? I am asking this because the workflow I usually see is this one: Lens distortion is calculated from a grid so we have a node in nuke to distort all the CG renders, and undistorted plate with overscan is generated for both comp and CG. The final submission of course will be the original plate with the CG integrated, distorted, and comped, which would at the end go to color grade or any grading process. At least at Framestore and at Dneg, which are the companies I have worked for where I saw an actual workflow for these things. I am an FX Supervisor but many years ago I worked as a generalist and did my fair bit of tracking, but never at the level of a big production.

    • @MatthewMerkovich
      @MatthewMerkovich  2 года назад

      @@hernanllano9711 Here are my typical, boilerplate Export Notes
      Undistorted plates: If your original footage was linear, the accompanying undistorted plates have been created with a neutral color correction in an RGB color space as JPEG images. If your original footage was in a log color space, the undistorted footage is coming back to you in that color space. Feel free to create your own undistorted plates using the provided undistort nodes in your preferred color space and file format supported by your compositing app.
      Nuke: Lens distortion is done via ST maps. The first will undistort your footage, though you shouldn't need it; it's mostly there for illustrative purposes. I've provided an undistorted JPEG sequence, though you could use the first ST map in the Nuke script to generate undistorted BG images of your own, for your 3D team, that are to your color specs. For best results in the final comp, render your 3D elements at the overscan resolution and redistort back to the plate resolution using the second ST map. This leaves your original plate unchanged and only downsamples your 3D render when re-adding lens distortion.
      Fusion: Fusion works the same as Nuke.
      After Effects: The scene has been exported as an After Effects Javascript. Just run it from within After Effects and your comp setup will be created. Also, an export message from SynthEyes: "SynthEyes Distortion node generated. You must have installed the SyAFXLens.plugin into the (new) folder /Applications/Adobe After Effects CC 20XX/Plug-ins/Effects/SynthEyes. See the Exporting section of the manual." Once your comp is set up, use only the redistort node to add distortion to the VFX elements rendered at the overscan resolution.
      FBX: The FBX file format is the most robust output for modern Autodesk 3D animation pipelines. This shot has been exported in the most recent FBX version. (Older FBX versions can be exported for pipelines running older versions of Autodesk software.)
      Export Geometry: The triangulated mesh named, "TrkMesh" is provided only as a byproduct of the tracking QC process. The point cloud has been roughly triangulated to show the camera or object track lines up to the photography. Feel free to use it for reprojections or whatever you'd like, but you may need to have actual modeling artists build out geometry using 3D specialized modeling tools like Maya, 3DS Max, Modo, etc.

    • @hernanllano9711
      @hernanllano9711 2 года назад

      @@MatthewMerkovich incredibly useful and detailed information, thanks!

  • @macker202
    @macker202 Год назад

    Hi - this is fascinating. I've been looking at purchasing some tracking software so that I can comp my 3D models (large scale architecture stuff) into drone footage - this seems perfect. Once you've set up this track, can you export the camera into 3DS max for rendering? I assume that's possibl?

    • @MatthewMerkovich
      @MatthewMerkovich  Год назад +1

      Absolutely. SynthEyes is very Autodesk pipeline friendly. There are direct 3DS Max exporters, but the best way to do it in 2023 is to just use .fbx.

    • @macker202
      @macker202 Год назад

      @@MatthewMerkovich Fantastic. Thank you for the quick response.
      I'm just giving the Demo a try - and I've imported my 3D site, but it's massive (about 2km long). Any idea how to scale up the camera track by a factor of say 100 to allow me to start pinning the geometry?

    • @thematchmovingcompany5739
      @thematchmovingcompany5739 Год назад +1

      @@macker202 You need not scale anything. If your geometry was exported at the scale you want (which is highly recommended), after pinning, the solve will then conform to that scale. If you want to change the mesh's size (not recommended in terms of best practices), just scale it in the 3D room and your camera track will scale to that.

    • @macker202
      @macker202 Год назад

      @@thematchmovingcompany5739 Thanks. My mesh is to scale (which is definitely what I want).
      I am almost certainly doing something wrong here then - because once I go to pin, the mesh is so large that I can't actually see any features that I want to move into shot.
      [Edit] I used the 3D view to move the camera/trackpoints to the rough location where they needed to be - then I could see the necessary points to start pinning. I've just achieved a way, way better camera track than I've ever been able to achieve in after effects (which isn't really built for this stuff). I think I'll be buying the software!

    • @thematchmovingcompany5739
      @thematchmovingcompany5739 Год назад +1

      @@macker202 You probably just need to zoom out. One of the weaker aspects of SynthEyes is the 3D viewport navigation. I may have to do a video about that at some point.

  • @michik2379
    @michik2379 3 года назад

    Alice vision can export the reconstructed cameraposes created during photogrametry as an animated camera ad an fbx file I believe.

  • @hatemhelmy820
    @hatemhelmy820 3 года назад

    constraining unsolved trackers! My mind is officially blown :D ... Great tutorial!

  • @MattScottVisuals
    @MattScottVisuals 3 года назад

    Thank you so much for taking the time to create / share this. Very enjoyable and informative!

  • @samuelconlogue
    @samuelconlogue 2 года назад

    You couldn't have made it clearer Matt! Excellent and helpful as always.

  • @therealmpaquin
    @therealmpaquin 3 года назад

    That was amazing! Thank you! I have Revit models and lidar models I need to match to drone footage and this answered so many questions.

  • @MortenKvale
    @MortenKvale 3 года назад

    Great tutorial as always! Now I don't have to have to show juniors how to work with photogrammetry, I can just send them to this video. :D
    Tho it is pretty easy to check which trackers are pinned in the Contraints Pts view, I like to give them a special color as well, just for easier readability. Great when inspecting other workers scenes.

    • @MatthewMerkovich
      @MatthewMerkovich  3 года назад +1

      Absolutely agree on coloring trackers. I often give mine custom names as well for clarity purposes. And regarding that constrained points view, I think You are going to really love the new version of SynthEyes when it hits. ;-)

  • @jamaljamalwaziat1002
    @jamaljamalwaziat1002 3 года назад

    I think if i will do it i will use place tool and texture model to place points. Unfortunately i didnt experiment too much with it

    • @MatthewMerkovich
      @MatthewMerkovich  3 года назад +1

      It would be great if you got textured geometry from a client, but that isn't always possible. We often have to patch things together from what's available, and it's good to have many tools in your toolbox to solve different problems. 😉

  • @jcnwillemsen
    @jcnwillemsen 2 года назад

    Hi, what is your discord? thxn gr J from holland