Meshroom | Photoscan to Camera Track (Matchmove)

Поделиться
HTML-код
  • Опубликовано: 26 янв 2019
  • Today we use free photoscan software to obtain a camera track (and mesh!)
    Final Shot: vimeo.com/341470345
    Software:
    Meshroom
    alicevision.github.io/#meshroom
    Blender
    www.blender.org/download/
    Maya
    www.autodesk.com/education/fr...
    Music:
    -TeknoAXE - Ambient Intermission
    -TeknoAXE - Synthwave C
    Join my Discord!
    / discord
  • КиноКино

Комментарии • 81

  • @UncleBurrito15
    @UncleBurrito15 4 года назад +61

    With blender 2.8 you no longer need to fix the alembic file

    • @marioCazares
      @marioCazares  4 года назад +5

      You're right thank you for letting me know!

    • @jeremybot
      @jeremybot 4 года назад +1

      Ya beat me to it. Great to give Meshroom another try, knowing I can bring in my cameras too. + This node workflow... !!

    • @killianpreston4125
      @killianpreston4125 4 года назад +1

      Blender still crashed when I tried it on 2.8.1.

    • @SuperDao
      @SuperDao 2 года назад

      Hello! When I import the alembic file in blender it doesn't crash but I got these "component": mvgRoot> mvgCameras; mvgCamerasUndefined;mvgCloud and I can't see the camera itself. What should I do ?

  • @ikbenikHD
    @ikbenikHD 4 года назад +4

    Really smart, never thought to do a camera track this way and you've explained it nicely!

  • @stevesloan6775
    @stevesloan6775 4 года назад +1

    Great delivery!!!
    Sharp and sweet with just enough information. Please produce more Meshroom videos. 🤜🏼🤛🏼🇦🇺🍀🤓

  • @LimeProd
    @LimeProd 5 лет назад +2

    what a great tutorial ... never thought Mesh could do that ....brilliant

  • @mrbob92679
    @mrbob92679 3 года назад +1

    Excelente. Been trying to learn meshroom. Can't get scan to come out. I now know it's all in the camera. Going to try this method. You did a great job and to the point! Look forward to learning more from you.
    Thanks

  • @gamerloud7416
    @gamerloud7416 3 года назад +1

    You just made my life easier. Thank you a lot👍

  • @sideeffectsstudios
    @sideeffectsstudios 2 года назад +1

    That’s very informative. Thank you mate 🤟

  • @throstur_thor
    @throstur_thor 5 лет назад +5

    This tutorial absolutely nails it! I would love to know if there are any special considerations for film gates/resolution for post production?

  • @mawztv
    @mawztv 5 лет назад +2

    Great tutorial, thanks

  • @woolenwoods665
    @woolenwoods665 4 года назад +1

    thanks for sharing man!

  • @raquel.reigns
    @raquel.reigns 4 года назад +2

    your awesome, Mario!!!

  • @khalatelomara
    @khalatelomara 3 года назад +2

    you saved my life :-D

  • @enriquebaeza9949
    @enriquebaeza9949 4 года назад +1

    Thanks Mario I will try it

  • @Bruets
    @Bruets 4 года назад +1

    Did you have to do any sort of retopo to get this to render out/preview faster? I've just created my first stunt double using meshroom and rigged it within blender, however the topology is hurrendous (obviously). Great tut mate, for moving objects (person) i guess you could get them to stand still in the video, then seperate them in the mesh afterwards and rig them. You'd need to remove them from the original video but AE content aware fill does a pretty good job if they're against a plain background.

    • @marioCazares
      @marioCazares  4 года назад

      Hey sorry for just getting to you now, I did not do any retopo but I did Decimate the mesh in Blender to remove a lot of geometry. Glad you could use it for a stunt double I tried to scan people twice and failed haha. If you ever upload anything with the scan feel free to share a link!

  • @mikealbert728
    @mikealbert728 4 года назад +1

    Thanks for this tutorial. Is that anyway to get the camera with just a few points to determine a ground plane to speed up the process?

    • @marioCazares
      @marioCazares  4 года назад

      I don't know of what settings will make the process faster but I do know they exist. I don't mess around with this software enough to know but others might be able to help you get faster scans or even just lowering settings in each node

  • @funny1048youtube
    @funny1048youtube 4 года назад +1

    Is there a way to get lens distortion data from this program into blender to give the render the same lens distortion values as the footage to improve the matchmove?

    • @marioCazares
      @marioCazares  4 года назад

      At the time of this video I wondered the same thing but didn't put enough time/research to find out. Meshroom has updated since this video so it may have tools for exporting lens distortion. I'll let you know if I find out!

  • @RusticRaver
    @RusticRaver Год назад +1

    ace

  • @no-trick-pony
    @no-trick-pony 4 года назад +2

    Wow, how did you do the water person? This is super impressive! I only know a little bit about fluid simulation in Blender but I would have no idea how to achieve something like this. Any hints? ^^

    • @marioCazares
      @marioCazares  4 года назад

      In short I used a fluid sim tool in Maya called biFrost. However the same should work in Blender with Mantaflow (which now is included in the main branch!) What I did was I animated a very low poly model. After I filled the model with fluid (by duplicating and shrinking the main model and making it a fluid object). Last I used the main original model as a collider. This makes the liquid follow the shape of whatever model you use. Sometimes water would spill out the model because the object doesn't have thickness, however it added a cool effect so I didn't try to fix it. If you have any more questions let me know I'd be happy to help!
      Here's the final shot if you're interested!
      vimeo.com/341470345

  • @nathancreates
    @nathancreates 5 лет назад +2

    Awesome tutorial mate, this is an awesome option for camera tracking!
    Is there any other software that you can use for conversion of the abc file that you don't give away your commercial rights to?

    • @marioCazares
      @marioCazares  5 лет назад

      Hello thanks for watching! I looked for a few days at the time and didn't find anything then. I still haven't found an option yet but perhaps one will come up. Sorry I hope you run across one!

    • @MrMargaretScratcher
      @MrMargaretScratcher 4 года назад +1

      Now working in Blender 2.8, apparently...

  • @Waffle4569
    @Waffle4569 Год назад +3

    Blender can import the alembic file fine now so no need to do any other conversions.

    • @marioCazares
      @marioCazares  Год назад

      You're right thanks for letting me know!

  • @roberthinde6577
    @roberthinde6577 3 года назад

    Hey - nice work - any way of exporting the camera for use in Maya (yes I know) rather then just using Maya to fix the file - Like any way of importing to Maya and fixing the orientation issues to then throw into something like Nuke or Natron?

    • @marioCazares
      @marioCazares  3 года назад

      For Maya you'll just import the Meshroom camera and OBJ straight into Maya and skip all the Blender stuff. To reorient it you'll want to add a locator in the middle of the scene> Parent the mesh and the camera to the locator> Then translate rotate and scale the locator as you please to orient the scene in a more practical way. Then you can just select the camera and mesh and export them as Alembic for use in Nuke or Natron (you don't have to select the locator on export)

  • @mrtjackson
    @mrtjackson 8 месяцев назад

    How does this tutorial match up to the more recent versions of Meshroom?

  • @MilanKarakas
    @MilanKarakas 4 года назад +1

    I need help because I am newbie in Blender (2.8). How did you import video aside camera tracking in Blender? Where to import and how. I need to overlay video (from .png files) to mesh that is done in Meshroom. Thanks.

    • @MilanKarakas
      @MilanKarakas 4 года назад +1

      Oh, I was wrong in my other reply (will be deleted to avoid confusion). In Blender 2.80, first delete cube and the camera. After importing .obj and .abc file, select camera from 'mvgRoot' and check 'Background Images', click on Add Image, click on Open, and select first image that was used for Meshroom. Change from Source: Single image to Image Sequence, change Frames: from 1 to number of frames that you can see on Playback track +1 (if there is End: 62, then it is total 63 frames because includes frame number 0), change start to 0, and Offset to -1. If those numbers are wrong, you may notice mesh and background image mismatched. Also, I enabled Cyclic, but can't see the difference, just for the convenience as author of this video did.

    • @MilanKarakas
      @MilanKarakas 4 года назад +1

      Additional info: in order to correct orientation of the object, but keeping camera motion in sync, when rotating and moving, first select both: camera and mesh frame. Else, it will be mismatched.

    • @marioCazares
      @marioCazares  4 года назад +1

      You are correct you follow the steps you stated with "Background Images". "Background Images" are in a different place in 2.8 then in 2.79 (which I use 2.79 in this video) but it seems you found it. Hope you found all the answers you needed and sorry for only now getting to you (I was away from youtube for some months)

    • @MilanKarakas
      @MilanKarakas 4 года назад +1

      @@marioCazares Yes. Everything is slightly different, but eventually anyone can find the solution. No worry about being late in answering. In that past two months I learned a lot about Blender, and I like that software even more.

  • @TheGladScientist
    @TheGladScientist 3 года назад

    tried this out, but the camera animation seems super choppy...is it possible to convert the alembic camera to an FBX to better control FPS and/or keyframing?

    • @marioCazares
      @marioCazares  3 года назад

      I think Meshroom only supports Alembic for now. You should be able to Bake Action of the alembic camera and manipulate the keyframes

  • @jellykid
    @jellykid 3 года назад

    Lol you skipped the one thing I needed to know. How do you add the original footage into the shot in the blender (my case it would be C4D)

    • @Gringottone
      @Gringottone 3 года назад

      Place your image or video as "projection" or "screen" on the background of the scene.

  • @SuperDao
    @SuperDao 2 года назад +1

    Hello! I hope you’re going well. As meshroom and blender software have got many updates. Is it possible to di an updated tutorial too please ?
    Have a good day!

    • @marioCazares
      @marioCazares  2 года назад

      Possibly however it may be a while because life is pretty chaotic right now. A really good tutorial out there currently for scanning however is here in the meantime: ruclips.net/video/j3lhPKF8qjU/видео.html

    • @SuperDao
      @SuperDao 2 года назад +1

      @@marioCazares Thank you a lot for your answer. I hope you’re going to be well. Tell yourself it's temporary! (I’m not good to comfort and my english isn’t that good sorry :/ )
      I’ll follow the course thank you again!

  • @blackpinkkpop2048
    @blackpinkkpop2048 3 года назад

    How did you import the video into the background in blender.
    Thankyou

    • @marioCazares
      @marioCazares  3 года назад +1

      Blender 2.79 I just opened the side panel with "n" key and checked the "Background Images" box. Select "Add Image">Open and then select your video or image sequence. In Blender 2.8, same steps except it will be in the Camera Settings and NOT the "n" panel

  • @800pieds
    @800pieds 3 года назад

    Have you tried aligning with the original footage in Blender? Do you observe it's shifted? Do you know how to fix it?

    • @marioCazares
      @marioCazares  3 года назад

      Usually this happens when your footage in Blender is a frame off. Try using frame offset on the background footage and see if going a frame or two forward or backwards works

    • @800pieds
      @800pieds 3 года назад

      @@marioCazares Could have been but I knew that one and it was not that. I used an import photogrammetry addon and it aligns much better now. Still no idea why it didn't when I did it manually. Maybe something to do with lens distortion? Could it be that the model is based on straightened out shots and thus doesn't fit the original footage if it was distorted?

  • @rickyferdianto6335
    @rickyferdianto6335 4 года назад

    How long to scan this with default setting

  • @Sylfa
    @Sylfa 4 года назад +1

    Why not use blenders camera tracking though?
    Okay, so if it works without any issues you can just as well use the one from Meshroom to save a bit of time, but if it requires fixing first you might as well just load the image sequence (which you probably exported from Blender in the first place) and do a quick camera track.
    It'll probably be faster than fixing the alembic file.

    • @marioCazares
      @marioCazares  4 года назад +2

      The advantage is that you get a camera and full scanned mesh of your scene that are already aligned. If you camera track in Blender you would need to manually match your photoscan geometry to a new camera which would be very difficult in organic scenes like this one.

    • @Sylfa
      @Sylfa 4 года назад +2

      @@marioCazares Fair point, though as long as you have something to align with in the scene it wouldn't be too hard either.
      Just seemed like most of the video was about how to fix the problems that arose with trying to import Meshrooms track to Blender. Considering how good Blender is at tracking it seemed roundabout, but as you say. You get the alignment with the mesh for free that way.

    • @SuperDao
      @SuperDao 2 года назад +1

      @@marioCazares Hello! Firstly, thank you for the tutorial and secondly is there any other way to align photoscan + new camera ? (Without using Meshroom tracking) If not what's the best way to manually match a camera track with the 3d environement ( from photogrammetry ) ? By the way for video footage with "human moving" inside it. Is there a way to "mask" human like that you can still get a camera track + some mesh ?
      Sorry for my english :s it's not my primary language

    • @marioCazares
      @marioCazares  2 года назад +1

      @@SuperDao Hello there. The way that I match photoscans and new cameras is by eye actually. I look through the camera and find features in the photoscan that match my image sequence, and I align them while in camera view.
      I have a video that talks exactly about this here: ruclips.net/video/4Ga_CrUPwbU/видео.html
      start at 8:36 and it will explain exactly what I do in a real shot I've already done. I hope it will help!
      Also for "human" in tracking you would just disable trackers as they get covered and for scanning I actually don't know. Maybe you can roto "human" out before bringing the images to meshroom? I've never tried that before so you might have to experiment with that one

  • @npc.artist
    @npc.artist 4 года назад +1

    My camera didn't match with footage, But This is a great tutorial btw

    • @marioCazares
      @marioCazares  4 года назад +1

      Sometimes the camera doesn't solve correctly or the footage could be offset by one frame forward or back

    • @npc.artist
      @npc.artist 4 года назад

      @@marioCazares Camera angle is uncorrect too ; w ;

    • @joserendon1025
      @joserendon1025 3 года назад

      ​@@npc.artist make sure your camera settings in blender match the settings of the camera you took the video with.

  • @constantianossborn4628
    @constantianossborn4628 4 года назад

    Is there a Mac version? :-)

    • @marioCazares
      @marioCazares  4 года назад

      Unfortunately there is not : ( This is the only information about having it on a mac: github.com/alicevision/meshroom/wiki/MacOS

  • @HeroSnowman
    @HeroSnowman 5 лет назад

    How long it took to compute the whole mesh?

    • @marioCazares
      @marioCazares  5 лет назад

      It's been a while so I don't remember exactly but I believe the whole process was a couple hours on my laptop

    • @HeroSnowman
      @HeroSnowman 5 лет назад

      @@marioCazares did you change any node settings?

    • @marioCazares
      @marioCazares  5 лет назад

      I went back to the file to check and no I left everything at default for this shot mostly because I didn't understand most of the settings at the time

    • @HeroSnowman
      @HeroSnowman 5 лет назад

      @@marioCazares tho from the logs of each node you can calculate total time it took

    • @marioCazares
      @marioCazares  5 лет назад

      @@HeroSnowman Thanks for the tip. Okay so wow this took a lot longer than I remember. Total was 9.2 hours! Guess I left it overnight

  • @altamiradorable
    @altamiradorable 3 года назад

    Can you talk faster ?

    • @marioCazares
      @marioCazares  2 года назад

      My speed is capped unfortunately ;_: