Maaaan... With all due respect.. Russ Andersson is a genius at making 3D math moving software.. but quite frankly the worst at explaining how it works. Your videos are what I've been looking for - for years! Thank you!
This is just gold, no, more than gold. Really priceless, thank you for sharing knowledge and wisdom. Now I know how to lock down the orientation/scale, what a time saver!
You and everyone else (me included)! Truth is, I have recorded it and it just needs to be edited. Trouble is, I have been crushed with work for over a year. ¯\_(ツ)_/¯
@@MatthewMerkovich No problem Mr Merkovich, I'm glad to hear you're doing well (at least in the biz is booming sense.) I'll keep my eyes *peeled* !! Thanks for the response! You're great.
That had to be delayed, but the good news is that there are new features in the newest version of SynthEyes that just came out that really make this process easier.
No matter how much I’ve learned about SynthEyes since 2006, I still continuously find bits I never knew about in your tutorials. Locking tracking to keep a solved put when resolving will now be used by me religiously moving forward. Lol I just got use to just placing the scene as part of my own personal checklist, and never sought out a solution. Thanks for the clear explanations of your process. Love it all.
Every single video of yours is a world of wisdom, I used MatchMover back in 2002 and later C4D's own tracker, I can't tell you how much I learn from every single video of yours and I am super excited that you are recording videos for BorisFX, as far as Tracking goes, it doesn't get any better than that! From the bottom of my heart Thank you Matthew!
Matchmover! Now that's a blast from the past. I've tracked in just about every app imaginable and I've never found anything better than SynthEyes. Thanks for following along. More videos are on the way!
Thanks so much! I have been working in 3D for about 10 years now. I have been wanting to learn the VFX/compositing side more and I knew I needed to figure out the tracking parts of it. I am not at this level yet, the reason for things is lost on me right now - but I know eventually it will make sense. Thank you for sharing your knowledge, this is super generous of you! I have been holding off on buying SynthEyes for a while but I think it’s time now.
You're like, better than anything on Netflix. Is it normal that I find match moving / solving cameras more fun than actually creating the effect that I matched it for?
I've often said my favorite thing about tracking is that I get absolutely zero creative notes. "The track feels like it has too much of an ephemeral quality and it feels like it wants to be more solid without feeling too contrived." Anyone who has worked in VFX knows exactly what I am talking about! 🤣
In the Resolve Deliver page, you can right-click the thumbnail of the clip, and pick "Render This Clip" to automatically set the in and out points to a single timeline clip.
Absolutely, Jon! I just didn't want to get too deep on the resolve side of things. My initial string out of all the media for this tutorial was over two hours long! I should really have spent a lot more time editing the SCRIPT! But I love and appreciate comments like this. I believe all 3D tracking departments need an editorial department of their own. Minimally, every animator should have access to editing software for QC and more.
Scaling has been my biggest problem. Im looking for the fastest method. Doing a couple of distance contraints and then manual allignment seems like the way to go. Right? lol
6:02 it's for this reason I wish all NLE's used the same current time indicator as Avid... or the same concept, with two lines that surround the current frame. Premire at least has a little line at the top showing you the whole frame, but it's not as good, and resolve just completely ignores it lol.
Yeah, one of the few things I actually like about avid. 😆 It's a totally valid criticism. Believe me, I have a laundry list of my own complaints about Resolve. But as I regularly say: "All software sucks. You just have to find the application with which you can tolerate having an unhealthy co-dependent relationship."
I've been practicing making site surveys to generate point clouds using your process. Can you give us a early hint of how you use site survey tracking with tracking done on the production footage? Don't want to steal your thunder on the next video but I understand it takes time to make tutorials. Really like your tutorials. Feels like they are real world problems and not footage shot just to do a tutorial.
Hi Matthew, Thank you so much for making these incredible tutorials! I have a few questions, if you don't mind. It's about scanning green screen survey footage. -Do you need to scan the entire studio? Meaning, can you shoot just the cyc and have enough tracking data? -Would the process be exactly the same? After all, a green screen cyc wouldn't have as much parallax as this footage, no? I noticed that, in your previous video, there was only one c-stand in the foreground, would it be better to have more stuff spread out in the mid-ground and the foreground or would it just get in the way? -If the soundstage/studio is really big, the opposite side to the cyc is really dark. And if the footage has too much noise on the studio side, would you denoise the two halves separately? Thank you again!
Yeah, I definitely want to finish it. That's the problem with these expansive tutorials: they suck up a lot of time that could be used for smaller, but very useful, other tutorials. After this one, I'll focus on doing just that.
I've gone through the second part, but I didn't see how the recorded the actual camera data used on the field. Did I miss something, or does SynthEyes not require recording the camera parameters during actual shooting?
It's been killing me that I haven't been able to do it. Actual tracking work has saturated my entire schedule. I guess the good news is that I am hiring?
@@MatthewMerkovich yes please, we need to know that last part! I can't find anything like this from anywhere else, your content is amazing on syntheyes. And BTW happy your work it's going amazing. You need to hire so you can time for the tuts! 🙏
Matthew, I got a quick question: Back in the day, when we were working with RealViz / Autodesk Matchmover we could lower the sensitivity of the 2D tracker. By default the sensitivity of the 2D trackers in Matchmover were set to 1/32 of a pixel by default and if the footage was too noisy or blurred we would lower the sensitivity to 1/16 of a pixel and that would make the pixels stick better. Is there a way in SynthEyes to do something similar? Once I put a supervised 2D tracker, is there a way to lower or raise its sensitivity before I let it track the X and Y info? Thanks in Advance.
Are you talking about sub-pixel tracking? Sounds like it. Matchmover was something I never used. To "lower" the sensitivity of the tracker, You might consider the rez tools in the Image Preprocessor. Blur in the Image Preprocessor might also eliminate some of the high frequency noise in your 2D tracker.
can we track a long shots like children of men in which we have a point not for even 100 frames(lot of camera movement and more than 1000 frames) if yes then how to track and matchmove it
Pretty much any shot is trackable. There are, of course, exceptions: tracking a polar bear's body, at the North Pole, in a blizzard, at midnight, all photographed out of focus. It's also worth discussing the terms "track" and "match-move." I prefer more specific language: camera track, object track, or rotomation. I've heard people use the term match-move to mean any of the above. How do you track shots like CHILDREN OF MEN? My 2D tracking series is how. I've tracked shots that were minutes long using those techniques. Here's the playlist: ruclips.net/p/PLg9owbm1ts8Q_gMY96dwJHTgo9wiQbAD1
@@uchihaitach545 You are very welcome. And I thank **you** for writing "a lot" and not "alot." 😉 Nothing goes better with good tracking than good grammar! 😁
the syntheyes king
And he uses Blender too. Quite a wonderful combination.
Maaaan... With all due respect.. Russ Andersson is a genius at making 3D math moving software.. but quite frankly the worst at explaining how it works. Your videos are what I've been looking for - for years! Thank you!
Part 3, part 3!
Really looking forward to Part 3. Learned so much from watching your videos.
This is just gold, no, more than gold. Really priceless, thank you for sharing knowledge and wisdom. Now I know how to lock down the orientation/scale, what a time saver!
Great to see you here! The follow-up video is on the way. ;-)
Apologies for peppering your vids with comments, but I would LOVE to see part 3.
You and everyone else (me included)! Truth is, I have recorded it and it just needs to be edited. Trouble is, I have been crushed with work for over a year. ¯\_(ツ)_/¯
@@MatthewMerkovich No problem Mr Merkovich, I'm glad to hear you're doing well (at least in the biz is booming sense.) I'll keep my eyes *peeled* !! Thanks for the response! You're great.
I love that series. Bob Odenkirk is playing this mad tracker so convincing.
Super excited to see part 3!
That had to be delayed, but the good news is that there are new features in the newest version of SynthEyes that just came out that really make this process easier.
@@MatthewMerkovich Please do that part 3 with the new features :) we need it
I’m excited for part 3
Ok, watched it! Deserves a second like! 😉😉
LOL!
your videos are so helpful! I'm really curious now how this data can be used for other shots, looking forward to part 3 😌🙏
Like before watching
I can't wait for part 3, thanks Matt!
No matter how much I’ve learned about SynthEyes since 2006, I still continuously find bits I never knew about in your tutorials. Locking tracking to keep a solved put when resolving will now be used by me religiously moving forward. Lol I just got use to just placing the scene as part of my own personal checklist, and never sought out a solution. Thanks for the clear explanations of your process. Love it all.
Every single video of yours is a world of wisdom, I used MatchMover back in 2002 and later C4D's own tracker, I can't tell you how much I learn from every single video of yours and I am super excited that you are recording videos for BorisFX, as far as Tracking goes, it doesn't get any better than that!
From the bottom of my heart Thank you Matthew!
Matchmover! Now that's a blast from the past. I've tracked in just about every app imaginable and I've never found anything better than SynthEyes. Thanks for following along. More videos are on the way!
@@MatthewMerkovich Thank you very much and I can't wait, really excited for your next videos 🤩🤩
Great work as always Matt, thank you for creating and sharing!
There is always a gem in your videos, sometimes a full bag! :)
Thank you again for putting out such high end content!
Thanks so much! I have been working in 3D for about 10 years now. I have been wanting to learn the VFX/compositing side more and I knew I needed to figure out the tracking parts of it. I am not at this level yet, the reason for things is lost on me right now - but I know eventually it will make sense. Thank you for sharing your knowledge, this is super generous of you! I have been holding off on buying SynthEyes for a while but I think it’s time now.
Great stuff as always Matthew, thank you so much. Can't wait to watch Part 3!
You're like, better than anything on Netflix. Is it normal that I find match moving / solving cameras more fun than actually creating the effect that I matched it for?
I've often said my favorite thing about tracking is that I get absolutely zero creative notes.
"The track feels like it has too much of an ephemeral quality and it feels like it wants to be more solid without feeling too contrived." Anyone who has worked in VFX knows exactly what I am talking about! 🤣
Fantastic series!
In the Resolve Deliver page, you can right-click the thumbnail of the clip, and pick "Render This Clip" to automatically set the in and out points to a single timeline clip.
Absolutely, Jon! I just didn't want to get too deep on the resolve side of things. My initial string out of all the media for this tutorial was over two hours long! I should really have spent a lot more time editing the SCRIPT! But I love and appreciate comments like this.
I believe all 3D tracking departments need an editorial department of their own. Minimally, every animator should have access to editing software for QC and more.
@@MatthewMerkovich Sweet. I guess that's how you pack so much knowledge into such a concise video. Much appreciated.
Thanks Matt.
Scaling has been my biggest problem. Im looking for the fastest method. Doing a couple of distance contraints and then manual allignment seems like the way to go. Right? lol
SET SEED!! even better.
6:02 it's for this reason I wish all NLE's used the same current time indicator as Avid... or the same concept, with two lines that surround the current frame. Premire at least has a little line at the top showing you the whole frame, but it's not as good, and resolve just completely ignores it lol.
Yeah, one of the few things I actually like about avid. 😆 It's a totally valid criticism. Believe me, I have a laundry list of my own complaints about Resolve. But as I regularly say: "All software sucks. You just have to find the application with which you can tolerate having an unhealthy co-dependent relationship."
@@MatthewMerkovich That is so true
I've been practicing making site surveys to generate point clouds using your process. Can you give us a early hint of how you use site survey tracking with tracking done on the production footage? Don't want to steal your thunder on the next video but I understand it takes time to make tutorials.
Really like your tutorials. Feels like they are real world problems and not footage shot just to do a tutorial.
yes please, did you find some solution for this?
Hi Matthew,
Thank you so much for making these incredible tutorials!
I have a few questions, if you don't mind.
It's about scanning green screen survey footage.
-Do you need to scan the entire studio? Meaning, can you shoot just the cyc and have enough tracking data?
-Would the process be exactly the same? After all, a green screen cyc wouldn't have as much parallax as this footage, no? I noticed that, in your previous video, there was only one c-stand in the foreground, would it be better to have more stuff spread out in the mid-ground and the foreground or would it just get in the way?
-If the soundstage/studio is really big, the opposite side to the cyc is really dark. And if the footage has too much noise on the studio side, would you denoise the two halves separately?
Thank you again!
The more parallax the better. Having things in the foreground, mid-ground, and background is always best.
I'm very fond of your tutorial videos.
Are there plans to continue the Set Survey series?
Yeah, I definitely want to finish it. That's the problem with these expansive tutorials: they suck up a lot of time that could be used for smaller, but very useful, other tutorials. After this one, I'll focus on doing just that.
@@MatthewMerkovich Cool.
I'm looking forward to them. 🙂
i feel like i should be paying for this
I've gone through the second part, but I didn't see how the recorded the actual camera data used on the field. Did I miss something, or does SynthEyes not require recording the camera parameters during actual shooting?
No, SynthEyes requires no information about the camera. The solve results will always be more accurate than anything written down during production.
Hi, do you think the part 3 will be up anytime soon or will be delayed for a looong time? thanks in advance for creating this amazing content.
It's been killing me that I haven't been able to do it. Actual tracking work has saturated my entire schedule. I guess the good news is that I am hiring?
@@MatthewMerkovich yes please, we need to know that last part! I can't find anything like this from anywhere else, your content is amazing on syntheyes. And BTW happy your work it's going amazing. You need to hire so you can time for the tuts! 🙏
@@MatthewMerkovich any news on part 3 please! 🙏🙏🙏
@@KenedyTorcatt Once I have a few more pieces in place and can hire some more people, I will free up some time to do it hopefully.
@@MatthewMerkovich hopefully part 3 will be ready! Please 😅
Matthew, I got a quick question:
Back in the day, when we were working with RealViz / Autodesk Matchmover we could lower the sensitivity of the 2D tracker. By default the sensitivity of the 2D trackers in Matchmover were set to 1/32 of a pixel by default and if the footage was too noisy or blurred we would lower the sensitivity to 1/16 of a pixel and that would make the pixels stick better.
Is there a way in SynthEyes to do something similar? Once I put a supervised 2D tracker, is there a way to lower or raise its sensitivity before I let it track the X and Y info?
Thanks in Advance.
Are you talking about sub-pixel tracking? Sounds like it. Matchmover was something I never used. To "lower" the sensitivity of the tracker, You might consider the rez tools in the Image Preprocessor. Blur in the Image Preprocessor might also eliminate some of the high frequency noise in your 2D tracker.
can we track a long shots like children of men in which we have a point not for even 100 frames(lot of camera movement and more than 1000 frames) if yes then how to track and matchmove it
Pretty much any shot is trackable. There are, of course, exceptions: tracking a polar bear's body, at the North Pole, in a blizzard, at midnight, all photographed out of focus.
It's also worth discussing the terms "track" and "match-move." I prefer more specific language: camera track, object track, or rotomation. I've heard people use the term match-move to mean any of the above.
How do you track shots like CHILDREN OF MEN? My 2D tracking series is how. I've tracked shots that were minutes long using those techniques. Here's the playlist: ruclips.net/p/PLg9owbm1ts8Q_gMY96dwJHTgo9wiQbAD1
@@MatthewMerkovich thanks a lot sir
@@uchihaitach545 You are very welcome. And I thank **you** for writing "a lot" and not "alot." 😉 Nothing goes better with good tracking than good grammar! 😁