I spent two days trying to get a tracking solution in Blender for my project, shooting all sorts of weird angles with lots of parallax and movement, the scene just didn't work but I couldn't change location to something better. But then I found out about this app, installed it, and had the shot I wanted loaded into Blender in 15 minutes with perfect tracking 😳 Amazing app. It completely changes the types of tracking shots that can be recorded. Most of my shot was even pointed at the sky and keeps perfect reference to all my ground markers even when it's looking away from the ground for over a minute. I don't know how this isn't the norm for camera phone tracking shots (maybe it is and I'm late to the party here? 🤔
The norm for camera tracking shots would be CamTrackAR, thats what most ppl use, however its iOS only and this app works for both ios and android, also ive heard this app is just fundamentally better than CamTrackAR.
man you are amazing Thanks for sharing the tutorial you are really a true gem for animation community. you are making animation and vfx affordable for everyone. ThankYou So Much 👐
thanks man what a wonderful work that you've done it's truly incredible works I have a small question my dear friend. How can place the floating camera points into 3d ground grid exact? when I move or scale everything the camera goes back to the same place where it was, so is there any way to do that? thanks man for your great work!
glad u like it! the easiest approach is an empty object as parent of the all the objects you want to change. the parent object you can scale, transform and translate as you want (and as such the child objects which it contains). you can also just add a parent to the camera, or what ever you would like to change up - the parent is necessary as the objects are keyframed already - keyframed objects will always jump back to their original keyframed position.
you can parent the camera, markers etc to an empty object. then just transfrom the empty however you like (b.e. scale it up, move it around till everything matches)
sure you can easily detach the video file in blender. you also could extract the .zip and import the ".._cam.json" file while having an object selected. It will add the location and rotation to the selected object without importing other things - but it will throw an error as the add-on is not supposed to be used that way (but it works and shouldn't matter..).
Hi, can you suggest the best way to get shape keys onto your face mesh so I can use it to retarget them onto separate face mesh (e.g. from mb-lab)? Happy to pay for this if you put it on blender market and provide some support. Great job! Also, can you set it up to record audio so it can be imported in blender & sync up as part of the process?
Hey Ian, currently I am not working with shape keys. At the moment I could just make it an option on iOS (sadly not yet on Android...). Probably I will develop a separate iOS app in the future. As soon it's possible for Android I'll implement it for sure - would personally prefer that option.
I use a big backgroundplane with video as a texture on to get the video in the rendering. The texture coordinate on it, needs to be "window" and the material have to me shadeless so it do not cast or receives shadow.
guess that's a tracking issue: the app tries to recognize floors and walls - it may happen that errors occur which aren't solved instantly. if the marker sticks and doesn't float, it shouldn't affect the tracking result (besides the location of the marker). probably there are differences from device to device, I'll make some updates soon and a video where I go more in depth
i love your app. however, there is one thing that bothers me. under settings for video resolution you can set 1080p, the video is still only 640x480 after recording. I hope that you still fix that
Hi CG TInker, thanks for your great app, i have some trouble to get the video data with the tracking, When exported to blender the files are very light, and contain no video (previewing on the phone shows only a black screen with the trackers), i'm using an Iphone 7. Is there something i'm missing on the parameters ? Also the face track doesn't turn the camera towards the user, i guess i'm missing something there too.. Thanks for your help !
I have a doubt sir. While capturing the video, I found that the final video being recorded seems to be in time lapse or what you could say is kind of of very low fps. It is like it is in 2x speed, but with a lower fps. This happens all the time when we are exporting the file. Is this my problem? What could I do in rectifying this? Anything to do with bit rate or something? I didn't find any options to correct it in the settings. Please help...
i believe i had the same issue where it would kinda lag more of I just stayed still for the first part and moved very slowly not sure if its the same problem as yours
Hello! how to move the start of the animation to another location in blender? I can't move the camera from coordinates x=0, y=0; when playing, everything returns to the beginning :(
hey! Sadly I don't have a documentation as I can only test on two devices and it's phone dependant. However, the less you record / track the better the quality of the recorded data. I'll add the following information soon to my website - cgtinker.com Motion tracking: Camera motion tracking requires at least one reference point which is always visible in frame while recording. The best results I tend to get, when I only use one reference point which is always in frame. However most of the time I use multiple reference points as I want to record footage in a large environment. While placing reference points don't slow down the tracking, they don't stack and improve results only while they are in frame. Video recording: Recording a video takes up lots of resources - the higher the video resolution settings and bit rate settings, the slower your phone gets. (available video resolution setting can vary depending on the device). I tend to only use what I need, never more (in terms of resolution / bitrate) Recording dot clouds: Recording the dot cloud can be performance heavy, I usually only use 1-5% in the settings if I need some more environmental information. I tend to use high values 50%-100% if I want to rebuild an object and want to use it as reference help - doing so I keep the recordings pretty short. Personally most of the time I don't record dot clouds as the reference points provide enough information for me. Hope that helps!l
hi cg tinker. i love the app...now i dont have to camera track manually in blender, i just have one problem, why is the video moving so fast,even i record video very slow moving, i tried to change the fps but it still moving very fast.
Hi, Mate thanks for the tool. (Big time reducer) Hope You will develop it further ( for example possibility to choose lens from phones with more that one?)
Do they drift or just float? If drifting occurs that's maybe because of reflections or insufficient textured surfaces. Markers in the air instead of the ground can happen because of bad plane recognition. I guess that should not effect the final tracking result while it is an inconvenience. It's possible to reset the session and try to get better recognition with another try
doesn't the camera switch at all? currently the setup is: front facing camera for camera motion tracking; backfacing camera for face tracking. this is intended due to the physical location of the true depth camera on iOS. edit: iPhone X+ is required for face tracking on iOS - if you use an iPhone below iPhone X and have a visible switch button lmk, this shouldn't happen.
@@Fancy.languages1 great job :) color ramp works basically the same way. the benefit of a the map range is that the values are getting normalised between 0 and 1, besides it does the same thing.
@@kocu3633 I found out the answer myself and the answer is yes all you need to do is put your camera into your scene and as long as its not seeing the blender backrond (grey) you're video wont be displayed you can also just delete the video out of the file if you don't want it present at all.
Does anyone know what cheaper Android phones this may be compatible with? It is not compatible with A12, and I doubt my past A20 would have been compatible either, but I may be wrong. I ruined that phone, shattered the screen and it bled all over itself internally, hence the newer A12. Cannot get this app on my phone, unlisted. The A12 costs around 100-130 at retail stores. Cheap to me would be
What if you want to use a more andvenced camera for video recording and the android phone only for tracking? Will it work if you fix them together firmly, next to each other?
Ok. This might be what I'm looking in Android. Do you know CamtrackAR from FXHome Hitfilm? It records the motion tracking and then let us export that motion to blender and get the camera from a 3D environment that then we can export to Hitfilm for compositing. For example I can shot a scene in green screen, then export the tracking for blender to get the environment in the same camera for the background and then export that background environment 3D in the desired perspective tracking to Hitfilm to use in a composite shot with the environment and the shot with the green screen for keying. Can I do this with this app? That would be a game changer because their app is only for iOS.
I don't have a large greenscreen so I didn't test that case. Could work, give it a try and let me know. To see some results would be interesting for me! :) Most likely it will work if you have some of the floor in the footage (and crop later on)... requires some testing!
@@cgtinker I'll try to make the matrix bullet time as FXHome released the tutorial with CamTrackAR. I have a green screen that covers the floor. Just try to get a 3D scene for Blender, record the green screen footage with BlendAR app and then import it to blender and try to mix their tutorial with yours about data importation. Let's see if I can figure it out. Thank you.
@@dpredie the camera moves yes, kinda, however the camera projection matrix cannot be packed in an fbx file - making it somewhat useless :( sadly I don't know of any format exporting OpenGL projection matrices across engines..
If I want to attach the iphone to a mirrorless camera, and use the tracking motion from the iphone but use the image from the camera, do you think is possible? I just want the iphone movement tracking with the good quality from the camera
@@unsergioromero sadly it's not only the motion - otherwise you could just capture the motion using the phone and then use the footage from the dslr (by b.e. mounting the phone on top of the dslr). however, each lens has it's own distortion, the camera frustum would have to match, the timing also gotta match (so some kind of timecode communication would be required). So it's not the movement while considering the mobile device would cover the exactly same frame as the dslr. A way to go about it, would be to create a fake camera input stream using the dslr img - which is probably possible when rooting the mobile device. This approach could work, however, the same issues from above could occur... I cannot even test that. The big issue is, that every camera has different systems / firmware and there isn't too much help out there to do so. So while it's maybe possible for an individual to make it possible for one specific device / phone setup - making it possible for a variety of cameras seems really though and most likely would require a large team of devs.
@@unsergioromero it's though, as mentioned - basically you would have to research for your device and your camera and develop... And I'm not 100% sure if it works. you're way better off using regular camera tracking in blender. It's very good.
Can you please add locked focus and exposure soon? because there’s something annoying in Iphone X that it keeps autofocusing everytime I move the camera
could someone help ive done the compositing exactly how hes done it and for some reason the 3d model i placed in the video wont show up in the compositing viewer
@@Roroprin oh great :D I just can imagine that you were trying to import a .zip which doesn't contain tracking data (b.e. you exported multiple .zips in a .zip and forgot to unpack and trying to import it.. happened to me in the past swell)
With the new iPhone 13 Cinematic camera that will be useful. I wish there was an addon that could Work with the Gyro data and Lens metadata of the Sony A7SIII/FX6
Hey man, I love the tracking its super easy to set up and the results turn out awesome but there's constant minor flicks when I render the video. You think you can give me a solution to my problem? Thanks!
Thanks! Select the camera, go in the graph editor, select all rotation axis and then use the 'euler filter' Graph editor > key > discontinuity Euler filter Working on a fix, but this should do the trick for now
anytime i try to import the zip file it says badzipfile file is nt a zip file and i checked its not because i imported multiple i unzipped and its all the json and MOV files but none of those import independently either what should i do?
update: I got frustrated earlier because it wouldn't work but i was just laying in bed and thought what if its because i used google drive, I emailed it to myself, downloaded it, and it worked like a charm, fair warning for anyone having troubles make sure not to upload it to google drive itll do something funky to the file. cheers, so glad i went back and tried a different approach much love from Washington DC
Hm.. did you ensure you are importing a zip containing .JSON files? If you exported multiple files you got to unpack the .zip containing the .zip's which contain the data to import..
Whenever I am importing my data, the video is perfectly tracked but it goes with 2x speed! Idk y this happens but if anyone knows the solution plz do let me know, thanks!
guess the issue is the scenes frame rate. If you are shooting with a 30Hz device (android) make sure blender runs at 30fps. If you are shooting on an 60Hz device (iOS) use 60fps.
that's not possible with the app. at the moment I wouldn't know how to develop something like that and would use blender's regular motion tracking system. It's pretty strong and fast, however, more steps are included
@@calmzar1 this actually could work :D but it's hard to get exactly the same lens. I actually want to test that in the future. btw I don't know, but maybe there is a "lens correction" video tool around which can use the mobile device video as solver and distort the dslr video :o.
Dude this is hands down the easiest and best way to get realistic handheld motion in blender. You're a real legend!
I haven't fully tested this yet. But this is amazing.
I spent two days trying to get a tracking solution in Blender for my project, shooting all sorts of weird angles with lots of parallax and movement, the scene just didn't work but I couldn't change location to something better. But then I found out about this app, installed it, and had the shot I wanted loaded into Blender in 15 minutes with perfect tracking 😳 Amazing app. It completely changes the types of tracking shots that can be recorded. Most of my shot was even pointed at the sky and keeps perfect reference to all my ground markers even when it's looking away from the ground for over a minute. I don't know how this isn't the norm for camera phone tracking shots (maybe it is and I'm late to the party here? 🤔
The norm for camera tracking shots would be CamTrackAR, thats what most ppl use, however its iOS only and this app works for both ios and android, also ive heard this app is just fundamentally better than CamTrackAR.
This app - addon truly changed video tracking. I love it. Thanks for your showcase!
ok so after trouble shooting with this for about an hour I got it working and it is so so good and very helpful thank you for this.
Thank you so much for this. It's lowering so many barriers for creativity and filmmaking 🍻
This is phenomenal !!! You are amazing. Thanks so much :)
Damn the possibilities this brings..thanks will try it and bring feedback
You are doing amazing job!!! Please keep it up!
Great work, a big help for blender community
Hey, your app is amazing very easy way to make tracking in blender .Good job!
what a legend!
Thank you bro. Great.
man you are amazing Thanks for sharing the tutorial you are really a true gem for animation community. you are making animation and vfx affordable for everyone. ThankYou So Much 👐
Super toll ! Danke !!
that's really awesome
AWSOME DUDE !!
This plugin is awesome bro
amazing tutorial, thank's
Very nice.Thx! 👍👍
Subbed 😉
perfect thank you
wow this is nice thanks and it's free you realy did great
Thank you so much brother ❤
The background video is not showing when I click on live render it is only showing for viewport shading . How to fix that?
thanks man what a wonderful work that you've done it's truly incredible works
I have a small question my dear friend.
How can place the floating camera points into 3d ground grid exact?
when I move or scale everything the camera goes back to the same place where it was, so is there any way to do that?
thanks man for your great work!
glad u like it!
the easiest approach is an empty object as parent of the all the objects you want to change.
the parent object you can scale, transform and translate as you want (and as such the child objects which it contains).
you can also just add a parent to the camera, or what ever you would like to change up - the parent is necessary as the objects are keyframed already - keyframed objects will always jump back to their original keyframed position.
WHAT A LEGEND!!
Awsesome great pluigin! :) I Would it be possible to use this app to track and and offset apply the tracked data into another videofootage? :)
Don't think that's possible
thats a great piece of software, BRAVO, I am struggling to put the result on scale with my modeling, Am I missing an option ?
you can parent the camera, markers etc to an empty object. then just transfrom the empty however you like (b.e. scale it up, move it around till everything matches)
AWWWWWEEESOME
Sweeet! Thanks! 👍
Danke dir! Ich hätte da noch eine frage. In welcher auflösung nimmt die app auf?
np ;)
die Optionen sind vga, hd, fhd. Hängt aber vom Device ab was für eine Resolution ohne größere Framedrops möglich ist
This is awesome! But what is the minimum phone requirement for this to work? (Android/iphone)
ar core / ar kit support is required. As usual, the more performance the phone, the better the results!
Can I use this app to get the realistic animation for camera without the background .just the key frame of camera
sure you can easily detach the video file in blender.
you also could extract the .zip and import the ".._cam.json" file while having an object selected. It will add the location and rotation to the selected object without importing other things - but it will throw an error as the add-on is not supposed to be used that way (but it works and shouldn't matter..).
obrigado ,boa dica ,parabens
Hi, can you suggest the best way to get shape keys onto your face mesh so I can use it to retarget them onto separate face mesh (e.g. from mb-lab)? Happy to pay for this if you put it on blender market and provide some support. Great job! Also, can you set it up to record audio so it can be imported in blender & sync up as part of the process?
Hey Ian, currently I am not working with shape keys. At the moment I could just make it an option on iOS (sadly not yet on Android...). Probably I will develop a separate iOS app in the future. As soon it's possible for Android I'll implement it for sure - would personally prefer that option.
@@cgtinker Sir can you please make a video on how to use dslr video insted of mobile footage in blender for background video
Hello dear,
Please help me, blendertrack app not download my issue
Your device isn't compatible with this version
Any solution
Not really, arcore or arkit support is required to run the application..
When i render the tracking video not appear, could you make how to render this cool stuff
I use a big backgroundplane with video as a texture on to get the video in the rendering. The texture coordinate on it, needs to be "window" and the material have to me shadeless so it do not cast or receives shadow.
it works perfectly now
great :) enjoy!
love every bit of it! i just have the problem with floating markers, until i get the phone right on the ground, then it works, what am i doing wrong?
guess that's a tracking issue:
the app tries to recognize floors and walls - it may happen that errors occur which aren't solved instantly. if the marker sticks and doesn't float, it shouldn't affect the tracking result (besides the location of the marker).
probably there are differences from device to device, I'll make some updates soon and a video where I go more in depth
@@cgtinker thanks for your answer :)
i love your app. however, there is one thing that bothers me. under settings for video resolution you can set 1080p, the video is still only 640x480 after recording. I hope that you still fix that
thats the only issue with the app. Its a bit of a pain tbh
Boss af
Amazing🎉🎉❤
Hi CG TInker, thanks for your great app, i have some trouble to get the video data with the tracking,
When exported to blender the files are very light, and contain no video (previewing on the phone shows only
a black screen with the trackers), i'm using an Iphone 7.
Is there something i'm missing on the parameters ?
Also the face track doesn't turn the camera towards the user, i guess i'm missing something there too..
Thanks for your help !
Well i think i found what's going on : Not sure the Iphone 7 has the AR tech needed for the app to work correctly...
Can we import this track data in to cinema 4d or after effects with out using blender ??
no.. I don't even own that software :/
I'm considering a port unity / unreal / davinci in the future, but cannot call it planned
I have a doubt sir. While capturing the video, I found that the final video being recorded seems to be in time lapse or what you could say is kind of of very low fps. It is like it is in 2x speed, but with a lower fps. This happens all the time when we are exporting the file. Is this my problem? What could I do in rectifying this? Anything to do with bit rate or something? I didn't find any options to correct it in the settings. Please help...
i believe i had the same issue where it would kinda lag more of I just stayed still for the first part and moved very slowly not sure if its the same problem as yours
hi cg tinker, ,your work is wonderfull!!, unfortunately, i have samsung note 10, android version is newer, blendatrackis not compatible... Congrats;;
Hello! how to move the start of the animation to another location in blender? I can't move the camera from coordinates x=0, y=0; when playing, everything returns to the beginning :(
create an empty, select the camera obj, then select the empty -> press ctrl + p and choose parent to object. then move around the parent ;)
@@cgtinker oh.. thank you so much! I have already deployed the whole map for this camera :))
is there any workaround on settings for better results? how about video bitrates and points density, Are there any documents available?
hey!
Sadly I don't have a documentation as I can only test on two devices and it's phone dependant. However, the less you record / track the better the quality of the recorded data.
I'll add the following information soon to my website - cgtinker.com
Motion tracking:
Camera motion tracking requires at least one reference point which is always visible in frame while recording. The best results I tend to get, when I only use one reference point which is always in frame. However most of the time I use multiple reference points as I want to record footage in a large environment. While placing reference points don't slow down the tracking, they don't stack and improve results only while they are in frame.
Video recording:
Recording a video takes up lots of resources - the higher the video resolution settings and bit rate settings, the slower your phone gets. (available video resolution setting can vary depending on the device). I tend to only use what I need, never more (in terms of resolution / bitrate)
Recording dot clouds:
Recording the dot cloud can be performance heavy, I usually only use 1-5% in the settings if I need some more environmental information. I tend to use high values 50%-100% if I want to rebuild an object and want to use it as reference help - doing so I keep the recordings pretty short. Personally most of the time I don't record dot clouds as the reference points provide enough information for me.
Hope that helps!l
@@cgtinker great! Thank you
hi cg tinker. i love the app...now i dont have to camera track manually in blender, i just have one problem, why is the video moving so fast,even i record video very slow moving, i tried to change the fps but it still moving very fast.
on ios the frame rate is 60fps - on android usually 30 fps. in the rendersetting you got to match the fps ;)
@@cgtinker im using android..i turn down my fps to 17, now its working perfectly... Thanks😀
Hi, Mate thanks for the tool. (Big time reducer) Hope You will develop it further ( for example possibility to choose lens from phones with more that one?)
Спасибо за урок!
Great but y is the footage so shaky? Is it due to your movement, the app or both?
phone footage without stabilisers is quiet shaky that's true. currently neither ar core nor ar kit support video stabilisers
@@cgtinker hmm, that’s unfortunate. So I guess we continue doing camera tracking the hard way then, till stabilisers are available.
having trouble getting the tracker markers to NOT float in the air... any tricks tips on how to refine the recording and markers?
Do they drift or just float?
If drifting occurs that's maybe because of reflections or insufficient textured surfaces.
Markers in the air instead of the ground can happen because of bad plane recognition. I guess that should not effect the final tracking result while it is an inconvenience. It's possible to reset the session and try to get better recognition with another try
Not sure if I'm just using the app wrong but on my iPhone when I try to use the front facing camera for face tracking it doesn't switch
doesn't the camera switch at all?
currently the setup is: front facing camera for camera motion tracking; backfacing camera for face tracking.
this is intended due to the physical location of the true depth camera on iOS.
edit: iPhone X+ is required for face tracking on iOS - if you use an iPhone below iPhone X and have a visible switch button lmk, this shouldn't happen.
hello dude its nice tools, can you make it so the smartphone as a guide tracker and we can connect it to actual camera like vive for unreal?
sorry im late but it works when i move the camera track but when i walk around the camera in blender stays in place how do i fix it
And one more question about the shader, I followed your step but couldn't get the shadow at 7:50. My b3d version is 2.92...
I solved the problem by using colorramp instead of Map Range. Then adjusted the black and white.
@@Fancy.languages1 great job :)
color ramp works basically the same way. the benefit of a the map range is that the values are getting normalised between 0 and 1, besides it does the same thing.
Is it possible to use this for just making a moving camera? Without the background footage.
I waiting for answer too
@@kocu3633 I found out the answer myself and the answer is yes all you need to do is put your camera into your scene and as long as its not seeing the blender backrond (grey) you're video wont be displayed you can also just delete the video out of the file if you don't want it present at all.
how to transparent?you're blocking the screen
Your camera movement is shaky. How do you make it track smoothly regardless of camera movement?
Buy a camera stabilizer or just hand animate your camera in blender if you want smooth camera motion
Does anyone know what cheaper Android phones this may be compatible with? It is not compatible with A12, and I doubt my past A20 would have been compatible either, but I may be wrong. I ruined that phone, shattered the screen and it bled all over itself internally, hence the newer A12. Cannot get this app on my phone, unlisted. The A12 costs around 100-130 at retail stores. Cheap to me would be
What if you want to use a more andvenced camera for video recording and the android phone only for tracking? Will it work if you fix them together firmly, next to each other?
Don't think that will work :/
Timesynch / Distortion will most likely screw it up
@@cgtinker Too bad :(
Great application by the way!
march/20/2024 can't find app on IOS & can't download on android play store because it was made for an older version of Android.
Ok. This might be what I'm looking in Android. Do you know CamtrackAR from FXHome Hitfilm? It records the motion tracking and then let us export that motion to blender and get the camera from a 3D environment that then we can export to Hitfilm for compositing.
For example I can shot a scene in green screen, then export the tracking for blender to get the environment in the same camera for the background and then export that background environment 3D in the desired perspective tracking to Hitfilm to use in a composite shot with the environment and the shot with the green screen for keying.
Can I do this with this app?
That would be a game changer because their app is only for iOS.
I don't have a large greenscreen so I didn't test that case. Could work, give it a try and let me know. To see some results would be interesting for me! :)
Most likely it will work if you have some of the floor in the footage (and crop later on)... requires some testing!
@@cgtinker I'll try to make the matrix bullet time as FXHome released the tutorial with CamTrackAR. I have a green screen that covers the floor. Just try to get a 3D scene for Blender, record the green screen footage with BlendAR app and then import it to blender and try to mix their tutorial with yours about data importation. Let's see if I can figure it out.
Thank you.
@@mariomachado1020 sounds great, good luck! if you need help let me know (via discord by example)
Would you be able to add FBX export to the app so that we can export camera moves to Unreal?
@@dpredie the camera moves yes, kinda, however the camera projection matrix cannot be packed in an fbx file - making it somewhat useless :(
sadly I don't know of any format exporting OpenGL projection matrices across engines..
I have Android 11 but still can't download the app...
Is says it's not compatible with my device...
If I want to attach the iphone to a mirrorless camera, and use the tracking motion from the iphone but use the image from the camera, do you think is possible? I just want the iphone movement tracking with the good quality from the camera
sadly it's not possible.. would be really cool though. for dslr footage traditional motion tracking seems to be required atm
@@cgtinker so, there is no way that iPhone accelerometer saves movements data that could be translate by blender in camera motion? 😣
@@unsergioromero sadly it's not only the motion - otherwise you could just capture the motion using the phone and then use the footage from the dslr (by b.e. mounting the phone on top of the dslr).
however, each lens has it's own distortion, the camera frustum would have to match, the timing also gotta match (so some kind of timecode communication would be required). So it's not the movement while considering the mobile device would cover the exactly same frame as the dslr.
A way to go about it, would be to create a fake camera input stream using the dslr img - which is probably possible when rooting the mobile device. This approach could work, however, the same issues from above could occur...
I cannot even test that. The big issue is, that every camera has different systems / firmware and there isn't too much help out there to do so. So while it's maybe possible for an individual to make it possible for one specific device / phone setup - making it possible for a variety of cameras seems really though and most likely would require a large team of devs.
@@cgtinker and how I can do that ?
@@unsergioromero it's though, as mentioned - basically you would have to research for your device and your camera and develop... And I'm not 100% sure if it works. you're way better off using regular camera tracking in blender. It's very good.
Can you please add locked focus and exposure soon? because there’s something annoying in Iphone X that it keeps autofocusing everytime I move the camera
Thanks for the info. Will check it soon
could someone help ive done the compositing exactly how hes done it and for some reason the 3d model i placed in the video wont show up in the compositing viewer
on my phone it says that the app is made for an older version of android, is there a way to fix this?
your app is really usefull, but blender does not read me the zip file and I don't know how to fix it, could I ask you for a hand?
did you export multiple recordings? if so they are stored in one .zip, which has to be unzipped. the containing zips should be importable
I've used blendartrack a few times, but it suddenly stopped importing data when I click import data. Do you know whats wrong?
Okay it started working again for some reason yay
@@Roroprin oh great :D
I just can imagine that you were trying to import a .zip which doesn't contain tracking data (b.e. you exported multiple .zips in a .zip and forgot to unpack and trying to import it.. happened to me in the past swell)
does blendartrack export files to prisma 3d ?
With the new iPhone 13 Cinematic camera that will be useful. I wish there was an addon that could Work with the Gyro data and Lens metadata of the Sony A7SIII/FX6
Is there a way I can do regular motion tracking using my front camera?
Currently not supported, not sure if ar core / kit even supports that atm. Maybe...
Can you add more feature on the camera like stable footage
Hey man, I love the tracking its super easy to set up and the results turn out awesome but there's constant minor flicks when I render the video. You think you can give me a solution to my problem? Thanks!
Thanks! Select the camera, go in the graph editor, select all rotation axis and then use the 'euler filter'
Graph editor > key > discontinuity Euler filter
Working on a fix, but this should do the trick for now
@@cgtinker ah alright thank you, I'll try it out when I get home, keep up the awesome work!
I'd appreciate it if you could post a link to where to download blendertrack.
You can find download links in the video description
hi is the the blender track software available on mac?
it's for iOS and android
Im having an issue with compositing. I have copied what the videos done but it doesn't show the 'Movie Clip'
Blender 3.1?
Will upload an update tomorrow
@@cgtinker Oh ok thanks!
ich hab nen fetten struggle bei mir wird die file angezeigt jedoch wenn ich sie suche find ich sie nicht
anytime i try to import the zip file it says badzipfile file is nt a zip file and i checked its not because i imported multiple i unzipped and its all the json and MOV files but none of those import independently either what should i do?
update: I got frustrated earlier because it wouldn't work but i was just laying in bed and thought what if its because i used google drive, I emailed it to myself, downloaded it, and it worked like a charm, fair warning for anyone having troubles make sure not to upload it to google drive itll do something funky to the file. cheers, so glad i went back and tried a different approach much love from Washington DC
When trying to import data into blender it doesn't seem to do anything for some reason, makes me sad.
Hm.. did you ensure you are importing a zip containing .JSON files? If you exported multiple files you got to unpack the .zip containing the .zip's which contain the data to import..
Whenever I am importing my data, the video is perfectly tracked but it goes with 2x speed! Idk y this happens but if anyone knows the solution plz do let me know, thanks!
guess the issue is the scenes frame rate. If you are shooting with a 30Hz device (android) make sure blender runs at 30fps. If you are shooting on an 60Hz device (iOS) use 60fps.
After effects?
8:06 why not use like shadow catcher?
Everything is cool but ,the video resolution is kind of bad
I loved but is not working in samsung a21. I need to buy a knew phone i guess.
you are the best man is a great app for blender
Can I add camera settings and shoot 4K video?
is there an android version requirement? it says unavailable on my device
yea Googles ArCore support is required on android
I keep getting an error when I click import and not all the data comes in ( only the camera is imported - no points ) ... Anyone know why?
can you copy paste the err msg and post it here?
Is there an option to just use the motion of the camera not the footage
sure, just delete / unlink the movie file
@@cgtinker ty so much
Help! when i try importing footage it gives a python traceback error
Can U post the error message?
@@cgtinker I fixed it by downloading the newer version no need for help anymore :) great app by the way!
what are the benefits of the shader?
is it possible to import only motion path without video?
well just click the camera in blender and remove the movie clip from it
the camera quality gets so compressed. sad.
Thiss app not working on samsung j5. Only black video.
Did you shoot in a bright environment? :P
Just kidding, hope the next update fixes this issue
@@cgtinker I'm sorry! my reason for being careful! I updated the OS and everything worked! thanks for the answer
how to get the file from ios to pc?
I would e-mail it :p
How to use camera tracker as a object tracker?
Sir can you please make a video on how to use dslr video insted of mobile footage in blender
that's not possible with the app.
at the moment I wouldn't know how to develop something like that and would use blender's regular motion tracking system. It's pretty strong and fast, however, more steps are included
just mount your phone directly to the dslr, works absolute great! same lens is important
@@calmzar1 this actually could work :D but it's hard to get exactly the same lens.
I actually want to test that in the future. btw I don't know, but maybe there is a "lens correction" video tool around which can use the mobile device video as solver and distort the dslr video :o.
It doesn't work on my android