I found that there is kind of workaround to fix finder problem. I was right, it don't track fingers, but you can manually change fingers from straight to fist and back. So If you want to animate fighting scene, or want your character to grab something, there is an option.
I'm a total newbie in Unreal and everything gamedev related. Your tutorial Maris Freimanis was the stepping stone that gave me confidence. During this weekend I was able to setup my kinect and record a silly dance with the help of my daughter. Now I have just applied that animation sequence to the UE4 mannequin and also to my first Metahuman into Unreal! Thanks a LOT! Thank you for your clear and simple explanation! Your video deserves so many more views! I hope you are having a great day. Take care! Greetings from México!
Yea me too. If we look at fact that I don't have any trackers on my body, it's not bad. I'm sure with some posture practice I will be able to get better results.
Can you use the Kinect in conjunction with other solutions. Ie. Leap motion or rokoko gloves for hands, and an iPhone ARkit via Iclone or some other option for face? I’ve been thinking of this sort of setup and actually the old kinects for whatever reason capture better than new ones. Any ideas on the above?
Yes you can assign an iphone for facial cap, the kinect for body mocap, and for example rokoko gloves for hand cap. My question in what way do the old kinects work better? I have a kinect v2 and this broke my heart
@@urbnctrl I’ve done no head to head tests. A filmmaker friend in Australia just told me that the Kinect ones get great motion capture better than many other solutions. I don’t know how this applies to Azure or the other versions. It could also just be he was referring to the low costs. I believe any combination of old and new can work and the key is having a lot of sensors.
@@WakingUniverseTV oh yes, I have just done some more research in the meantime and the Kinect One (so v2) indeed has way better results, honestly I have just watched a ton of demo's and it seems that a dual Kinect setup is more accurate and smooth at times than an actual Rokoko Mocap Suit. Which is amazing considering the price difference.
Hey. Make sure that after u track moves, you use jitter removal to smooth out animations. Move Trajectory filter to five. U can see it in 6:18 in video. Beside that, everything else comes from the way how you do your movements, with a bit of practice I got smoother movements than first time I try to do it. For these I didn't made any changes in animation bp. As you can see in video, I just drop them in ue4.
Thanks for the video. Just some unsolicited advice: cleaning up the unnecessary 2-3 minutes of you in front of the green screen would help. That didn't add much. Also the levels of your background music are a bit loud. Look into lowering them and/or sidechain compression against your voice so the music automatically ducks out when you're speaking.
Yea, latest version is 4.20. I try to compile it while a go for 4.26,but it will require aome work. Would be greate if someone with more knowledge be able to covert it for 4.25 or 4.26 so we can use it with MetaHumans
Have you tried the difference in mocap quality difference from kinect v1 and the v2? Someone mentioned that the xbox one version works less well for mocap?
Yes you can. Face and body are separate skeletal meshes. So you can record them separate. I use my Kinect setup + Live Link Face when I made water jump scene video from movie Extraction.
Great videos! I've watched almost all the videos of your channel. I do virtual production, and I'm trying to build a feature that based on real time trackerless motion capture, some particles and effect would be triggered. I have kinect and some webcams, what solutions would you recommend? ipi or mocap for all
Hey. Both of these options comes with their own pros and cons. IpI is easier to set up and been around longer, so animations atm are better than with MCA. But Kinect problem is, unless you use at least 2 kinect devices, is rotation. If you rotate to much, animation will go wild. So here comes strength of MCA, you set up multiple cameras and it will not lose track. But once again it's too early to tell and I haven't had enough tests with MCA to tell how useful it will be
Im using IPI-soft from allmost 1 year with 2 kinects v 2.0 , but i was bit out of time, so i can easily record my Body (i have also 3 Ps moves :D ) and put this as a animation Clip to unreal engine ? I dont preffer streaming cuz there is Jittering and other problems. If we recording we can Clean up perfectly our Clip :) Ipi soft
Wonder if u could put this in iclone. You could easily tweak every part of body. You could use an array of hand gestures they have... be interesting to see if thats possible :)
@@MarisFreimanis yes. Please take a look. You can tweak everything. No need to guess. Precision in fingers, shoulders, ALL joints. Ive been playing around with facial stuff as well. Really excellent workflow. . U can download 30 day trail. Export options with unreal fbx settings. Iclone live link where u can control character 100% from iclone but it appears in unreal... lots of real juicy options my friend ;)
Can I use this plugin to instead capture the movement to a character and trigger an event to play something? Like when I touch a button in the real world it triggers an event inside unreal.
Can you use the Kinect in conjunction with other solutions. Ie. Leap motion or rokoko gloves for hands, and an iPhone ARkit via Iclone or some other option for face? I’ve been thinking of this sort of setup and actually the old kinects for whatever reason capture better than new ones. Any ideas on the above?
Hey. Yes you can, there are different ways you can do it. Right now I'm working on scene from the movie, where I'm combining Kinect motion tracking with live link facial capture and some simple hand animations. Probably most tedious way is to record all 3 separately and then put all animations together in unreal. I unfortunately don't have rokoko gloves, so I can't tell if you can use it same time, while u record your motions with kinect, but you can do face capture and kinect motion capture at the same time, if you have some kind of head mount for your phone.
I found that there is kind of workaround to fix finder problem. I was right, it don't track fingers, but you can manually change fingers from straight to fist and back. So If you want to animate fighting scene, or want your character to grab something, there is an option.
I'm a total newbie in Unreal and everything gamedev related. Your tutorial Maris Freimanis was the stepping stone that gave me confidence. During this weekend I was able to setup my kinect and record a silly dance with the help of my daughter. Now I have just applied that animation sequence to the UE4 mannequin and also to my first Metahuman into Unreal! Thanks a LOT! Thank you for your clear and simple explanation! Your video deserves so many more views! I hope you are having a great day. Take care! Greetings from México!
Hey! I'm glad it worked out for you. Kids love UE Kinect projects :D Welcome to Unreal :)
Thank you, apricated it.
this is much better than I expected :)
Yea me too. If we look at fact that I don't have any trackers on my body, it's not bad. I'm sure with some posture practice I will be able to get better results.
Damn those are some fluid animations, nice trick!
Thanks!
Can you use the Kinect in conjunction with other solutions. Ie. Leap motion or rokoko gloves for hands, and an iPhone ARkit via Iclone or some other option for face? I’ve been thinking of this sort of setup and actually the old kinects for whatever reason capture better than new ones. Any ideas on the above?
ruclips.net/video/fsTwVYKs6j8/видео.html
Yes you can assign an iphone for facial cap, the kinect for body mocap, and for example rokoko gloves for hand cap. My question in what way do the old kinects work better? I have a kinect v2 and this broke my heart
@@urbnctrl I’ve done no head to head tests. A filmmaker friend in Australia just told me that the Kinect ones get great motion capture better than many other solutions. I don’t know how this applies to Azure or the other versions. It could also just be he was referring to the low costs. I believe any combination of old and new can work and the key is having a lot of sensors.
@@urbnctrl also to be clear it’s Kinect for Xbox One V2, so version 2 looks like the same thing I have here. That’s the one my friend was referring to
@@WakingUniverseTV oh yes, I have just done some more research in the meantime and the Kinect One (so v2) indeed has way better results, honestly I have just watched a ton of demo's and it seems that a dual Kinect setup is more accurate and smooth at times than an actual Rokoko Mocap Suit. Which is amazing considering the price difference.
good work man
Thank you!
Thank you for sharing your knowledge!! I've watched a lot of videos and yours was the best!! =D
Awesome! I'm glad to hear that. These week been crazy, but I will try soon to make an update tutorial for using these setup using Metahumans.
Great work !
Thank you!
Great tutorial! Are you going to make update for that? I mean ue5 and metahumans?
Thank you and have a great day.
It's like Nuitrack?
Hey, I don't know since I have not used Nuitrack
How do u make the movements so smooth, I have made some tests and the movement is not so smooth, do u make any changes in blueprint animation?
Hey. Make sure that after u track moves, you use jitter removal to smooth out animations. Move Trajectory filter to five. U can see it in 6:18 in video. Beside that, everything else comes from the way how you do your movements, with a bit of practice I got smoother movements than first time I try to do it. For these I didn't made any changes in animation bp. As you can see in video, I just drop them in ue4.
@@racsstudio7425 An invitation for test? Looks interesting, would like yo take a look
Thanks for the video. Just some unsolicited advice: cleaning up the unnecessary 2-3 minutes of you in front of the green screen would help. That didn't add much. Also the levels of your background music are a bit loud. Look into lowering them and/or sidechain compression against your voice so the music automatically ducks out when you're speaking.
You can use Kinect 4 Unreal by opaque media. They have stopped development but stable.
Yea, latest version is 4.20. I try to compile it while a go for 4.26,but it will require aome work. Would be greate if someone with more knowledge be able to covert it for 4.25 or 4.26 so we can use it with MetaHumans
I have developed a plugin for integrating unreal and kinect ruclips.net/video/dOFzeLJV3-8/видео.html
Have you tried the difference in mocap quality difference from kinect v1 and the v2? Someone mentioned that the xbox one version works less well for mocap?
This is amazing! I use Live Link Face, any chance I can combine the two?
Yes you can. Face and body are separate skeletal meshes. So you can record them separate. I use my Kinect setup + Live Link Face when I made water jump scene video from movie Extraction.
Great videos! I've watched almost all the videos of your channel. I do virtual production, and I'm trying to build a feature that based on real time trackerless motion capture, some particles and effect would be triggered. I have kinect and some webcams, what solutions would you recommend? ipi or mocap for all
Hey. Both of these options comes with their own pros and cons. IpI is easier to set up and been around longer, so animations atm are better than with MCA. But Kinect problem is, unless you use at least 2 kinect devices, is rotation. If you rotate to much, animation will go wild. So here comes strength of MCA, you set up multiple cameras and it will not lose track. But once again it's too early to tell and I haven't had enough tests with MCA to tell how useful it will be
@@MarisFreimanis but ipisoft isnt real time correct?
@@mikegoesnike you can do both.
@@MarisFreimanis have you tried using it in unreal 5? Should be the same right?
How do I animate the movements of my body and my face at the same time?
isn't there anyway to use this in real-time with unreal, and use a kinect as a motion capture suit for live vtubing?
You can use it in real time.
What happens if you turn your back? can i make a fighting animation
If you have only one camera it will break. For more complex movements, like turning etc. You will need 2 to 3 Kinect cameras. 3 Would be really good.
Its works using the first version kinect ? Thanks for sharing, good job
Hey! If I remember correctly, IPI supports all Kinect versions.
@@MarisFreimanis thank you bro, i ll be trying this right now. 🙏🙏🙏 Crongratz
Awesome! Let me know how it goes
Im using IPI-soft from allmost 1 year with 2 kinects v 2.0 , but i was bit out of time, so i can easily record my Body (i have also 3 Ps moves :D ) and put this as a animation Clip to unreal engine ? I dont preffer streaming cuz there is Jittering and other problems. If we recording we can Clean up perfectly our Clip :) Ipi soft
wow
Wonder if u could put this in iclone. You could easily tweak every part of body. You could use an array of hand gestures they have... be interesting to see if thats possible :)
If iClone will be useful for any of my projects, will look into it.
@@MarisFreimanis yes. Please take a look. You can tweak everything. No need to guess. Precision in fingers, shoulders, ALL joints. Ive been playing around with facial stuff as well. Really excellent workflow. . U can download 30 day trail. Export options with unreal fbx settings. Iclone live link where u can control character 100% from iclone but it appears in unreal... lots of real juicy options my friend ;)
They have a paid Kinect plugin for iClone
Yeah
Can I use this plugin to instead capture the movement to a character and trigger an event to play something? Like when I touch a button in the real world it triggers an event inside unreal.
If you are good with blueprints then yes, you could make a logic that does exactly that.
What value do you get using a green screen since the Kinect is depth based?
That's just my studio setup. But wih IPI I have notoced that if there is good contrast between subject and back, it improves tracking results.
Use 2 sensors
me podrias dar clases para realizar algo parecido por favor
Does this work with 360 kinect?
Yes
Any idea of when we can use the kinect v2 with UE5?
With IPi soft? Once they will update plugin to be compatible with ue5. I doubt it will happen before UE5 will come out of Early Access
can you use that in unity too?
I think there are unity kinect plugins, but can't tell much avout them, as I haven't used unity in years
Does this work with 5?
It has support for ue5
what plugin did you used to make the kinect work???
IPI Soft plugin.
@@MarisFreimanis thank you.
will this work with the new Azure Kinect?
Yes
brother ,ipi soft is paid software ,any other alternative software for free
Depends what kinect you have. For kinect Azure yes, for Kinect V1 and V2, as far as I know, no :/
does it also work for Kinect v1?
You can test trial version, but according to documentation works only with Kinect Azure and V2
This animation doesn't work correctly with metahuman. Only with the base scelet
It works. I will try to make metahuman tutorial next week. Took me a time to figure out how to do it for meta.
@@MarisFreimanis ruclips.net/video/VS2qOzIr5js/видео.html - it works like this
@@Kozlov_Production I have a fix for that. Will make a tutorial.
@@MarisFreimanis ok, thanks. Waiting for it
@@Kozlov_Production made a tutorial how to fix that glitch when you try to use metahumans with ipi soft
what kinect used xbox 360 or xbox one??
Xbox one
@@MarisFreimanis xbox 360 will be work??
@@jerryohhh hard to tell, i don't have xbox 360 kinect, so haven't tested it. You can check ipi soft documentation if they support xbox 360 kinect
Can you do spins?
If you have two sensors, yes. With one hands will go wild when you are not facing camera
Can you use the Kinect in conjunction with other solutions. Ie. Leap motion or rokoko gloves for hands, and an iPhone ARkit via Iclone or some other option for face? I’ve been thinking of this sort of setup and actually the old kinects for whatever reason capture better than new ones. Any ideas on the above?
Hey. Yes you can, there are different ways you can do it. Right now I'm working on scene from the movie, where I'm combining Kinect motion tracking with live link facial capture and some simple hand animations. Probably most tedious way is to record all 3 separately and then put all animations together in unreal. I unfortunately don't have rokoko gloves, so I can't tell if you can use it same time, while u record your motions with kinect, but you can do face capture and kinect motion capture at the same time, if you have some kind of head mount for your phone.
is it true the old kinect better ?
Can I use the Kinect 360?
Yes