@SatishKumar-vo6cd unfortunately you can't really do livestreaming directly from XR animator, but you can broadcast the data to other apps that are more compatible for live streaming with the VMC protocol function. And of course you can also record your screen with obs. But I'm afraid I can't really help you a lot with that, because I never livestreamed before
Great info! You just helped me level up my own videos for real! I didn't realize how easy it is to do more complex animations with all these techniques - So cool :D
Thank you so much for your comment. I've asked the developer of XR animator the exact same thing, and unfortunately it doesn't look like it's going to be added anytime soon. What you can do instead is to broadcast the data from XR animator to other software with this function like DSS, or VseeFace with the VMC protocol function. At least, that's what I'm doing for my animations.
The fact that the animation is not always good may have to do with the model. What you showed is that the model go through itself is clipping. Using a different model can may solve the problem. If you make youre own model you can fix it also.
Yeah, I think you are right. But it is amazing how good the results are even without doing it. Especially if you are using the more advanced AI mocap of xranimator!
Hello, I want to ask a question about something. I decided to use xr animator with dan sing sing app. I turn on the explorer mode in xr animator and connect it to dan sing sing but when i press forward. my character goes backwards instead in dan sing sing and this also happens when my i move backwards and somehow my character move forward in dan sing sing. please tell me a way to solve this problem.
Yeah, it can happen. But it's super easy to fix it! Just turn on the "send camera data" function (sorry if it's not the exact name, I'm not near my computer) and if it doesn't fix it, you can play around with the software settings. It doesn't have DSS as a preset, but I think the Vseeface preset can work pretty well. I hope that helps :)
Sounds awesome! XR animator actually really improved recently with tracking for musical instruments! you should check out the RUclips chaanel of the developer. He is showing a lot of the latest features that were added recently. I think you gonna like it :) www.youtube.com/@AnimeThemeGadget
It should be there. I would suggest you to download the latest version again from the original GitHub page. Here is the link : github.com/ButzYung/SystemAnimatorOnline/releases/tag/XR-Animator_v0.26.0
Your video is amazing. Thank you for showcasing XR animator. I got one question: How do we enable "local files" videos to be tracked and replayed by the avatar?
Thank you so much! In order for the video input to work, first you need to click on the webcam/media button, click yes and then just drag and drop the video reference (mp4 only) and then when you'll click on "local media file" it will play it as a reference. Then, in order to make your avatar perform it, click on the motion capture button, I recommend choosing the "legacy holistic" function, and that's it :)
And if that sounds too complicated, i recommend watching my latest video about exporting vmd files. I am also showing how to track the motion of a video reference over there as well :)
Well... When I made this video I was using "Adobe Dimension" to create the plane, and to make it invisible. it is a paid software, but you can easily do the same thing in Blender as well! just add plain to yours scene, switch the viewport to "shading", than add a new material to the plane, drag the alpha value to 0 and change the blend mode to "alpha clip". And then just export it as a glb file and import it to xr animator like i showed in the video. I know it's kind of hard to understand what to do if your new to Blender, so if you are interested you can feel free to message me vie my email and I can send you an image sequence showing exactly how to do it :)
Yeah, pmx models don't always work well in XR Animator. what i suggest you do, is converting your model to VRM, and then you'll be able to use it in XR Animator, and a lot of other softwares. I've made a tutorial about it, showing two methods of converting models to VRM, if you're interested :) Here's the link : ruclips.net/video/ieqmaNdNTCw/видео.htmlsi=okBAxf8_h86cdEnR
And if you want to capture motions from a RUclips video, I suggest using a free software to record your screen (like clipchamp) while playing the video, and then upload the video reference to XR Animator :)
I never tried it, but i believe it should work. mixamo has some chibi like characters as well, so i think you should export your motions from there using one of those characters, instead of one of the regular mixamo bots. then just upload your chibi character in XR animator and DSS, and I think you should be fine :)
Hi, thank you for commenting. I have a video that I'm planing to do about exporting VMD files from XR animator. I can definitely expand on the webcam mocap function in that video with no problem :)
So I'm trying to add an FBX file to XR animator and I've only gotten it to work once by accident. Every time I try to drop an FBX file in there it just changes some image of a dude in a dress looking at a piece of paper. Like what does that even mean?
Hahaha, I have no idea what this is. It happened to me too once. It happened when I tried to import an fbx file that was not from mixamo. I've messaged the developer of xr Animator about it, so I really hope it will be fixed in a future update. I would suggest you try and import a mixamo fbx file, and if the software still does this weird stuff, then try to delete and reinstall it. I hope it will help :)
Good video, a question, I already exported the animation in bvh to daz, but it doesn't go anywhere, it stays in the same place, as exported so that it moves on the stage
There could be a lot of reasons. First of all, make sure that the mixamo animation isn't set to "in place" before even downloading it. I can't really help you with daz, because I've never used it before, but in dss they have a feature called leg/toe ik, that helps the feet of the characters to stay in place and eliminate the floaty look, but it also prevents the characters from moving it 3d space. If there is anything similar to that in daz, make sure to disable it first
@A1TA_chan that's weird. It's supposed to be pretty easy. I think it's pretty simple if you just sign in with Google. That way, it's just a couple of clicks, and you're in
Unfortunately, you can't really export items out of VRM Posing desktop. You can only import 3d props into the software. This is why I recommend using vrm parts adder, because the attached props will transfer to a lot of other softwares with your model
@@PULPANIME What about feet/toe animation staying on the ground? Also what's the difference between Full body MediaPipeVision and Full body Legacy Holistic? Which gives better more accurate tracking results from recorded video and how are the finger tracking? Also is 30fps the highest so I can't track a video with 60 fps?
@@Astarrycloud-x2p Haha, that's a lot of questions so I'll try to answer all of them... except the first one, because I didn't get exactly what you mean. If you can elaborate more on what you'd like to know abut feet/toe animation, Id be glad to help. Now about MediaPipeVision V.S Legacy Holistic. "MediaPipeVision" is the most accurate full body mocap that you can use in XR animator, but it's a bit more taxing and might not work very well on some computers. This is why they also have the "Legacy Holistic" option, which uses a more advanced A.I to help with the tracking, and estimating the movements. It's a bit less accurate but can work faster on some computers. I think you should try both of them, and see which one is working best for you! the finger tracking in both options is pretty good for a fee webcam mocap application. I never tried using LeapMotion or anything like that, so I'm afraid I'm not the right person to tell you if that's better than other mocap solutions Now, I'm not sure what you mean by 30fps. I belive you can import videos in any frame rate into XR Animator. I prefer recording 30fps videos inside XR animator, but you can also change the frame rate to "unlimited" before exporting your videos. I really hope I understood your questions correctly. If not, please feel free to clarify what you meant, and I will try my best to help :)
Do you mean exporting an animation file? or to export a video of your animation? Both are pretty easy. To export a vmd animation file, click on the UI/options button below, and then click on "5.export motion to file" In order to export a video file, just press on your f9 key to start recording. and then on f10 to stop recording
@@PULPANIME i manage to export a vmd file for the animation can you possible point me in the right directions to help me retarget animation to be used in unreal engine
@spitfirefps2552 Unfortunately, I can't help you with the unreal Unfortunately because I don't have any experience with it. But I think it might be possible to use vmd files with a vrm avatar by using a plugin called "vrm4u". ruclips.net/video/epcQ-uU6tfU/видео.htmlsi=sbts20xe7bFyJvNr Also, you should look into the new demo that connects DSS to the unreal engine ruclips.net/video/E5TTHjD5lV0/видео.htmlsi=0v-q5doYpFZgrx6i
Hi everyone. I hope you've enjoyed the video. If you have any questions, feel free to ask them here :)
hey bro how can i use this xr animator for live streaming into others platform like facebook youtube and twicth etc with using obs?
@SatishKumar-vo6cd unfortunately you can't really do livestreaming directly from XR animator, but you can broadcast the data to other apps that are more compatible for live streaming with the VMC protocol function. And of course you can also record your screen with obs. But I'm afraid I can't really help you a lot with that, because I never livestreamed before
@@PULPANIME how??
@SatishKumar-vo6cd how to use vmc protocol or obs?
@@PULPANIME due to this, we able to do Live streaming?
You’re completely changing the game. Keep doing what you’re doing… I have a feeling you’re gonna get big soon. More people need to see this
thank you so much!
04:37 = best tip ever, thank you so much
Awesome! I'm about to use this software, but don't know where to download, then yt put this to me. Bravo!
Thank you! I'm glad my video helped you :)
Great info! You just helped me level up my own videos for real! I didn't realize how easy it is to do more complex animations with all these techniques - So cool :D
I'm so glad to hear that!
Thank you for the video, very useful information, keep going!
this was very helpful! thx smm!
Outstanding tutorial, it's going to save me hours of pulling my hair out! - is it possible to lip sync audio in XR Animator?
Thank you so much for your comment. I've asked the developer of XR animator the exact same thing, and unfortunately it doesn't look like it's going to be added anytime soon. What you can do instead is to broadcast the data from XR animator to other software with this function like DSS, or VseeFace with the VMC protocol function. At least, that's what I'm doing for my animations.
The fact that the animation is not always good may have to do with the model. What you showed is that the model go through itself is clipping. Using a different model can may solve the problem. If you make youre own model you can fix it also.
Yeah, I think you are right. But it is amazing how good the results are even without doing it. Especially if you are using the more advanced AI mocap of xranimator!
@@PULPANIMEYes Indeed. One of the best for free that i have fond yet.
@Fleuritje I know, right? I am genuinely amazed how underrated xranimator is!
nice information. you've earned new subscriber.
Thanks for this -seems pretty cool and should be helpful. 😃
Hello, I want to ask a question about something. I decided to use xr animator with dan sing sing app. I turn on the explorer mode in xr animator and connect it to dan sing sing but when i press forward. my character goes backwards instead in dan sing sing and this also happens when my i move backwards and somehow my character move forward in dan sing sing. please tell me a way to solve this problem.
Yeah, it can happen. But it's super easy to fix it! Just turn on the "send camera data" function (sorry if it's not the exact name, I'm not near my computer) and if it doesn't fix it, you can play around with the software settings. It doesn't have DSS as a preset, but I think the Vseeface preset can work pretty well. I hope that helps :)
@@PULPANIME Thank you, bro. it works.
@@RenDoStuff it's great to hear! I'm glad I was able to help :)
very helpful trying to to do guitar solo for s music video, thank you
Sounds awesome! XR animator actually really improved recently with tracking for musical instruments!
you should check out the RUclips chaanel of the developer. He is showing a lot of the latest features that were added recently. I think you gonna like it :)
www.youtube.com/@AnimeThemeGadget
Thanks that was interesting!
@@zhappy Thank you :)
Thank you for making this video!
I had to redownload some stuff, XR Animator included and now i can't seem to find electron
It should be there. I would suggest you to download the latest version again from the original GitHub page.
Here is the link :
github.com/ButzYung/SystemAnimatorOnline/releases/tag/XR-Animator_v0.26.0
Your video is amazing. Thank you for showcasing XR animator. I got one question: How do we enable "local files" videos to be tracked and replayed by the avatar?
Thank you so much! In order for the video input to work, first you need to click on the webcam/media button, click yes and then just drag and drop the video reference (mp4 only) and then when you'll click on "local media file" it will play it as a reference. Then, in order to make your avatar perform it, click on the motion capture button, I recommend choosing the "legacy holistic" function, and that's it :)
And if that sounds too complicated, i recommend watching my latest video about exporting vmd files. I am also showing how to track the motion of a video reference over there as well :)
how to make invisible 3D plane? and what software did u use?
Well... When I made this video I was using "Adobe Dimension" to create the plane, and to make it invisible. it is a paid software, but you can easily do the same thing in Blender as well! just add plain to yours scene, switch the viewport to "shading", than add a new material to the plane, drag the alpha value to 0 and change the blend mode to "alpha clip". And then just export it as a glb file and import it to xr animator like i showed in the video.
I know it's kind of hard to understand what to do if your new to Blender, so if you are interested you can feel free to message me vie my email and I can send you an image sequence showing exactly how to do it :)
Thank you so much!!!❤
Subscribed
it doesnt want to open my MMD /pmx model in it and i wanted to us a RUclips video to get motion
Yeah, pmx models don't always work well in XR Animator. what i suggest you do, is converting your model to VRM, and then you'll be able to use it in XR Animator, and a lot of other softwares.
I've made a tutorial about it, showing two methods of converting models to VRM, if you're interested :)
Here's the link :
ruclips.net/video/ieqmaNdNTCw/видео.htmlsi=okBAxf8_h86cdEnR
And if you want to capture motions from a RUclips video, I suggest using a free software to record your screen (like clipchamp) while playing the video, and then upload the video reference to XR Animator :)
@@PULPANIME sadly I don't like vrm models and I mostly use pmx I got it to work kind of just need to fix the hands and facials lol
@@SoraTaeminProductions Alright, I'm glad to hear it worked out :)
i have a chibi charecter how well would it work with this software?? and mixamo animations could they be applied or will it not work at all
I never tried it, but i believe it should work. mixamo has some chibi like characters as well, so i think you should export your motions from there using one of those characters, instead of one of the regular mixamo bots. then just upload your chibi character in XR animator and DSS, and I think you should be fine :)
@@PULPANIME wow thanks so much i will defienetly try i also looked around and ur the only one to answer this so thanks!!
@@conservativeliberal you're welcome. I'm glad I was able to help :)
Dropping an FBX file from Maximo doesn't do anything for me. It accepts the file, but the character doesn't move at all.
It only works with Mixamo files. Unfortunately, it does the same for me with other FBX files as well
Can you please more video about this program> such as we make motion capture use webcamera step by step please with my thanks
Hi, thank you for commenting. I have a video that I'm planing to do about exporting VMD files from XR animator. I can definitely expand on the webcam mocap function in that video with no problem :)
OK, thank you... I am waiting 🙂
So I'm trying to add an FBX file to XR animator and I've only gotten it to work once by accident. Every time I try to drop an FBX file in there it just changes some image of a dude in a dress looking at a piece of paper. Like what does that even mean?
Hahaha, I have no idea what this is. It happened to me too once. It happened when I tried to import an fbx file that was not from mixamo. I've messaged the developer of xr Animator about it, so I really hope it will be fixed in a future update. I would suggest you try and import a mixamo fbx file, and if the software still does this weird stuff, then try to delete and reinstall it. I hope it will help :)
Good video, a question, I already exported the animation in bvh to daz, but it doesn't go anywhere, it stays in the same place, as exported so that it moves on the stage
There could be a lot of reasons. First of all, make sure that the mixamo animation isn't set to "in place" before even downloading it. I can't really help you with daz, because I've never used it before, but in dss they have a feature called leg/toe ik, that helps the feet of the characters to stay in place and eliminate the floaty look, but it also prevents the characters from moving it 3d space. If there is anything similar to that in daz, make sure to disable it first
I can't open the mixamo website
@A1TA_chan that's weird. It's supposed to be pretty easy. I think it's pretty simple if you just sign in with Google. That way, it's just a couple of clicks, and you're in
Hello great as always but do you know how to download items like guns from vrm posing desktop into another file on computer ?
Unfortunately, you can't really export items out of VRM Posing desktop. You can only import 3d props into the software. This is why I recommend using vrm parts adder, because the attached props will transfer to a lot of other softwares with your model
@@PULPANIME Thank you for advise as always look forward for new tutorials
Can I do face mocap from a recorded video and export it to Blender?
I'm afraid not. as far as i know, there isn't an option to use XR animator for facial mocap. just for body movements
@@PULPANIME What about feet/toe animation staying on the ground? Also what's the difference between Full body MediaPipeVision and Full body Legacy Holistic? Which gives better more accurate tracking results from recorded video and how are the finger tracking? Also is 30fps the highest so I can't track a video with 60 fps?
@@Astarrycloud-x2p Haha, that's a lot of questions so I'll try to answer all of them... except the first one, because I didn't get exactly what you mean. If you can elaborate more on what you'd like to know abut feet/toe animation, Id be glad to help.
Now about MediaPipeVision V.S Legacy Holistic. "MediaPipeVision" is the most accurate full body mocap that you can use in XR animator, but it's a bit more taxing and might not work very well on some computers. This is why they also have the "Legacy Holistic" option, which uses a more advanced A.I to help with the tracking, and estimating the movements. It's a bit less accurate but can work faster on some computers. I think you should try both of them, and see which one is working best for you! the finger tracking in both options is pretty good for a fee webcam mocap application. I never tried using LeapMotion or anything like that, so I'm afraid I'm not the right person to tell you if that's better than other mocap solutions
Now, I'm not sure what you mean by 30fps. I belive you can import videos in any frame rate into XR Animator. I prefer recording 30fps videos inside XR animator, but you can also change the frame rate to "unlimited" before exporting your videos.
I really hope I understood your questions correctly. If not, please feel free to clarify what you meant, and I will try my best to help :)
how do you export animation
Do you mean exporting an animation file? or to export a video of your animation? Both are pretty easy.
To export a vmd animation file, click on the UI/options button below, and then click on "5.export motion to file"
In order to export a video file, just press on your f9 key to start recording. and then on f10 to stop recording
I hope it helped :)
@@PULPANIME thnx
@@PULPANIME i manage to export a vmd file for the animation can you possible point me in the right directions to help me retarget animation to be used in unreal engine
@spitfirefps2552 Unfortunately, I can't help you with the unreal Unfortunately because I don't have any experience with it. But I think it might be possible to use vmd files with a vrm avatar by using a plugin called "vrm4u".
ruclips.net/video/epcQ-uU6tfU/видео.htmlsi=sbts20xe7bFyJvNr
Also, you should look into the new demo that connects DSS to the unreal engine
ruclips.net/video/E5TTHjD5lV0/видео.htmlsi=0v-q5doYpFZgrx6i
interesting