Hey guys, quick update after 7 months: The current Unreal Engine version is 5.3.2, and the workflow here doesn't support above 5.0.3. Updates not only apply to Unreal Engine but also to Metahuman versions, possibly causing issues with older Unreal Engine versions when using updated Metahumans. I tried reaching out to the VMCtoMOP plugin developer but got no response. Maybe he missed my message, I don't know. Honestly, I love this workflow and the video-it's a good, free option, but it's slowly losing its freshness. While figuring out how to adapt this workflow to the latest Unreal Engine versions, I found another software called Dollars MONO. It offers real-time, full-body (face, fingers, body) Metahuman mocap, so it's an all-in-one solution, no extra programs needed but you can also combine it with Live Link if you want. The catch is it's not free; there's a one-time fee of $99, but you can try it out with a trial version to see if it's meets your expectations. Just to clarify, I don't have any deals like an affiliate link or any involvement in receiving a percentage of the sale with the company. I got access for free, which is the reason for the 'paid promotion' label. Check out my video: ruclips.net/video/ayTm856fDBQ/видео.html By the way, this was my first tutorial video, and the response was beyond my expectations. Thanks a bunch for that!
I somehow, have it work on ue5.3.2 on my laptop, but i cant get it run on my PCs UE5.3.2. The problem is the plugin folder. when i copy it in my projekt folder, it didnt appear, but, when i copy it to content in the folder than it appears, but i cant set the metahuman on Mop_reciever XO
Allright, I have it working on UE5.3.2. The problem was i had to click at "show plugin content in contet browser" in the content settings. Thanks God, to that small youtube chanal that made an extra turtorial for that! And thanks to you! :)
Hi, the Mop plugin doesn't work with UE 5.2 (take a look at 1:40) You'll need to downgrade your project to 5.0.x version. In the tutorial, I'm using 5.0.3. It's just a possibility, but if this video receives enough views, the developer might update these plugins to versions 5.1 and 5.2, which would be highly pleasing to everyone.
Hey! thank you for this video, it's been really useful for some projects. But I realized that the head is not tracked. I mean, the tracker does it but the MetaHuman doesn't seem to recieve that info. I'm not able to identify if in your case it's the same or not. Are you having the same issue or is something that only happened to me?
It might be possible to achieve this by tweaking the metahuman finger skeletons and then retargeting them. I have a basic idea of how it might work, but I'm not sure about the exact steps to get it done. If you're looking for an easy-to-use full-body mocap system using just a webcam, be sure to check out my latest video. ruclips.net/video/ayTm856fDBQ/видео.html
Take a look at my latest video at ruclips.net/video/ayTm856fDB/видео.html. You can use livelink for facial mocap and dollars mocap for body animations. What you mentioned is possible
@@duz_gen Hi, I did search in "All" parent folder, I'm using UE5.1.1 and it doesen't show plugin folders. So I tried again by copying the BP files from the plugin folders into the content folder and now I can drag drop the Mop Receiver into the world. But after setting the port, in the metahuman BP, I couldn't find the Mop Receiver. It's simply showing None. Also tried restarting.
@@vivekanandhan555 sorry but UE 5.1.1 isn't compatible with mop plugin, only 5.0.x versions work, mentioned in 1:40. Convert your project to version 5.0.3
On 06:56 you find BP_MopReceiver, but in my case there is none. when i copy BP_Mopreceiver file to Content folder i can find it but in last step at 7:22 i can't find it. I am using Unreal Engine 5.0.3.
@@vuhung4934 After reading your comment, I started a new project and it worked as expected. In the final step, I select the metahuman and search for 'mop' in the details section. You might have missed a tiny detail. I'd be happy to help, so feel free to email me. I can send you this project as a reference
Not sure about the multicam rig but in the desktop version, there's an option to save motion data in BVH format. Just click on the 'Record BVH' button in the menu. On the iPhone version you can save in both BVH and VMD formats and export them as you like. Both formats can be converted to FBX, so you can easily import these FBX files into most 3D software
Would it be possible to get the tracked webcam head location from the Evaluate LiveLink node (or any other way to get the data) to track the head location 2d to world location in Unreal Engine. What I am trying to achieve is to have a virtual avatar to always look towards the person on the webcam.
So basically, are you thinking about having the camera follow your avatar's eyes all the time, making sure they always maintain eye contact with the camera?
Thank you so much for the plugins but, after I dragged my mop receiver into the scene it says bad on the blueprint icon and nothing is working , can you help me out?
Recently replied to an e-mail about this: Wrist movements can be transferred quite easily. As for finger data, I believe it might be possible to achieve this by tweaking the Metahuman finger skeletons and then retargeting them. However, I should clarify that I'm not an expert on retargeting. I have a general idea of how it could work, but I don't know the exact steps to accomplish it. That said, the general idea would be to align the finger bone structure of your motion capture data with that of the Metahuman model in Unreal Engine. Once the skeletons are aligned, you could then retarget the motion capture data to the Metahuman model. This would involve mapping each bone in your motion capture data to its corresponding bone in the Metahuman model.
In the desktop version, you have the option to save motion data in the BVH format. Just look for the 'Record BVH' button in the menu. On the iPhone version, you have even more flexibility as you can save motion data in both BVH and VMD formats and export them as needed. I googled it, and it seems that both BVH and VMD formats can be converted to the FBX format. If Source Filmmaker supports the BVH or FBX format, you should try, i don't see any reason why it wouldn't work.
@@Goofy_Music The desktop version is completely free, and there is no premium. However, if you are using the iPhone version, you will need a premium subscription to export motion data.
Awesome! Could i apply the same concept but not with (TDPT), I wanaa to use it with (VMC) virtual motion capture app?? is it goes right with the same workflow? Thanks.
That should actually work because Unreal doesn't directly interact with TDPT, it connects with vmctomop and as long as you can run vmctomop you should be able to use any software with Unreal. However, keep in mind that the software you put in place of TDPT may have different settings and you will need to configure them properly.
No, that's not true. I'm using just a basic $20 webcam here and can still do real-time mocap. The Kinect can also be used for real-time mocap and I think it will give better results.
what about if i paint some green dots in my wall ,can i mockup a video in that wall like if i am in a news transmition youtube chanel in real time or news weather in real time,can unreal engine track green dots in the wall so i dont have to pint the entire wall in green color?
What we're talking about here is 'real-time compositing.' A few green dots won't be enough, so you're going to need a full green screen in the background. Instead of painting your wall, you could consider buying a portable green screen. I have one myself, and it's really handy. You should check it out. (Neewer collapsible green screen)
Those dots are usually used for "tracking". Something different than what you're planning. Painting the wall or putting up a green screen behind is a completely traditional method but they will 100% work. Nowadays, there are a lot of AI tools coming out, maybe you can find what you're looking for in different tools.
I hadn't heard of XR Animator until now. I just checked it out and it'd be great if there's a way to make it fully work with Unreal Engine. I'll do some research and see if it's possible@@Demsnacks
Thank you for the nice video. Is it possible to use a mirrorless camera like Sony A6000? I currently have 2 cameras. Lumix G7 and A6000. I also have a Samsung Note 10+ Could I use any one of these with this technique?
Absolutely, you can use it. Just connect your camera to your PC and use OBS's virtual camera function. If you have a capture card, even better. If you plan to use your smartphone I recommend the IVCam app
@@duz_genyes. What I did was I manual placed the BP_MopReceiver.uasset to the content folder. But as I drag the MOPreciever to the MOP of the metahuman it reds out
This workflow (TDTP + Unreal Engine) allows you to set up a live link connection. It's a standard live link setup with no differences, and it works as expected. While I don't have a tutorial video yet, I may create one soon
TDPT can capture finger movements but it can't be transferred to Unreal Engine. However, with some adjustments to the metahuman skeleton, it might be possible.
Thank you very much, very detailed, I have subscribed. If possible, I hope to see you record an installment of how to capture with custom characters next time!
In the desktop version, you have the option to save motion data in the BVH format. Just look for the 'Record BVH' button in the menu. On the iPhone version, you have even more flexibility as you can save motion data in both BVH and VMD formats and export them as needed. I googled it, and it seems that both BVH and VMD formats can be converted to the FBX format. I hope it works!
Open the Epic Games Launcher, find the Unreal Engine tab, click on the plus sign next to 'Engine Versions.' From there, you can install the version you prefer.
Very good content, I learned a lot from your video, you gained another subscriber to your channel, you certainly didn't talk to yourself! Soon more subscribers will appear, anxiously waiting for the next video. Success and I will follow whenever I post new content on your channel. Hugs from São Paulo Brazil.
Wow, thank you so much for your kind words buddy! It's encouraging to know that my work is reaching and helping people like you. Your support means a lot to me, and I'm excited to have you as part of our growing community. Agradeço de coração!
@@duz_gen Pode contar com isso meu amigo! Já me inscrevi e agora sou mais um dos seguidores do teu canal, e sempre interessante assuntos que envolve essa área da animação e produção virtual com a unreal engine, e fantástico!
You can do it with TDPT but it's kinda complex. There's simpler software options out there ruclips.net/video/Sc2sk1hXN9c/видео.html And for the software in this vid, I've got a detailed tutorial. It's for metahuman but it'll give you the idea. ruclips.net/video/ayTm856fDBQ/видео.html
ses biraz değişik geliyor boğazdaki mikrofon sanki gırtlağın sesini değilde hava akışını daha fazla alıyor gibi ses havalandırmadan geliyor gibi sürekli hıhhı hıııhıı hıııhıı tarzı
Why do some of the best/useful pieces of software always look like they come from a notepad tutorial on how they can help pirate your friends MSN account if you enter your credentials Great tutorial btw, shame it doesn't support 5.1/2 but only a matter of time I hope!
Your comment made me laugh 😂 but I don't think it's that bad. However, there is room for improvement in the interface. Since these programs are provided for free, we can assume they don't make much profit. Still, let's give thanks to the developers who make this workflow possible!
Hey guys, quick update after 7 months:
The current Unreal Engine version is 5.3.2, and the workflow here doesn't support above 5.0.3. Updates not only apply to Unreal Engine but also to Metahuman versions, possibly causing issues with older Unreal Engine versions when using updated Metahumans. I tried reaching out to the VMCtoMOP plugin developer but got no response. Maybe he missed my message, I don't know. Honestly, I love this workflow and the video-it's a good, free option, but it's slowly losing its freshness.
While figuring out how to adapt this workflow to the latest Unreal Engine versions, I found another software called Dollars MONO. It offers real-time, full-body (face, fingers, body) Metahuman mocap, so it's an all-in-one solution, no extra programs needed but you can also combine it with Live Link if you want. The catch is it's not free; there's a one-time fee of $99, but you can try it out with a trial version to see if it's meets your expectations.
Just to clarify, I don't have any deals like an affiliate link or any involvement in receiving a percentage of the sale with the company. I got access for free, which is the reason for the 'paid promotion' label.
Check out my video:
ruclips.net/video/ayTm856fDBQ/видео.html
By the way, this was my first tutorial video, and the response was beyond my expectations. Thanks a bunch for that!
I somehow, have it work on ue5.3.2 on my laptop, but i cant get it run on my PCs UE5.3.2. The problem is the plugin folder. when i copy it in my projekt folder, it didnt appear, but, when i copy it to content in the folder than it appears, but i cant set the metahuman on Mop_reciever XO
Allright, I have it working on UE5.3.2. The problem was i had to click at "show plugin content in contet browser" in the content settings. Thanks God, to that small youtube chanal that made an extra turtorial for that! And thanks to you! :)
i also changed the folder to "can be written" you know?
was able to get this to work in 5.4 with Webcam Motion Capture, works great!
Thank you i was searching for a good live motion capture
On 06:56 you find BP_MopReceiver, but in my case there is none. I am using Unreal Engine 5.2 -- maybe that is the reason??
Hi, the Mop plugin doesn't work with UE 5.2 (take a look at 1:40) You'll need to downgrade your project to 5.0.x version. In the tutorial, I'm using 5.0.3.
It's just a possibility, but if this video receives enough views, the developer might update these plugins to versions 5.1 and 5.2, which would be highly pleasing to everyone.
@@duz_gen Thank you. Yes, I noticed the UE 5.0.3 requirement, but still hoped that it might work with 5.2 :)
@@TigranAivazianhey if you were still wondering it does work you have to enable show plugin content
@@itsjustview5759 Thank you!!!!! I was struggling to find the file even in 5.0. CHEERS!
@@thegoaks how TF did u find it??? I'm on 5.0.3!!!
Great vid, great to meet you! Hope to see many more of your videos!
Thank you, stay tuned for more exciting content! I’m glad you enjoyed the video and it was a pleasure to meet you too Gabriel!
cant find mopr please help me
Here's your 271st sub! Thanks for the video
Thanks for the sub, welcome! 🙌
Hey! thank you for this video, it's been really useful for some projects. But I realized that the head is not tracked. I mean, the tracker does it but the MetaHuman doesn't seem to recieve that info. I'm not able to identify if in your case it's the same or not. Are you having the same issue or is something that only happened to me?
it's been like a month now, is it workable in UE 5.2 now ?
if not, is it coming soon ?
UE 5.2 is not supported yet but if there is an update I'll comment here and let you know.
hello, please add iphone face track and your body mocap app in same time tracking course. so we can complete all system body and face
planning to cover that but for now you can follow the body mocap steps in this video and then set up live link as usual. Both should work seamlessly
The program ThreeDposeTracker has for hand tracking from what I see, but how do we detect it in real time?
It might be possible to achieve this by tweaking the metahuman finger skeletons and then retargeting them. I have a basic idea of how it might work, but I'm not sure about the exact steps to get it done. If you're looking for an easy-to-use full-body mocap system using just a webcam, be sure to check out my latest video. ruclips.net/video/ayTm856fDBQ/видео.html
Excellent tutorial! Thanks a lot! Can you make a video on how you can create a multi-person online game with this?
what about facial animation but not just facial animation but facial animation+ body mocap from the video
Take a look at my latest video at ruclips.net/video/ayTm856fDB/видео.html. You can use livelink for facial mocap and dollars mocap for body animations. What you mentioned is possible
Hi, I couldn't find "mopr" after compiling and saving. Any idea?
Hi, make sure that you search 'mopr' in 'ALL' folders, not just in 'content'
@@duz_gen Hi, I did search in "All" parent folder, I'm using UE5.1.1 and it doesen't show plugin folders. So I tried again by copying the BP files from the plugin folders into the content folder and now I can drag drop the Mop Receiver into the world. But after setting the port, in the metahuman BP, I couldn't find the Mop Receiver. It's simply showing None. Also tried restarting.
probably I will test with 5.0
@@vivekanandhan555 sorry but UE 5.1.1 isn't compatible with mop plugin, only 5.0.x versions work, mentioned in 1:40. Convert your project to version 5.0.3
@@duz_gen Oh you did mentioned it! sorry my bad. Thanks!
On 06:56 you find BP_MopReceiver, but in my case there is none. when i copy BP_Mopreceiver file to Content folder i can find it but in last step at 7:22 i can't find it. I am using Unreal Engine 5.0.3.
Make sure that you search for 'mopr' in 'ALL' folders, not just in 'content'
@@duz_gen yes, i search in All folder. I did all the same steps as you but when it comes to adding mopreceiver file, this file doesn't appear
@@vuhung4934 After reading your comment, I started a new project and it worked as expected. In the final step, I select the metahuman and search for 'mop' in the details section. You might have missed a tiny detail. I'd be happy to help, so feel free to email me. I can send you this project as a reference
In the content browser click the settings menu and make sure "Show Plugin Content" is checked. This fixed it for me
@@dennismeelis3452 thankkkkkkkyyyyyyooooouuuuu!!!!!!!!!
Can this work with a multicam rig, exported to any 3d animation software, and what about facial capture with high accuracy?
Thanks!
Not sure about the multicam rig but in the desktop version, there's an option to save motion data in BVH format. Just click on the 'Record BVH' button in the menu. On the iPhone version you can save in both BVH and VMD formats and export them as you like. Both formats can be converted to FBX, so you can easily import these FBX files into most 3D software
Would it be possible to get the tracked webcam head location from the Evaluate LiveLink node (or any other way to get the data) to track the head location 2d to world location in Unreal Engine.
What I am trying to achieve is to have a virtual avatar to always look towards the person on the webcam.
So basically, are you thinking about having the camera follow your avatar's eyes all the time, making sure they always maintain eye contact with the camera?
@@duz_gen I found a way :-)
Good luck on your channel!!
非常感谢 ,很详细。但是我有一个问题,我想角色替换自己的小孩子模型。该怎么弄?
Glad you liked it! You need to learn how to use VRM models in Unreal Engine. Please search for 'How to use VRM models in Unreal Engine'
Thank you so much for the plugins but, after I dragged my mop receiver into the scene it says bad on the blueprint icon and nothing is working , can you help me out?
Still dealing with that issue or did you manage to fix it?
I have the same issue :(@@duz_gen
Great video. If I use this method to livestream, is it better to have a streaming pc and a gaming pc separately?
Yes, it would definitely be better because this method puts a load on the graphics card.
Do you know a way to make the fingers movement work?
Recently replied to an e-mail about this:
Wrist movements can be transferred quite easily. As for finger data, I believe it might be possible to achieve this by tweaking the Metahuman finger skeletons and then retargeting them. However, I should clarify that I'm not an expert on retargeting. I have a general idea of how it could work, but I don't know the exact steps to accomplish it.
That said, the general idea would be to align the finger bone structure of your motion capture data with that of the Metahuman model in Unreal Engine. Once the skeletons are aligned, you could then retarget the motion capture data to the Metahuman model. This would involve mapping each bone in your motion capture data to its corresponding bone in the Metahuman model.
Could I make possibably export the animation to something like Sourcefilm maker?
In the desktop version, you have the option to save motion data in the BVH format. Just look for the 'Record BVH' button in the menu. On the iPhone version, you have even more flexibility as you can save motion data in both BVH and VMD formats and export them as needed. I googled it, and it seems that both BVH and VMD formats can be converted to the FBX format. If Source Filmmaker supports the BVH or FBX format, you should try, i don't see any reason why it wouldn't work.
@@duz_gen Thank you, also could I still do animation like putting in SFM or unreal without premium?
@@Goofy_Music The desktop version is completely free, and there is no premium. However, if you are using the iPhone version, you will need a premium subscription to export motion data.
Awesome! Could i apply the same concept but not with (TDPT), I wanaa to use it with (VMC) virtual motion capture app?? is it goes right with the same workflow? Thanks.
That should actually work because Unreal doesn't directly interact with TDPT, it connects with vmctomop and as long as you can run vmctomop you should be able to use any software with Unreal. However, keep in mind that the software you put in place of TDPT may have different settings and you will need to configure them properly.
Thanks, i have tried it and it works well🙂. @@duz_gen
hi! is now avaiable for 5.2 version?
Not yet but if there is an update I'll comment here and let you know.
is it true that you can do real time mocap if you hav only xbox 360 kinect?
No, that's not true. I'm using just a basic $20 webcam here and can still do real-time mocap. The Kinect can also be used for real-time mocap and I think it will give better results.
5.0.3 doesnt want to load the metahumans only 5.3 , what can i do ? it says they are newer......
3d pose tracker to iclone then animate any dummy avatar
Then iclone animation to unreal engine metahuman
what about if i paint some green dots in my wall ,can i mockup a video in that wall like if i am in a news transmition youtube chanel in real time or news weather in real time,can unreal engine track green dots in the wall so i dont have to pint the entire wall in green color?
What we're talking about here is 'real-time compositing.' A few green dots won't be enough, so you're going to need a full green screen in the background. Instead of painting your wall, you could consider buying a portable green screen. I have one myself, and it's really handy. You should check it out. (Neewer collapsible green screen)
@@duz_gen BUT WHY OR HOW THEY USE DOTS IN THE BODY AND WORKS? ISNT THAT POSSIBLE IN UNREAL ENGINE?
Those dots are usually used for "tracking". Something different than what you're planning. Painting the wall or putting up a green screen behind is a completely traditional method but they will 100% work. Nowadays, there are a lot of AI tools coming out, maybe you can find what you're looking for in different tools.
Good video bro, love from China 😀 hope to see more videos from you
Thank you, there will be more videos! 👨💻
Going to try this out with XR Animator to see if I can get hand tracking working along with the IOS live link face
Oof no dice on getting hand tracking working but if anyone knows how to get it working I'd be very interested
I hadn't heard of XR Animator until now. I just checked it out and it'd be great if there's a way to make it fully work with Unreal Engine. I'll do some research and see if it's possible@@Demsnacks
Thank you for the nice video. Is it possible to use a mirrorless camera like Sony A6000? I currently have 2 cameras. Lumix G7 and A6000. I also have a Samsung Note 10+ Could I use any one of these with this technique?
Absolutely, you can use it. Just connect your camera to your PC and use OBS's virtual camera function. If you have a capture card, even better. If you plan to use your smartphone I recommend the IVCam app
Hey, any idea how I can assign the mop receiver to a game pawn?
Hi, what's your goal with using the mop receiver in that way? Its only function is simply to keep the program running, nothing else.
This is unbelievable cool!
THERES NO BP_MOPRECIEVER IN \ALL\ WTH, (UE5.3.1)
ALSO DOESNT EXIST IN 5.0.3 EITHER, WTH??????
sos un genio!!!! gracias desde argentina
I can't connect the metahuman MOP to the Mop receiver :
its all red out. Im so close to the last step
Can't find mop receiver?
@@duz_genyes. What I did was I manual placed the BP_MopReceiver.uasset to the content folder. But as I drag the MOPreciever to the MOP of the metahuman it reds out
@@duz_gen drive.google.com/file/d/14DqatoKwezElX4j9m2AsTsfcV9Zuux4t/view?usp=sharing I'm stuck on this part
since typing the mopr wont appear even if I click at the topmost of the folder. I did downgraded to 5.0.3 and changed my metahuman on 5.0 version
how we can pair this with live link to get both, also live link tutorial
This workflow (TDTP + Unreal Engine) allows you to set up a live link connection. It's a standard live link setup with no differences, and it works as expected. While I don't have a tutorial video yet, I may create one soon
Won't work for me😭😭😭😭😭😭
which version of unreal are you using?
Wow! Vlad The Impaler sure knows his UE5 stuff!
Do I really look like Vlad the Impaler? 😄 heard that he's vicious but I'm not too familiar with his history
Can it capture detail, ie finger movement?
TDPT can capture finger movements but it can't be transferred to Unreal Engine. However, with some adjustments to the metahuman skeleton, it might be possible.
Can it work with kinect??
Not sure, I don't know much about Kinect
Sry to disturb but does anyone know if it is available for ue5.2?
UE 5.1, 5.2, and 5.3 aren't compatible yet
can this be used for live tracking?
This is for mocap, I don't think tracking would be possible.
Good Video
We do notice the little guys ;)
Thank you! 🙌
New subscriber added respect to you brother ❤
Thanks and welcome!
brilliant, Thanks for sharing
Whats the best motion capture with iclone to ue5…?
sorry, don't know much about iClone
Thx Man, big love!
Thank you very much, very detailed, I have subscribed. If possible, I hope to see you record an installment of how to capture with custom characters next time!
If you're asking about custom characters other than metahuman, yes, you can definitely do real-time mocap with them as long as the skeleton matches.
Very useful, thank you!
It would be cool if result can be exported to blender or just some 3d format like fbx
In the desktop version, you have the option to save motion data in the BVH format. Just look for the 'Record BVH' button in the menu. On the iPhone version, you have even more flexibility as you can save motion data in both BVH and VMD formats and export them as needed. I googled it, and it seems that both BVH and VMD formats can be converted to the FBX format. I hope it works!
Blender and Unreal engine?
Wow but Nothing for mac is upsetting! Rokoko is expensive :(
sorry, maybe a different version for mac will be released in the future.
Yes thats much better than sitting in a room and talking to yourself😂😂😂
where can i find 5.0.3?
does it work for 5.2 now ?
Open the Epic Games Launcher, find the Unreal Engine tab, click on the plus sign next to 'Engine Versions.' From there, you can install the version you prefer.
Currently, the 5.1 and 5.2 versions don't work, but they might become available soon.
Very good content, I learned a lot from your video, you gained another subscriber to your channel, you certainly didn't talk to yourself! Soon more subscribers will appear, anxiously waiting for the next video. Success and I will follow whenever I post new content on your channel. Hugs from São Paulo Brazil.
Wow, thank you so much for your kind words buddy! It's encouraging to know that my work is reaching and helping people like you. Your support means a lot to me, and I'm excited to have you as part of our growing community. Agradeço de coração!
@@duz_gen Pode contar com isso meu amigo! Já me inscrevi e agora sou mais um dos seguidores do teu canal, e sempre interessante assuntos que envolve essa área da animação e produção virtual com a unreal engine, e fantástico!
@@duz_gen❤
If should be cool if combine live link face app on another phone, low cost full body mocap solution.
This combination is possible I've used it in a project and it worked well.
Do you have to have premium?
can it capture lip sync
nope, check this: ruclips.net/video/ayTm856fDBQ/видео.html
Thank you, but how I do this for VRM model ?? any video because no one explain this .. can please
You can do it with TDPT but it's kinda complex. There's simpler software options out there ruclips.net/video/Sc2sk1hXN9c/видео.html
And for the software in this vid, I've got a detailed tutorial. It's for metahuman but it'll give you the idea. ruclips.net/video/ayTm856fDBQ/видео.html
ses biraz değişik geliyor boğazdaki mikrofon sanki gırtlağın sesini değilde hava akışını daha fazla alıyor gibi ses havalandırmadan geliyor gibi sürekli hıhhı hıııhıı hıııhıı tarzı
Muito bom, obrigado pelo conteúdo
subbed good channel
I'll program my own thanks...
That's great thank you
Thnx Sir
başarılar kardeşim. türkçesinide bekliyoruz.
Teşekkürler, altyazı ekleyeceğim yakın zamanda
video starts aftter two minutes of yapping, skip to save time
Welp… I guess I gotta move a source engine model to unreal
And then most likely rig it
Hi, let me know how it goes, if you've got any questions feel free to ask
Why do some of the best/useful pieces of software always look like they come from a notepad tutorial on how they can help pirate your friends MSN account if you enter your credentials
Great tutorial btw, shame it doesn't support 5.1/2 but only a matter of time I hope!
Your comment made me laugh 😂 but I don't think it's that bad. However, there is room for improvement in the interface. Since these programs are provided for free, we can assume they don't make much profit. Still, let's give thanks to the developers who make this workflow possible!
Works in 5.3.2 @@duz_gen
keep working....
Slide
лучше бы на русском базарил, брат
Are you suggesting that I add Russian subtitles?
You'd better learn English, vatnik!😊
um noone needs to use this if u have a iphone there is a live link made just for that lmaoooooio