Unfortunately, the company we collaborated with (Hyprsense) got acquired by Epic Games just prior to the launch. So we've started building our own instead. But I'm afraid we won't be able to release anything until the middle of next year at the earliest :(
well some motion capture file (bvh ) that you download online ..the first frame is not a T pose or A pose but already an action pose ..would be nice to automatically make a T pose or A Pose to easily retarget
Very good tutorial, thank you. Just one question: What do I have to do with the bones of the FK rig if it has more bones than the mocap rig? My DAZ figure has twice as many bones.
That's ok! Just make sure to use the relevant bones - usually you want to avoid anything that is a Twist bone, or secondary bone. Your arms should just use the Shoulder, UpperArm, Lowerarm, and Wrist bones, and you can ignore all the other ones. Same for leg - Upper Leg, Lower Leg, Foot - and then just ignore the others. Hope that helps!
The raw capture data is looking a lot smoother and cleaner, have you guys done any recent updates? Corridor Crew's recent video " Avatar 2" had me concerned due to the amount of jank ( that may have been their artistic choice or they missed some critical clean up) but this gives me some hope back.
Hi Zane! Our raw capture is excellent, and you can get some really great results with the livestreaming workflow too - Corridor was trying to do a lot in that video but the workflows for UE for livestream are super solid, we'll be putting out more videos showing that - and when you're retargeting the data you record instead of livestreaming it the results are always reliable. Check out Loacher Films or Cinematic Captures on RUclips to see some Unreal projects that will blow your mind!
Hi! Thanks a lot for the feedback. We shared it with our Content Team and we will take it into consideration for our next tutorials. If you are experiencing issues, you can always check out our Help&Community page, ask our users in our Facebook Group here facebook.com/groups/252832679299596 , or reach out to our Support Team at support@rokoko.com.
Hi! Thanks a lot for the feedback. We shared it with our Content Team and we will take it into consideration for our next tutorials. If you are experiencing issues, you can always check out our Help&Community page, ask our users in our Facebook Group here facebook.com/groups/252832679299596 , or reach out to our Support Team at support@rokoko.com.
Hello - I need Help! How to MAP/MATCH the InTheFlesh Facial Morphs From DAZ to the equivalent Apple ARKit blenshapes, in Blender? I'm using the Retargeting Workflow (body+face) and NOT the the live streaming workflow. I'm following this video but when I get to the copying the key frames from the facial animation, it doesn't do it! Thank you for any help!
In addition to the ARKit vendor you mentioned, there are two other paid ($$$) options for Daz characters. Facemotion 3D from Dazney.com and Face Mojo from @laylo3d available in the Daz store. The latter has RUclips tutorials about setting up the exported Daz character in UE4 and using LiveLink for the recording. I'm guessing this would then work with Rokoko's face recording plugin.
We use Face Mojo all the time - @laylo3d is awesome! We have a whole tutorial on that workflow on the channel as well - need to check out the Dazney.com one!
Hi! Thanks a lot for the feedback. We shared it with our Content Team and we will take it into consideration for our next tutorials. If you are experiencing issues, you can always check out our Help&Community page, ask our users in our Facebook Group here facebook.com/groups/252832679299596 , or reach out to our Support Team at support@rokoko.com.
Something about Rokoko mocap that you guys never seem to address...or maybe you have and I've never seen it. But the thumbs are always pointing outward in an awkward, non-natural position in EVERY Rokoko mocap transfer I see. Is that something that's ever going to be fixed?
Funny thing the add-on never showed up in my Blender preferences i did everything exported the model to blender opened blender i don't even see yup i scrolled nada! wtf is going here can someone help me please!
Go into the Blender addon folder in "%AppData%\Blender Foundation\Blender\" and you will see a folder called "BLENDER_VERSION". Rename this folder to your current Blender version and then the plugin should work.
Newbie here: so how would I go from having it all set up for 'livestreaming mode' to actually using the model in a livestream (like vtubing) aka on twitch and stuff? How do I get the character into OBS?
Love to see the continued Blender support. Very nice, Rokoko.
We are doing our best for you!
Can you import Rokoko data directly from your software directly into Daz?
Would be awesome
You can, but into Genesis 1 first, then 3, then 8 or the animation will be messed up.
@@a.m.studios6126 sure about this?
@@RuniDjurhuus Yes, check out the DAZ forums.
@@a.m.studios6126 do you know which tread? Thank you
That's awesome! Any news on the Android app for facial mocap?
Unfortunately, the company we collaborated with (Hyprsense) got acquired by Epic Games just prior to the launch. So we've started building our own instead. But I'm afraid we won't be able to release anything until the middle of next year at the earliest :(
Great demo. Can this data be recorded then brought back into DAZ Studio for renders?
well some motion capture file (bvh ) that you download online ..the first frame is not a T pose or A pose but already an action pose ..would be nice to automatically make a T pose or A Pose to easily retarget
Very good tutorial, thank you. Just one question: What do I have to do with the bones of the FK rig if it has more bones than the mocap rig? My DAZ figure has twice as many bones.
That's ok! Just make sure to use the relevant bones - usually you want to avoid anything that is a Twist bone, or secondary bone. Your arms should just use the Shoulder, UpperArm, Lowerarm, and Wrist bones, and you can ignore all the other ones. Same for leg - Upper Leg, Lower Leg, Foot - and then just ignore the others. Hope that helps!
The raw capture data is looking a lot smoother and cleaner, have you guys done any recent updates? Corridor Crew's recent video " Avatar 2" had me concerned due to the amount of jank ( that may have been their artistic choice or they missed some critical clean up) but this gives me some hope back.
Hi Zane! Our raw capture is excellent, and you can get some really great results with the livestreaming workflow too - Corridor was trying to do a lot in that video but the workflows for UE for livestream are super solid, we'll be putting out more videos showing that - and when you're retargeting the data you record instead of livestreaming it the results are always reliable. Check out Loacher Films or Cinematic Captures on RUclips to see some Unreal projects that will blow your mind!
Can you show us your final connection? what would LipsLowerDown LowerOPEN, Lower CLOSE be on the ARkit list?
Nice, what's the final result animation of the orc character in this video rendered in, Cycles or Eevee?
Great tutorial! I would love for the next tutorial in this series to be about the Daz to Unity Bridge. When do you think we will get this?
Hi! Thanks a lot for the feedback. We shared it with our Content Team and we will take it into consideration for our next tutorials. If you are experiencing issues, you can always check out our Help&Community page, ask our users in our Facebook Group here facebook.com/groups/252832679299596 , or reach out to our Support Team at support@rokoko.com.
Hi! Thanks a lot for the feedback. We shared it with our Content Team and we will take it into consideration for our next tutorials. If you are experiencing issues, you can always check out our Help&Community page, ask our users in our Facebook Group here facebook.com/groups/252832679299596 , or reach out to our Support Team at support@rokoko.com.
Hello - I need Help! How to MAP/MATCH the InTheFlesh Facial Morphs From DAZ to the equivalent Apple ARKit blenshapes, in Blender? I'm using the Retargeting Workflow (body+face) and NOT the the live streaming workflow.
I'm following this video but when I get to the copying the key frames from the facial animation, it doesn't do it!
Thank you for any help!
In addition to the ARKit vendor you mentioned, there are two other paid ($$$) options for Daz characters. Facemotion 3D from Dazney.com and Face Mojo from @laylo3d available in the Daz store. The latter has RUclips tutorials about setting up the exported Daz character in UE4 and using LiveLink for the recording. I'm guessing this would then work with Rokoko's face recording plugin.
We use Face Mojo all the time - @laylo3d is awesome! We have a whole tutorial on that workflow on the channel as well - need to check out the Dazney.com one!
@@RokokoMotion where is that tutorial? I can't find it
@@RokokoMotion Are you... Are you telling us that face mojo is better than your own face capture tool ?
Is it possible get rokoko mocap data from blender into daz again?
Thats awesome man..
That's great! Can you please provide a workflow in how to import facial animation to Houdini ?
Hi! Thanks a lot for the feedback. We shared it with our Content Team and we will take it into consideration for our next tutorials. If you are experiencing issues, you can always check out our Help&Community page, ask our users in our Facebook Group here facebook.com/groups/252832679299596 , or reach out to our Support Team at support@rokoko.com.
very good I could make a crocodile one so I can use it in my lives
Something about Rokoko mocap that you guys never seem to address...or maybe you have and I've never seen it. But the thumbs are always pointing outward in an awkward, non-natural position in EVERY Rokoko mocap transfer I see. Is that something that's ever going to be fixed?
Funny thing the add-on never showed up in my Blender preferences i did everything exported the model to blender opened blender i don't even see yup i scrolled nada! wtf is going here can someone help me please!
Same thing here
Go into the Blender addon folder in "%AppData%\Blender Foundation\Blender\" and you will see a folder called "BLENDER_VERSION". Rename this folder to your current Blender version and then the plugin should work.
@@MrDarkblader1 I been found it it was in my 29.0 folder i took and copied it into my 29.1 folder it worked
@@MrDarkblader1 Thanks man, that was super helpful!
Newbie here: so how would I go from having it all set up for 'livestreaming mode' to actually using the model in a livestream (like vtubing) aka on twitch and stuff? How do I get the character into OBS?