@Bird I literally just switched from my Sony phone to get the iPhone 12 for this. Still use google chrome and Spotify nothing really changed except now I can use real time face tracking models
ah noo!! don't be sad, the android equivalent of arkit is arcore (they are both built into unity). So there probably is a way to do this with android, you'll just have to do a bit more searching
You can always just buy a used old iphone off ebay to use for this purpose. Needs to be a 6s or up to work with arkit. 6s's run like ~$50, which is honestly cheaper than a decent webcam.
I finally ordered my iPhone! While wait the phone to ship, I'm brushing up on your videos about Face ID. PS: I don't know why, but you laughing at yourself (like in the intro) always gets me laughing!
Thank you for this tutorial! I’ve wanting to know how to create new shapekeys for FOREVER! I’m new to using Blender, I’ve tried before and it gave me such a headache... xD
Just wondering about your thoughts on the new HTC Vive mouth tracking and eye-tracking headset combo. would that be a viable face-tracking option for Vtubers or is the iPhone the only viable way. mainly asking cuz I don't have an iPhone and I'd rather buy a whole vive setup then get one.
8:19 - omg! the amount of subdivided vertices! Doesn't VRChat /android stream capture get slowed down by them? I imagine 3 of those avatars in a room would make the room laggy....What's the recommended amount of polys for an Avatar? 50k? 80k? 250k?
Hello Fofamit! I'm super baby newish to the Vtuber scene and have just finised having my model drawn and rigged to be used with VTube Studio. Currently I'm using a Logitech Brio for my capturing but my Rigging Design Person suggested going with either a IOS or Android as they have the IR Depth thing you speak of. My question, which I feel is dumb but I tend to overthing everything, is Does the Iphone XR have to have service even though I'm just using the Camera feature and probably having it hooked straight up to my PC? And do you maybe have a good tutorial video that goes through setting this all up?
Hihi! Not sure if you'd know the answer/cause of this-- but when I add blendshapes for brows up/down, they work fine in VSeeFace (using VTubeStudio on iPhone to track my face). However, when I add things like Cheek Puff, Tongue out, etc. they all work fine BUT then the eyebrows don't work anymore. Do you know what I could be doing wrong or how to fix it? Please help if you can!
Thank you so much for so many amazing tutorials. We don’t even know each other but u really helped me overcome my fear of streaming. Not only that your tutorials helped me start my own journey on twitch and for that I will be forever thankful❤️ I have a question- I have a vrm model as well (made from a vrchat unity package model) At 6:00 I can see that ur avatars on the screen has a ton of different face expression sliders. (I’m not really good with unity or blender) Is it possible to get face tracking to work with a vr chat -> vrm model? ( for some reason it feels like I have less face expression than ur model on the screen) Thank you so much for your help ❤️
Hello! Would really love some help if possible, I have exported my avatar with all the apple blendshapes set with no export issues, but when I put it into waidayo it doesn't seem like it's able to use any of them, I can't find any solutions. Any input would be greatly appreciated! Thank you
late reply but ive had the same issue the only that is visibly working is sticking my tounge out... but its still kinda bad tracking i think im gonna try switch to ifacialmocap instead
Hey! I love this content, very helpful! I was wondering, though, is this a good wah to go about in-video face replacement? I'm trying to learn how to just swap out my face with a 3d model of a mask, which tracks my facial expressions into the mask itself. I want to leave the rest of my real life body in the video. Only swap out my face with the 3d mask model.
My main issue is that I don't know what half of these blend shapes mean. Like, what does "MouthRollUpper" mean? The guide is Japanese, so the text is basically unintelligible after Google Translate, like "Cover the upper teeth with the upper lip and roll it up. The shape when mogumogu is included in the mouth."
hey fofa! im trying vmagicmirror with ifacialmocap and its not working? all the blandshapes and keys are in the name correctly. and with waidayo it works fine. im not sure whats going on. any idea?
This one I actually know the answer to. It's because processing what the camera is seeing your face do takes a little bit of time. This slight lag is very easily overcome though either in post with a slight audio offset, or if streaming/recording using OBS, with a very slight delay on the mic audio source, which OBS can do natively.
What about us android users can you make a video of android I would be greatly appreciated amongst us android users and I'm just starting out vtubing but haven't posted a video yet but I'm looking for an app to were I can use a fbx model because I have one already made and (I love your work keep it up Fofamit you the best vtubing guide that's why I'm subscribed)
@@Fofamit that would be helpful thank you and your the one who has inspired me to become a vtuber and your guides are so helpful that's why I'm subscribed
When the cooler Facetracking might take longer than the creation of the whole avatar.😥 Thank you for this awesome tutorial, I definitely will get back into it because I just want this tracking to work. Problem is I will need to manually create a lot of them since my model is based on things from Models Creators created for MMD mainly. So it's just the basics and I need to add the rest like... 42 blendshape keys. 🤣
are you doing something with your voice? not sure if you are just getting more comf on the mic or practicing subtle things like enunciation. but your vocal quality seams to be a bit more consistent so if you are doing something it show :D otherwise you seem to be getting more comf on the mic
Special thanks to Great Moon Aroma!
I asked this before but how do u add Avatars face at bottom to check for previews on Unity, Thank you
@@MAMAvsGOD ruclips.net/video/2C5pJl7Y9cM/видео.html
12 hours of stream time converted to 10 minutes . Fofa's editing skillz are over 9000 .
more like Patience Skill
Jesus
talented smart amazing fofa
Ah yes
This will be useful
Edit: *cries in android*
@Bird I literally just switched from my Sony phone to get the iPhone 12 for this. Still use google chrome and Spotify nothing really changed except now I can use real time face tracking models
@Bird like I said the same. Unplug in the morning to go to work and plug it again at night like any other phone?
ah noo!! don't be sad, the android equivalent of arkit is arcore (they are both built into unity). So there probably is a way to do this with android, you'll just have to do a bit more searching
You can always just buy a used old iphone off ebay to use for this purpose. Needs to be a 6s or up to work with arkit. 6s's run like ~$50, which is honestly cheaper than a decent webcam.
@@qumefox sadly only the new iPhones have the right camera
I eventually got my hands on one tho
I finally ordered my iPhone! While wait the phone to ship, I'm brushing up on your videos about Face ID.
PS: I don't know why, but you laughing at yourself (like in the intro) always gets me laughing!
A nice surprise to see Moon's model included! Thank you for the video~
Fofamit, you're the best! So informative, entertaining, and enjoyable.
how do you get SO MANY shape keys in the shape key menu list in unity? i only get the 5 default dxpressions T-T
Fofa, you had me at, “What do I do with my life!”
I love your background Q~Q did you make this?
link for the apple shape key names and guide? it isn't in the description or I cant find it
Thank you for this tutorial! I’ve wanting to know how to create new shapekeys for FOREVER! I’m new to using Blender, I’ve tried before and it gave me such a headache... xD
Took me 5 listens to have the luppet and magic mirror part register in my brain haha
Thankfully that gave me time to set up everything else
Thanks fofamit, you are heaven sent xD I bought an iPhone for this 😂
This is incredibly impressive!
Thank you! 🥰
Just wondering about your thoughts on the new HTC Vive mouth tracking and eye-tracking headset combo. would that be a viable face-tracking option for Vtubers or is the iPhone the only viable way. mainly asking cuz I don't have an iPhone and I'd rather buy a whole vive setup then get one.
I wish I had an iPhone for just this purpose :/ uhh xD amazing video!
8:19 - omg! the amount of subdivided vertices! Doesn't VRChat /android stream capture get slowed down by them? I imagine 3 of those avatars in a room would make the room laggy....What's the recommended amount of polys for an Avatar? 50k? 80k? 250k?
Oh, it's cause X-Ray is enabled so it's showing the vert from behind too.
Hello Fofamit! I'm super baby newish to the Vtuber scene and have just finised having my model drawn and rigged to be used with VTube Studio. Currently I'm using a Logitech Brio for my capturing but my Rigging Design Person suggested going with either a IOS or Android as they have the IR Depth thing you speak of. My question, which I feel is dumb but I tend to overthing everything, is Does the Iphone XR have to have service even though I'm just using the Camera feature and probably having it hooked straight up to my PC? And do you maybe have a good tutorial video that goes through setting this all up?
Hihi! Not sure if you'd know the answer/cause of this-- but when I add blendshapes for brows up/down, they work fine in VSeeFace (using VTubeStudio on iPhone to track my face). However, when I add things like Cheek Puff, Tongue out, etc. they all work fine BUT then the eyebrows don't work anymore. Do you know what I could be doing wrong or how to fix it? Please help if you can!
Do VRoid models have enough shape keys for this process?
Thank you so much for so many amazing tutorials. We don’t even know each other but u really helped me overcome my fear of streaming.
Not only that your tutorials helped me start my own journey on twitch and for that I will be forever thankful❤️
I have a question- I have a vrm model as well (made from a vrchat unity package model)
At 6:00 I can see that ur avatars on the screen has a ton of different face expression sliders.
(I’m not really good with unity or blender)
Is it possible to get face tracking to work with a vr chat -> vrm model? ( for some reason it feels like I have less face expression than ur model on the screen)
Thank you so much for your help ❤️
Does anyone know a good channel for learning how to use blender? I am kinda brain dead currently on using character creation programs.
Blender Guru
Dikko is another great one, once you have the basics down!
I don't understand the language! would it be possible to have subtitles in Portuguese?
Hello! Would really love some help if possible, I have exported my avatar with all the apple blendshapes set with no export issues, but when I put it into waidayo it doesn't seem like it's able to use any of them, I can't find any solutions. Any input would be greatly appreciated! Thank you
late reply but ive had the same issue
the only that is visibly working is sticking my tounge out... but its still kinda bad tracking
i think im gonna try switch to ifacialmocap instead
Hey!
I love this content, very helpful!
I was wondering, though, is this a good wah to go about in-video face replacement?
I'm trying to learn how to just swap out my face with a 3d model of a mask, which tracks my facial expressions into the mask itself.
I want to leave the rest of my real life body in the video. Only swap out my face with the 3d mask model.
hello, how did you make your environment..please make a tutorial.. what app did you use?
My main issue is that I don't know what half of these blend shapes mean. Like, what does "MouthRollUpper" mean? The guide is Japanese, so the text is basically unintelligible after Google Translate, like "Cover the upper teeth with the upper lip and roll it up. The shape when mogumogu is included in the mouth."
Does anyone happen to know what she's using for hand tracking? Is this also arkit?
I can't seem to change the face no matter what :/ idk what i'm doing wrong i can't get the left eye by itself to move
hey fofa! im trying vmagicmirror with ifacialmocap and its not working? all the blandshapes and keys are in the name correctly. and with waidayo it works fine. im not sure whats going on. any idea?
Did you link vmagicmirror with ifacialmocap?
@@Fofamit yes and it says
"sending to computer" when i click connect so the connection is there i think...
Thanks so much for the tutorial, I’ll try the hard way.💙
Здравствуйте. Можно гайд как после настройки аватара начать стримить и так далее, хочу сделать всё как у вас.
Fofamit? Where's the Codemiko tutorial for Unity part 2? I'm so sad
yo have some patience
Hey Fofa, thanks for the tutorials!
Are you doing commissions? How much can it possibly cost to create those blendshapes?
The best way to keep up to date would be to check my Twitter page: twitter.com/fofamit
Can you do it without a computer
my blendshape proxy script does not have an avatar set to it :(
the avatar with the blue hair is clearly 3d but kinda looks 2d. HOW?
I notice that your "real-time face tracking" is not in real time but the audio is slightly off sync. What causes the delay?
This one I actually know the answer to. It's because processing what the camera is seeing your face do takes a little bit of time. This slight lag is very easily overcome though either in post with a slight audio offset, or if streaming/recording using OBS, with a very slight delay on the mic audio source, which OBS can do natively.
Please please please more in depth I have no idea where to get the arkit shapes or anything 😭
Does anyone know if any of these apps work with brood models??
What about us android users can you make a video of android I would be greatly appreciated amongst us android users and I'm just starting out vtubing but haven't posted a video yet but I'm looking for an app to were I can use a fbx model because I have one already made and (I love your work keep it up Fofamit you the best vtubing guide that's why I'm subscribed)
I can do a video on that addressing the Android face tracking.
@@Fofamit that would be helpful thank you and your the one who has inspired me to become a vtuber and your guides are so helpful that's why I'm subscribed
@@Fofamit that would be very helpful as I kinda hate apple with a burning passion and was wondering if arcore is a feasible replacement for arkit.
What is the oldest iphone model that will give you the better streaming stuff?
iPhone 10 and up
You are really smart and funny
When the cooler Facetracking might take longer than the creation of the whole avatar.😥
Thank you for this awesome tutorial, I definitely will get back into it because I just want this tracking to work. Problem is I will need to manually create a lot of them since my model is based on things from Models Creators created for MMD mainly. So it's just the basics and I need to add the rest like... 42 blendshape keys. 🤣
UR MODEL IT LOOKS SO REAL STOP! btw I’m on my old account LOL
Hey there,
I can create 52 blendshapes on a custom character, like the ones on the metahuman. Hit me up if you need any.
Is there a link to ARKit or the reference sheet?
are you doing something with your voice? not sure if you are just getting more comf on the mic or practicing subtle things like enunciation. but your vocal quality seams to be a bit more consistent so if you are doing something it show :D otherwise you seem to be getting more comf on the mic
♥️
Заранее Спасибо :)
I just saw great moon aroma
woW niceee
Is Arkit on iPhone?
*screams in iphone 8*
Hard to follow T^T
Your mouth is TINY