How to make AI NPC MetaHumans in Unreal Engine
HTML-код
- Опубликовано: 30 сен 2024
- NOTE: OVRLipsync does not need to be manually added anymore as lipsync is now incorporated into our latest version of the SDK.
Combining high-fidelity character models and Inworld AI is simple thanks to our integration with Unreal Engine and MetaHumans. Using this Advanced Guide, learn how to integrate Inworld AI's into your Unreal NPCs.
Join Inworld Dev Rel consultant, Matt Kiernander, as he walks you through all of the steps to get you started today!
Learn how to:
Download the Unreal SDK
Navigate Unreal's MetaHuman Creator
Hook your MetaHuman avatar up to an Inworld AI brain
And more!
🕹 Learn about AI-powered gameplay: bit.ly/49mSJ3V
🛠 Build with dev docs: bit.ly/3VFNeKF
🎮 Join our Discord - talk AI and gaming: / discord
📩 Contact us to get started: bit.ly/3yEZQbO
#gaming #gamedev #gamedevelopment #games #gaming #indiegames #artificialintelligence #generativeart #aicharacters #chatbot #chatbotdevelopment #unrealengine #metahumans #unrealmetahumans #oculus #metaverse #avatar #virtualreality #madeinunreal #virtualcharacter #madeinunity #gpt3 #naturallanguageprocessing #chatgpt #chatgpt #globalgamejam #npcsarebecomingsmart #npcs #AINPC #NPCAI #chatGPT
NOTE: OVRLipsync does not need to be manually added anymore as lipsync is now incorporated into our latest version of the SDK.
Man, we really are on the future. This is so freaking cool.
Did you work at Computer Lounge in NZ back around 2016? I swear you sold me my PC
*Just FYI...*
Don't be fooled! Matthew Kiernarder doesn't actually exist, he's just an AI generated presenter for the purposes of this generated content... 😅😋
Edit:
That background/set isn't even real... it's all AI generated!
Matt will be really sad to know he doesn't exist. lol
@@inworldai Help, i'm stuck in Baldur's gate 3 and just rolled a 1 on charisma.
@@MattKander Oh... gee. Matt. Really sorry we left you in there. We'll try to get you out as soon we have a second, we promise. Just stay put and don't anger any evil elves.
I really hope you guys have the resources to take this big time. This is awesome.
For anyone doing this tutorial. Follow the documentation they provide on their site. Some functionalities like adding components through Inworld Player seems to not add all the components. Also, check that the character you made is in your scene api in the dashboard. Some details that makes this awesome functionality a headache.
This is an older tutorial and somewhat outdated given the pace of our development. Thanks for adding these comments. We're working on updating our tutorials. Our Discord is a great place to ask questions about new features like Player Profiles until we have those up.
mind blowing plugin with lots of feature. but can we use our own custom chatbot/knowledge api here?
yes i need answers for this too
that's amazing! do you have the plan to support other languages like chinese? would definitely like to pay for it!
It's not in our near-term product road map but we hope to provide that functionality in the future. There needs to be good LLM models available, text-to-speech with the proper pronunciation trained and more...
So the guy in the video is an AI. How do you make an AI that looks that good into a live chatbot using GPT?
So many bugs in the plugins however. I think you guys need to keep more on top of it as you have a better product that the other guys
We always appreciate bug reports from users and try to tackle them all quickly. If you want to join our Discord, that's where we address bugs users are experiencing!
I can't speak with it how can i do like you at the end do i have to parameter the mic or download some other pluging for my micro?
If you're still having technical issues, I recommend joining our Discord as we answer all technical questions there.
Hi, Everything seems to be working in the Map created from the widget Inworld Studio, "Create Dialog Map", the chat starts and the communication is fluid.
But in the main scene where the MetaHuman is, even though the connection is established there is not dialog or any communication between the voice input and the Metahuman.
Another issue found is if I install the plugin from the Epic Game Marketplace, after activated in the project, the option to set the inworld as Metahuman is not available. It worked for me only if I download and manually copy the plugin folder in the project directory, as the video shows.
I have posted these issues in the community forum.
I also failed to get the voice, don't know what's wrong
i couldnt start chat even with create dialog map, then its irreversible
That's a question you'll want to ask in our Discord forum. I encourage you to join. Just go to our website and click on the link.
nice tutorial, does work but you need to create a scene and build the project after installing the plugins.
*It's Coming!*
So exciting and enabling. Thank you Guys! More speed and Effectiveness to You :)
Hi, in the sdk I dowloaded there is only 'InworldAI' 'InworldMetahuman' and 'InworldReadyPlayerMe' no trace like your directory, what do I do? Thank you
We're best set up to provide help debugging via our Discord forum. If you go to our website, you can find the link to join.
Hi, thanks for the video, does this work in italian too? And can the NPC Move in the tridimensional space if the player say " Open The Door " Etc?
We don't currently support languages outside English. However, you can create goals and actions for characters where htye can take an action if you say something to trigger it.
does it need to be a c++ project or does it also work with a blueprint project ? sorry noob question
Our SDK allows you to work with Unreal blueprints. If you have more advanced questions about how that works, the best place to ask them is in our Discord.
как же быстро меняется! представить что теперь в играх можно будет поболтать с любым неписем или вообще сюжеты будут вариативно на ходу меняться так же как ChatGPT спокойно может любую тему поддержать и развить - просто фантастика. В конце-концов ИИ посмотрит на нас и решит что ну вас на фиг с вашими войнами и конфликтами и пошлет человечество куда по-дальше и будет повторения истории с Атлантидой)
My character is not talking and I was following everything here.
If you're still having difficulties, I recommend joining Discord as we provide support through that channel. We're happy to answer your questions there.
can you train characters with your own information?
Currently you can add a wikipedia link to the character if that exists, or you can create your own fictional character and feed it with its own knowledge.
How can i type to the character without using a mic?
Yes! Our SDK allows both typing and chat. However, developers creating experiences with our tech might choose to only make one or the other available to their users.
no funciona para español, es una herramienta que no sirve
Ahora mismo no tenemos apoyo para español, pero puedes usar el Azure Text-to-Speech SDK para que el personaje hable el idioma que desees. También ayuda decirles que son "hispanohablantes" o que "saben hablar español" en su Core Description. En nuestro Discord, mucha gente ha creado alternativas a lo que damos apoyo :)
能说中文吗
This is incredible but I would like to know if this is completely free to use and if so, will it remain that way?
I don't think it's completely free, i tried it for free though
It's free to talk to characters in our studio and via our Arcade. API minutes like in Unreal as demonstrated in this video are not free and require a monthly plan and have a per usage charge. However, our rates for that are constantly going down.
and does it work in Spanish?
Por ahora no tenemos español como lenguaje disponible, pero mucha gente usa Azure Text to Speech SDK para poder hacer la traducción al instante. Verifica nuestro Discord, donde puedes ver cómo otros devs han encontrado una solución temporera.
Amazing! What about integrating other languages? I'm willing to pay
It's not in our near-term product road map but we hope to provide that functionality in the future. There needs to be good LLM models available, text-to-speech with the proper pronunciation trained and more...
I’ve tried to hand them an enterprise license on a silver platter. They don’t seem interested.
Nice vidéo 😊. Do you support french language? If not when can we expect this feature please?
We don't currently support French but are currently expanding our supported languages. French isn't on the list of the first languages we're targeting. However, some games have used Deep-L to translate into French and other languages for text-based conversations.
@@inworldai is it possible to parameters the personality in English and ask him to respond only in french?
With my elevenlab french voice if it's possible to have a french response by my avatar that sounds good if I can have a french response
@@omega-pi So with TTS in different languages there are a few factors -- 1) the LLM has to be able to respond via text in that language. That can often be done though not perfectly now 2) The text-to-speech output of the LLM and voice to speech model has to be able to get the right pronunciation and emphasis of all the words and 3) the voice has to have the right accent. 4) the lipsync has to align with the foreign language pronunciation. You can hack our system to get imperfect responses in other languages we don't support. If 11labs does 2 and 3 then it should work. If they do 3 but not 2 then it won't work. You might see issues with 4 but they could be minor. I'd recommend joining our Discord and asking questions in there for more comprehensive answers.
I downloaded the folder in the link but it doesn't resemble what you're showing so.. stuck at stage 1.
We're best set up to provide help debugging these types of things via our Discord forum. If you go to our website, you can find the link to join.
well.. i followed all the inworld AI tutorials but none of them explain how to set up the OVR lip sync playback actor, and it didn't come with the Inworld plugin.
If you're still having difficulties, I recommend joining Discord as we provide support through that channel. We're happy to answer your questions there. I believe it is now incorporated in our SDK but they'll be able to help you there best.
Thank you very much for this great tutorial.
I have a question.
Is it possible to place several MetaHumans with different types of InworldAI assigned to each of them in the same project, or can only one MetaHuman be placed in a project?
Each workspace allows you to have 100 characters so you could have 100 MetaHumans in your project. However, if you need more than you can message us and ask.
Hmm.. the option to right click my metahuman and set as "inworld metahuman" doesn't show up, there is only "inworld character" option.. any idea how to fix that?
did you solved it?
Thanks for the video. Characters from CC4 works? or just with metahumans fron Epic
You can use any avatar. We're avatar agnostic.
how you talking to it ?
Remember that you need to give mic permission! That said if you continue to have issues or don't know how to reach out to us on support@inworld.ai or through our Discord.
Hey I tried for a while can anyone tell me why the 1.1 plugins say they are missing or built with a different engine version for the 5.1.1 release? When I try to rebuild them they always fail.
We're best set up to provide help debugging via our Discord forum. If you go to our website, you can find the link to join.
May i ask if inworld could use in the Unreal VR project?
Yes, currently C++/Blueprint project type is not an option but you can easily convert your project into a C++ project and proceed with instructions from our documentation to implement for a VR project in Unreal.
Hey there, I'm curious - are you able to hook up this character to a live text feed?
Our characters generate the dialogue live. If you have your own live text feed you're wanting to hook a character up with our solution won't work.
in linux?
We have a web SDK that should work with a wide variety of applications.
Amazing video thank you! Can we package this to webgl?
That's a question you'll want to ask in our Discord forum. I encourage you to join. Just go to our website and click on the link.
Can I make a metahuman off of a jpeg photo and use is as an A I character in a game?
You can customize a metahuman but I don't think you can do that by uploading a jpeg. I recommend you look at Unreal Engine's documentation on that. And yes, you can use it in games you create or in mods to existing games. Bloc the Worker just published a modding guide for Inworld
@@inworldai thanks 😊
can the npc brain connect to chatgpt?
We have our own LLM's that we switch between based on conversational context and latency to get you the fastest and best responses.
Why I dont have the inworld lip sync folder
Same here...
@@nielslesliepringle3143 We're best set up to provide help debugging via our Discord forum. If you go to our website, you can find the link to join.
I'm so sorry that was the case. We're best set up to provide help debugging via our Discord forum. If you go to our website, you can find the link to join.
I tried for four hours to get this to work. Nothing.
Same
Same lol. My project was made in. Blueprint I can’t for the life of me figure out how to switch it to c++ or how to get this to work in mines
@@naytbreeze I'm so sorry that was the case. We're best set up to provide help debugging via our Discord forum. If you go to our website, you can find the link to join.
I'm so sorry that was the case. We're best set up to provide help debugging via our Discord forum. If you go to our website, you can find the link to join.
@@dsee I'm so sorry that was the case. We're best set up to provide help debugging via our Discord forum. If you go to our website, you can find the link to join.
Hello! I tried to integrate InWorld character in MetaHuman UE5. I did a step-by-step as tutorial, but in the end, I have that error after trying to communicate with the character:
Unhandled Exception: EXCEPTION_ACCESS_VIOLATION reading address 0x0000000000000000
UnrealEditor_InworldAIIntegration!UInworldCharacterPlaybackAudio::Visit() [D:\Rama_WORK\UE\InWorld_AI\Plugins\unreal-plugin-1.2.3\InworldAI\Source\InworldAIIntegration\Private\InworldCharacterPlaybackAudio.cpp:80]
UnrealEditor_InworldAIIntegration!UInworldCharacterComponent::TickComponent() [D:\Rama_WORK\UE\InWorld_AI\Plugins\unreal-plugin-1.2.3\InworldAI\Source\InworldAIIntegration\Private\InworldCharacterComponent.cpp:116]
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Core
UnrealEditor_Core
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
kernel32
ntdll
What may be the problem? How to fix it?
We're better set up on Discord for dealing with technical questions because we can create a ticket and ensure we respond to it in a timely manner. I hope you managed to figure this out, but if not, feel free to pop in there and ask for help!