MetaHumanSDK
MetaHumanSDK
  • Видео 15
  • Просмотров 107 137
MetaHumanSDK Lip Sync Test with Amazon Web Services’ Zoey
Have you already seen Zoey, an Autonomous Virtual Human, from Amazon Web Services (AWS)? What do you think of their solution? We couldn’t resist testing our lip sync technology with it! Check out the original video here - ruclips.net/video/-umCNWZk9Yg/видео.html.
MetaHumanSDK is a multilingual lip sync plugin. Our technology supports multiple languages and delivers precise speech synchronization for Unreal Engine. We can adapt our lip sync to any case.
By the way, credits in this video also go to 11labs for providing synthetic voice.
Link to our discord: discord.com/invite/kubCAZh37D?_gl=1*3jm5w1*_ga*MTMyODEzMDY0NC4xNjc4MDMxMDAy*_ga_L1M1F0N7HH*MTY4NDI0NDI1NC4xNy4xLjE2ODQyNDYxNjYuMC4wLjA
Link ...
Просмотров: 180

Видео

Making a Realistic MetaHuman in Unreal Engine | Realtime lip sync with MetaHumanSDK
Просмотров 6 тыс.6 месяцев назад
MetaHuman SDK is an innovative tool for facial character animation, powered by ML. This plugin for Unreal Engine streamlines the animation process, ensuring real-time facial expressions and lip-syncing thanks to our cloud service. Bring your projects to life with ease and high precision with MetaHuman SDK. Use MetaHuman SDK for your creative projects and enjoy a variety of functions: • Text to ...
User Showcase AI influencer
Просмотров 7048 месяцев назад
Introducing first user case in our User Showcases playlist! A huge thank you to our talented member and his team for this video. Catching the trend wave #IAmAnEverybodyWantsTo with an AI influencer. How do you think what else AI influencer can do? Share your cases with us in discord discord.com/invite/jHtA39RfwB and perhaps the next video will feature your project!
Tutorial: Personal account. How to get an API token.
Просмотров 2,8 тыс.Год назад
MetaHumanSDK Team has prepared personal accounts for you. We prepared tutorial how to log in and get tokens Follow these simple steps: Log in to your personal account (space.metahumansdk.io/) 1. Enter your email address 2. Add tokens that are relevant to you or generate a new one 3. Update the UE MetahumanSDK plugin to the current version (supporting the logic of working with the account) (UE 4...
Tutorial: how to install an improved module Epic pixel streaming
Просмотров 2,8 тыс.Год назад
We have prepared a tutorial to install an improved module Epic pixel streaming to create a chat with MetaHuman in the browser and work together with our service which generates lip sync and facial expressions. Why do you need it? It’s a great starting point to create your own chat with an Avatar on the web. What is included: -customizable web UI with npm and typescript compatibility - tweakable...
MetaHuman SDK: use case
Просмотров 4,1 тыс.Год назад
Check out how clearly lip sync is set up!! We added our plugin to MetaHuman from Unreal Engine. Check out how realistic it turned out! Avatar withstands all pauses and the lips perfectly match the audio track. Download our plugin and try to create your own content. Write your impressions in the comments. Link to our discord: discord.com/invite/kubCAZh37D?_gl=1*3jm5w1*_ga*MTMyODEzMDY0NC4xNjc4MDM...
Tutorial: how to use Azure TTS in Metahuman SDK
Просмотров 4,7 тыс.Год назад
We have prepared a tutorial on how to start using Azure TTS from MetaHuman SDK in blueprints . Azure TTS offers a wide selection of different languages and dialects with different accents that sound natural and will make the avatar even more human-like. Test and share the results in the comments. Link to our discord: discord.com/invite/kubCAZh37D?_gl=1*3jm5w1*_ga*MTMyODEzMDY0NC4xNjc4MDMxMDAy*_g...
Tutorial: Metahuman's Head Detachment Fix
Просмотров 6 тыс.Год назад
We have prepared a tutorial on how to avoid the MetaHuman’s head detachment issue from the body. Follow all the steps from the video to fix the issue. Watch and share the results in the comments Link to our discord: discord.com/invite/kubCAZh37D?_gl=1*3jm5w1*_ga*MTMyODEzMDY0NC4xNjc4MDMxMDAy*_ga_L1M1F0N7HH*MTY4NDI0NDI1NC4xNy4xLjE2ODQyNDYxNjYuMC4wLjA. Link to our website: metahumansdk.io/ Get the...
Tutorial: Unreal Engine ChatGPT with Metahuman SDK
Просмотров 21 тыс.Год назад
You can now use the GPT artificial neural network, which belongs to the same family as ChatGPT, to communicate with MetaHuman SDK! Chat GPT - a tool that uses artificial intelligence to create a more natural and productive dialogue between the user and the system. In this tutorial you can see how to use chat GPT with our plugin. Here you can find the blueprint from this tutorial: discord.com/ch...
Tutorials: how to use the plugin
Просмотров 45 тыс.Год назад
MetaHuman SDK is an automated AI solution to generate realistic animation for characters. This Unreal Engine plugin allows you to create and use lip sync animation generated by our cloud server. We have prepared a detailed tutorial describing how to use our plugin: -integrate TTS -add audio to lip sync -add audio to lip sync streaming -integrate a chat bot -combine everything into a single comb...
Tutorials: How to prepared Demo scene tutorial
Просмотров 14 тыс.2 года назад
MetaHuman SDK is an automated AI solution to generate realistic animation for characters. This Unreal Engine plugin allows you to create and use lip sync animation generated by our cloud server. In this tutorial, we describe in detail how to start using our plugin: add a TTS and a voice track, as well as merge everything into a combo request. Test and tell us what you got in the comments. Link ...

Комментарии

  • @dimvolmusic9910
    @dimvolmusic9910 День назад

    Hi! Can your avatars be integrated into the website or just into the games?

    • @metahumansdk
      @metahumansdk День назад

      Hi! Absolutely! Our avatars can be interactive not only for games but also for websites, virtual events, and much more. You can check out the video ruclips.net/video/nq4EFu88oXM/видео.html for some great examples! For more details, feel free to reach out at support@metahumansdk.io - we'd love to chat further!

  • @GenesisDominguez-n1c
    @GenesisDominguez-n1c 13 дней назад

    hi! I am having problems with the blue print, since nothing about metahuman sdk appears in the functions, could you help me with that?

    • @metahumansdk
      @metahumansdk 12 дней назад

      Hi! Api Manager has been renamed to Lipsync Api Manager in the latest version of the plugin. Please try to call plugin functions through this name.

  • @GenesisDominguez-n1c
    @GenesisDominguez-n1c 13 дней назад

    metahuman sdk does not show in my pugins´s settings

    • @metahumansdk
      @metahumansdk 12 дней назад

      Hi! Try to find the plugins section in the project settings and the name Lipsync there. This has changed in the latest version, which may cause confusion.

  • @Bakahira
    @Bakahira 13 дней назад

    When do you plan to offer Metahuman fully supported for mac os? With the Mesh to Metahuman plugin etc? After all these years of release I hope you take this part of the community with you as well.

  • @sanjay1994.
    @sanjay1994. 15 дней назад

    The lip Sync is only working for 5 seconds. It is not working for longer audio files.

    • @metahumansdk
      @metahumansdk 15 дней назад

      Hi! The limit for generating one animation of 5 seconds is present only on the Trial plan. If you have a different plan, please email us at support@metahumansdk.io and we will check your account.

  • @ashutoshdongare5370
    @ashutoshdongare5370 21 день назад

    Slow. There is a clear lag...

    • @metahumansdk
      @metahumansdk 21 день назад

      Thank you for your feedback! We would love to understand more about your experience. Could you please clarify what aspect of our service felt "slow" to you? Did you encounter any specific delays or issues while using the plugin? Your input helps us improve.

  • @airforce__ro
    @airforce__ro 25 дней назад

    @4:30 ... he seems a little drunk to me 😂

    • @metahumansdk
      @metahumansdk 21 день назад

      It's just that this IDLE was done on a ship during a storm🚢

  • @ccartermices7085
    @ccartermices7085 Месяц назад

    Hello! When will the 5.4 version of this plugin come?

    • @metahumansdk
      @metahumansdk Месяц назад

      Hi! You can already download the 5.4 version from the marketplace www.unrealengine.com/marketplace/en-US/product/digital-avatar-service-link

  • @Jungleroo
    @Jungleroo Месяц назад

    Has anyone got this working with a reallusion character creator rigged model? Did you have to seperate the head? Which preset did u use?

    • @metahumansdk
      @metahumansdk Месяц назад

      Hi! They support the ARKit blendshape set after version 3.4, so you can just select the ECustom option in the ATL Mapping Mode settings, this should help.

    • @Jungleroo
      @Jungleroo Месяц назад

      @@metahumansdk ok, and under ECustom option, what mapping asset and bone asset do i select? as if i dont select any, the anim it creates is blank.

    • @metahumansdk
      @metahumansdk Месяц назад

      If possible, please share information about Unreal Engine version, send us the project log file via discord discord.com/invite/MJmAaqtdN8 or email support@metahumansdk.io. At the moment we can't reproduce the error and animation is created correctly for custom meshes without additional mapping options.

  • @doomerooy
    @doomerooy Месяц назад

    Does it work with all languages?

    • @metahumansdk
      @metahumansdk Месяц назад

      Hi! Our plugin is language independent and generates animations from audio files. To get better results we recommend you to use clear records of voice or generated voice without side effects or noises. Also using too fast speech may cause some artifacts in animation.

  • @Jungleroo
    @Jungleroo Месяц назад

    How are you making the head and shoulders move along with the speech too?

    • @metahumansdk
      @metahumansdk Месяц назад

      Metahumans have 2 skeletons, one for the head and one for the body. You can direct animations to both skeletons at the same time and set them up in a suitable way so that the movements match your wishes.

  • @art3mis241
    @art3mis241 Месяц назад

    If only the video quality were 1080p (or higher) and the parameters adjustment section was zoomed in, it would be even better

    • @metahumansdk
      @metahumansdk Месяц назад

      Hi! Thank you! As of late we've only been posting videos in 4k, but the zoom idea is great and we'll try to improve the experience in new videos.

  • @dsk0313
    @dsk0313 2 месяца назад

    I just created an account, so I should still be within the free period, but it stops at the login screen to begin with. I have tried from various wired and wireless networks but I just can't log in. Does anyone have a solution to this problem?

    • @metahumansdk
      @metahumansdk 2 месяца назад

      Hello @dsk0313 We are very sorry that you encountered this difficulties. Please contact us at support@metahumansdk.io we will figure it out.

  • @Silentiumfilms007
    @Silentiumfilms007 2 месяца назад

    5.4 please

    • @metahumansdk
      @metahumansdk 2 месяца назад

      Hi! You can find a test build for 5.4 in our discord discord.com/channels/1010548957258186792/1010557901036851240/1253377959700463647

  • @Silentiumfilms007
    @Silentiumfilms007 2 месяца назад

    I need to know how to copy face reaction and lip syncing via android phone and also need to know how to motion movement's, thank you

    • @metahumansdk
      @metahumansdk 2 месяца назад

      Hi! Currently, our plugin only supports Windows and Linux operating systems.

    • @Silentiumfilms007
      @Silentiumfilms007 2 месяца назад

      @@metahumansdk will it work on every metahuman? And is it free?

    • @metahumansdk
      @metahumansdk Месяц назад

      Hi! You can use the plugin for free for two days after registering at the space.metahumansdk.io/

  • @rezahasny9036
    @rezahasny9036 2 месяца назад

    your blueprint was different from you have shared

    • @metahumansdk
      @metahumansdk 2 месяца назад

      Hi! We posted a finished example project because plugins change over time, so this video has lost relevance, and that's one of the reasons for changing the Blueprints. In the example project from discord we used specific versions of plugins and made Blueprints taking into account the use of specific versions. You can rely on the example project to understand the blueprints logic, but if you update the OpenAI plugin - we can't guarantee that the example project will work correctly.

    • @rezahasny9036
      @rezahasny9036 14 дней назад

      @@metahumansdk so you should tell the right tutorial bro

  • @리저드
    @리저드 2 месяца назад

    The video level in the way you show is like un3. sorry

  • @mustafayenerkol9958
    @mustafayenerkol9958 3 месяца назад

    Hi, I am using unreal engin 5.3.2. There are 2 projects. I download the same metahuman from quixel and apply the same animation. In one project the head separates, in the other project it does not. I don't know what the difference is. Do you have any possible ideas?

    • @metahumansdk
      @metahumansdk 3 месяца назад

      Hi! Metahuman have different skeletons for the body and the face. If you send the animation to the head skeleton directly, the head will take the default position for its coordinate system. Therefore, it is necessary to use the animation Blueprint for the face and use the Blend Per Bone node to specify the root bones to map the head position to the body position.

  • @TheOsirisband
    @TheOsirisband 3 месяца назад

    im stuck here, min 1:10 when importhing the metahuman to unreal engine via bridge. already download the metahuman preset, but when I add the metahuman to UE 5, nothing happen, can someone help me on this one?

    • @metahumansdk
      @metahumansdk 3 месяца назад

      Hi! When you have already downloaded metahuman in Quixel Bridge you need to export it to the project. After that you need to open content browser in the project and find MetaHumans folder which contains exported metahumans.

  • @TheOsirisband
    @TheOsirisband 3 месяца назад

    thanks for posting the video. really inspiring. i just want to clarify, is it possible to make the metahuman speak in bahasa Indonesia? have some difficulties to develop this kind of product. really need your help. thanks in advance

    • @metahumansdk
      @metahumansdk 3 месяца назад

      Hi! Azure and Google TTS standard voices are currently supported. As far as I know, Azure should have a language id-ID Indonesian (Indonesia). Also you can use your TTS to send audio to the ATL (Audio To Lip-sync) node.

  • @syedhannaan2974
    @syedhannaan2974 3 месяца назад

    I am trying to to create a virtual-voice assistant that is integrated with chatgpt and talks to me with gbt based responses, i have created the voice assistant and it works perfectly and generates voice and text output could you please tell me how to utilize this response output and convert it to lip sync voice and animation on meta humans, i want to send the text/voice outputs generated by my python code and use it to convert to lipsync what are the communication methods or is there a tutorial for the same

    • @metahumansdk
      @metahumansdk 3 месяца назад

      You can use Talk Component>Talk Text for your task, you only need to precede the text to generate the voice and animation. ruclips.net/video/jrpAJDIhCFE/видео.html

  • @mwa8385
    @mwa8385 4 месяца назад

    Can we have a step-by-step screen shots of it, please? it's very hard to follow the steps

    • @metahumansdk
      @metahumansdk 3 месяца назад

      Please visit our Discord server discord.com/invite/kubCAZh37D or ask about advice to the e-mail support@metahumansdk.io

  • @hernanmartz
    @hernanmartz 4 месяца назад

    Hey, what about Mac? The SDK is not available for it. ☹

    • @metahumansdk
      @metahumansdk 4 месяца назад

      At this time we have removed the Mac version due to a virus found for that platform in one of the third-party libraries we used for the plugin on macOS.

    • @hernanmartz
      @hernanmartz 4 месяца назад

      @@metahumansdk Virus on Mac? Hmm… ok. Hope it can be fixed asap. Thanks. 😃

  • @Djonsing
    @Djonsing 4 месяца назад

    Realism doesn't even smell like it

    • @metahumansdk
      @metahumansdk 4 месяца назад

      We are sorry to hear about this. Could you please send us an email to support@metahumansdk.io about what you expect from a service like ours? It would help us to be better!

    • @Djonsing
      @Djonsing 4 месяца назад

      @metahumansdk I don’t mean the lip synchronization to the speech (it’s really at a high level), but the appearance of the characters themselves, the title of the video clearly states: “Creating a *REALISTIC* metahuman”, so where is the realism here? Even a schoolchild will immediately understand that these are computer characters... Watch videos about Metahumans by blogger JSFILMZ or others like him, that’s what I understand *REALISM!*

  • @victoraddiamaadorortega7121
    @victoraddiamaadorortega7121 4 месяца назад

    What is the difference between audio-to-lipsync and the real-time-lipsync service that is listed in the available services in the chatbot subscription?

    • @metahumansdk
      @metahumansdk 4 месяца назад

      Hi! The chatbot can use streaming requests that generate animations in chunks, so the generated animation runs faster than a normal ATL request.

  • @damncpp5518
    @damncpp5518 4 месяца назад

    im with ue 5.3.2 and play animation node is not found. I only get Play animation with finished event and play animation time range with finished event...They are not suitable with getface node and metahuman sdk combo output animation

    • @metahumansdk
      @metahumansdk 4 месяца назад

      Hi! If i understand it right you have a delay between start of animation and sound. You can try to use Talk Component whis is much easier to use and include prepared blueprints for all requests in runtime ruclips.net/video/jrpAJDIhCFE/видео.html If you need more advice please visit our discord discord.com/invite/kubCAZh37D or send an e-mail to the support@metahumansdk.io

  • @bruninhohenrri
    @bruninhohenrri 5 месяцев назад

    Why the heck my code stopped working ???????????? The GetChunk function is returning invalid, and the code was like, WORKING, a few weeks ago, today a opened the same build and the speech doesnt work anymore ? Did you guys changed anything ? I'm not even using any AI services of ai, just the standard audio-to-lipsync conversion. What the heel happened ? I have a deadline on this project, imagine for my surprise, when i open it and the code doesn't work anymore....

    • @metahumansdk
      @metahumansdk 4 месяца назад

      Hi! We didn't change anything on our side, all depends on your tariff plan only. For more information please visit our discord discord.com/invite/kubCAZh37D or send an e-mail to the support@metahumansdk.io

  • @kreamonz
    @kreamonz 5 месяцев назад

    hello! I generated a face animation and audio file (the time in the video is 5:08), I go into it, this file is only 125 frames, although the audio lasts much longer. In the sequencer, I add audio and generated animation and the animation is much shorter, and when stretching the track, the animation repeats from the beginning. Please tell me how to adjust the number of frames per second?

    • @kreamonz
      @kreamonz 5 месяцев назад

      I mean, how to edit the number of sampled keys/frames

  • @kennedykimeu4618
    @kennedykimeu4618 5 месяцев назад

    This tutorial needs to be updated.

  • @mahdibazei7020
    @mahdibazei7020 5 месяцев назад

    Can I use this on Android?

    • @metahumansdk
      @metahumansdk 5 месяцев назад

      Hi! We didn't support mobile platforms but you can try to rebuild our plugin with kubazip for android. It might work, but I can't guarantee it.

  • @SaadSohail-ug9fl
    @SaadSohail-ug9fl 5 месяцев назад

    Really good tutorial! Can you also tell me how to achieve body and head motion with facial expressions while metahuman is talking? Just like you have talking metahumans in your video

    • @metahumansdk
      @metahumansdk 5 месяцев назад

      Hi! You can generate animation with emotions from our plugin or use additive blending to add your own emotions directlly to selected blend shapes.

  • @inteligenciafutura
    @inteligenciafutura 6 месяцев назад

    se debe pagar para usarlo, no funciona

    • @metahumansdk
      @metahumansdk 5 месяцев назад

      Hi! Can you please share more details about your issue? Perhape this tutorial can help you ruclips.net/video/cC2MrSULg6s/видео.html

  • @inteligenciafutura
    @inteligenciafutura 6 месяцев назад

    spanish?

    • @metahumansdk
      @metahumansdk 5 месяцев назад

      MetahumanSDK is language independent. We are generate animation from a sound but not from a visemes.

  • @Relentless_Games
    @Relentless_Games 6 месяцев назад

    Error: fill api token via project settings First time using this sdk, how can I fix this?

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      Please contact us through e-mail support@metahumansdk.io we will help you with token.

  • @stephanedavi
    @stephanedavi 6 месяцев назад

    Hi. Can I add an audio file to have lips-sync ?

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      Yes. We recommend you to use 16-bit PCM wave for it.

  • @dreamyprod591
    @dreamyprod591 6 месяцев назад

    is there any way to integrate this on a website

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      Sure, you can try to make a pixel streaming project for example.

  • @abelboban4557
    @abelboban4557 6 месяцев назад

    i am not able to register at the site for the token

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      Hi! Please contact us through the support@metahumansdk.io We will help you with the token while our site on reconstruction 🛠️

  • @bruninhohenrri
    @bruninhohenrri 6 месяцев назад

    Thanks. Finally my coed worked !

  • @bruninhohenrri
    @bruninhohenrri 6 месяцев назад

    Hello, how can i use the ATLStream animation with an Animation Blueprint ? Metahumans have a postprocessing AnimBP, so if a run the raw animation basically it messes up with the body animations

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      Hi! Please try to start from Talk Component. This is the easiest way to use Streaming options. Here is tutorial about it ruclips.net/video/jrpAJDIhCFE/видео.html If you still have some issues please visit our discord discord.gg/MJmAaqtdN8

  • @GrandePlansDigitalSoluti-rv3gp
    @GrandePlansDigitalSoluti-rv3gp 6 месяцев назад

    It shows as unavailable in Unreal Engine Marketplace.

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      Hi! Please reopen EGS launcher. It should fix a bug with unavailable content.

  • @jpegodzilla
    @jpegodzilla 6 месяцев назад

    Absolutely stunning!

  • @bahtitashpaev7285
    @bahtitashpaev7285 6 месяцев назад

    after putting souce from github to the file Pluguns, project is not opening((.. Error is: could not be compiled can you help please?

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      Hi! Perhaps you need to install VS2019 or higher because plugins that you put directly in the project should be compiled at the first project run. Please use recommended parameters from the UE devs when setting up Visual Studio dev.epicgames.com/documentation/en-us/unreal-engine/setting-up-visual-studio-development-environment-for-cplusplus-projects-in-unreal-engine?application_version=5.2

    • @bahtitashpaev7285
      @bahtitashpaev7285 6 месяцев назад

      @@metahumansdk thanks for the quick response. I will try it

  • @sinaasadiyan
    @sinaasadiyan 7 месяцев назад

    hello I have developed a STT+chatgpt+TTS in UE5 , how can I use your plugin to lipsync ??

    • @metahumansdk
      @metahumansdk 7 месяцев назад

      Hi! Sounds great! You can send your sound as 16-bit PCM wave to the Audio To Lipsync/Combo nodes and it should work fine to you. To generate animation MetahumanSDK plugin just need a sound and you already have TTS so it should work 😉

    • @sinaasadiyan
      @sinaasadiyan 7 месяцев назад

      @@metahumansdk Great, As I have developed these parts separately, does it still require any token or API to use Metahuman SDK for "audio to lipsync" part?

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      @sinaasadiyan yes you need a token

  • @Vegaart777
    @Vegaart777 7 месяцев назад

    It’s driving me crazy , I have two avatars in the scene , ( with face animation created from audio and I have body animations as well, one avatar is perfect the other one has his had inside his shoulders when I apply the face animation

    • @Vegaart777
      @Vegaart777 7 месяцев назад

      Ok it was because I had two metahumans in the scene , and bc I’m not too advanced I was using the same face blueprint for both , I just did them in separate projects. Not a solution but hey 👁️

    • @metahumansdk
      @metahumansdk 7 месяцев назад

      Hi! We use same animation blueprints on the different metahumans and they works fine. If it possible please send us more details about this issue to the discord discord.gg/MJmAaqtdN8 or e-mail support@metahumansdk.io we will try to help you.

    • @bruninhohenrri
      @bruninhohenrri 6 месяцев назад

      @@metahumansdk I have kinda the same problem. This plugin doesnt work with animation Blueprints for the face

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      @bruninhohenrri This video was created with modified facial animation blueprint. We also share it to users who ask for it in our discord, you can download Facial animation blueprint that we usied for UE 5.3 from this message discord.com/channels/1010548957258186792/1210158174133551174/1211563094808203274

  • @SK-hj1xh
    @SK-hj1xh 7 месяцев назад

    RVC models (Retrieval-based-Voice-Conversion) support?

    • @metahumansdk
      @metahumansdk 7 месяцев назад

      Hi! You can use any sound source that you want to use if you can convert it to a 16-bit PCM wave. Our nodes should sork correctly if they receive correct sound data.

    • @SK-hj1xh
      @SK-hj1xh 7 месяцев назад

      @@metahumansdk i mean that can i put my own pre training rvc voice model into metahuman sdk and after this my rvc will be working by text to voice in the project?

    • @metahumansdk
      @metahumansdk 6 месяцев назад

      @SK-hj1xh we have no option to integrate external APIs into our plugin by users but if you alreadi have your TTS in Unreal Engine you can use it as source of wave files and send it to MetahumanSDK. Also you can try to modify cpp files of plugin which you can find in the Unreal Engine plugin's folder to integrate your own TTS directly in plugin.

  • @anveegsinha4120
    @anveegsinha4120 7 месяцев назад

    I am getting error 401 no ATL permission

    • @metahumansdk
      @metahumansdk 7 месяцев назад

      Hi! It should depends on the tariff plan. If you are using trial version you have limit to generate maximum 5 seconds per animation. If you are at the Chatbot tariff plan you need to use ATL Stream but not regular ATL. Regular ATL available on the Liet, Standard and Pro tariffs.

    • @BluethunderMUSIC
      @BluethunderMUSIC 7 месяцев назад

      @@metahumansdk That's not really true cos I am getting the SAME error and I tried with sounds ranging from 0.5 seconds to 8 seconds. How do we fix this because it's impossible to do anything now.

    • @metahumansdk
      @metahumansdk 7 месяцев назад

      Can you please send us logs to our discord discord.gg/MJmAaqtdN8 or support@metahumansdk.io? We will try to help you with this issue but we need more details about your case.