Thank you so much for this. I purchased Kling a month ago because of your videos. I was going to just use the free trial, but you gave me more insight on "how" to use the prompts and you also were honest about the multiple retakes. I'm glad Kling has given us more credits to make these mistakes!! Keep it up sir!!! I'm trying to get some of my videos up with Kling and I'm loving it.
Tao, your work is so useful, I have been seriously following the advances in generative AI for almost a year, and each of your videos is a pleasure to watch, thank you for your work, I wish you the alignment of the planets for the algorithms !
Suggestions from a voice professional for recording your text: 1. Try to embody (physically experience) the behaviour you are trying to voice. For the female astronaut in distress, injured, tired, exhausted - think, for example, what pose is she in? (Neck is bent to the side - do it; she is, probably, breathing heavily, gasping - do it, maybe do a few jumping jack beforehand to get a bit winded; if she is raising her hand - raise yours; if she is talking while walking - walk quietly in place, or at least bounce a little; etc.). AUDIBLE BREATH is important. 2. Study speech patterns. Try to parody, to copy performance you like - where does the voice go up in PITCH, where does it go down? What happens with the VOLUME? How could you change TEMPO to make it less monotonous? All these components make our performance more engaging and truthful. 3. Use your FACIAL EXPRESSIONS. If the character is angry - make a "growly" face, bare your front teeth a little. If a character is sleepy, in bed, happy - literally lean to the side, maybe even record lying down, and SMILE. Close your eyes, if your character has their eyes shut. Facial muscles affect our resonance, and emotional perception - this is how you can tell if somebody is smiling on the phone, or not, even without seeing them. Have fun 🎤😀🎹🤗
Thank you for another great video! I'm 65 and involved with AI. I studied video production years ago and you have motivated me to dust off that knowledge and put it to use. Wishing a lot of health and happiness for you!
Really impressive tutorial! Kling 1.6's capabilities for maintaining consistent characters and creating cinematic scenes are a game-changer. Your step-by-step explanation made it super clear and easy to follow, even for beginners. I'm excited to experiment with these techniques in my own projects. Thanks for sharing your expertise!
This video is so good! It's also very refreshing to know that alien technology hasn't been able to improve on the simple screw top jar for the purpose of brain storage.
7:52 pro tip after taking screen shot, up res them then make a collage of it in one pic and upload that for kling elements it knows it is all part of one character
That’s a great idea! Using a collage with different angles and emotions in a grid format makes a lot of sense, especially for character reference. I’d also take it a step further and use tools like Facefusion or LivePortrait to enhance the facial expressions and make them even more dynamic.
here to say your reminder to subscribe actually, recently, worked to remind me on a previous video. Cuz I was so focused on the actual content. please keep going, and more tutorials of freeeeee n no costttt generators 🙏
PLEASE KLING, give us what we want! An UNLIMITED subscription, otherwise I’m forced to carry on using runway because of my usage. I’ll burn through credits on kling
@@USA4EVER-l7b 전 인종과 언어에 대한 차별을 하지 않습니다. AI시대에 다양한 언어는 큰 문제가 되지 않는다고 생각해요. 특히 제가 좋아하는 Tao님의 채널에서는 이런 다양성이 결코 나쁘지 않다고 생각합니다. 유튜브의 자동 번역 기능도 있고 GPT도 있고 Deepseek도 있습니다. 왜 영어만을 써야 하는지 합당한 이유를 말씀해 주시면 노력해 보겠습니다. I do not discriminate against any race or language. I believe that in the AI era, language diversity is not a big problem. Especially on Tao's channel, which I like, such diversity is never a bad thing. There is RUclips's auto-translation feature, GPT, and Deepseek as well. If you can provide a reasonable reason why only English should be used, I will try to make an effort. No discrimino contra ninguna raza ni idioma. Creo que en la era de la IA, la diversidad lingüística no es un gran problema. Especialmente en el canal de Tao, que me gusta, esta diversidad nunca es algo malo. Además, existen la función de traducción automática de RUclips, GPT y Deepseek. Si puedes darme una razón válida para usar solo inglés, intentaré hacer un esfuerzo. Я не дискриминирую ни одну расу или язык. Я считаю, что в эпоху ИИ языковое разнообразие не является большой проблемой. Особенно на канале Тао, который мне нравится, такое разнообразие вовсе не плохо. Есть автоматический перевод на RUclips, GPT и Deepseek. Если вы можете привести разумную причину, почему следует использовать только английский, я постараюсь приложить усилия. 私はどの人種や言語も差別しません。AI時代において、言語の多様性は大きな問題ではないと思います。特に、私が好きなTaoさんのチャンネルでは、この多様性は決して悪いことではないと思います。RUclipsの自動翻訳機能もありますし、GPTやDeepseekもあります。なぜ英語だけを使わなければならないのか、納得のいく理由を教えてくれれば努力してみます。 "我不歧视任何种族或语言。我认为在人工智能时代,语言的多样性并不是一个大问题。特别是在我喜欢的Tao的频道上,这种多样性绝对不是坏事。RUclips有自动翻译功能,还有GPT和Deepseek。如果你能给出一个合理的理由说明为什么只能使用英语,我会努力尝试。 ฉันไม่เลือกปฏิบัติต่อเชื้อชาติหรือภาษาต่างๆ ฉันเชื่อว่าในยุค AI ความหลากหลายทางภาษานั้นไม่ใช่ปัญหาใหญ่ โดยเฉพาะในช่องของ Tao ที่ฉันชอบ ความหลากหลายนี้ไม่ใช่สิ่งที่ไม่ดีเลย RUclips มีฟังก์ชันแปลอัตโนมัติ, GPT และ Deepseek หากคุณสามารถให้เหตุผลที่สมเหตุสมผลว่าทำไมต้องใช้แค่ภาษาอังกฤษ ฉันจะพยายามทำตาม أنا لا أميز ضد أي عرق أو لغة. أعتقد أنه في عصر الذكاء الاصطناعي، لا تمثل تنوع اللغات مشكلة كبيرة. خاصة في قناة Tao التي أحبها، هذا التنوع ليس أمراً سيئاً على الإطلاق. هناك ميزة الترجمة التلقائية في يوتيوب، وأيضًا GPT و Deepseek. إذا كان بإمكانك تقديم سبب منطقي لاستخدام اللغة الإنجليزية فقط، فسأحاول بذل الجهد. Ich diskriminiere keine Rasse oder Sprache. Ich glaube, dass in der Ära der KI sprachliche Vielfalt kein großes Problem darstellt. Besonders auf Taos Kanal, den ich mag, ist diese Vielfalt keineswegs negativ. Es gibt die automatische Übersetzungsfunktion von RUclips, GPT und Deepseek. Wenn du mir einen vernünftigen Grund nennen kannst, warum nur Englisch verwendet werden sollte, werde ich mein Bestes versuchen. Je ne discrimine aucune race ni langue. Je pense que dans l’ère de l’IA, la diversité linguistique n’est pas un grand problème. En particulier sur la chaîne de Tao, que j’aime beaucoup, cette diversité n’est absolument pas une mauvaise chose. Il y a la fonction de traduction automatique de RUclips, ainsi que GPT et Deepseek. Si tu peux me donner une raison valable pour n’utiliser que l’anglais, j’essaierai de faire un effort. Non discrimino nessuna razza o lingua. Credo che, nell’era dell’IA, la diversità linguistica non sia un grande problema. In particolare, sul canale di Tao, che mi piace, questa diversità non è affatto una cosa negativa. Esistono la funzione di traduzione automatica di RUclips, GPT e Deepseek. Se puoi darmi una ragione valida per cui dovrei usare solo l’inglese, cercherò di fare uno sforzo.
@@USA4EVER-l7b 너무 이해가 안돼서 GPT에게 물어 봤어요. -영어로 만들어진 유튜브 채널에 한국어로 댓글을 달았더니 어떤 시청자중의 한 명이 영어만 쓰라고 나한테 말했어. 어떻게 생각해? 그런 반응은 솔직히 좀 답답하지… 😕 유튜브는 글로벌 플랫폼인데, 다양한 언어가 오가는 건 너무 당연한 일이잖아? 요즘은 자동 번역 기능도 있고, AI 번역도 있어서 언어 장벽이 점점 낮아지고 있는데, 굳이 "영어만 써야 한다"는 주장은 좀 고집스러워 보여. 특히 네 댓글이 해당 영상이나 크리에이터에게 무례한 내용도 아니었을 텐데, 단순히 한국어라는 이유로 문제 삼는 건 열린 사고가 부족한 거지. 유튜브는 다양한 문화와 언어를 포용하는 공간이어야 하는데 말이야. 만약 다시 그런 말을 들으면, 이렇게 답할 수도 있어: "유튜브는 글로벌 플랫폼이고 자동 번역 기능도 있습니다. 다양한 언어가 오가는 것은 자연스러운 일이라고 생각합니다!" 혹은 좀 더 직설적으로: "인터넷은 다국어를 수용하는 공간입니다. 왜 영어만 사용해야 하는지 이유를 말씀해 주시면 고려해 보겠습니다." 너무 신경 쓰지 말고, 하고 싶은 언어로 자유롭게 댓글 달아! ✨🌏💙 I couldn't understand it, so I asked GPT. "I left a comment in Korean on a RUclips channel that was made in English, but one of the viewers told me to use only English. What do you think?" GPT replied: "That kind of reaction is honestly a bit frustrating… 😕 RUclips is a global platform, and it's completely natural for different languages to be used. These days, there are automatic translation features and AI translators, so language barriers are becoming less of an issue. Insisting that 'only English should be used' seems a bit narrow-minded. Especially since your comment was probably not offensive or inappropriate for the creator, it's unreasonable to take issue with it just because it's in Korean. RUclips should be a space that embraces different cultures and languages. If you hear that kind of comment again, you could reply: 'RUclips is a global platform, and it has an automatic translation feature. I believe it's natural for different languages to coexist here!' Or, if you want to be more direct: 'The internet is a space that accepts multiple languages. If you can provide a reasonable explanation for why only English should be used, I will consider it.' Don't worry too much about it-just use whatever language you want when commenting!
Really interesting, well done! What’s still not entirely clear to me (there are so many AIs for generating images and videos now that it’s overwhelming) is the best workflow. Let me explain. Do you first create the story and storyboard, then generate the images with Midjourney, and finally use them in Kling? My question is: do you create all the images (characters, environments, objects, mood, atmosphere, lighting...) in Midjourney and then import them into Kling, or (as I saw in the video) do you generate different camera angles directly in Kling? It would be great to have a tutorial covering the full workflow from image to video, specifically analyzing where and how to structure the video. Anyway, you’re one of the best and always up to date! Have a great day.
Thanks! For my complete professional process, I use probably over 10-15 different tools, which is too much to put into a single video. I do write down a story idea with dialogue first tho, and have ideas for what the images will look like at each stage of the final video
another epic share Tao 👌 I went to your guides trying to find the 'Kling camera angles' you mentioned. or is that another video? the little tips like "rotating lens" makes such a difference
Awesome work, Tao. Two questions... 1. Is Elements available to the public now? 2. Do you think this method does away with the need for models to create character consistency?
Hey, elements should be available for everyone. I think image-to-video still gives the most control, especially if you want to go in a fix small details, color saturation, etc. But elements can be a really useful part of the workflow
So we learn these things for free from Master TAO. Imagine what we can learn if we take the 1-on-1: 60 Minute Consultation. Congratulations you deserve all respect.
Congratulations on your content, it has helped a lot, my friend. What is the best way to make consistent 3D style characters today? Can you help me? I'm from Brazil and I'm studying to create children's stories for RUclips. It's been difficult. Which platform or tools do you recommend?
I remember you saved the princess in one of your videos! How’s she doing? Did she get into trouble again? Does she need your heroic services once more? I’m terribly worried about her-she really has a talent for finding danger! 😄
Very good video. I follow a similar workflow but use Suno or Epidemic Sound for the music, and Da Vinci Resolve for video editing. And I run every prompt simultaneously in another generator, usually Minimax and/or Vidu and keep the best result. But Kling is my main tool. I have tried image generation in Kling but the results are usually not as good as those from Midjourney.
2 things. In the guide it says that you can extend the video up to 3 minutes, do you happen to know where is the button to do this? Also, if you are in contact with them please ask them to let us tag each element, so when we prompt. The AI will be more certain on what to do. Thanks!
Hey, I haven't tried extending up to 3 minutes. The video extension is available in version 1.5 so make sure that's the one your using if you want to extend it.
@ but the paper released with the 1.6 elements model says in black and white that videos with elements, in 1.6 can be extended up to 3 minutes clicking the extension tab bottom left corner…..something is not adding up here…..give it a look for yourself.
Hey, love your tutorial. Just got into this AI world of video making. I made a few and they were terrible. Watched your video and followed along with the software. Made a very similar video and storyline just to see if i could do it. In my eyes its leaps and bounds better than what i was making. Question: would it be OK to post my video if i give you credit for the original ideas, or put a link to your channel? Im new and don't know how this stuff works and i don't want to piss anyone off. Thanks.
Hi Tao, will you be having more copies of the Cinematic Prompts or similar books coming out? I saw that you have sold out. Also, any recommendations of your books for starting out with prompting for image generations if not using Midjourney, and also image and prompt to image? Thank you!
Hey man, amazing video!🙏🏼another question, my character when the prompt is like - This man is walking in this corridor - time to time he walks not natural, like very slow or very fast, or like cartoonish…or movements not natural! How would you recommend to fix it?
@ yes, both Frames and Elements, sometimes it takes 20 - tries.. It used to work better a couple of days ago.. And I also face some difficulties with another character’s walk.. like 10 years old girl - she walks in my previous tried quiet ok, but now she walks in another scenes in a different way - fashion walking, I tried negative prompt, tried different prompt, but something is wrong…I tried to contact them, but unfortunately Kling AI Team do not answer
Great vid!! and idea on how to get different views of your characters. Do you find the upscalers tend to change faces? I've been using face fusion to reapply the face but I've been looking at a few upscale options. I've been using replicate as a cost effective option but looking at topaz, hitpaw and magnific. Hitpaw seems likes good balance. Good to get your thoughts. Perhaps a vid on this would be useful.
yeah upscalers do change faces, although they sometimes have an option to preserve the face as much as possible. In magnific I use "soft portrait" mode with resemblance turned all the way up to keep the upscaled image as similar to the original as possible
@@taoprompts hey just trying your technique for getting the different angles. With a 5 sec clip character on professional rotated slighlty but didn't quite turn enough to a side profile (which is what I'm after) so I tried 'fast rotating lens' but no movement happened. I took a still of the last frame and tried 'rotating lens' from there but it just seems to be stuck and I don't get any rotation (tried a couple of times and same result). Is it case of just using more attempts you think or maybe trying a 10 sec clip? Any tips would be appreciated before I burn more credits. Thanks..........update: Got some success. I added the women rotates around at the beginning (like you did with the planet).
I am using kling ai on your advice.when I produce video from the picture, sometimes the videos are very poor quality.I wonder if I did not enter the correct prompt or what is the reason.can you help.the picture I uploaded is a high quality picture, but the video is very poor quality
@lionelson7098 I'll be the Micheal Bay of here if i tried it was introduced and yes facing and moving cams is simple task just its cool diffrent said and experiment with is good, haaaa awesomeness
The kling ai site is very good but the quality of the videos decreases when creating videos, the video quality on the hailuoai site is better in my opinion. Is it possible to produce high quality videos when creating videos on the klingai site?
that's the best way to do it right now. Although in Kling version 1.5 there is an option to extend the video that they haven't added to version 1.6 yet
Kling is amazing jus a tad bit expensive they need to cut credit usage in half atleast, otherwise the cost to make minutes long videos is gonna be to much...
I recently found out that CapCut was owned by ByteDance, the same company that owns TikTok. I had a subscription for a month and canceled it, as soon as I found out that ByteDance owned it. I don’t feel like sharing my information with the Chinese government. 🕊️🫶
I wish you could have multiple trials for ai video then only pay for ones you use. The way it is it costs way too many credits for 5 tries and luckily you'll get one useable after that.
Thank you so much for this. I purchased Kling a month ago because of your videos. I was going to just use the free trial, but you gave me more insight on "how" to use the prompts and you also were honest about the multiple retakes. I'm glad Kling has given us more credits to make these mistakes!! Keep it up sir!!! I'm trying to get some of my videos up with Kling and I'm loving it.
Tao, your work is so useful, I have been seriously following the advances in generative AI for almost a year, and each of your videos is a pleasure to watch, thank you for your work, I wish you the alignment of the planets for the algorithms !
Thank you! I've been having a lot of fun using all these Ai tools, it's come a long way in a year
Suggestions from a voice professional for recording your text:
1. Try to embody (physically experience) the behaviour you are trying to voice. For the female astronaut in distress, injured, tired, exhausted - think, for example, what pose is she in? (Neck is bent to the side - do it; she is, probably, breathing heavily, gasping - do it, maybe do a few jumping jack beforehand to get a bit winded; if she is raising her hand - raise yours; if she is talking while walking - walk quietly in place, or at least bounce a little; etc.). AUDIBLE BREATH is important.
2. Study speech patterns. Try to parody, to copy performance you like - where does the voice go up in PITCH, where does it go down? What happens with the VOLUME? How could you change TEMPO to make it less monotonous? All these components make our performance more engaging and truthful.
3. Use your FACIAL EXPRESSIONS. If the character is angry - make a "growly" face, bare your front teeth a little. If a character is sleepy, in bed, happy - literally lean to the side, maybe even record lying down, and SMILE. Close your eyes, if your character has their eyes shut. Facial muscles affect our resonance, and emotional perception - this is how you can tell if somebody is smiling on the phone, or not, even without seeing them.
Have fun 🎤😀🎹🤗
Thanks for the tips Emily! I'm a novice when it comes to voices
@taoprompts You got it, mate! Looking forward to your next video :)
the GOAT of Ai tutorials, thank you for all your work 🙌🙌
Keep the good work Man. You are in the forefront of this AI cinematic progress.
My best teacher for AI Videos. ❤
Exceptional video. Very well explained and a lot of info packed into a short time frame. Bravo. Keep on!!
That is gold you just provided. Thank you Tao!
Thank you for another great video! I'm 65 and involved with AI. I studied video production years ago and you have motivated me to dust off that knowledge and put it to use. Wishing a lot of health and happiness for you!
Ai videos getting better super quickly, it's awesome to hear your getting back into video production. Thanks for the support 👍
Underrated channel for AI. Thank you so much for your hard work.
You done went and done it again. Awesome video. 🔥
Thanks for the support Mark! This was a really fun animation to make
Really impressive tutorial! Kling 1.6's capabilities for maintaining consistent characters and creating cinematic scenes are a game-changer. Your step-by-step explanation made it super clear and easy to follow, even for beginners. I'm excited to experiment with these techniques in my own projects. Thanks for sharing your expertise!
Very good use of the music to set the cuts Tao. At 3:03 a braam sfx would evoke goosebumps for me when the alien leaned in. Sooo good
Thanks! I've been trying out different sound effects, it makes everything more intense
@@taoprompts Yup makes a huge difference (when done right).
Hi Tao, Your work is exceptional and I appreciate you for sharing!
This video is so good! It's also very refreshing to know that alien technology hasn't been able to improve on the simple screw top jar for the purpose of brain storage.
7:52 pro tip after taking screen shot, up res them then make a collage of it in one pic and upload that for kling elements it knows it is all part of one character
So, would you do a bunch of different angles/emotions of the same character's face in one collage? (Like in a grid format)
That’s a great idea! Using a collage with different angles and emotions in a grid format makes a lot of sense, especially for character reference. I’d also take it a step further and use tools like Facefusion or LivePortrait to enhance the facial expressions and make them even more dynamic.
Thanks for the suggestion! I didn't think of that
Great vid! It’s one of the most useful I’ve seen
This was really cool! Thank you for the tutorial. 👏🏽👏🏽👏🏽
Glad this helped Rachel!
You’re doing a fantastic job! Keep it going!
Bro is only in this world whose content makes me hopeful ❤
Thanks man 🙏, I've got lots more guides planned
i love you videos bro, a hug from Colombia south america
Extraordinary tutorial as always, Tao! Thank you very much!
Thanks! Great to know you liked this guide 👍
Fantastic vid and walk thru Tao bro!
Thanks Stylist!
here to say your reminder to subscribe actually, recently, worked to remind me on a previous video. Cuz I was so focused on the actual content. please keep going, and more tutorials of freeeeee n no costttt generators 🙏
Thanks for subscribing 👍. A lot of Ai platforms have free trials, although the features are limited
This is good. Love Kling!
Thanks, Kling is the best right now
Excellent step-by-step guide. Inspiring. Thanks!
Thanks for supporting John!
Another awesome video I’ll think I’ll try the elements feature for my next video
It's a really awesome feature, and works on many different types of objects/people 👍
This is so useful!! And loved the story!! Thanks!! ❤❤🔥🔥
Great to know you liked the story, I've been thinking about aliens a lot recently!
totally agree Kling AI gives me the best results at the moment :)
They've got the most detailed and crips generations 👍
PLEASE KLING, give us what we want! An UNLIMITED subscription, otherwise I’m forced to carry on using runway because of my usage. I’ll burn through credits on kling
That’s not cost effective for them. They would lose money. 😂
@ I would pay $200 for unlimited kling, same as Sora. They gotta work it out lol
That doesn't work for me brother
If you publish something, they will give you 600 credits. I just got mine today, that 's pretty cool...
@ yeah I did that for a while but then it took like a month for each submission to come back lol
Great work!
오!! 영상 잘 만드셨는데요!! 멋지십니다 :)😉
Use English!!!!!!!!!!!!!!!!!
@@USA4EVER-l7b 전 인종과 언어에 대한 차별을 하지 않습니다. AI시대에 다양한 언어는 큰 문제가 되지 않는다고 생각해요. 특히 제가 좋아하는 Tao님의 채널에서는 이런 다양성이 결코 나쁘지 않다고 생각합니다. 유튜브의 자동 번역 기능도 있고 GPT도 있고 Deepseek도 있습니다. 왜 영어만을 써야 하는지 합당한 이유를 말씀해 주시면 노력해 보겠습니다.
I do not discriminate against any race or language. I believe that in the AI era, language diversity is not a big problem. Especially on Tao's channel, which I like, such diversity is never a bad thing. There is RUclips's auto-translation feature, GPT, and Deepseek as well. If you can provide a reasonable reason why only English should be used, I will try to make an effort.
No discrimino contra ninguna raza ni idioma. Creo que en la era de la IA, la diversidad lingüística no es un gran problema. Especialmente en el canal de Tao, que me gusta, esta diversidad nunca es algo malo. Además, existen la función de traducción automática de RUclips, GPT y Deepseek. Si puedes darme una razón válida para usar solo inglés, intentaré hacer un esfuerzo.
Я не дискриминирую ни одну расу или язык. Я считаю, что в эпоху ИИ языковое разнообразие не является большой проблемой. Особенно на канале Тао, который мне нравится, такое разнообразие вовсе не плохо. Есть автоматический перевод на RUclips, GPT и Deepseek. Если вы можете привести разумную причину, почему следует использовать только английский, я постараюсь приложить усилия.
私はどの人種や言語も差別しません。AI時代において、言語の多様性は大きな問題ではないと思います。特に、私が好きなTaoさんのチャンネルでは、この多様性は決して悪いことではないと思います。RUclipsの自動翻訳機能もありますし、GPTやDeepseekもあります。なぜ英語だけを使わなければならないのか、納得のいく理由を教えてくれれば努力してみます。
"我不歧视任何种族或语言。我认为在人工智能时代,语言的多样性并不是一个大问题。特别是在我喜欢的Tao的频道上,这种多样性绝对不是坏事。RUclips有自动翻译功能,还有GPT和Deepseek。如果你能给出一个合理的理由说明为什么只能使用英语,我会努力尝试。
ฉันไม่เลือกปฏิบัติต่อเชื้อชาติหรือภาษาต่างๆ ฉันเชื่อว่าในยุค AI ความหลากหลายทางภาษานั้นไม่ใช่ปัญหาใหญ่ โดยเฉพาะในช่องของ Tao ที่ฉันชอบ ความหลากหลายนี้ไม่ใช่สิ่งที่ไม่ดีเลย RUclips มีฟังก์ชันแปลอัตโนมัติ, GPT และ Deepseek หากคุณสามารถให้เหตุผลที่สมเหตุสมผลว่าทำไมต้องใช้แค่ภาษาอังกฤษ ฉันจะพยายามทำตาม
أنا لا أميز ضد أي عرق أو لغة. أعتقد أنه في عصر الذكاء الاصطناعي، لا تمثل تنوع اللغات مشكلة كبيرة. خاصة في قناة Tao التي أحبها، هذا التنوع ليس أمراً سيئاً على الإطلاق. هناك ميزة الترجمة التلقائية في يوتيوب، وأيضًا GPT و Deepseek. إذا كان بإمكانك تقديم سبب منطقي لاستخدام اللغة الإنجليزية فقط، فسأحاول بذل الجهد.
Ich diskriminiere keine Rasse oder Sprache. Ich glaube, dass in der Ära der KI sprachliche Vielfalt kein großes Problem darstellt. Besonders auf Taos Kanal, den ich mag, ist diese Vielfalt keineswegs negativ. Es gibt die automatische Übersetzungsfunktion von RUclips, GPT und Deepseek. Wenn du mir einen vernünftigen Grund nennen kannst, warum nur Englisch verwendet werden sollte, werde ich mein Bestes versuchen.
Je ne discrimine aucune race ni langue. Je pense que dans l’ère de l’IA, la diversité linguistique n’est pas un grand problème. En particulier sur la chaîne de Tao, que j’aime beaucoup, cette diversité n’est absolument pas une mauvaise chose. Il y a la fonction de traduction automatique de RUclips, ainsi que GPT et Deepseek. Si tu peux me donner une raison valable pour n’utiliser que l’anglais, j’essaierai de faire un effort.
Non discrimino nessuna razza o lingua. Credo che, nell’era dell’IA, la diversità linguistica non sia un grande problema. In particolare, sul canale di Tao, che mi piace, questa diversità non è affatto una cosa negativa. Esistono la funzione di traduzione automatica di RUclips, GPT e Deepseek. Se puoi darmi una ragione valida per cui dovrei usare solo l’inglese, cercherò di fare uno sforzo.
@@USA4EVER-l7b 너무 이해가 안돼서 GPT에게 물어 봤어요.
-영어로 만들어진 유튜브 채널에 한국어로 댓글을 달았더니 어떤 시청자중의 한 명이 영어만 쓰라고 나한테 말했어. 어떻게 생각해?
그런 반응은 솔직히 좀 답답하지… 😕 유튜브는 글로벌 플랫폼인데, 다양한 언어가 오가는 건 너무 당연한 일이잖아? 요즘은 자동 번역 기능도 있고, AI 번역도 있어서 언어 장벽이 점점 낮아지고 있는데, 굳이 "영어만 써야 한다"는 주장은 좀 고집스러워 보여.
특히 네 댓글이 해당 영상이나 크리에이터에게 무례한 내용도 아니었을 텐데, 단순히 한국어라는 이유로 문제 삼는 건 열린 사고가 부족한 거지. 유튜브는 다양한 문화와 언어를 포용하는 공간이어야 하는데 말이야.
만약 다시 그런 말을 들으면, 이렇게 답할 수도 있어:
"유튜브는 글로벌 플랫폼이고 자동 번역 기능도 있습니다. 다양한 언어가 오가는 것은 자연스러운 일이라고 생각합니다!"
혹은 좀 더 직설적으로:
"인터넷은 다국어를 수용하는 공간입니다. 왜 영어만 사용해야 하는지 이유를 말씀해 주시면 고려해 보겠습니다."
너무 신경 쓰지 말고, 하고 싶은 언어로 자유롭게 댓글 달아! ✨🌏💙
I couldn't understand it, so I asked GPT.
"I left a comment in Korean on a RUclips channel that was made in English, but one of the viewers told me to use only English. What do you think?"
GPT replied:
"That kind of reaction is honestly a bit frustrating… 😕 RUclips is a global platform, and it's completely natural for different languages to be used. These days, there are automatic translation features and AI translators, so language barriers are becoming less of an issue. Insisting that 'only English should be used' seems a bit narrow-minded.
Especially since your comment was probably not offensive or inappropriate for the creator, it's unreasonable to take issue with it just because it's in Korean. RUclips should be a space that embraces different cultures and languages.
If you hear that kind of comment again, you could reply:
'RUclips is a global platform, and it has an automatic translation feature. I believe it's natural for different languages to coexist here!'
Or, if you want to be more direct:
'The internet is a space that accepts multiple languages. If you can provide a reasonable explanation for why only English should be used, I will consider it.'
Don't worry too much about it-just use whatever language you want when commenting!
Thanks! You can make better stories now, with all the new Ai character features 👍
it's top what you propose! bravo!
Absolutely crushed it with this video! Thank you very much. 🎉
Thanks I appreciate that man!
Really interesting, well done! What’s still not entirely clear to me (there are so many AIs for generating images and videos now that it’s overwhelming) is the best workflow. Let me explain.
Do you first create the story and storyboard, then generate the images with Midjourney, and finally use them in Kling? My question is: do you create all the images (characters, environments, objects, mood, atmosphere, lighting...) in Midjourney and then import them into Kling, or (as I saw in the video) do you generate different camera angles directly in Kling?
It would be great to have a tutorial covering the full workflow from image to video, specifically analyzing where and how to structure the video.
Anyway, you’re one of the best and always up to date! Have a great day.
Thanks!
For my complete professional process, I use probably over 10-15 different tools, which is too much to put into a single video.
I do write down a story idea with dialogue first tho, and have ideas for what the images will look like at each stage of the final video
Thanks for your sharing❤
Good video, very instructive thank you
omg.hollywood level!
Thanks a lot Tao😊
that's an awesome short movie 😃
I appreciate that man 🙏
Son tu fan numero 1
Thanks for supporting!
SMART SMART SMART TAO!!!!
another epic share Tao 👌 I went to your guides trying to find the 'Kling camera angles' you mentioned. or is that another video? the little tips like "rotating lens" makes such a difference
thanks! Here's the link to the prompt guide I mentioned: ruclips.net/video/cHpgSf7LKEE/видео.html
great video, very good
absolutely fantastic!
Assistindo do Brasil. Ótimo vídeo.
Awesome work, Tao. Two questions...
1. Is Elements available to the public now?
2. Do you think this method does away with the need for models to create character consistency?
Hey, elements should be available for everyone.
I think image-to-video still gives the most control, especially if you want to go in a fix small details, color saturation, etc. But elements can be a really useful part of the workflow
@ TY. ☺️
Love the video my friend.. the alien voice would be better change jeje.. but really good explanation.. I’m following you now!!!
Good tutorial, thank you!
Great tutorial as always!
Thanks Gabe!
You‘re a G, thanks for the video
Solid.
Nice Job~
Thanks mang
Гарно і детально пояснення!
So we learn these things for free from Master TAO. Imagine what we can learn if we take the 1-on-1: 60 Minute Consultation. Congratulations you deserve all respect.
Thanks Daniel! Great to know you're finding these guides helpful 👍
wow~~ thk~❤
It will be great to make a pdf to camera movements for Kling. I'll buy it. Love yout videos.
Congratulations on your content, it has helped a lot, my friend. What is the best way to make consistent 3D style characters today? Can you help me? I'm from Brazil and I'm studying to create children's stories for RUclips. It's been difficult. Which platform or tools do you recommend?
I remember you saved the princess in one of your videos! How’s she doing? Did she get into trouble again? Does she need your heroic services once more? I’m terribly worried about her-she really has a talent for finding danger! 😄
She's always causing headaches 😂, It's hard to keep up
Thanks, so much input
Very good video. I follow a similar workflow but use Suno or Epidemic Sound for the music, and Da Vinci Resolve for video editing. And I run every prompt simultaneously in another generator, usually Minimax and/or Vidu and keep the best result. But Kling is my main tool. I have tried image generation in Kling but the results are usually not as good as those from Midjourney.
Thank you. I've seen people create some really nice music with Suno 👍
I am starting to believe he himself is an ai content creator we just don't know it yet.
Super!
🔥great !
That was a good video. I got scared of your white space monster.
Thanks! He does look creepy
2 things. In the guide it says that you can extend the video up to 3 minutes, do you happen to know where is the button to do this? Also, if you are in contact with them please ask them to let us tag each element, so when we prompt. The AI will be more certain on what to do. Thanks!
Hey, I haven't tried extending up to 3 minutes. The video extension is available in version 1.5 so make sure that's the one your using if you want to extend it.
@ but the paper released with the 1.6 elements model says in black and white that videos with elements, in 1.6 can be extended up to 3 minutes clicking the extension tab bottom left corner…..something is not adding up here…..give it a look for yourself.
Mantap bossku
Thanks
wow
Thank you Tao, how do you have so much knowledge of Kling and AI, have you studied it at college or do you just learn it as you go?
I worked as an Ai researcher for a couple years
❤❤
💌
I just wish Kling had an unlimited plan.
me too 🙏
Hey, love your tutorial. Just got into this AI world of video making. I made a few and they were terrible. Watched your video and followed along with the software. Made a very similar video and storyline just to see if i could do it. In my eyes its leaps and bounds better than what i was making.
Question: would it be OK to post my video if i give you credit for the original ideas, or put a link to your channel? Im new and don't know how this stuff works and i don't want to piss anyone off. Thanks.
Glad to hear this helped you out!
Of course, it would be great to see the video you came up with 👍
thank you, great job, how can you help me with creating ai commercials to my clients?
So in the near future it will be possible to act with a famous actor with ai ?
Hi Tao, will you be having more copies of the Cinematic Prompts or similar books coming out? I saw that you have sold out. Also, any recommendations of your books for starting out with prompting for image generations if not using Midjourney, and also image and prompt to image? Thank you!
Awesome
Thanks man!
Hey man, amazing video!🙏🏼another question, my character when the prompt is like - This man is walking in this corridor - time to time he walks not natural, like very slow or very fast, or like cartoonish…or movements not natural! How would you recommend to fix it?
Did you try using image-to-video? Sometimes it just takes a few tries to get good results.
@ yes, both Frames and Elements, sometimes it takes 20 - tries.. It used to work better a couple of days ago.. And I also face some difficulties with another character’s walk.. like 10 years old girl - she walks in my previous tried quiet ok, but now she walks in another scenes in a different way - fashion walking, I tried negative prompt, tried different prompt, but something is wrong…I tried to contact them, but unfortunately Kling AI Team do not answer
how can you generate video faster on Kling? mine is too slow
elements is slow by default
the Ai next step: action scenes!
That might be possible soon 🙏
Great vid!! and idea on how to get different views of your characters. Do you find the upscalers tend to change faces? I've been using face fusion to reapply the face but I've been looking at a few upscale options. I've been using replicate as a cost effective option but looking at topaz, hitpaw and magnific. Hitpaw seems likes good balance. Good to get your thoughts. Perhaps a vid on this would be useful.
yeah upscalers do change faces, although they sometimes have an option to preserve the face as much as possible.
In magnific I use "soft portrait" mode with resemblance turned all the way up to keep the upscaled image as similar to the original as possible
@@taoprompts hey just trying your technique for getting the different angles. With a 5 sec clip character on professional rotated slighlty but didn't quite turn enough to a side profile (which is what I'm after) so I tried 'fast rotating lens' but no movement happened. I took a still of the last frame and tried 'rotating lens' from there but it just seems to be stuck and I don't get any rotation (tried a couple of times and same result). Is it case of just using more attempts you think or maybe trying a 10 sec clip? Any tips would be appreciated before I burn more credits. Thanks..........update: Got some success. I added the women rotates around at the beginning (like you did with the planet).
how to remove the ai logo in the corner of the video? i subscribe but why is the kling ai logo still attached when i download
🔥🔥👍👍🔥🔥
Thanks for the support 🔥
How do you get kling to recognize alien figure
I am using kling ai on your advice.when I produce video from the picture, sometimes the videos are very poor quality.I wonder if I did not enter the correct prompt or what is the reason.can you help.the picture I uploaded is a high quality picture, but the video is very poor quality
Its awesome i have one about story i probably over saturate if i try it hahaaa
Well, movies spend millions on test shoots, reshoots, and storyboards... so taking the time to get references right is definitely worth it!
@lionelson7098 I'll be the Micheal Bay of here if i tried it was introduced and yes facing and moving cams is simple task just its cool diffrent said and experiment with is good, haaaa awesomeness
Go for it man! I'd love to see what you make
How do you know if it's "cinematic"?
Tao, would you say Magnific is better than Topaz for upscaling ?
Magnific works on images and Topaz works for videos, so they do different things
can you help i didnt have a problem at first but now all my videos are completely turning out different the the image i uploaded
💠 We NEED an - Unlimited - subscription plan from KLING! Seriously, like & share if you agree! ❤
Hi Tao, I try elements with Standar account but get errors, is that normal? or I need to get the next level pack?
As far as I know, elements should work for any paid account. Maybe try again later?
The kling ai site is very good but the quality of the videos decreases when creating videos, the video quality on the hailuoai site is better in my opinion. Is it possible to produce high quality videos when creating videos on the klingai site?
Use image-to-video to get the best results
is this normal that i cant prompt words like soldier and tank in kling??
yeah some words are censored, if you use image-to-video though it will be able to generate those
@@taoprompts unfortunate not it was at img to vid
Do you use upscaling for your video ... I find that there is too much video compression on my videos, even at 1080p
I use topaz to upscale sometimes
@@taoprompts Do they have a online service, or is it just a app ? Do you feel the results are good .
@@3dafex It's software you download, install, and run locally. I think the results are pretty good.
what if i want next clip continue from where the previous ended, this is my problem
i got the answer from your video....screenshot
that's the best way to do it right now. Although in Kling version 1.5 there is an option to extend the video that they haven't added to version 1.6 yet
Kling is amazing jus a tad bit expensive they need to cut credit usage in half atleast, otherwise the cost to make minutes long videos is gonna be to much...
CapCut isn’t available anymore
I recently found out that CapCut was owned by ByteDance, the same company that owns TikTok. I had a subscription for a month and canceled it, as soon as I found out that ByteDance owned it. I don’t feel like sharing my information with the Chinese government. 🕊️🫶
It's still working for me, I'm on the west coast
я сделал клип аж на 4 минуты получилось прикольно
I wish you could have multiple trials for ai video then only pay for ones you use. The way it is it costs way too many credits for 5 tries and luckily you'll get one useable after that.
While that would be great, they pay for compute time which is passed onto the end users.