I noticed a comment you put on "SpongeBob's Original 1999 Crew REUNITES For New SpongeBob Movie" and I recommend you watch "A better schedule for Nickelodeon's main block" by Evan Rosman.
I'm an audio engineer who works on international dubs of English videos every day at work. This is definitely a fascinating discussion. On the one hand, using AI tools allows the characters to retain their tones of voice and feel very in character, at least to me as a non-native speaker. You also skip the difficult parts of trying to match the lip sync and can work on adapting the script on the fly much easier without an actor waiting in the booth for you to make things easier for them. On the other hand however, in addition to AI voices not always sounding right, you do lose some of the qualities that a genuine performance can bring when an actor is called in to dub. A good point of reference would be the narrator in "Love is War" in both English and Japanese. Both do an entirely different style of acting, and both are hilarious in their own way, for very different reasons. Very surprised how different Patrick sounds in the official dub. I was expecting them to do an impersonation but they definitely wanted to put their own spin on the character.
Totally, the accentuated personalities of different actors as they give their own take makes it sometimes more enjoyable to rewatch a series. I was just playing around with this tool, and I just felt like some things a missing in terms of delivery and performance that makes it feel distinctly human. Take for example laughing or yelling. The outburst at the end is way funnier even in the old, cheap Japanese dub since the actor just really laughed it up. Haha Performance-wise, going with full professionals is the way to go. But I have heard interesting discussions about whether people would watch famous movies in their own language if the dub actors sounded like the original on screen actor. I think combining real actor performances in the target language with augmentation via AI could produce interesting results. It still impresses me with what it can do. Considering how wonky dubs can get (was it the Malaysian dub of DBZ with the "Everyone in our race can become a giant gorilla! Haha HA!"), I see the potential of this being stronger in general accessibility. Imagine a touring company being able to more quickly localize their videos even for people who speak less common languages.
I wonder if AI will be used for future dubbing. There are clearly limitations with tone/inflection, but that'll likely improve in future iterations. I'm curious to hear the reverse, using Japanese dubs and converting that English via AI.
I've had issues with the latter because Eleven Labs seems to give Japanese speaking characters a very thick, weird Japanese accent. The accent also doesn't even sound Japanese. I could try again at some point though. What I bring up with this discussion is more of the idea, could AI be leveraged to help fill in gaps of localization if human localization is never considered? Spongebob definitely already has a dub in this case (hence the comparison), but I would be curious about more obscure media that wouldn't be replacing actual work. On top of that, there are obvious limitations that would prevent it from being a completely viable replacement anyway as we've discussed. Anyway, I may try experimenting with the reverse. We'll see.
@@TayoEXE It would definitely be worthwhile for the more obscure shows that don't get a localised dub, especially with it being more true to the English voice. But I guess this would only be for more obscure or older shows as anything popular would already be dubbed. I'd be curious to see this done with Batman: TAS. Not sure if there's a Japanese dub, assuming there is, but Kevin Conroy and Mark Hamill's voices are so iconic for their respective roles it'd be cool to hear them in Japanese.
This is a very good comparison video! :D If AI is used responsibly, it could do some good. Some of the examples mentioned in the comments would be beneficial, like having certain things more readily available to a wider audience. As was mentioned in other comments, I also thought the voices of the AI dub sounded closer to the English ones, at least on the surface, but it lost the genuine tone that the other two dubs have done by live, professional voice actors, you can especially tell in the laugh. lol Still, for those who might not ever have the chance to see it in their own language otherwise, that wouldn't matter as much. And, if it's popular enough, it could also encourage professional dubbing in that area. :) By the way, not only does the official Japanese dub have their own unique spin on the characters, but I also found it interesting that they added echo to the voices to highlight that they were in a cave, which the English dub didn't do.
What if we could make localized content for our favorite foreign media that never got any official releases in our native language?
I noticed a comment you put on "SpongeBob's Original 1999 Crew REUNITES For New SpongeBob Movie" and I recommend you watch "A better schedule for Nickelodeon's main block" by Evan Rosman.
The crossover we never knew we needed
I'm an audio engineer who works on international dubs of English videos every day at work.
This is definitely a fascinating discussion. On the one hand, using AI tools allows the characters to retain their tones of voice and feel very in character, at least to me as a non-native speaker. You also skip the difficult parts of trying to match the lip sync and can work on adapting the script on the fly much easier without an actor waiting in the booth for you to make things easier for them. On the other hand however, in addition to AI voices not always sounding right, you do lose some of the qualities that a genuine performance can bring when an actor is called in to dub. A good point of reference would be the narrator in "Love is War" in both English and Japanese. Both do an entirely different style of acting, and both are hilarious in their own way, for very different reasons.
Very surprised how different Patrick sounds in the official dub. I was expecting them to do an impersonation but they definitely wanted to put their own spin on the character.
Totally, the accentuated personalities of different actors as they give their own take makes it sometimes more enjoyable to rewatch a series. I was just playing around with this tool, and I just felt like some things a missing in terms of delivery and performance that makes it feel distinctly human. Take for example laughing or yelling. The outburst at the end is way funnier even in the old, cheap Japanese dub since the actor just really laughed it up. Haha Performance-wise, going with full professionals is the way to go. But I have heard interesting discussions about whether people would watch famous movies in their own language if the dub actors sounded like the original on screen actor. I think combining real actor performances in the target language with augmentation via AI could produce interesting results.
It still impresses me with what it can do. Considering how wonky dubs can get (was it the Malaysian dub of DBZ with the "Everyone in our race can become a giant gorilla! Haha HA!"), I see the potential of this being stronger in general accessibility. Imagine a touring company being able to more quickly localize their videos even for people who speak less common languages.
@@TayoEXE It was the "Big Green" dub that had that line
I like it more personality more into the character and it sounds great the AI sounds great
I wonder if AI will be used for future dubbing. There are clearly limitations with tone/inflection, but that'll likely improve in future iterations.
I'm curious to hear the reverse, using Japanese dubs and converting that English via AI.
I've had issues with the latter because Eleven Labs seems to give Japanese speaking characters a very thick, weird Japanese accent. The accent also doesn't even sound Japanese. I could try again at some point though.
What I bring up with this discussion is more of the idea, could AI be leveraged to help fill in gaps of localization if human localization is never considered? Spongebob definitely already has a dub in this case (hence the comparison), but I would be curious about more obscure media that wouldn't be replacing actual work.
On top of that, there are obvious limitations that would prevent it from being a completely viable replacement anyway as we've discussed.
Anyway, I may try experimenting with the reverse. We'll see.
@@TayoEXE It would definitely be worthwhile for the more obscure shows that don't get a localised dub, especially with it being more true to the English voice. But I guess this would only be for more obscure or older shows as anything popular would already be dubbed.
I'd be curious to see this done with Batman: TAS. Not sure if there's a Japanese dub, assuming there is, but Kevin Conroy and Mark Hamill's voices are so iconic for their respective roles it'd be cool to hear them in Japanese.
This is a very good comparison video! :D If AI is used responsibly, it could do some good. Some of the examples mentioned in the comments would be beneficial, like having certain things more readily available to a wider audience.
As was mentioned in other comments, I also thought the voices of the AI dub sounded closer to the English ones, at least on the surface, but it lost the genuine tone that the other two dubs have done by live, professional voice actors, you can especially tell in the laugh. lol Still, for those who might not ever have the chance to see it in their own language otherwise, that wouldn't matter as much. And, if it's popular enough, it could also encourage professional dubbing in that area. :)
By the way, not only does the official Japanese dub have their own unique spin on the characters, but I also found it interesting that they added echo to the voices to highlight that they were in a cave, which the English dub didn't do.
And it's very unique
Yo awesome