Only 2 more papers down the line and their character animations will finally be as good as their HR departments are at maintaining, protecting and concealing workplace toxicity!
@@eujin3562 ubi as a company is bad but that doesn't make the developers working inside less good, I'm always amazed by the work they make on animations and world details, glad they leaving ubisoft, hopefully we see a new indie company by grouping ex-ubi employees or something.
Ubisoft: lets make AI predicting movement and performing various complex sets Also Ubisoft: We are gonna put same old copy pasted generic open world formula in each of our games.
@@lets_see_777 Or. Did you ever think, like all businesses, they need to push out established games to keep the huge company turning over nicely which then allows them to push and develop technologies on the side and take risks trying new things. Kind of like other tech companies already do. Except in games people cry about it. No one criticizes tesla for pushing out the same generic electric car each year with minimal changes while they use the profits to develop better tech.
@@espirite Ah yes, I recall that very useful review from IGN: "This game is bland but you should give your money to them anyway so they can develop better tech"
Since Half Life² I'm of the opinion that the best thing happening to video game graphics since 3D are realtime physics. Making a world move more believable is so much more important than than making it look good on screenshots.
Team 1: "We created a program that can improve the motion of virtual characters!" Team 2: "We created a way to sell more skins using NFTs!" Executive: "Give Team 2 a gold star! That is the sort of innovation my bottom line needs! I feel like taking a vacation to celebrate!"
@@txorimorea3869 That's usually how it happens. The democratization and proliferation of advanced programming techniques can break the log-jam of corporate self-interest. An advanced technique may be considered not worth the time and effort compared to using older techniques and iterating just enough to work a bit better, but give that new tech to a small group and they can afford to play around with ways of implementing new techniques. The tools and progressing AI techniques are already starting to flip the script (i.e.Unreal engine Nanite and lighting) on the game industry. Now individual game designers don't necessarily need big math degrees to make a game at least look AAA since most of the work is already done under the hood. With increasingly freely available 3D asset collections, some dude with time at home can have a brilliant game idea without getting bogged down with the soul-destroying creation of hundreds of assets, or other big jobs like animating characters using traditional motion capture techniques, now that an AI can do much of the heavy lifting. Most of the games I get the most time out of before getting bored are independently created by small companies or individuals. The AAA games are mostly just railroad movies that you get the illusion of freedom while being largely scripted towards predictable results. What a time to be alive.
@@CharveL88 Funny how you mention Unreal Engine considering you need to install literal actual malware that sells your credit card information to china in order to even try it. (epic games)
@@jonasbuyle1341 squats and pushups doesn't teach you much balance or cure physical ailments though. Nor does it change the fact that your diet is more important for energy and obesity levels. In my own experience, squats, situps and pushups are mostly good for teaching self-discipline and as a tiny supplementary exercise that can keep you from degrading completely with an otherwise sedentary lifestyle.
@@Muskar2 ....wel , thats what i meant i used "squats and push ups" metaphorical to oppose sedentary lifestyle , still , the squat is the base of all excercises , i do them without weights and then balance is very importent , al the power your body generates comes from the ground , action reaction.....
Imagine if this can be implemented so as your dexterity and agility stats increases, you actually have a visaul representation/change in how the character moves.
You remember back when mocap first became popular and video games all had that 'mocappey' look- you could kind of tell, 'see' the actor moving around? I feel like this will be the next 'look'- you'll be able to just see that it's this particular system running the animations- and I for one look forward to it :)
Anything that prevents people from making animations is a massive win. You can have the nicest looking model, but if you keyframe it by hand; and I don't care if that includes inverse kinematics with sanity constraints on joint angles doing the heavy lifting for you; it's going to have Half-life 1 levels of jankyness. In Half-life 1 that was OK, the models looked janky too, but today it really sticks out. If a human animates it, the center of mass motion and angular momentum will change in ways which do no correspond to forces that could plausibly be applied by the muscles with that joint orientation. It's very hard to tell what even is wrong when looking at the animation; you can just tell that it's awful, if you find out what's wrong it's even harder to actually do something to make it better. When you physically simulate the characters you get correct looking motion by default, but the hard part that this AI is solving is how to get from one desired location and pose to another desired location and pose in a given amount of time. I'm already looking forward to a new generation of Bethesda bugs where the AI has fallen and can't get back up.
Definately a skill issue And specially in games that dont need extremely realistic animations because they benefit a lot from a bit of stylization and personality
On that thought, my first thought when seeing this is that I think a lot of work should be put into recovering, rather than just staying in the game. If an object falls over, it should be able to recover well, otherwise it will be a highly critical flaw in the game/art, getting stuck in a fallen state sometimes by being unlucky. Even a beetle on its back usually finds a way up. Evolution took care of that. So it seems unnatural when it can't - and obviously frustrating if it's a character controlled by or allied to the player.
I am pretty sure you cannot patent “an idea”, you might be able to patent the specific code you used to create the idea, but if you put the idea out and someone reverse engineers it and implements their own version, I seriously doubt you’d have any legal claim over them. There’s another thing called a company using their economic power to destroy a rival afterwards through sketchy, alternative means, and maybe I am very wrong guys idk but I am almost sure they cannot patent anything else than their specific code. I have to reiterate though maybe I am very wrong 😂
As a 3D artist with ZERO programing knowledge, i really wish i had a tool to make animations using ubisoft's methods. This is amazing! Their previous paper based on key frames was also great (and open source!) i'm suprised no one even has made a blender plugin or something.
@@Americanbadashh This looks very promising! EDIT: Just tried it and its.... not very user friendly and tends to crash alot. But! the times i did get the use it without fault, it's definetely a step forward! though admitedly i'm lazy and i just want to make mocap level animatiosn with 3 keyframes haha. Still awesome though.
Love your videos... The best part is that at first it does not makes a lot of people think that it can be a thing. Suddenly, few more papers down the road, we can see so much intriguing applications that we could not even imagine before. This channel is definitely about the future (if you are watching this on the 20's)
I can’t wait for these simulations to make leaps and bounds in my lifetime. I’m curious how battle tactics would evolve if you gave AI’s different kinds of super powers and tell them to fight each other. It’d be interesting to see how the “meta” for those powers develop across generations.
These videos are cool but kind of terrifying at the same time. It just blows my mind that this sort of thing is possible. The future is gonna be crazy.
It will if we keep being complacent sheep allowing these companies to ''patent'' these things. This is from the company that is trying to sell NFTs as video games. They don't care about the consumers bro.
@@RobotronSage wait so you're so saying scientist or companies should be able to patent their accomplishments. Do you propose a free for all? Where anyone can just freely steal anyone elses work. What is the solution? 🤔. I don't know.
I think Karoly missed the main point of this paper, which is that these are *physically simulated* characters. The whole "predict the future" thing is done in order to achieve this. At 2:50 he makes it sound as if copying the reference motion seems silly, and that what this thing *really* does is predict the future motion. No, copying the reference motion *is* the point, and the future prediction is made in order to do this. Only, it isn't simply copying - it is *driving the joint motor torques such that the resulting physically simulated pose matches the kinematic (non-simulated) pose*. This is the challenging problem that was tackled. This has been tackled before as well, but the new idea in this paper was to use supervised learning instead of reinforcement learning to do this, which I also didn't see mentioned anywhere.
The summary at 4:12 is also off - this method does not create higher quality animation, it tracks already made (or mocapped) animations with a physics driven character. The character controller is also just a regular character controller that controls the kinematic reference pose - it is not created by this method, it's just that this method allows a physically simulated character to follow the reference pose, and hence be controlled. The controller is likely based off motion matching, since this is Ubisoft we're talking about.
Thanks, you explained it much better than my comment where I said the same thing. This video completely missed the point of the paper (as sadly already happened in other videos on this channel tackling Tech Animation papers). But what can you do, Technical Animation and physical animation are not well known field, and most people don't even understand the difference between mocap, physics, IK and keyed animation.
@@thearcadefire93 Yep, I've also seen this happen before with two minute papers, especially with animation like you said. It's understandable I guess given that this isn't Karoly's field if expertise, but perhaps most people wouldn't understand the difference anyway
Yeah but considering his channel is supposed to be informative and providing learning content, spreading misinformation is not great. 90% of the people in this comment section think this new system will allow indie devs to create animations from scratch, which is total nonsense. Clearly this is bad. I tried to find a way to reach out to Karoly, since I am sure he cares about the quality of its content and accuracy of the information he provides, but I could not find a proper way... And i doubt he will read all these comments and find the two ones of people with more experience in the field criticizing the content.
I have done something very similar in my bachelor thesis 10 years ago. I pretty much failed and could barely train the AI to do a few steps & simple jumps before falling over xD Cool to see how far this got now. This technology will be very important for future VR. In regular games the player character and NPC's play fixed animations independent of the physics simulation. This doesn't work in VR though because you usually control your virtual character directly via your own movement and the game can't just move you to play an animation. In the future these animations will be replaced with physics and an AI control the body of an NPC. This is very similar in complexity to controlling a real world humanoid robot.
Not too bright there, are you skippy!! Give yourself a pat on the back for your ability to not use critical thinking. You have been very well indoctrinated by your college & shows. Ever heard of nanotechnology?? Ever done any research on Graphene Oxide & how it is able to make a human into a Transhuman. ruclips.net/video/tYBmiwlp3Rg/видео.html
THIS IS WHAT I AM TALKING ABOUT! THIS IS AWESOME! I am happy to see that physics and animations are catching up to visuals. Things look so great these days but then end up still looking goofy or emersion breaking when something strange interacts with an object. Take Forza, Project Cars, etc, visuals and cars look almost real, up until you crash into a wall and your Subaru just bounces of like a spunge.
Amazing! Imagine if we could couple it with the NVIDIA paper from 2 weeks ago about 3d character animations from videos! And we have a controllable parkour player trained on videos!
Good animation has beautiful moments and good tweeting. Great animation needs this kind of properly predicted timing and spacing between beautifully crafted moments. As a training animator this paper is exciting!
The thing most exciting about this to me is what indie creators will be able to do. Small scale game development will be able to do so much more with this type of tech in a short amount of time and more cheaply. It allows indie games scope and detail to increase without much trouble
Ayyy another Daniel holden paper! (:: This guy's got all the excellent character locomotion papers. Only a matter of time now before we see stuff of it in games etc
This is going to make motion tracking a more lucrative thing for a while, and then it'll be gone because A.I.s will do in faster and for significantly cheaper. Companies who make motion tracking animations are going to have to develop their own motion tracking A.I.s to keep their businesses going.
While this looks super promising for future games _(not just in itself, but as a precedent or potential inspiration for other developers),_ it also has the possibility of being under-utilized by the industry for years just like Euphoria-powered animations
What a time indeed! I have a request - please consider creating another channel where you show applications of these technologies and how people can play with these cool new tools.
This is actually a game changer for animation, now that the AI can make them physicalized. Before, animations were just movements on a rigged model that had no contact or interaction with the environment. So when a character ran up a flight of stairs, it's just playing out an animation instead of physically moving up each step. This AI development allows stock animations to be recreated in a simulation so that it is actually simulating the movement in a physical environment in real time.
Holy crap, we'll finally get realistic movement animations in all situations, not just the hard-coded ones that feel very forced in changing environments. I also hope AI will help in optimising the resource use of games. Like a photorealistic huge open world game running in a current gen smartphone natively without breaking a sweat.
I was literally talking to my girlfriend about half an hour ago how when I was a kid I'd see tech demo videos for the 360 and ps3 coming out and how I never see ones that actually excite me anymore. This video got me hyped though, I feel like I just watched the tech demo for the Euphoria engine being used in GTA IV for the first time again
👏 now make it identify energy-minimal deformations of morphs where energy is a higher-order energy based on total change of muscles/bones (taking into account some other higher order object). As in: 1️⃣ Avoid this upper bound while running ~> slides under object then continues running. 2️⃣ Avoid this sword while shuffling left/right in parry stance. 3️⃣ Be like water while avoiding punches from Bruce Lee
THIS IS UNBELIEVEBLE. I HAVE NEVER EVER THOUGHT THAT THE DAY WOULD COME AND I WOULD BE A WITNESS WHAT WE HAVE ACHIEVED. BUT TELL ME MY FRIEND ISN'T IT A BIT SCARY, TOO???
yeah, this is a Learned Motion Matching, the superior version using neural networks, considering Motion Matching made animations in a lot of games far more realistic(TLOU2 and UFC 3) we can have high hopes for this tech once it starts being used in games
Károly, at what point do you think that 3D TV will be possible? I'm imagining watching a sports match where the entire game is rendered in real-time so that you could (with a headset) stand in the middle of the field and watch the game from any perspective. Thanks for such great content. You'll be delighted to know that even my teenage boys enjoy them!
Considering the 3D simulation tech that American football games have, it's not hard to imagine that the only limitation right now is nobody thought of it.
@@punishedkid for it to look like a ps3 game maybe, but for it to look real we are not even close yet, it would need to be able to render realistic looking models in real time just taking in images from a few cameras
Can’t wait until we start applying this to robotics. Imagine how simple it would be to control a robot using this technique. It would make it super easy for any company to start developing robots instead of highly technical companies like Boston dynamics.
A game on Steam called "will you Snail" been in development for 3 years. The premise is an AI predicting the player's movement in order to spawn traps.
I'm not sure I understand... how many frames forward is the AI predicting things? ...Basically we're giving it a set of keyframes (taken from mocap data) and it's interpolating (with physics) between them, right? How many keyframes per second are there? Impressive stuff thanks for sharing
I struggle to see what is new with this technique. Movement prediction like the animations in the video was showcased a while back and this is already implemented in major engines like unity (called kinematica). Is it the physics calculations, i.e IK and forces? Is physics the input to the ai and not the animations?
I think kinematica produces traditional poses and animations aka it is a system to make creating of animations easier. From a physics standpoint the characters is just a capsule. It can not really interact with anything else. The simulation in this video uses an actual physical representation of a body. And it trains an AI to control that body to follow the animation. Take VR for example: If you punch a traditional animated character your hand moves through the animation without any interaction (unless you specifically program this in). But the physics based animations here would be knocked out of balance (or outright see your movement coming and evade).
Yes, this is not a new system though. EA sports games already run a full physical simulation on every player on the pitch to ensure they have proper interactions when they hit each other (think about two football players running into each other in madden) So I am not entirely sure what this new system improves upon. I guess mostly stability of the final motion, meaning it would be less risky to run and need less setup?
I'm confused. So the AI (seen on the left comparative to the green reference on the right), is predicting what the reference will do? It's not just reading the same instructions the reference comes with? Also, the reference clearly must have instructions with it, so why not just give the AI access to those instructions? Why was there not a demonstration of a user input controlling a reference while the AI tries to predict? Are we assuming the reference was user controlled? All I think I am seeing here is predetermined instructions. While the motions are beautiful to see, I'm having trouble grasping a visual flag that what I'm being told is actually happening.
Amazing!!! The next gen character animations
Only 2 more papers down the line and their character animations will finally be as good as their HR departments are at maintaining, protecting and concealing workplace toxicity!
Funny enough that this pinned comment have less likes than the meme comments below it.
Or something that'll remain under-utilized for years just like Euphoria-powered animations
If Ubisoft is in control of this software they're downgrade it before it comes out.
@@cupofjoen fuck likes. who gives a shit
I always wonder how much amazing tech these companies are developing that doesn't get published! Glad to see ubisoft sharing their techniques.
There is a whole department at Ubisoft dedicated to publishing their tech called LaForge! This is one of their amazing works!
lots of employee resigning from ubiscam
@@eujin3562 ubi as a company is bad but that doesn't make the developers working inside less good, I'm always amazed by the work they make on animations and world details, glad they leaving ubisoft, hopefully we see a new indie company by grouping ex-ubi employees or something.
@@dux3644 yeah, like Red Barrels did.
@@YnteryPictures would not be suprised if that throws you higher into reds hiring list.
Ubisoft: Good at predicting player's motions, bad at predicting player's emotions.
Lmao nice 1😂😂
Ubisoft is crap company I have stopped playing their repetitive soulless games
Ubisoft: lets make AI predicting movement and performing various complex sets
Also Ubisoft: We are gonna put same old copy pasted generic open world formula in each of our games.
@@lets_see_777 Or. Did you ever think, like all businesses, they need to push out established games to keep the huge company turning over nicely which then allows them to push and develop technologies on the side and take risks trying new things.
Kind of like other tech companies already do. Except in games people cry about it.
No one criticizes tesla for pushing out the same generic electric car each year with minimal changes while they use the profits to develop better tech.
@@espirite Ah yes, I recall that very useful review from IGN: "This game is bland but you should give your money to them anyway so they can develop better tech"
This is great! Sharp textures are nice, but true next gen doesn't begin until we improve animations
Animations are crucial
Since Half Life² I'm of the opinion that the best thing happening to video game graphics since 3D are realtime physics.
Making a world move more believable is so much more important than than making it look good on screenshots.
Shadow of the Colossus (PS2) animations still amazes me.
@@MajorFleshbang Such an insight. I felt like Half Life 2 was a sea change. Definitely felt like the next step forward.
100% agreed. We are so behind on gameplay animations and at times it feels like we moved backwards
Team 1: "We created a program that can improve the motion of virtual characters!"
Team 2: "We created a way to sell more skins using NFTs!"
Executive: "Give Team 2 a gold star! That is the sort of innovation my bottom line needs! I feel like taking a vacation to celebrate!"
Hopefully these new A.I. technologies will allow indies who love making games to replace those dysfunctional corporations.
@@txorimorea3869 That's usually how it happens. The democratization and proliferation of advanced programming techniques can break the log-jam of corporate self-interest. An advanced technique may be considered not worth the time and effort compared to using older techniques and iterating just enough to work a bit better, but give that new tech to a small group and they can afford to play around with ways of implementing new techniques.
The tools and progressing AI techniques are already starting to flip the script (i.e.Unreal engine Nanite and lighting) on the game industry. Now individual game designers don't necessarily need big math degrees to make a game at least look AAA since most of the work is already done under the hood. With increasingly freely available 3D asset collections, some dude with time at home can have a brilliant game idea without getting bogged down with the soul-destroying creation of hundreds of assets, or other big jobs like animating characters using traditional motion capture techniques, now that an AI can do much of the heavy lifting.
Most of the games I get the most time out of before getting bored are independently created by small companies or individuals. The AAA games are mostly just railroad movies that you get the illusion of freedom while being largely scripted towards predictable results.
What a time to be alive.
@@txorimorea3869 keep supporting indies and we'll get there one day.
@@txorimorea3869 Wait, isn't this a proprietary AI developed and owned by Ubisoft though? Will they allow this to be used by indie devs?
@@CharveL88 Funny how you mention Unreal Engine considering you need to install literal actual malware that sells your credit card information to china in order to even try it. (epic games)
I am absolutely amazed at how smooth that animation is. The weights and the bounce of all of it is spot on. My mind is blown.
this AI is better at mimicking human movements than i am
better slap tensor cores to your brainstem lol
This comment needs more likes.
No worries , you are flawed , you are human and this ai has god ambitions , anyway keep doing your squats and push ups
@@jonasbuyle1341 squats and pushups doesn't teach you much balance or cure physical ailments though. Nor does it change the fact that your diet is more important for energy and obesity levels. In my own experience, squats, situps and pushups are mostly good for teaching self-discipline and as a tiny supplementary exercise that can keep you from degrading completely with an otherwise sedentary lifestyle.
@@Muskar2 ....wel , thats what i meant i used "squats and push ups" metaphorical to oppose sedentary lifestyle , still , the squat is the base of all excercises , i do them without weights and then balance is very importent , al the power your body generates comes from the ground , action reaction.....
Imagine if this can be implemented so as your dexterity and agility stats increases, you actually have a visaul representation/change in how the character moves.
it's time to rename the channel from 'two minute papers' to 'what a time to be alive'
what a period of two minutes to be alive
Two Minutes Being Alive
Papers living for two minutes a time
"Two Minutes Alive or What to be - The Paper" ~ Time (2021)
to 5 minute papers
You remember back when mocap first became popular and video games all had that 'mocappey' look- you could kind of tell, 'see' the actor moving around? I feel like this will be the next 'look'- you'll be able to just see that it's this particular system running the animations- and I for one look forward to it :)
Once they advance more you wont be able to tell if it is person or an AI moving.
Anything that prevents people from making animations is a massive win. You can have the nicest looking model, but if you keyframe it by hand; and I don't care if that includes inverse kinematics with sanity constraints on joint angles doing the heavy lifting for you; it's going to have Half-life 1 levels of jankyness. In Half-life 1 that was OK, the models looked janky too, but today it really sticks out. If a human animates it, the center of mass motion and angular momentum will change in ways which do no correspond to forces that could plausibly be applied by the muscles with that joint orientation. It's very hard to tell what even is wrong when looking at the animation; you can just tell that it's awful, if you find out what's wrong it's even harder to actually do something to make it better.
When you physically simulate the characters you get correct looking motion by default, but the hard part that this AI is solving is how to get from one desired location and pose to another desired location and pose in a given amount of time. I'm already looking forward to a new generation of Bethesda bugs where the AI has fallen and can't get back up.
I don't know, i've seen hand animated stuff that looks better than this, just sounds like a skill issue
Definately a skill issue
And specially in games that dont need extremely realistic animations because they benefit a lot from a bit of stylization and personality
On that thought, my first thought when seeing this is that I think a lot of work should be put into recovering, rather than just staying in the game. If an object falls over, it should be able to recover well, otherwise it will be a highly critical flaw in the game/art, getting stuck in a fallen state sometimes by being unlucky. Even a beetle on its back usually finds a way up. Evolution took care of that. So it seems unnatural when it can't - and obviously frustrating if it's a character controlled by or allied to the player.
@@Shyguy5104 In Video Games or other media?
@@spunkysamuel Yes
"A solid step towards democratising the creation of superb computer animations" - until they patent the s**t out of it that is.
They can patent the code, not the process ;)
@@EnriquePage91 they actually literally can patent the process yes indeed
@@ninseineon The corporative way to earn money
@@EnriquePage91 They... Can, though
I am pretty sure you cannot patent “an idea”, you might be able to patent the specific code you used to create the idea, but if you put the idea out and someone reverse engineers it and implements their own version, I seriously doubt you’d have any legal claim over them. There’s another thing called a company using their economic power to destroy a rival afterwards through sketchy, alternative means, and maybe I am very wrong guys idk but I am almost sure they cannot patent anything else than their specific code.
I have to reiterate though maybe I am very wrong 😂
As a 3D artist with ZERO programing knowledge, i really wish i had a tool to make animations using ubisoft's methods. This is amazing! Their previous paper based on key frames was also great (and open source!) i'm suprised no one even has made a blender plugin or something.
Look up cascadeur
@@Americanbadashh This looks very promising!
EDIT: Just tried it and its.... not very user friendly and tends to crash alot. But! the times i did get the use it without fault, it's definetely a step forward! though admitedly i'm lazy and i just want to make mocap level animatiosn with 3 keyframes haha. Still awesome though.
Love your videos... The best part is that at first it does not makes a lot of people think that it can be a thing. Suddenly, few more papers down the road, we can see so much intriguing applications that we could not even imagine before. This channel is definitely about the future (if you are watching this on the 20's)
Imagine what they could achieve if they were even only half as good as this at predicting the reception of their business models.
I can’t wait for these simulations to make leaps and bounds in my lifetime. I’m curious how battle tactics would evolve if you gave AI’s different kinds of super powers and tell them to fight each other. It’d be interesting to see how the “meta” for those powers develop across generations.
These videos are cool but kind of terrifying at the same time. It just blows my mind that this sort of thing is possible. The future is gonna be crazy.
It will if we keep being complacent sheep allowing these companies to ''patent'' these things.
This is from the company that is trying to sell NFTs as video games. They don't care about the consumers bro.
@@RobotronSage wait so you're so saying scientist or companies should be able to patent their accomplishments. Do you propose a free for all? Where anyone can just freely steal anyone elses work. What is the solution? 🤔. I don't know.
Eventually everything will be very easy and the "Idea guy" will actually be a important person.
I think Karoly missed the main point of this paper, which is that these are *physically simulated* characters.
The whole "predict the future" thing is done in order to achieve this. At 2:50 he makes it sound as if copying the reference motion seems silly, and that what this thing *really* does is predict the future motion.
No, copying the reference motion *is* the point, and the future prediction is made in order to do this. Only, it isn't simply copying - it is *driving the joint motor torques such that the resulting physically simulated pose matches the kinematic (non-simulated) pose*. This is the challenging problem that was tackled. This has been tackled before as well, but the new idea in this paper was to use supervised learning instead of reinforcement learning to do this, which I also didn't see mentioned anywhere.
The summary at 4:12 is also off - this method does not create higher quality animation, it tracks already made (or mocapped) animations with a physics driven character. The character controller is also just a regular character controller that controls the kinematic reference pose - it is not created by this method, it's just that this method allows a physically simulated character to follow the reference pose, and hence be controlled. The controller is likely based off motion matching, since this is Ubisoft we're talking about.
Thanks, you explained it much better than my comment where I said the same thing. This video completely missed the point of the paper (as sadly already happened in other videos on this channel tackling Tech Animation papers).
But what can you do, Technical Animation and physical animation are not well known field, and most people don't even understand the difference between mocap, physics, IK and keyed animation.
@@thearcadefire93 Yep, I've also seen this happen before with two minute papers, especially with animation like you said. It's understandable I guess given that this isn't Karoly's field if expertise, but perhaps most people wouldn't understand the difference anyway
Yeah but considering his channel is supposed to be informative and providing learning content, spreading misinformation is not great. 90% of the people in this comment section think this new system will allow indie devs to create animations from scratch, which is total nonsense. Clearly this is bad. I tried to find a way to reach out to Karoly, since I am sure he cares about the quality of its content and accuracy of the information he provides, but I could not find a proper way... And i doubt he will read all these comments and find the two ones of people with more experience in the field criticizing the content.
Remember seeing the gdc talk forever ago. So excited the paper is out there now
Nice, this has been my major gripe with modern games, they look stunning until the characters walk or move. Glad this is being improved upon.
I have done something very similar in my bachelor thesis 10 years ago. I pretty much failed and could barely train the AI to do a few steps & simple jumps before falling over xD Cool to see how far this got now.
This technology will be very important for future VR. In regular games the player character and NPC's play fixed animations independent of the physics simulation. This doesn't work in VR though because you usually control your virtual character directly via your own movement and the game can't just move you to play an animation. In the future these animations will be replaced with physics and an AI control the body of an NPC. This is very similar in complexity to controlling a real world humanoid robot.
ruclips.net/video/tYBmiwlp3Rg/видео.html
Not too bright there, are you skippy!! Give yourself a pat on the back for your ability to not use critical thinking. You have been very well indoctrinated by your college & shows.
Ever heard of nanotechnology??
Ever done any research on Graphene Oxide & how it is able to make a human into a Transhuman.
ruclips.net/video/tYBmiwlp3Rg/видео.html
@@truthwarrior7934 smd
@@InternetTree. smh. Looking at your profile, tells me EVERYTHING I need 2 know about you.
This will be dream come true for many artists. Those countless days for creating a few mins of animation will all be a history now.
THIS IS WHAT I AM TALKING ABOUT! THIS IS AWESOME!
I am happy to see that physics and animations are catching up to visuals. Things look so great these days but then end up still looking goofy or emersion breaking when something strange interacts with an object. Take Forza, Project Cars, etc, visuals and cars look almost real, up until you crash into a wall and your Subaru just bounces of like a spunge.
Amazing! Imagine if we could couple it with the NVIDIA paper from 2 weeks ago about 3d character animations from videos!
And we have a controllable parkour player trained on videos!
the sound effects when they fell were gold
The voice over Is awesome … felt like I am watching a thriller video
Happy, merry christmas and a happy new year and good luck in 2022 to everyone!
At some point in the future they will look back and think how insane we must be, because we used to animate everything by hand.
Na
Good animation has beautiful moments and good tweeting. Great animation needs this kind of properly predicted timing and spacing between beautifully crafted moments. As a training animator this paper is exciting!
im in love with this field of cs !
ubisoft will be the biggest studio soon! They have so much talent and ambition its insane
animators: wow this is super good and cool, even i cant animate that.
wait its gonna take over my job what am i gonna do about my family pls man
Wish Ubisoft could do more of this and less NFT nonsense.
Goodness! love your videos! I am definitely seeing the future by watching your videos!
The thing most exciting about this to me is what indie creators will be able to do. Small scale game development will be able to do so much more with this type of tech in a short amount of time and more cheaply. It allows indie games scope and detail to increase without much trouble
I'm fascinated by their foot model.
Would love to have this in VR...
Now it's time for the AI Facial animations because sometimes even motion capture does no looks very well!
Thanks for the video
I love your enthusiasm, it's very informative and a perfect size.
Ayyy another Daniel holden paper! (:: This guy's got all the excellent character locomotion papers. Only a matter of time now before we see stuff of it in games etc
Like a decade ago inverse kinematics in video games was something new. Now we have fully simulated movement that adapts to the environment. Amazing!
this was by far the most amusing animation display on this channel
This is going to make motion tracking a more lucrative thing for a while, and then it'll be gone because A.I.s will do in faster and for significantly cheaper. Companies who make motion tracking animations are going to have to develop their own motion tracking A.I.s to keep their businesses going.
My man said it best, “What A time to be Alive!”
Making a shirt with this quote!
totally in love with this.
my dystopian brain also sees security bots, that can predict movements and agressions before they happen lol
This was the breaking point for me, we live in a simulation
Damn, what a progress! I love the future of this.
Happy Holidays!
It was the narrator for me lol good stuff
3:14 is a great analog of Z-fighting! (also called stitching or planefighting)
so excited! video games are about to reach a whole new level
Finally! a studio beside Naughty Dog that can achieve this fluidlike animation of a virtual characters.
While this looks super promising for future games _(not just in itself, but as a precedent or potential inspiration for other developers),_ it also has the possibility of being under-utilized by the industry for years just like Euphoria-powered animations
What a time indeed!
I have a request - please consider creating another channel where you show applications of these technologies and how people can play with these cool new tools.
This is great, upcoming games will feel way more realistic, good work ubisoft 🤘🏽
This is actually a game changer for animation, now that the AI can make them physicalized. Before, animations were just movements on a rigged model that had no contact or interaction with the environment. So when a character ran up a flight of stairs, it's just playing out an animation instead of physically moving up each step. This AI development allows stock animations to be recreated in a simulation so that it is actually simulating the movement in a physical environment in real time.
That's actually not true we've had contextual / physics based animation for a long time already. First assassin creed game is a good example of this.
If you want to go further back the basic ragdoll effects used in most games are an example of this
3:07 looks like king crimson habillity
what a time to be alive!
I've long thought that this would come, ever since watching "the Incredibles" and thinking it was a video game in GMV format
jaw dropped and liked the video immediately when he said " in real real time"
Now apply this to UE5's matrix demo
I wish the anime industry could start using all those A.I. advancements right now.
imagine GTA5 with each pedestrian having a unique walk.
What a time to be alive ;)
Holy crap, we'll finally get realistic movement animations in all situations, not just the hard-coded ones that feel very forced in changing environments.
I also hope AI will help in optimising the resource use of games. Like a photorealistic huge open world game running in a current gen smartphone natively without breaking a sweat.
Oh god, imagine the parkour for the next Assassin's Creed games....
I fucking love this channel so much...
I was literally talking to my girlfriend about half an hour ago how when I was a kid I'd see tech demo videos for the 360 and ps3 coming out and how I never see ones that actually excite me anymore. This video got me hyped though, I feel like I just watched the tech demo for the Euphoria engine being used in GTA IV for the first time again
Maybe they should have had an AI that predicted that gamers will hate their NFT plans...
Wow, can't wait to see it in game.
Beautiful movements!
Ubisoft 2011: Used to make awesome GOAT Rayman games
Ubisoft 2021:
"These mannequins can be mapped to real characters"
R34 artist: That's everything that what I needed to know
I can't wait to see what that new avatar game's gonna look like. Unless this tech isn't in, then I'll have to wait for that one star wars game!
"Please don't pick up that weapon. You have 20 milliseconds to comply." -ED 2029
Super interesting that they tested it with multiple physics engines too
awesome! I love you videos.. thank you!
Can't wait for them to implement this feature into the same copy/pasted Ubisoft games they make this year.
👏 now make it identify energy-minimal deformations of morphs where energy is a higher-order energy based on total change of muscles/bones (taking into account some other higher order object). As in:
1️⃣ Avoid this upper bound while running ~> slides under object then continues running.
2️⃣ Avoid this sword while shuffling left/right in parry stance.
3️⃣ Be like water while avoiding punches from Bruce Lee
Oh sick, maybe they'll release this with hints of game strung in it!
THIS IS UNBELIEVEBLE. I HAVE NEVER EVER THOUGHT THAT THE DAY WOULD COME AND I WOULD BE A WITNESS WHAT WE HAVE ACHIEVED. BUT TELL ME MY FRIEND ISN'T IT A BIT SCARY, TOO???
For Honor's animation system seems to utilize some older form of this, it's pretty fantastic.
Motion Matching, it's their major animation tech. Check out Ubisoft Laforge's YT channel!
yeah, this is a Learned Motion Matching, the superior version using neural networks, considering Motion Matching made animations in a lot of games far more realistic(TLOU2 and UFC 3) we can have high hopes for this tech once it starts being used in games
Károly, at what point do you think that 3D TV will be possible? I'm imagining watching a sports match where the entire game is rendered in real-time so that you could (with a headset) stand in the middle of the field and watch the game from any perspective. Thanks for such great content. You'll be delighted to know that even my teenage boys enjoy them!
Considering the 3D simulation tech that American football games have, it's not hard to imagine that the only limitation right now is nobody thought of it.
@@punishedkid for it to look like a ps3 game maybe, but for it to look real we are not even close yet, it would need to be able to render realistic looking models in real time just taking in images from a few cameras
We had 3D TV but it's no longer marketed due to low sales lmao
My mum still has one. They're actually pretty neat.
Can’t wait until we start applying this to robotics. Imagine how simple it would be to control a robot using this technique. It would make it super easy for any company to start developing robots instead of highly technical companies like Boston dynamics.
Seeing the AI get shot with cubes at 0:22 makes me want to see an AI learn to block, dodge and deflect them in order to stay standing.
A game on Steam called "will you Snail" been in development for 3 years. The premise is an AI predicting the player's movement in order to spawn traps.
I'm not sure I understand... how many frames forward is the AI predicting things? ...Basically we're giving it a set of keyframes (taken from mocap data) and it's interpolating (with physics) between them, right? How many keyframes per second are there? Impressive stuff thanks for sharing
"a solid step towards democratizing the creation of superb computer animations"
It would be sick to have a game where the enimies attack your predicted location using this
“This is how a trader predicts a stock’s movement.”
I struggle to see what is new with this technique. Movement prediction like the animations in the video was showcased a while back and this is already implemented in major engines like unity (called kinematica). Is it the physics calculations, i.e IK and forces? Is physics the input to the ai and not the animations?
I think kinematica produces traditional poses and animations aka it is a system to make creating of animations easier. From a physics standpoint the characters is just a capsule. It can not really interact with anything else. The simulation in this video uses an actual physical representation of a body. And it trains an AI to control that body to follow the animation.
Take VR for example: If you punch a traditional animated character your hand moves through the animation without any interaction (unless you specifically program this in). But the physics based animations here would be knocked out of balance (or outright see your movement coming and evade).
Yes, this is not a new system though.
EA sports games already run a full physical simulation on every player on the pitch to ensure they have proper interactions when they hit each other (think about two football players running into each other in madden)
So I am not entirely sure what this new system improves upon. I guess mostly stability of the final motion, meaning it would be less risky to run and need less setup?
@@thearcadefire93 I think the EA ones never encountered a hill nor any other objects than a ball. They are very purpose built for a single scenario.
I'm confused. So the AI (seen on the left comparative to the green reference on the right), is predicting what the reference will do? It's not just reading the same instructions the reference comes with? Also, the reference clearly must have instructions with it, so why not just give the AI access to those instructions? Why was there not a demonstration of a user input controlling a reference while the AI tries to predict? Are we assuming the reference was user controlled? All I think I am seeing here is predetermined instructions. While the motions are beautiful to see, I'm having trouble grasping a visual flag that what I'm being told is actually happening.
What a time.
LI’m ready to see it in every video game
They act more natural than I do.
Today on two minute papers: Hardcore parkour! 😋
I can see this ai being used to train the interphace between pilots and machines. Lets go Gundam Unicorn!
Dynamism of a Dog on a Leash
but it's 2021
*Looks like Rockstar has some competition*