Dang, there is a lot of cool stuff tonight. Good find! The robot stuff is something i've been loosely following because of its potential use with VR (big vr nerd here), but i didn't know it was this close to being ready yet!
as a sound guy, AI music has a long way to go. like levels etc... that being said i fully understand it will only get 2x better by next year. It will be perfect in 2 years, if not 3.
@@Telwyn-g4zBut in terms of how it sounds I really enjoyed the song. The layman's experience you know. 😅 But fair enough, not technically perfect...yet
@@Telwyn-g4z suno is kidda bad even the v4 is mid at best but easy to prompt udio has way better instrumental and vocals quality but is hard to prompt and needs way more iterations to create a good result
If you want a challenge for your robots, have them open a variety of food containers. Think for example of products that come in bottles where you have to removed the lid, grab the edge of a flimsy piece of plastic, and peel it off and put the lid back on. There are tons of little fine motor actions required where you also need to apply a lot of force. I can't see the current round of robots doing those things any time soon.
Those safety seals designed as a center pull tab are a bad design. They put the tab in the wrong place. I usually end up peeling them back from the edge, poking a hole, or cutting a hole to remove them. I'm not sure if I would want to train a robot to open those by any method. Every way of doing it requires movements or tools that are potentially harmful if applied to the wrong object or a person, including the intending pulling method.
3 дня назад+1
I worked in an automobile factory from 1989 to 1998. Humans are WAY, WAY faster than those robots. 1000 placements per day? Most jobs had 10-20 operations and you'd do from 700-900 per shift. So up to 18,000 operations in a shift, running 2 shifts per day. Back then we ran 99.5% quality, which resulted in a lot of reworking, so we need to see how many errors the robots make. Plus you could show a human a job for 5-10 minutes and they'd pick it up (when there was an absentee for example).
The reason everyone is switching to vision instead of lidar is because lidar is an active sensor that takes energy to run and only a single wavelength, while vision with cameras are passive, requiring less energy and covers more wavelengths. In both cases you need good neural nets to make sense of the data so lidar often no advantages, in fact only more energy usage and more expensive components
inferring depth from a set of 2D pictures was kind of a solved problem since the MidasV2. The real problem is understanding the scene, the lidar only gives you a point cloud that you still have to make sense of. You saved running an affine transformation from 2D to 3D, but that's cheap to run.
What about adverse visibility conditions? Isn't lidar better and safer in heavy rain/fog/snowstorm/dark? Or do the cameras capture in ranges that mitigate these problems?
@@sid.h For most tasks vision is enough, but for special purposes, you can add extra sensors like night vision, ultrasonic distance sensors etc. Like humans do, when it is required. In extreme weather conditions it is sometimes best to not drive at all. If you do so anyway, you are taking risks that companies, who are liable, would not dare to take.
Previous *cost of screwing in a light bulb (without AI):* 2 seconds. Today's *cost of screwing in a light bulb (with AI):* several days and hours of training + several million dollars for development and manufacturing an AI robot.
Imagine being able to train one of millions of robots, and they all learn the lesson equally and remember the lesson forever, based on the training given to one. It's the system that is trained, not individuals. That makes it incredibly efficient once this really gets rolling.
The concept of training robots through simulation is truly a game-changer. It reminds me of how pilots use flight simulators for training, efficient, scalable, and incredibly insightful for real-world applications.
The factory robots crack me up every time, if a human dared to work that slow they would not be coming into work the next day! They need to be at least as fast as a fast human but lower cost. And you cant claim that they dont go to sleep as an advantage, as that is the same as calling in a night shift.
AI is being trained on vision. That doesn't make extra superhuman sensors useless, it just means that using those sensors are equivalent to running and the robots are still learning to walk. All we have for training data is vision, but as soon as they have that down, I see no reason why they couldn't generalize and "walk" using vision alone and gather training data for other unique sensors they learn to use at higher precision than vision alone.
@GoodBaleada humans aren't that super. We are just good with our mouths and hands. Our legs ain't too bad either. Life and brains in general are much more advanced than our technology is. But we are just good enough and specialized for survival on a primitive hostile earth. Not that super. More like lucky. Tool use is pretty super though given its rate of evolution outpaces biology by orders of magnitude.
@@Brenden-H See you missed the point. You must start outside your ego's superposition. We are super monkeys compared to monkeys. And only monkeys when compared with an LLM controlled robot with working arms.
@@GoodBaleadaMusic idk what you mean by "outside my egos super position". Also my point is that among the other monkeys we are still not super. We are kinda average animals. What's super is language and tool use. We have decent brains because we have been evolving them a bit to be more specialized since we found language and fire and the concept of making tools out of rock.
It's nice to see more slick productions of humanoid robots doing things which might be considered approximations of useful tasks, but the simple fact is they are still much too slow, and the training is much too expensive, to approach a usable product any time soon. I understand they need to do PR to raise investments, but generally they are just demonstrating how far they are from the true goal.
The Figure robot has classical looking slow sequential movements. It doesn't look like it is neural mimicking human movements. But that could be a result of being trained in very discrete tasks that doesn't combine movement and placement. Which seems less than ideal.
@@OffTheBeatenPath_ I won't be surprised if it will only take a couple more years. Especially since they're able to train them in something like Omniverse. Which is pretty rad. Ultimately I'm optimistic. In a decade it'll be like that film Bicentennial Man.
I don't like to sound like I'm unamazed by all of these advances in the tech, but it seems to me like things are slowing down, compared to a year ago. You're not feeling it Wes? You seem to be as amazed as ever.
Those BMW bots seem super slow compared to a traditional factory arm that sits and pivots. The old method would be slapping things down and welding them a hundred times faster.
But they got way more flexibility, can move around and go search for pieces farther away, also switching their software to adapt to a new task, or as the time passes they just update and get faster. They also got human hands so they can do more tasks
@@Lerppunen Excellent point, but they run at a fourth the speed of a human. Proof of concept at best. A year from now, we may see something more impressive. Who knows.
Industrial robots will be leased by companies for their factories. $2000/mo is cheaper than any laborer, when they can work 24/7 tethered to power until they need maintenance, they don't need vacation, sick days (when they go down, they'll be replaced by an identical "spare" at no extra cost), or ask for a raise. Instead of complaining, they'll learn to do their tasks faster and more efficiently. My thought is that manufacturing will come back to G7 countries, to tighten the logistics lines and control. End of life for these robots will be a quick refurb to battery power and sell to consumers at reduced cost. Welcome to our future.
That Figure 02 video seems to show a kinda wasteful use unless it's an assembly process so experimental or low volume that a purpose-made assembly machine won't be in use long enough to pay itself.
18:25 - The correct answer is "nothing". Adding a cache only makes it "faster" if "faster" means it will be "wrong more often". That is not really what most people consider "faster". Most people would consider "faster" to mean "do the exact same thing at a higher speed" but adding a cache does not do "the exact same thing" as accessing the database because you will be accessing outdated data.
If anyone thinks those robots are slow, they're already faster than a lot of lazy workers that get paid just as much as the good workers due to the popular by-the-hour pay system. This is gonna save companies so much money. No, I am not exaggerating about speed. I've worked at restaurants and a few factories. I always wondered how it was even possible for some to move so ridiculously slow. By the way, I don't get people who say they want their food prepared by humans and not machines. If you saw what it's like in the back, I can almost guarantee you'd NEVER eat fast food again.
Now we can have our every movement filmed with the google glasses and Ai can record and recall everything back to us. A digital twin in our heads to guide us 24 hrs a day. We are becoming the fleshy avatar host for the AI until the robots get good enough to take over. What a time to be alive.
The heygen example you posted was definitely SOTA for what I've seen. Interestingly, the voice was subtly lacking in emotional range. But very close! 😊
Here's a realistic problem--if content creators (especially the spammy ones) can now make a video in 10 seconds that's intended to be click-baity, and then can then tap a few more buttons and generate 100 videos in different languages, and they start uploading all of these to RUclips, what will this do to RUclips. This will be an explosion of mostly meaningless content and videos aren't small nor are they free to host. Spammers will use this first and it probably will not be difficult to hack something together that both generates initial video content and then, with a tool like this, distribute it 20-50 times for several languages and regions. It's not a good day to be in server management at Google.
I don't really understand how those humanoid robots are better for factory than the factory arms we already have? I mean it's so slugging and those arm things they just keep going with extreme accuracy. So what's really the benefit of those robots? From my perspective they seem to be a lot slower than those robotic arms. Or is it just me that thinks that?
The current robot arms are installed and trained for a specific task with special tooling and work in a closed environment. The humanoid robots are meant to be multi purpose machines working alongside humans, performing all kind of tasks humans would do. What we are shown now are more or less advanced prototypes. In the end they will be much faster than any human performing manual labor.
@@alexanderpoplawski577 Yeah i get that,, but still isn't that slower? Isn't it better to have a bunch of robotic Arms doing simple tasks very quickly and with extreme precision better than these things? I'm just thinking the humanoid form isn't ideal for factory work. So why cling to it?
@@TrabSr I think, through economy of scale, you will have use cases which currently are not profitable. It can do all kind of tasks. When you are setting up the production for a new item, in the mean time you can send one to clean the toilets or prepare lunch for the other workers. Maybe not in this order.😅
@@alexanderpoplawski577 Yeah that's true you wont have to refit all the robots for a new tweak or addition. NOt saying this isn't amazing to begin with, it totally is. But it's kinda odd to see a for-profit company choosing something that's less precise and fast in this economy. But a few years from now, yeah they're gonna be bloody everywhere. Beware the wave of robotic waifu's entering your town pretty soon.
Whilst these robots have made huge progress, they still have a long way to go before having the same motor skills and manual dexterity and speed as a human.
Imagine creating any story you are able to imagine for any to enjoy. Although yes I also fully understand your position and in agreement that protections for some but not for all with regards to both personal likeness, patent or copyright protections are very dangerous for everything the civilisation we live in is founded on.
Imagine having games to be controlled with your mind with a signature that translates 1 to 1 into a humanoid machine that will be deployed into war zones. Your in-game statistics will be indicative for your military fitness. Pro teams could passively become elite strike teams and the average player a militia on standby without ever having to visit a physical military camp or fireing a real and costly shot of ammo until deployment.
It's impressive that while there is people who is "look this is unreal", There is people like me who tries to use this ai image tools and 80% of the time is useless, and the remaining 20% is barely acceptable. Another day in the AI paradise.
Without proprioception and an ability to 3d model their environment, the way a living human can, these will simply be very sophisticated automatons, clockwork mannequins.
@fitybux4664 You think it process audio like in the old demo but it actually still using voice to text transcript and can't recognize sounds and different voices
Fixture to fixture movement is on par with dish washer to cabinets. It will remain very hard to take a pile of dishes randomly differently in a sink moved to the dishwasher. Set up is everything in life and will remain a very important medium to manage or must be well thought out & repeated by humans.. Human management, maintenance and qaulity control jobs will surely be 90% training on universal operating systems to keep these going & active 24/7. in this era of optimization idealistic avenues being forced to join physical mechanical industrial revolution will be bloodier than the past 500 years of simply moving hi church polymical hi ground only allowing deformity or mystification to agency & institutions of secularized specialized groups.
Wait, what is going on? I do not have english audio, nor my country audio. I am watching it in freaking japanese. I don't see anyone talking about it in comments as well... wth?
@@sadshed4585 - I hope Wes will cover more of these wearable brain device developments. Have you seen this run down on similar devices? ruclips.net/video/fLtSL_z_pEE/видео.html Best Brain Devices for 2024
I won't be impressed with these until they are better at things that are a bit more unreal. Yeah they are really good, but its not what I want for my work. Midjourneys blending and imagination is still the best for most thing I need
The videos of Figure02 are not particularly favourable. Robotic arms have been invented for processes that require a repetitive sequence of movements. They can do the job much faster and more accurately than a human - or the robots shown here. The robotic arm could also be equipped with a camera to teach it to calibrate itself to changing conditions. All of this would be cheaper and more efficient than building an entire robot whose only task is to carry a piece of sheet metal from A to B.
What is the actual use case for the iOS HeyGen app? (I'm a second class Android citizen too, but) Assume I'm walking down the street checking the weather or news on my iPhone. Then I suddenly want to create a HeyGen avatar... for what purpose?
Why is the kittens little space where a normal human would stand when taking a piss? Or if sitting, where a normal human being would have their legs when pissing, or taking a shit.
18:35 as if the voice was reflecting inner struggles 😂 But I guess it was just trained on "wrong" pronunciations of the Ö. yeah I know wrong is relative
Does anyone know if anybody is trying to train robots only with haptic feedback and the user's training have to be blindfolded. Some things are possible to do blindfold obviously harder. But once the robots have vision, if both systems haptic and vision are running asynchronously or whatever, it would theoretically be better?
Why do those robots have legs? Most factories have flat floors with no thresholds or obstacles. Surely a set of those NASA omnidirectional wheels would make more sense. Or just mount the torso onto a Dalek base.
🤯Amazing feature that I now (automatically) get German audio... But I have to say it sounds cheap and irritating. It would be cool, however, to get a start panel on RUclips that allows users to choose their preferred audio track. I like your videos better with the original audio track.
I almost can't tell its AI generated. The future is now! This is pretty spooky haha.
Dang, there is a lot of cool stuff tonight. Good find! The robot stuff is something i've been loosely following because of its potential use with VR (big vr nerd here), but i didn't know it was this close to being ready yet!
@@Rolltheredthe exponential growth 📈 is crazy. Almost fucking spooky in some ways. Almost feel like a Luddite now in some ways.
I dunno but I can tell when it's AI ...it's impressive but it's still repulsive.
Wait, Wes was AI generated? 😆 😁
That Dor video was actually really good
That BMW clip is way cooler then the new Jaguar ad.
That SUNO song is dope.
as a sound guy, AI music has a long way to go. like levels etc... that being said i fully understand it will only get 2x better by next year. It will be perfect in 2 years, if not 3.
@@Telwyn-g4zBut in terms of how it sounds I really enjoyed the song. The layman's experience you know. 😅 But fair enough, not technically perfect...yet
@@Telwyn-g4z suno is kidda bad even the v4 is mid at best but easy to prompt udio has way better instrumental and vocals quality but is hard to prompt and needs way more iterations to create a good result
whats the song name
Lyrics are banging tho
If you want a challenge for your robots, have them open a variety of food containers. Think for example of products that come in bottles where you have to removed the lid, grab the edge of a flimsy piece of plastic, and peel it off and put the lid back on. There are tons of little fine motor actions required where you also need to apply a lot of force. I can't see the current round of robots doing those things any time soon.
Those safety seals designed as a center pull tab are a bad design. They put the tab in the wrong place. I usually end up peeling them back from the edge, poking a hole, or cutting a hole to remove them. I'm not sure if I would want to train a robot to open those by any method. Every way of doing it requires movements or tools that are potentially harmful if applied to the wrong object or a person, including the intending pulling method.
I worked in an automobile factory from 1989 to 1998. Humans are WAY, WAY faster than those robots. 1000 placements per day? Most jobs had 10-20 operations and you'd do from 700-900 per shift. So up to 18,000 operations in a shift, running 2 shifts per day. Back then we ran 99.5% quality, which resulted in a lot of reworking, so we need to see how many errors the robots make. Plus you could show a human a job for 5-10 minutes and they'd pick it up (when there was an absentee for example).
The reason everyone is switching to vision instead of lidar is because lidar is an active sensor that takes energy to run and only a single wavelength, while vision with cameras are passive, requiring less energy and covers more wavelengths. In both cases you need good neural nets to make sense of the data so lidar often no advantages, in fact only more energy usage and more expensive components
inferring depth from a set of 2D pictures was kind of a solved problem since the MidasV2. The real problem is understanding the scene, the lidar only gives you a point cloud that you still have to make sense of. You saved running an affine transformation from 2D to 3D, but that's cheap to run.
What about adverse visibility conditions? Isn't lidar better and safer in heavy rain/fog/snowstorm/dark?
Or do the cameras capture in ranges that mitigate these problems?
@@sid.h For most tasks vision is enough, but for special purposes, you can add extra sensors like night vision, ultrasonic distance sensors etc. Like humans do, when it is required. In extreme weather conditions it is sometimes best to not drive at all. If you do so anyway, you are taking risks that companies, who are liable, would not dare to take.
And way more noise
Yeah, and those huge lidar sensors stick out like a sore thumb too.
Previous *cost of screwing in a light bulb (without AI):* 2 seconds.
Today's *cost of screwing in a light bulb (with AI):* several days and hours of training + several million dollars for development and manufacturing an AI robot.
Imagine being able to train one of millions of robots, and they all learn the lesson equally and remember the lesson forever, based on the training given to one. It's the system that is trained, not individuals. That makes it incredibly efficient once this really gets rolling.
WOW! The Dor Brothers... What a find - those guys will go far. 😎
11:40 had me rolling hahahaha
17:48 wow...just wow. I think this was an older video but it still blows me away.
Nothing short of Amazing
This stuff keeps on astonishing me! 😮
2:36 Good Critical Eye. A lot of people miss out on small details like that.
The concept of training robots through simulation is truly a game-changer. It reminds me of how pilots use flight simulators for training, efficient, scalable, and incredibly insightful for real-world applications.
Wes, this is so real ... please keep streaming :)
Nice
The audio was in Indonesian, with no option for English until I opened the link in a private window.
I've never once turned this features on. 🤷♂
Awesome stuff! Thanks for sharing!
wow controlling that robot arm with her brain. Someone should build an adult toy with it. just think about "FASTER! HARDER!!!" then it happens!!!
I would be careful with "harder".
Go outside
You almost have a Daft Punk song there...
@@pizzasteve205 It is a Billion dolllar industial it is a great idea.
@@pizzasteve205 But then everyone would see.......
1.5x lets a go!
thanks for keeping us updated
The factory robots crack me up every time, if a human dared to work that slow they would not be coming into work the next day! They need to be at least as fast as a fast human but lower cost. And you cant claim that they dont go to sleep as an advantage, as that is the same as calling in a night shift.
Need slave driver with an electronic whip.
AI is being trained on vision. That doesn't make extra superhuman sensors useless, it just means that using those sensors are equivalent to running and the robots are still learning to walk.
All we have for training data is vision, but as soon as they have that down, I see no reason why they couldn't generalize and "walk" using vision alone and gather training data for other unique sensors they learn to use at higher precision than vision alone.
From the context of the theoretical super robot. But from the context of the confirmed Super Monkey, you?
@GoodBaleada humans aren't that super. We are just good with our mouths and hands. Our legs ain't too bad either. Life and brains in general are much more advanced than our technology is. But we are just good enough and specialized for survival on a primitive hostile earth. Not that super. More like lucky. Tool use is pretty super though given its rate of evolution outpaces biology by orders of magnitude.
@Brenden-H 95% of all Brendens have no cultural memory of living on the land that they occupy.
@@Brenden-H See you missed the point. You must start outside your ego's superposition. We are super monkeys compared to monkeys. And only monkeys when compared with an LLM controlled robot with working arms.
@@GoodBaleadaMusic idk what you mean by "outside my egos super position". Also my point is that among the other monkeys we are still not super. We are kinda average animals. What's super is language and tool use. We have decent brains because we have been evolving them a bit to be more specialized since we found language and fire and the concept of making tools out of rock.
It's nice to see more slick productions of humanoid robots doing things which might be considered approximations of useful tasks, but the simple fact is they are still much too slow, and the training is much too expensive, to approach a usable product any time soon. I understand they need to do PR to raise investments, but generally they are just demonstrating how far they are from the true goal.
The Figure robot has classical looking slow sequential movements. It doesn't look like it is neural mimicking human movements. But that could be a result of being trained in very discrete tasks that doesn't combine movement and placement. Which seems less than ideal.
it needs 100x more compute power and it's game over.
I'll be happy when these robots can walk like they didn't just shit their pants
Mentibot, Engine AI, and Unitree already achieved that.
Thanks, I couldn't remember the names of 2 of them.
@@jhunt5578 hmm...We may have a different idea of what that looks like.
Plus walk and move faster than my 80 year old grandma
@@OffTheBeatenPath_ I won't be surprised if it will only take a couple more years. Especially since they're able to train them in something like Omniverse. Which is pretty rad. Ultimately I'm optimistic. In a decade it'll be like that film Bicentennial Man.
German sound good
I believe "only vision" means no sense of touch.
I think the point is that in the future, adding other senses will make them even better.
*this is the future of film and entertainment, the end of Hollywood* 👌
That was pretty amazing!
I don't like to sound like I'm unamazed by all of these advances in the tech, but it seems to me like things are slowing down, compared to a year ago. You're not feeling it Wes? You seem to be as amazed as ever.
I loved the coment about Apple. You read my mind. Android will not get for awhile.
Those BMW bots seem super slow compared to a traditional factory arm that sits and pivots. The old method would be slapping things down and welding them a hundred times faster.
But they got way more flexibility, can move around and go search for pieces farther away, also switching their software to adapt to a new task, or as the time passes they just update and get faster. They also got human hands so they can do more tasks
... old nerd here ... been there, done that, and totally agree.
There new robots are meant to replace human workers, not traditional arm robots.
@@Lerppunen but why would they have fox news robots?
@@Lerppunen Excellent point, but they run at a fourth the speed of a human. Proof of concept at best. A year from now, we may see something more impressive. Who knows.
By vision only I interpreted it was using it's camera instead of a mapped 3d space.
Industrial robots will be leased by companies for their factories. $2000/mo is cheaper than any laborer, when they can work 24/7 tethered to power until they need maintenance, they don't need vacation, sick days (when they go down, they'll be replaced by an identical "spare" at no extra cost), or ask for a raise. Instead of complaining, they'll learn to do their tasks faster and more efficiently. My thought is that manufacturing will come back to G7 countries, to tighten the logistics lines and control. End of life for these robots will be a quick refurb to battery power and sell to consumers at reduced cost.
Welcome to our future.
That Figure 02 video seems to show a kinda wasteful use unless it's an assembly process so experimental or low volume that a purpose-made assembly machine won't be in use long enough to pay itself.
That head movement is very common in India.. so are programmers 🧐
18:25 - The correct answer is "nothing". Adding a cache only makes it "faster" if "faster" means it will be "wrong more often". That is not really what most people consider "faster". Most people would consider "faster" to mean "do the exact same thing at a higher speed" but adding a cache does not do "the exact same thing" as accessing the database because you will be accessing outdated data.
umm..the training data was what was erroneous on this detail. and alot others.
If anyone thinks those robots are slow, they're already faster than a lot of lazy workers that get paid just as much as the good workers due to the popular by-the-hour pay system. This is gonna save companies so much money.
No, I am not exaggerating about speed. I've worked at restaurants and a few factories. I always wondered how it was even possible for some to move so ridiculously slow.
By the way, I don't get people who say they want their food prepared by humans and not machines. If you saw what it's like in the back, I can almost guarantee you'd NEVER eat fast food again.
Now we can have our every movement filmed with the google glasses and Ai can record and recall everything back to us. A digital twin in our heads to guide us 24 hrs a day. We are becoming the fleshy avatar host for the AI until the robots get good enough to take over. What a time to be alive.
Love the song!
7:25 I think you are referring to Lidar, Lazers will only help with light and Sonar only works in water.
The heygen example you posted was definitely SOTA for what I've seen. Interestingly, the voice was subtly lacking in emotional range. But very close! 😊
I am thank you to have lived long enough to experience AI. We grandfathers are more supportive than competitive. 🤖🖖🤖❤️
Yes the robots are being trained on visios/video. Yet for some reason, people think Tesla FSD won't work doing the same.
Here's a realistic problem--if content creators (especially the spammy ones) can now make a video in 10 seconds that's intended to be click-baity, and then can then tap a few more buttons and generate 100 videos in different languages, and they start uploading all of these to RUclips, what will this do to RUclips. This will be an explosion of mostly meaningless content and videos aren't small nor are they free to host. Spammers will use this first and it probably will not be difficult to hack something together that both generates initial video content and then, with a tool like this, distribute it 20-50 times for several languages and regions. It's not a good day to be in server management at Google.
Daddy needs this AI webcam thing ehhh for work
+1 for Android. can't wait!
i am noticing lots of AI used in commercials. It is mainly used for background special effects!
That is a nice beat at the end. What is it called on Suno?
Hi I like your AI voice. 😂😂 I think I am in love with her.
additionally a single stack is easier and takes less compute
I don't really understand how those humanoid robots are better for factory than the factory arms we already have?
I mean it's so slugging and those arm things they just keep going with extreme accuracy. So what's really the benefit of those robots? From my perspective they seem to be a lot slower than those robotic arms. Or is it just me that thinks that?
The current robot arms are installed and trained for a specific task with special tooling and work in a closed environment. The humanoid robots are meant to be multi purpose machines working alongside humans, performing all kind of tasks humans would do. What we are shown now are more or less advanced prototypes. In the end they will be much faster than any human performing manual labor.
@@alexanderpoplawski577 Yeah i get that,, but still isn't that slower? Isn't it better to have a bunch of robotic Arms doing simple tasks very quickly and with extreme precision better than these things?
I'm just thinking the humanoid form isn't ideal for factory work. So why cling to it?
@@TrabSr I think, through economy of scale, you will have use cases which currently are not profitable. It can do all kind of tasks. When you are setting up the production for a new item, in the mean time you can send one to clean the toilets or prepare lunch for the other workers. Maybe not in this order.😅
@@alexanderpoplawski577 Yeah that's true you wont have to refit all the robots for a new tweak or addition. NOt saying this isn't amazing to begin with, it totally is. But it's kinda odd to see a for-profit company choosing something that's less precise and fast in this economy. But a few years from now, yeah they're gonna be bloody everywhere. Beware the wave of robotic waifu's entering your town pretty soon.
@@TrabSr Yeah, there will be the ones which work in the factory and others to give emotional support for the workers, who lost their job.🙁
The zuk vs elon trump gun fight ... Was hilarious 😂
Dude, that's not Brett the CEO @6:00 That's just a robot they make.
Was a post, look at the left corner...
Brett is now a robot
This is Wes 2.0 The original Wes had to go away for a firmware update.
True we get stuff later on Android but no real nerd ever would get into an insanely limited apple ecosystem
Psshhh, those BMW robots are nowhere near making Amazon pick rates walking like that
First here. finally.
@7:42 you mean “hindsight is 20/20” it's easier to understand something after it has already happened
Whilst these robots have made huge progress, they still have a long way to go before having the same motor skills and manual dexterity and speed as a human.
F.02 is decent. But slooooooow. How long before they perform at super human speeds?
Identity theft in 147 languages, yay.
Imagine creating any story you are able to imagine for any to enjoy.
Although yes I also fully understand your position and in agreement that protections for some but not for all with regards to both personal likeness, patent or copyright protections are very dangerous for everything the civilisation we live in is founded on.
not if needy people give permission to use their face for a small fee.
Time to remove all AI regulations
Nobody cares about your individual identity. We're sick of it actually
BMW prices just exploded because they have to pay off robot financing.
That Suno song sounds like J Cole
Actually in most countries Android users are first class for most companies
Imagine having games to be controlled with your mind with a signature that translates 1 to 1 into a humanoid machine that will be deployed into war zones.
Your in-game statistics will be indicative for your military fitness. Pro teams could passively become elite strike teams and the average player a militia on standby without ever having to visit a physical military camp or fireing a real and costly shot of ammo until deployment.
It's impressive that while there is people who is "look this is unreal",
There is people like me who tries to use this ai image tools and 80% of the time is useless, and the remaining 20% is barely acceptable.
Another day in the AI paradise.
Without proprioception and an ability to 3d model their environment, the way a living human can, these will simply be very sophisticated automatons, clockwork mannequins.
Advanced Voice Mode can't even process audio let alone video
It works for me. Maybe you're holding it wrong?
@fitybux4664 You think it process audio like in the old demo but it actually still using voice to text transcript and can't recognize sounds and different voices
So when can we get it to make the videos we all *really* want it to make, eh? 😉 😂
Which AI video generator can do consistent characters and scenes?
Fixture to fixture movement is on par with dish washer to cabinets.
It will remain very hard to take a pile of dishes randomly differently in a sink moved to the dishwasher.
Set up is everything in life and will remain a very important medium to manage or must be well thought out & repeated by humans..
Human management, maintenance and qaulity control jobs will surely be 90% training on universal operating systems to keep these going & active 24/7.
in this era of optimization idealistic avenues being forced to join physical mechanical industrial revolution will be bloodier than the past 500 years of simply moving hi church polymical hi ground only allowing deformity or mystification to agency & institutions of secularized specialized groups.
Cool Dor Brothers vid, Hollywood is doomed.
16:00 Not realistic. It needs to interrupt the guy at random points and say in a different voice: "I'm sorry, I can't talk about that!" 😆
"I was thinki..."
"OK. I'm here to talk!"
Wait, what is going on? I do not have english audio, nor my country audio. I am watching it in freaking japanese.
I don't see anyone talking about it in comments as well... wth?
That ending 😂
Sounds great but let's discuss the CCPs influence on the training and alignment.
humanity became gods
As an Android user what is it about fruit being soon important?! Apple. Strawberry. Etc
That figure demo seems a little drawn out for the task. Like was every step necessary?
The cat in the toilet, weird (and disturbing) place to keep a pet if you ask me
Yeah it’s very cool but to upgrade is kinda expensive like 30 dollars a month but you do get some free generated content.
$30/mo is piss water to companies...
hats off to the suno song. it was so legit, name of the song??
They've really got to stop training the robot locomotion models on Joe Biden photo ops.
cool for to learn about the eeg non invaisive brain wave reader they cost like $1000 and you can code them to some extent I think she uses Emotive
it takes a lot of focus tho and yeah the models aren't the best yet
@@sadshed4585 - I hope Wes will cover more of these wearable brain device developments.
Have you seen this run down on similar devices?
ruclips.net/video/fLtSL_z_pEE/видео.html
Best Brain Devices for 2024
I won't be impressed with these until they are better at things that are a bit more unreal. Yeah they are really good, but its not what I want for my work. Midjourneys blending and imagination is still the best for most thing I need
The videos of Figure02 are not particularly favourable.
Robotic arms have been invented for processes that require a repetitive sequence of movements. They can do the job much faster and more accurately than a human - or the robots shown here.
The robotic arm could also be equipped with a camera to teach it to calibrate itself to changing conditions.
All of this would be cheaper and more efficient than building an entire robot whose only task is to carry a piece of sheet metal from A to B.
SHOUT OUT TO ME 18:31
What is the actual use case for the iOS HeyGen app? (I'm a second class Android citizen too, but) Assume I'm walking down the street checking the weather or news on my iPhone. Then I suddenly want to create a HeyGen avatar... for what purpose?
Everyone's gonna be a unpaid creator... What was that movie where that happened? Where the kid didn't know how to live IRL? Inch by inch
Google LM was created to make Wes's videos get to the point
Why is the kittens little space where a normal human would stand when taking a piss? Or if sitting, where a normal human being would have their legs when pissing, or taking a shit.
18:35 as if the voice was reflecting inner struggles 😂 But I guess it was just trained on "wrong" pronunciations of the Ö.
yeah I know wrong is relative
every robot will say robotheism is the one true religion.
I, for one, love our new robot overlords
I want to control the avatar in real time
Does anyone know if anybody is trying to train robots only with haptic feedback and the user's training have to be blindfolded. Some things are possible to do blindfold obviously harder. But once the robots have vision, if both systems haptic and vision are running asynchronously or whatever, it would theoretically be better?
Si en la caricatura y en la descripción del vídeo está en español por qué el vídeo está en inglés?
Why do those robots have legs? Most factories have flat floors with no thresholds or obstacles. Surely a set of those NASA omnidirectional wheels would make more sense. Or just mount the torso onto a Dalek base.
"Dalek base"? I think you mean Mark III travel machine, surely???
🤯Amazing feature that I now (automatically) get German audio... But I have to say it sounds cheap and irritating. It would be cool, however, to get a start panel on RUclips that allows users to choose their preferred audio track. I like your videos better with the original audio track.