Nice! When LLMs hit the market I knew this could be the future for robotics as they no longer need to process objects but now can see what they are looking at knowing everything about those objects.
Dude, he used the prompt to program that robot. He was only able to do it because the guy it told what do from his phone. When digit works autonomously without human intervention, then we’ll see progress.
@@HipHopAndCityGossip Yes, he did. Try the same with the RL robot. He was able to ask the robot to do a task here. This is due to LLMs and wasn't needed to be trained first other than the overall master model training itself.
For the people who think it's slow (which it is). This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So while slower than a human working in real time, at the end of a week, they're probably be pretty close to being capable of the same output. The thing is, in a year or two, it'll work faster than a human can in real time, I'm guessing. So that means one robot does what 5 humans can. Which means it could eliminate five jobs at $30k per year, saving $150k per year (actually more because of holiday pay, vacation pay, sick pay, medical benefits, etc). Even if the robot costs $250k, it'll pay itself off and be profitable after only two years. (yes, I'm eliminating maintenance, and break downs/labor to fix the robot, which I can't possibly calculate. I assume it will be reliable when sold at scale) Wake up. Human labor is about to become obsolete in practical terms. Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious. It'll take many, many years to transition over, but it's here.
There won't be companies. The company model is to sell products to other companies employees. No employees no customers. No customers no sales. No sales no products. No products no companies.
@ne3559 Exactly. If everyone who owns a company replaces all the workers with automation, then the unemployed workers won't be able to afford to buy anything from the companies. So that process is a swift downward spiral of the economy and society. There are three ways to address this issue: 1. Make displacing human workers with automation illegal, or highly regulated. 2. Disincentivise worker displacement with new tax laws that penalize automation. Tax the robotic workers heavily to fund the lost job income, UBI. 3. Revamp the tax code completely and implement Universal Basic Income. All three of those will need to occur in various forms unless the company owners and the government want to face widespread civil unrest. Personally, I'm ready for the robots to perform menial tedious tasks. I would prefer to work far fewer hours at a "job" and have more time to work on things that I want to do. Even if that means being somewhat broke all the time.
Digit: *picks up blue box* R&D: "Damn, there must be something wrong with his sensors, we'll have to-" Digit: "ACKTCHYUALLY... in 'Star Wars Episode III: Revenge of the Sith' Darth Vader has a blue lightsaber until Obi-Wan defeats him on Mustafar, so I'm right because you didn't specify which era."
In 2000s they lasted an entire day to do that task using CPUs, now they last minutes using GPUs, they are improving at exponential rate and will last seconds using NPUs. Few people can see the acceleration curve and progression in here.
Nice demo. It would have been good to see at least an outline of how the whole system is structured. For example, this video shows the output of the LLM as a human-readable text. But how does this get further elaborated into the lower level actions appropriate for the specific environment in which the robot operates?
Please see our earlier LLM video for a bit more details. Turns out LLMs are pretty good at mapping between natural language and code (arguably, they're VERY good at this). So the underlying process is the LLM writing code using the existing Digit API. The human-readable text is a neat addition to provide some observability.
They should make one for delivering groceries. So it's able to lift heavy grocery crates to the door of the customer. It would be so call to see one at your door. 😅
there have been a lot of breakthroughs in the capability of smaller LLMs such as the new Phi-2 (1.7b) from Microsoft. It can even outperform models 25x larger on complex benchmarks.
As soon as Chat GPT came out in 2022 November... I knew that it had advanced so far that it could be used to generalize tasks for robotics eventually.. it was only a matter of time. And it may be slow to process now, but just about guarantee in a few months to just a year or 2, this will be fully real time command execution. For the time being it is kinda funny to think about how slow the thoughts are :) Hes a toddler right now but wont be for long xD
@@MASSKA Define fake. The QR codes are so it knows some information about the boxes. It doesn't really matter where that data comes from (the QR codes, or by identifying the box colours through computer vision) because the point of the video is integrating LLMs into the control flow of the robot, not seeing boxes with a camera.
so LLM dont need qr codes, qr codes are used only if the bot is PROGRAMMED to do so, seems like you dont know what is an AI so define what is Google? because you seem to dont know how to use it@@clonkex
The ultimate test would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.
Crude design but with a few adaptations, it can be far more productive. Nice to see them develop and hopefully evolve. These are the Atari of robotics but once the novelty phase is over, the focus will shift to proficiency.
The backwards legs give this little bot a bizarre insectoid look. Without wanting to be the guy who comments about a "Terminator" style future - this robots abilities are incredible - and this technology is in its infancy. In two years time, I wonder what tasks this robot will be carrying out....
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
The ultimate goal would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.
If everyone who owns a company replaces all the workers with automation, then the unemployed workers won't be able to afford to buy anything from the companies. So that process is a swift downward spiral of the economy and society. There are three ways to address this issue: 1. Make displacing human workers with automation illegal. 2. Disincentivise worker displacement with new tax laws that penalize automation. 3. Revamp the tax code completely and implement Universal Basic Income. All three of those will need to occur in various forms unless the company owners and the government want to face widespread civil unrest. Personally, I'm ready for the robots to perform menial tedious tasks. I would prefer to work far fewer hours at a "job" and have more time to work on things that I want to do. Even if that means being somewhat broke all the time.
You know the clothes you wear? Yep, produced mostly automatically. The car you drive? Produced mostly automatically. The food you eat? Again, produced mostly automatically. I'm not saying the industrial revolution didn't destroy lives, but saying "make displacing human workers with automation illegal" is a bit silly.
I think putting a 'slaverowner' tax on using AI to perform work that is then going to be sold is a good idea. One it sets precendent that non-human AI have rights, and the revenue could be used to pay for the Universal Basic Income that would be required to retrain humans for other jobs and prevent large scale social unrest.
@@brynbailey5482 No AI has rights lol. AI is not intelligent, despite the name. It's not even remotely close to being self aware. Things like ChatGPT are just predictive engines; they're not actually aware of what they're saying, only how to use language in a way that matches their training data.
It remains to be seen, if they are selling these. Yes, some companies are testing some out, but that would be mutually beneficial for both parties. I would wager a company like Amazon, would get half a dozen for free on a type of lease/rental/gift. That will make Agility Robotics refine it for that role, and Amazon will learn of it's limitations, and if impressed, will put in a first order of a few hundred, and go from there. Overall Tesla's newest Optimus looks far more capable, I know Digit will eventually get digits, but until they do, Optimus's hands are far superior. But Agility will likely begin sales ~1 year ahead of Tesla.
@@clonkex maybe in an industrial or commercial setting there is little or no real gain but let say in elderly care setting or commercial service setting it has a 'human' gain in term of user interactions and comfort, but I agree it will cost more power usage
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
At $250,000, I don't expect that there will be many buyers. And how long before it breask down and have to be replaced? And what are the maintenance costs? The price will have to come down.
I don't know about GXO but I heard that Amazon is testing Agility robots. It does not mean that they will adopt it. I don't see how it can be profitable to buy those simple robots at those price but we'll see.@@TiaguinhouGFX
250k is the price before mass production , once mass produced (the factory started to be constructed the past year) the price will go down , regarding maintenance cost that is unknown at the moment
Уважаемые разработчики, вы скоро доиграетесь, и сценарий фильма "Терминатор" повториться в реальном мире. Задумайтесь! Не будьте сумасшедшими ученами, которые создают то что всех может уничтожить...
Perhaps silicon life has been waiting to see if we pass this filter... or to make contact with whatever silicon based life we crate that exterminates us and comes after.
These robots that were developed three years ago use the same tracking Aruco markers, but they take it a step further, mirroring the virtual and physical together. ruclips.net/video/A_QPW2N4MhQ/видео.html
No, they're not. Embodied LLMs have been around for a few years, almost since the invention of Transformer-based LLMs. Check out Google's "Say-CAN" for instance, or even Boston Dynamics' recent demo of a Spot robot tour guide powered by an LLM.
@@KuZiMeiChuan lol No Mate This is a great stepping stone where robotic evolution will soon surpass the speed of human blue-collar workers. And I think you know what I mean. I can't wait to see a 14" (35cm) tall kids' version on sale next Christmas with STEM suite software for kids to tackle and interact with this iconic robot, representing the future that humanity deserves.
The bot did not seem to identify the colors of the boxes and went directly for the red box without scanning anything. So unless it was preprogrammed with this knowledge beforehand, I don't see how this is even remotely a real world test? Same with the tower. Goes directly for the largest tower without scanning anything in the environment. This looks like a canned demo.
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
I think you forgot that robots dont need to take breaks, they dont need to sleep or to live. They might need to recharge though, yet with faster improvement, its will replace us oof
Pro tip: you're a company that sells a product; you shouldn't monetize your RUclips videos. The few hundred dollars you're making from ads signals desperation to potential clients viewing this video.
@@boremir3956In a company, for example an Amazon Warehouse, tasks are repetitive. This means that there is no need to wait that long for an elaborate answer. When the same question is asked a thousand times, storing a cache of the answer will reduce thinking time to 0.
Spoken as a person in denial. This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So this robot works at least 5x the speed of a human right now. The thing is, in a year or two, it'll work faster than a human can in real time. Wake up. Human labor is about to become obsolete in practical terms. Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious.
~♦ I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, true love works in conjunction with other spiritual forces such as patience and faith (in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever). To avoid sin and error which leads to the death of our body and also our spirit in hell fire, we should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John 5:31-44, 2 Thessalonians 2:1-12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others.
QR codes are primarily for near-field localization, and secondarily provide a shortcut (for demo purposes) from training up a vision pipeline for object/number/color recognition. That would be straightforward but out of scope for this test, which was focused on the control of the robot in the context of natural language LLM inputs.
Difference being that Agility *sells* Digit and many Companies have already bought the robots , also the factory to mass produce them started to be constructed last year . Meanwhile Optimus is pure hype at the moment and mass production seems far away , oh , Digit is also more energy efficient than Optimus
How so? The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
Nice! When LLMs hit the market I knew this could be the future for robotics as they no longer need to process objects but now can see what they are looking at knowing everything about those objects.
Same😅
That's bullshit but okay
Dude, he used the prompt to program that robot. He was only able to do it because the guy it told what do from his phone. When digit works autonomously without human intervention, then we’ll see progress.
@@HipHopAndCityGossip Yes, he did. Try the same with the RL robot. He was able to ask the robot to do a task here. This is due to LLMs and wasn't needed to be trained first other than the overall master model training itself.
Nothing in a LLM knows anything. It's just a complex form of pattern matching. It's on rails. It fails when outside of narrow confines.
For the people who think it's slow (which it is). This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So while slower than a human working in real time, at the end of a week, they're probably be pretty close to being capable of the same output. The thing is, in a year or two, it'll work faster than a human can in real time, I'm guessing. So that means one robot does what 5 humans can. Which means it could eliminate five jobs at $30k per year, saving $150k per year (actually more because of holiday pay, vacation pay, sick pay, medical benefits, etc). Even if the robot costs $250k, it'll pay itself off and be profitable after only two years. (yes, I'm eliminating maintenance, and break downs/labor to fix the robot, which I can't possibly calculate. I assume it will be reliable when sold at scale)
Wake up. Human labor is about to become obsolete in practical terms.
Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious. It'll take many, many years to transition over, but it's here.
People just don't notice the big picture.
This is all very exciting but, how are all these companies going to make money when no one has a job to buy their products?
@@joelohne3559 Don't worry, super a.i. will figure that out.
There won't be companies. The company model is to sell products to other companies employees. No employees no customers. No customers no sales. No sales no products. No products no companies.
@ne3559 Exactly. If everyone who owns a company replaces all the workers with automation, then the unemployed workers won't be able to afford to buy anything from the companies. So that process is a swift downward spiral of the economy and society.
There are three ways to address this issue:
1. Make displacing human workers with automation illegal, or highly regulated.
2. Disincentivise worker displacement with new tax laws that penalize automation. Tax the robotic workers heavily to fund the lost job income, UBI.
3. Revamp the tax code completely and implement Universal Basic Income.
All three of those will need to occur in various forms unless the company owners and the government want to face widespread civil unrest.
Personally, I'm ready for the robots to perform menial tedious tasks. I would prefer to work far fewer hours at a "job" and have more time to work on things that I want to do. Even if that means being somewhat broke all the time.
Yes, LLMs are AGI. Y'all were just expecting miracles and felt disappointed when we got to this milestone. That don't change the fact though.
LLMs are not intelligence. They don't "know" anything. It's just fancy pattern matching.
right, i'm wondering more about the lecun argument that it's "not really reasoning or planning", what is it then?
Digit: *picks up blue box*
R&D: "Damn, there must be something wrong with his sensors, we'll have to-"
Digit: "ACKTCHYUALLY... in 'Star Wars Episode III: Revenge of the Sith' Darth Vader has a blue lightsaber until Obi-Wan defeats him on Mustafar, so I'm right because you didn't specify which era."
“Digit, use Darth Vader’s lightsaber on all the younglings”
That's for when you try to use third-party repair services on your robot. 😅
In 2000s they lasted an entire day to do that task using CPUs, now they last minutes using GPUs, they are improving at exponential rate and will last seconds using NPUs. Few people can see the acceleration curve and progression in here.
Nice demo.
It would have been good to see at least an outline of how the whole system is structured. For example, this video shows the output of the LLM as a human-readable text. But how does this get further elaborated into the lower level actions appropriate for the specific environment in which the robot operates?
its all staged 🤐
Please see our earlier LLM video for a bit more details. Turns out LLMs are pretty good at mapping between natural language and code (arguably, they're VERY good at this). So the underlying process is the LLM writing code using the existing Digit API. The human-readable text is a neat addition to provide some observability.
They should make one for delivering groceries. So it's able to lift heavy grocery crates to the door of the customer. It would be so call to see one at your door. 😅
Wow this is amazing. More, and longer videos, please.
Great work. Retail sales when. ;)
Looking forward to more advances integrating smaller on-board LLM's.
there have been a lot of breakthroughs in the capability of smaller LLMs such as the new Phi-2 (1.7b) from Microsoft. It can even outperform models 25x larger on complex benchmarks.
Nice. Multimodal AI-powered robots are the future of robotics.
As soon as Chat GPT came out in 2022 November... I knew that it had advanced so far that it could be used to generalize tasks for robotics eventually.. it was only a matter of time. And it may be slow to process now, but just about guarantee in a few months to just a year or 2, this will be fully real time command execution. For the time being it is kinda funny to think about how slow the thoughts are :) Hes a toddler right now but wont be for long xD
i agree, what do you think of the "it needs to learn to feel from ground up" people?
Super success super congrats keep up the good work we need super intelligent robots
This is the most slow and stupid the robots will be from now, remember it. From now to 2030-40-50 we will be just like their pets.
Awesome congratulations.
This is how I move when I’m pretending not to be drunk 😂 very cool though!
I still feel ashamed calling this company CGI 2years ago... y’all are putting in the work and we see you. You guys rock✨
its as fake as before, it has qr codes on all boxes it already knows what to do
@@MASSKA You have a good point, but it is actually doing most of what they are showing. Welcome to the future.
@@Mavrik9000 yee, but when youll use it for example in your kitchen, good luck to stick everywhere qr codes, I prefer to buy a n*gro
@@MASSKA Define fake. The QR codes are so it knows some information about the boxes. It doesn't really matter where that data comes from (the QR codes, or by identifying the box colours through computer vision) because the point of the video is integrating LLMs into the control flow of the robot, not seeing boxes with a camera.
so LLM dont need qr codes, qr codes are used only if the bot is PROGRAMMED to do so, seems like you dont know what is an AI so define what is Google? because you seem to dont know how to use it@@clonkex
The ultimate test would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.
Crude design but with a few adaptations, it can be far more productive. Nice to see them develop and hopefully evolve. These are the Atari of robotics but once the novelty phase is over, the focus will shift to proficiency.
The backwards legs give this little bot a bizarre insectoid look.
Without wanting to be the guy who comments about a "Terminator" style future - this robots abilities are incredible - and this technology is in its infancy.
In two years time, I wonder what tasks this robot will be carrying out....
It’s a good start.
Good job, little buddy! 👏
Nice
How much info do the QR codes provide though?
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
the eyes are a nice touch
imagine this whole project taking less than project binky that is about restoring a car? been ongoing for like 7 or more years
😍
The ultimate goal would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.
This is way better that Tesla's Optimus.
They are different, Tesla Optimus has incredible control and natural hand movement. Digit doesn't even have fingers.
If everyone who owns a company replaces all the workers with automation, then the unemployed workers won't be able to afford to buy anything from the companies. So that process is a swift downward spiral of the economy and society.
There are three ways to address this issue:
1. Make displacing human workers with automation illegal.
2. Disincentivise worker displacement with new tax laws that penalize automation.
3. Revamp the tax code completely and implement Universal Basic Income.
All three of those will need to occur in various forms unless the company owners and the government want to face widespread civil unrest.
Personally, I'm ready for the robots to perform menial tedious tasks. I would prefer to work far fewer hours at a "job" and have more time to work on things that I want to do. Even if that means being somewhat broke all the time.
You know the clothes you wear? Yep, produced mostly automatically. The car you drive? Produced mostly automatically. The food you eat? Again, produced mostly automatically. I'm not saying the industrial revolution didn't destroy lives, but saying "make displacing human workers with automation illegal" is a bit silly.
@@clonkex I don't mean machines, I mean automation in a way that mimics people and completely replaces them.
I think putting a 'slaverowner' tax on using AI to perform work that is then going to be sold is a good idea. One it sets precendent that non-human AI have rights, and the revenue could be used to pay for the Universal Basic Income that would be required to retrain humans for other jobs and prevent large scale social unrest.
@@brynbailey5482 No AI has rights lol. AI is not intelligent, despite the name. It's not even remotely close to being self aware. Things like ChatGPT are just predictive engines; they're not actually aware of what they're saying, only how to use language in a way that matches their training data.
@@brynbailey5482 That's a good idea.
❤🎉
Now he needs to discover the Hypotenuse... and the concept of shortest distance to 2 points (minus any barriers.)
"Shakey" is looking down from heaven......
Tesla vs Agility?
Agility actually is selling these. Tesla is usually filled with empty promises and hype.
It remains to be seen, if they are selling these. Yes, some companies are testing some out, but that would be mutually beneficial for both parties. I would wager a company like Amazon, would get half a dozen for free on a type of lease/rental/gift. That will make Agility Robotics refine it for that role, and Amazon will learn of it's limitations, and if impressed, will put in a first order of a few hundred, and go from there.
Overall Tesla's newest Optimus looks far more capable, I know Digit will eventually get digits, but until they do, Optimus's hands are far superior. But Agility will likely begin sales ~1 year ahead of Tesla.
Give your robot an idle animation and expressive animation. It will look more natural greatly improve interaction with people
An idle animation is an interesting idea. More power usage for no real gain, but interesting nonetheless.
@@clonkex maybe in an industrial or commercial setting there is little or no real gain but let say in elderly care setting or commercial service setting it has a 'human' gain in term of user interactions and comfort, but I agree it will cost more power usage
It would be good to see these demos without cuts. It's highly suspect, although more convincing than teslas bots.
When you say large language model which one do you mean? Are you running it on gpt?
I don’t think it matters which one they are using.
@@Smiley957 It doesn't matter too much. Some LLMs are more specialized
what are the QR codes for can the bot actually see colour or is it just seeing the QR code and knows then the box is red?
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
Honestly i am for Agility Robotics rather than Elon's Optimus in this consumer robot market race
More than baby steps
Warehouse, fast food, jobs disruption on the horizon.
And so it begins
Now make paper clips.
THIS COULD SOLVE THE MILITARY RECRUITMENT CRISIS #WW3NOTME
Yea because teaching robots to kill humans will never come back to bite us?
Try telling it to apply an unstoppable force to an immovable object and see what happens.
As long as they can’t 360 heelflip varial down 20 stairs I don’t think too much of them robots
At $250,000, I don't expect that there will be many buyers. And how long before it breask down and have to be replaced? And what are the maintenance costs? The price will have to come down.
Amazon and GXO are two companies that recently acquired lots of these robots.
I don't know about GXO but I heard that Amazon is testing Agility robots. It does not mean that they will adopt it. I don't see how it can be profitable to buy those simple robots at those price but we'll see.@@TiaguinhouGFX
250k is the price before mass production , once mass produced (the factory started to be constructed the past year) the price will go down , regarding maintenance cost that is unknown at the moment
Уважаемые разработчики, вы скоро доиграетесь, и сценарий фильма "Терминатор" повториться в реальном мире. Задумайтесь! Не будьте сумасшедшими ученами, которые создают то что всех может уничтожить...
First
I don't see the need for a legged robot in that environment. Put the upper part (arms) on a wheeled base XD
Guys, watch?v=2RQWiJ0x_R4 .. Draw your own conclusion. My conclusion is that this is the great filter. Or, at least, one of them.
Perhaps silicon life has been waiting to see if we pass this filter... or to make contact with whatever silicon based life we crate that exterminates us and comes after.
These robots that were developed three years ago use the same tracking Aruco markers, but they take it a step further, mirroring the virtual and physical together. ruclips.net/video/A_QPW2N4MhQ/видео.html
It was only a matter of time until someone put AI into a robot body -- Agility Robotics is first
The AI LLM is not "IN" the robot body
No, they're not. Embodied LLMs have been around for a few years, almost since the invention of Transformer-based LLMs. Check out Google's "Say-CAN" for instance, or even Boston Dynamics' recent demo of a Spot robot tour guide powered by an LLM.
nope, Google among others did embodied llms a lot earlier in 2023, you can find papers on it like PaLM-E.
This looks so silly now compared with optimus gen 2
A human would do the task in 10 seconds based on this video, yet Digit took about 80 seconds, if this video represents real time
I guess they better give up then.
@@KuZiMeiChuan lol No Mate This is a great stepping stone where robotic evolution will soon surpass the speed of human blue-collar workers. And I think you know what I mean. I can't wait to see a 14" (35cm) tall kids' version on sale next Christmas with STEM suite software for kids to tackle and interact with this iconic robot, representing the future that humanity deserves.
I am glad to not be tge engineer asked to implement this.
We are living in the future
The bot did not seem to identify the colors of the boxes and went directly for the red box without scanning anything. So unless it was preprogrammed with this knowledge beforehand, I don't see how this is even remotely a real world test? Same with the tower. Goes directly for the largest tower without scanning anything in the environment. This looks like a canned demo.
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
We will be sending robots to Mars not humans.
Actually, that'll be a great idea. Space is too dangerous for humans anyway.
@@srb20012001 It will make the whole mission cheaper.
I mean , there are modified versions of spot that are meant to be used in Mars so
At this rate you can load up a truck in 5 days!
I think you forgot that robots dont need to take breaks, they dont need to sleep or to live. They might need to recharge though, yet with faster improvement, its will replace us oof
Who?
!
Now we just have to put QR codes on everything.
Pro tip: you're a company that sells a product; you shouldn't monetize your RUclips videos. The few hundred dollars you're making from ads signals desperation to potential clients viewing this video.
Well at this pace im not sure any useful task can be accomplished 😅
So true, companies aren't going to adopt this iteration it's way too slow. Humans = 1, Robots = 0
It doesn't have to be fast. Just cheaper than humans. Have multiple digit robots move and you will find that the slowness of a robot doesn't matter.
@@boremir3956In a company, for example an Amazon Warehouse, tasks are repetitive. This means that there is no need to wait that long for an elaborate answer. When the same question is asked a thousand times, storing a cache of the answer will reduce thinking time to 0.
Spoken as a person in denial. This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So this robot works at least 5x the speed of a human right now. The thing is, in a year or two, it'll work faster than a human can in real time.
Wake up. Human labor is about to become obsolete in practical terms.
Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious.
Hmm, datamatrix all over the place makes me suspect the robot isn't very good at segmentation and understanding of its environment
~♦ I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, true love works in conjunction with other spiritual forces such as patience and faith (in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever). To avoid sin and error which leads to the death of our body and also our spirit in hell fire, we should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John 5:31-44, 2 Thessalonians 2:1-12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others.
if it figured it out why then qr codes? nice fake video...
QR codes are primarily for near-field localization, and secondarily provide a shortcut (for demo purposes) from training up a vision pipeline for object/number/color recognition. That would be straightforward but out of scope for this test, which was focused on the control of the robot in the context of natural language LLM inputs.
ok, when youll do same FASTER and without q r code then it will be something big@@AgilityRobotics
Sorry but compared to Teslabot now this is nothing.
Difference being that Agility *sells* Digit and many Companies have already bought the robots , also the factory to mass produce them started to be constructed last year . Meanwhile Optimus is pure hype at the moment and mass production seems far away , oh , Digit is also more energy efficient than Optimus
did you guys snap up Googles marketing team? total BS
How so? The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
@@clonkex yeah you're right, I might of been to harsh
that's impressive
They gave him intelligence, now he can demand his labor rights😢
Finally 🥲
Nice. Multimodal AI-powered robots are the future of robotics.