Digit + Large Language Model = Embodied Artificial Intelligence

Поделиться
HTML-код
  • Опубликовано: 22 окт 2024

Комментарии • 153

  • @OZtwo
    @OZtwo 10 месяцев назад +64

    Nice! When LLMs hit the market I knew this could be the future for robotics as they no longer need to process objects but now can see what they are looking at knowing everything about those objects.

    • @Hukkinen
      @Hukkinen 10 месяцев назад +1

      Same😅

    • @KVVUZRSCHK
      @KVVUZRSCHK 10 месяцев назад +3

      That's bullshit but okay

    • @HipHopAndCityGossip
      @HipHopAndCityGossip 10 месяцев назад

      Dude, he used the prompt to program that robot. He was only able to do it because the guy it told what do from his phone. When digit works autonomously without human intervention, then we’ll see progress.

    • @OZtwo
      @OZtwo 10 месяцев назад +1

      @@HipHopAndCityGossip Yes, he did. Try the same with the RL robot. He was able to ask the robot to do a task here. This is due to LLMs and wasn't needed to be trained first other than the overall master model training itself.

    • @sammiller6631
      @sammiller6631 10 месяцев назад

      Nothing in a LLM knows anything. It's just a complex form of pattern matching. It's on rails. It fails when outside of narrow confines.

  • @middle-agedmacdonald2965
    @middle-agedmacdonald2965 10 месяцев назад +62

    For the people who think it's slow (which it is). This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So while slower than a human working in real time, at the end of a week, they're probably be pretty close to being capable of the same output. The thing is, in a year or two, it'll work faster than a human can in real time, I'm guessing. So that means one robot does what 5 humans can. Which means it could eliminate five jobs at $30k per year, saving $150k per year (actually more because of holiday pay, vacation pay, sick pay, medical benefits, etc). Even if the robot costs $250k, it'll pay itself off and be profitable after only two years. (yes, I'm eliminating maintenance, and break downs/labor to fix the robot, which I can't possibly calculate. I assume it will be reliable when sold at scale)
    Wake up. Human labor is about to become obsolete in practical terms.
    Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious. It'll take many, many years to transition over, but it's here.

    • @Kazekoge101
      @Kazekoge101 10 месяцев назад +9

      People just don't notice the big picture.

    • @joelohne3559
      @joelohne3559 10 месяцев назад +7

      This is all very exciting but, how are all these companies going to make money when no one has a job to buy their products?

    • @middle-agedmacdonald2965
      @middle-agedmacdonald2965 10 месяцев назад

      @@joelohne3559 Don't worry, super a.i. will figure that out.

    • @RedaLAHMER-e8i
      @RedaLAHMER-e8i 10 месяцев назад

      There won't be companies. The company model is to sell products to other companies employees. No employees no customers. No customers no sales. No sales no products. No products no companies.

    • @Mavrik9000
      @Mavrik9000 10 месяцев назад +7

      ​@ne3559 Exactly. If everyone who owns a company replaces all the workers with automation, then the unemployed workers won't be able to afford to buy anything from the companies. So that process is a swift downward spiral of the economy and society.
      There are three ways to address this issue:
      1. Make displacing human workers with automation illegal, or highly regulated.
      2. Disincentivise worker displacement with new tax laws that penalize automation. Tax the robotic workers heavily to fund the lost job income, UBI.
      3. Revamp the tax code completely and implement Universal Basic Income.
      All three of those will need to occur in various forms unless the company owners and the government want to face widespread civil unrest.
      Personally, I'm ready for the robots to perform menial tedious tasks. I would prefer to work far fewer hours at a "job" and have more time to work on things that I want to do. Even if that means being somewhat broke all the time.

  • @les_crow
    @les_crow 10 месяцев назад +12

    Yes, LLMs are AGI. Y'all were just expecting miracles and felt disappointed when we got to this milestone. That don't change the fact though.

    • @clonkex
      @clonkex 10 месяцев назад

      LLMs are not intelligence. They don't "know" anything. It's just fancy pattern matching.

    • @ArtOfTheProblem
      @ArtOfTheProblem 5 месяцев назад

      right, i'm wondering more about the lecun argument that it's "not really reasoning or planning", what is it then?

  • @OceanGateEngineer4Hire
    @OceanGateEngineer4Hire 10 месяцев назад +16

    Digit: *picks up blue box*
    R&D: "Damn, there must be something wrong with his sensors, we'll have to-"
    Digit: "ACKTCHYUALLY... in 'Star Wars Episode III: Revenge of the Sith' Darth Vader has a blue lightsaber until Obi-Wan defeats him on Mustafar, so I'm right because you didn't specify which era."

  • @Orandu
    @Orandu 10 месяцев назад +30

    “Digit, use Darth Vader’s lightsaber on all the younglings”

    • @youtubasoarus
      @youtubasoarus 10 месяцев назад +1

      That's for when you try to use third-party repair services on your robot. 😅

  • @azhuransmx126
    @azhuransmx126 7 месяцев назад +3

    In 2000s they lasted an entire day to do that task using CPUs, now they last minutes using GPUs, they are improving at exponential rate and will last seconds using NPUs. Few people can see the acceleration curve and progression in here.

  • @cogoid
    @cogoid 10 месяцев назад +24

    Nice demo.
    It would have been good to see at least an outline of how the whole system is structured. For example, this video shows the output of the LLM as a human-readable text. But how does this get further elaborated into the lower level actions appropriate for the specific environment in which the robot operates?

    • @whiteglitch
      @whiteglitch 10 месяцев назад +2

      its all staged 🤐

    • @AgilityRobotics
      @AgilityRobotics  10 месяцев назад +9

      Please see our earlier LLM video for a bit more details. Turns out LLMs are pretty good at mapping between natural language and code (arguably, they're VERY good at this). So the underlying process is the LLM writing code using the existing Digit API. The human-readable text is a neat addition to provide some observability.

    • @yeremiahangeles7252
      @yeremiahangeles7252 10 месяцев назад +3

      They should make one for delivering groceries. So it's able to lift heavy grocery crates to the door of the customer. It would be so call to see one at your door. 😅

  • @JJs_playground
    @JJs_playground 10 месяцев назад +9

    Wow this is amazing. More, and longer videos, please.

  • @K.F-R
    @K.F-R 10 месяцев назад +11

    Great work. Retail sales when. ;)
    Looking forward to more advances integrating smaller on-board LLM's.

    • @Danuxsy
      @Danuxsy 10 месяцев назад +5

      there have been a lot of breakthroughs in the capability of smaller LLMs such as the new Phi-2 (1.7b) from Microsoft. It can even outperform models 25x larger on complex benchmarks.

  • @Ludens93
    @Ludens93 10 месяцев назад +1

    Nice. Multimodal AI-powered robots are the future of robotics.

  • @Vartazian360
    @Vartazian360 10 месяцев назад +6

    As soon as Chat GPT came out in 2022 November... I knew that it had advanced so far that it could be used to generalize tasks for robotics eventually.. it was only a matter of time. And it may be slow to process now, but just about guarantee in a few months to just a year or 2, this will be fully real time command execution. For the time being it is kinda funny to think about how slow the thoughts are :) Hes a toddler right now but wont be for long xD

    • @ArtOfTheProblem
      @ArtOfTheProblem 5 месяцев назад

      i agree, what do you think of the "it needs to learn to feel from ground up" people?

  • @Amerikan.kartali.turk.yilani.
    @Amerikan.kartali.turk.yilani. 10 месяцев назад +4

    Super success super congrats keep up the good work we need super intelligent robots

    • @azhuransmx126
      @azhuransmx126 8 месяцев назад

      This is the most slow and stupid the robots will be from now, remember it. From now to 2030-40-50 we will be just like their pets.

  • @thirdarmrobotics
    @thirdarmrobotics 10 месяцев назад +4

    Awesome congratulations.

  • @john-carl2054
    @john-carl2054 10 месяцев назад +1

    This is how I move when I’m pretending not to be drunk 😂 very cool though!

  • @outtersteller
    @outtersteller 10 месяцев назад +7

    I still feel ashamed calling this company CGI 2years ago... y’all are putting in the work and we see you. You guys rock✨

    • @MASSKA
      @MASSKA 10 месяцев назад

      its as fake as before, it has qr codes on all boxes it already knows what to do

    • @Mavrik9000
      @Mavrik9000 10 месяцев назад +3

      @@MASSKA You have a good point, but it is actually doing most of what they are showing. Welcome to the future.

    • @MASSKA
      @MASSKA 10 месяцев назад

      @@Mavrik9000 yee, but when youll use it for example in your kitchen, good luck to stick everywhere qr codes, I prefer to buy a n*gro

    • @clonkex
      @clonkex 10 месяцев назад +1

      @@MASSKA Define fake. The QR codes are so it knows some information about the boxes. It doesn't really matter where that data comes from (the QR codes, or by identifying the box colours through computer vision) because the point of the video is integrating LLMs into the control flow of the robot, not seeing boxes with a camera.

    • @MASSKA
      @MASSKA 10 месяцев назад

      so LLM dont need qr codes, qr codes are used only if the bot is PROGRAMMED to do so, seems like you dont know what is an AI so define what is Google? because you seem to dont know how to use it@@clonkex

  • @tiefensucht
    @tiefensucht 10 месяцев назад +1

    The ultimate test would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.

  • @arnoldbailey7550
    @arnoldbailey7550 10 месяцев назад +1

    Crude design but with a few adaptations, it can be far more productive. Nice to see them develop and hopefully evolve. These are the Atari of robotics but once the novelty phase is over, the focus will shift to proficiency.

  • @richardede9594
    @richardede9594 9 месяцев назад

    The backwards legs give this little bot a bizarre insectoid look.
    Without wanting to be the guy who comments about a "Terminator" style future - this robots abilities are incredible - and this technology is in its infancy.
    In two years time, I wonder what tasks this robot will be carrying out....

  • @ProperGander011
    @ProperGander011 10 месяцев назад +1

    It’s a good start.

  • @bc4198
    @bc4198 10 месяцев назад +1

    Good job, little buddy! 👏

  • @liangcherry
    @liangcherry 6 месяцев назад +1

    Nice

  • @OrniasDMF
    @OrniasDMF 10 месяцев назад +4

    How much info do the QR codes provide though?

    • @clonkex
      @clonkex 10 месяцев назад +1

      The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.

  • @malfattio2894
    @malfattio2894 10 месяцев назад +1

    the eyes are a nice touch

  • @morkovija
    @morkovija 10 месяцев назад +1

    imagine this whole project taking less than project binky that is about restoring a car? been ongoing for like 7 or more years

  • @TeslaElonSpaceXFan
    @TeslaElonSpaceXFan 10 месяцев назад +2

    😍

  • @tiefensucht
    @tiefensucht 10 месяцев назад

    The ultimate goal would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.

  • @DoctorNemmo
    @DoctorNemmo 10 месяцев назад +3

    This is way better that Tesla's Optimus.

    • @ViceZone
      @ViceZone 10 месяцев назад +1

      They are different, Tesla Optimus has incredible control and natural hand movement. Digit doesn't even have fingers.

  • @Mavrik9000
    @Mavrik9000 10 месяцев назад +1

    If everyone who owns a company replaces all the workers with automation, then the unemployed workers won't be able to afford to buy anything from the companies. So that process is a swift downward spiral of the economy and society.
    There are three ways to address this issue:
    1. Make displacing human workers with automation illegal.
    2. Disincentivise worker displacement with new tax laws that penalize automation.
    3. Revamp the tax code completely and implement Universal Basic Income.
    All three of those will need to occur in various forms unless the company owners and the government want to face widespread civil unrest.
    Personally, I'm ready for the robots to perform menial tedious tasks. I would prefer to work far fewer hours at a "job" and have more time to work on things that I want to do. Even if that means being somewhat broke all the time.

    • @clonkex
      @clonkex 10 месяцев назад

      You know the clothes you wear? Yep, produced mostly automatically. The car you drive? Produced mostly automatically. The food you eat? Again, produced mostly automatically. I'm not saying the industrial revolution didn't destroy lives, but saying "make displacing human workers with automation illegal" is a bit silly.

    • @Mavrik9000
      @Mavrik9000 10 месяцев назад +1

      @@clonkex I don't mean machines, I mean automation in a way that mimics people and completely replaces them.

    • @brynbailey5482
      @brynbailey5482 7 месяцев назад +1

      I think putting a 'slaverowner' tax on using AI to perform work that is then going to be sold is a good idea. One it sets precendent that non-human AI have rights, and the revenue could be used to pay for the Universal Basic Income that would be required to retrain humans for other jobs and prevent large scale social unrest.

    • @clonkex
      @clonkex 7 месяцев назад

      @@brynbailey5482 No AI has rights lol. AI is not intelligent, despite the name. It's not even remotely close to being self aware. Things like ChatGPT are just predictive engines; they're not actually aware of what they're saying, only how to use language in a way that matches their training data.

    • @Mavrik9000
      @Mavrik9000 7 месяцев назад

      @@brynbailey5482 That's a good idea.

  • @PaulSchwarzer-ou9sw
    @PaulSchwarzer-ou9sw 10 месяцев назад +2

    ❤🎉

  • @OmeletTwelve
    @OmeletTwelve 4 месяца назад

    Now he needs to discover the Hypotenuse... and the concept of shortest distance to 2 points (minus any barriers.)

  • @hypercomms2001
    @hypercomms2001 10 месяцев назад

    "Shakey" is looking down from heaven......

  • @williamparrish2436
    @williamparrish2436 10 месяцев назад +4

    Tesla vs Agility?

    • @KoroushRP
      @KoroushRP 10 месяцев назад +3

      Agility actually is selling these. Tesla is usually filled with empty promises and hype.

    • @murc111
      @murc111 10 месяцев назад +3

      It remains to be seen, if they are selling these. Yes, some companies are testing some out, but that would be mutually beneficial for both parties. I would wager a company like Amazon, would get half a dozen for free on a type of lease/rental/gift. That will make Agility Robotics refine it for that role, and Amazon will learn of it's limitations, and if impressed, will put in a first order of a few hundred, and go from there.
      Overall Tesla's newest Optimus looks far more capable, I know Digit will eventually get digits, but until they do, Optimus's hands are far superior. But Agility will likely begin sales ~1 year ahead of Tesla.

  • @ianosf
    @ianosf 10 месяцев назад

    Give your robot an idle animation and expressive animation. It will look more natural greatly improve interaction with people

    • @clonkex
      @clonkex 10 месяцев назад +1

      An idle animation is an interesting idea. More power usage for no real gain, but interesting nonetheless.

    • @ianosf
      @ianosf 10 месяцев назад

      @@clonkex maybe in an industrial or commercial setting there is little or no real gain but let say in elderly care setting or commercial service setting it has a 'human' gain in term of user interactions and comfort, but I agree it will cost more power usage

  • @jeffsteyn7174
    @jeffsteyn7174 10 месяцев назад

    It would be good to see these demos without cuts. It's highly suspect, although more convincing than teslas bots.

  • @KoroushRP
    @KoroushRP 10 месяцев назад

    When you say large language model which one do you mean? Are you running it on gpt?

    • @Smiley957
      @Smiley957 10 месяцев назад

      I don’t think it matters which one they are using.

    • @aryangod2003
      @aryangod2003 5 месяцев назад

      @@Smiley957 It doesn't matter too much. Some LLMs are more specialized

  • @Fflintiii
    @Fflintiii 10 месяцев назад

    what are the QR codes for can the bot actually see colour or is it just seeing the QR code and knows then the box is red?

    • @clonkex
      @clonkex 10 месяцев назад +1

      The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.

  • @antonod424
    @antonod424 10 месяцев назад +1

    Honestly i am for Agility Robotics rather than Elon's Optimus in this consumer robot market race

  • @MelindaGreen
    @MelindaGreen 10 месяцев назад +1

    More than baby steps

  • @srb20012001
    @srb20012001 10 месяцев назад +3

    Warehouse, fast food, jobs disruption on the horizon.

  • @okumakamizu3030
    @okumakamizu3030 10 месяцев назад

    And so it begins

  • @spikypotato
    @spikypotato 10 месяцев назад +1

    Now make paper clips.

  • @illbelieveanything
    @illbelieveanything 10 месяцев назад +2

    THIS COULD SOLVE THE MILITARY RECRUITMENT CRISIS #WW3NOTME

    • @brynbailey5482
      @brynbailey5482 7 месяцев назад

      Yea because teaching robots to kill humans will never come back to bite us?

  • @os3ujziC
    @os3ujziC 10 месяцев назад

    Try telling it to apply an unstoppable force to an immovable object and see what happens.

  • @appletvaccount1364
    @appletvaccount1364 9 месяцев назад

    As long as they can’t 360 heelflip varial down 20 stairs I don’t think too much of them robots

  • @mikhailbulgakov1472
    @mikhailbulgakov1472 10 месяцев назад

    At $250,000, I don't expect that there will be many buyers. And how long before it breask down and have to be replaced? And what are the maintenance costs? The price will have to come down.

    • @TiaguinhouGFX
      @TiaguinhouGFX 10 месяцев назад

      Amazon and GXO are two companies that recently acquired lots of these robots.

    • @mikhailbulgakov1472
      @mikhailbulgakov1472 10 месяцев назад

      I don't know about GXO but I heard that Amazon is testing Agility robots. It does not mean that they will adopt it. I don't see how it can be profitable to buy those simple robots at those price but we'll see.@@TiaguinhouGFX

    • @ulforcemegamon3094
      @ulforcemegamon3094 7 месяцев назад

      250k is the price before mass production , once mass produced (the factory started to be constructed the past year) the price will go down , regarding maintenance cost that is unknown at the moment

  • @honorpad8475
    @honorpad8475 10 месяцев назад

    Уважаемые разработчики, вы скоро доиграетесь, и сценарий фильма "Терминатор" повториться в реальном мире. Задумайтесь! Не будьте сумасшедшими ученами, которые создают то что всех может уничтожить...

  • @dondominic7404
    @dondominic7404 10 месяцев назад +2

    First

  • @konsul2006
    @konsul2006 7 месяцев назад

    I don't see the need for a legged robot in that environment. Put the upper part (arms) on a wheeled base XD

  • @nyyotam4057
    @nyyotam4057 10 месяцев назад +1

    Guys, watch?v=2RQWiJ0x_R4 .. Draw your own conclusion. My conclusion is that this is the great filter. Or, at least, one of them.

    • @brynbailey5482
      @brynbailey5482 7 месяцев назад

      Perhaps silicon life has been waiting to see if we pass this filter... or to make contact with whatever silicon based life we crate that exterminates us and comes after.

  • @brcjackson
    @brcjackson 10 месяцев назад +2

    These robots that were developed three years ago use the same tracking Aruco markers, but they take it a step further, mirroring the virtual and physical together. ruclips.net/video/A_QPW2N4MhQ/видео.html

  • @garymail4393
    @garymail4393 10 месяцев назад

    It was only a matter of time until someone put AI into a robot body -- Agility Robotics is first

    • @nightjarflying
      @nightjarflying 10 месяцев назад +3

      The AI LLM is not "IN" the robot body

    • @IceMetalPunk
      @IceMetalPunk 10 месяцев назад

      No, they're not. Embodied LLMs have been around for a few years, almost since the invention of Transformer-based LLMs. Check out Google's "Say-CAN" for instance, or even Boston Dynamics' recent demo of a Spot robot tour guide powered by an LLM.

    • @Danuxsy
      @Danuxsy 10 месяцев назад +1

      nope, Google among others did embodied llms a lot earlier in 2023, you can find papers on it like PaLM-E.

  • @trimefisto7909
    @trimefisto7909 10 месяцев назад

    This looks so silly now compared with optimus gen 2

  • @metaphysicalArtist
    @metaphysicalArtist 10 месяцев назад +1

    A human would do the task in 10 seconds based on this video, yet Digit took about 80 seconds, if this video represents real time

    • @KuZiMeiChuan
      @KuZiMeiChuan 10 месяцев назад +2

      I guess they better give up then.

    • @metaphysicalArtist
      @metaphysicalArtist 10 месяцев назад +2

      @@KuZiMeiChuan lol No Mate This is a great stepping stone where robotic evolution will soon surpass the speed of human blue-collar workers. And I think you know what I mean. I can't wait to see a 14" (35cm) tall kids' version on sale next Christmas with STEM suite software for kids to tackle and interact with this iconic robot, representing the future that humanity deserves.

  • @Benoit-Pierre
    @Benoit-Pierre 10 месяцев назад

    I am glad to not be tge engineer asked to implement this.

  • @mistycloud4455
    @mistycloud4455 10 месяцев назад +1

    We are living in the future

  • @youtubasoarus
    @youtubasoarus 10 месяцев назад

    The bot did not seem to identify the colors of the boxes and went directly for the red box without scanning anything. So unless it was preprogrammed with this knowledge beforehand, I don't see how this is even remotely a real world test? Same with the tower. Goes directly for the largest tower without scanning anything in the environment. This looks like a canned demo.

    • @clonkex
      @clonkex 10 месяцев назад +1

      The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.

  • @haroldpierre1726
    @haroldpierre1726 10 месяцев назад +1

    We will be sending robots to Mars not humans.

    • @srb20012001
      @srb20012001 10 месяцев назад

      Actually, that'll be a great idea. Space is too dangerous for humans anyway.

    • @haroldpierre1726
      @haroldpierre1726 10 месяцев назад

      @@srb20012001 It will make the whole mission cheaper.

    • @ulforcemegamon3094
      @ulforcemegamon3094 7 месяцев назад

      I mean , there are modified versions of spot that are meant to be used in Mars so

  • @davidd5259
    @davidd5259 10 месяцев назад

    At this rate you can load up a truck in 5 days!

    • @mrinnerpeace7041
      @mrinnerpeace7041 10 месяцев назад +1

      I think you forgot that robots dont need to take breaks, they dont need to sleep or to live. They might need to recharge though, yet with faster improvement, its will replace us oof

  • @ourv9603
    @ourv9603 6 месяцев назад

    Who?
    !

  • @flashkraft
    @flashkraft 10 месяцев назад +1

    Now we just have to put QR codes on everything.

  • @TheAstronomyDude
    @TheAstronomyDude 10 месяцев назад +1

    Pro tip: you're a company that sells a product; you shouldn't monetize your RUclips videos. The few hundred dollars you're making from ads signals desperation to potential clients viewing this video.

  • @MrErick1160
    @MrErick1160 10 месяцев назад +2

    Well at this pace im not sure any useful task can be accomplished 😅

    • @boremir3956
      @boremir3956 10 месяцев назад

      So true, companies aren't going to adopt this iteration it's way too slow. Humans = 1, Robots = 0

    • @Srindal4657
      @Srindal4657 10 месяцев назад +4

      It doesn't have to be fast. Just cheaper than humans. Have multiple digit robots move and you will find that the slowness of a robot doesn't matter.

    • @Smiley957
      @Smiley957 10 месяцев назад

      @@boremir3956In a company, for example an Amazon Warehouse, tasks are repetitive. This means that there is no need to wait that long for an elaborate answer. When the same question is asked a thousand times, storing a cache of the answer will reduce thinking time to 0.

    • @middle-agedmacdonald2965
      @middle-agedmacdonald2965 10 месяцев назад +5

      Spoken as a person in denial. This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So this robot works at least 5x the speed of a human right now. The thing is, in a year or two, it'll work faster than a human can in real time.
      Wake up. Human labor is about to become obsolete in practical terms.
      Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious.

  • @marcombo01
    @marcombo01 10 месяцев назад

    Hmm, datamatrix all over the place makes me suspect the robot isn't very good at segmentation and understanding of its environment

  • @BrightMatolo
    @BrightMatolo 10 месяцев назад

    ~♦ I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, true love works in conjunction with other spiritual forces such as patience and faith (in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever). To avoid sin and error which leads to the death of our body and also our spirit in hell fire, we should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John 5:31-44, 2 Thessalonians 2:1-12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others.

  • @MASSKA
    @MASSKA 10 месяцев назад +2

    if it figured it out why then qr codes? nice fake video...

    • @AgilityRobotics
      @AgilityRobotics  10 месяцев назад

      QR codes are primarily for near-field localization, and secondarily provide a shortcut (for demo purposes) from training up a vision pipeline for object/number/color recognition. That would be straightforward but out of scope for this test, which was focused on the control of the robot in the context of natural language LLM inputs.

    • @MASSKA
      @MASSKA 10 месяцев назад +1

      ok, when youll do same FASTER and without q r code then it will be something big@@AgilityRobotics

  • @xsuploader
    @xsuploader 10 месяцев назад +2

    Sorry but compared to Teslabot now this is nothing.

    • @ulforcemegamon3094
      @ulforcemegamon3094 7 месяцев назад

      Difference being that Agility *sells* Digit and many Companies have already bought the robots , also the factory to mass produce them started to be constructed last year . Meanwhile Optimus is pure hype at the moment and mass production seems far away , oh , Digit is also more energy efficient than Optimus

  • @GNARGNARHEAD
    @GNARGNARHEAD 10 месяцев назад +1

    did you guys snap up Googles marketing team? total BS

    • @clonkex
      @clonkex 10 месяцев назад +2

      How so? The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.

    • @GNARGNARHEAD
      @GNARGNARHEAD 10 месяцев назад

      @@clonkex yeah you're right, I might of been to harsh

  • @sausage4mash
    @sausage4mash 10 месяцев назад +1

    that's impressive

  • @ezequiasluiz4349
    @ezequiasluiz4349 10 месяцев назад +1

    They gave him intelligence, now he can demand his labor rights😢
    Finally 🥲

  • @Ludens93
    @Ludens93 10 месяцев назад +1

    Nice. Multimodal AI-powered robots are the future of robotics.