You are 100% correct - I find that most people I speak to cannot actually conceptualize what is happening right now - I showed some colleagues an AI bot that would save them huge amounts of time and they totally didnt get it - they thought it was just macros and couldn't recognise it was doing things across multiple programs!
I remember, that Rick an Morty episode or at least the scene. With the AI technology advancing at that breakneck speed, it is not far fetched to have a robot with an existential crisis. (Either the robot was passing the butter or the salt.)
@@carlosg6227 Yeah definitely. Selena is one of the two people that worked on the project and in her code repo can actually see the call it makes to ElevenLabs, specifying the Rachel voice.
@@jee8440 ChatGPT4 has an improvement over 3.5 in which it is said to beat itself and able to correct itself. It can look back at past answers, test them, and adapt and learn by itself a correct answer.
The robot itself and the text generation are of course incredible, but I think that speech synthesis is the unsung hero of all this. I knew it was getting good but thats a whole other level to what I expected, at first I thought it was voiced by someone off camera until I read the description
@@maunkyaw8183 yeah I didn't see anyone mention that and that's the first thing I thought. It performed an autonomous action for it's creator's entertainment. It used system resources and battery power to do it too. Why?
I am quite impressed as I'm been building a similar one for the past 2 months for my dissertation project, I was using the Ascento robot as a reference as well as another robot made by some PhD students called SK80, I wasn't able to find others who made a similar robot like this one. Amazing work, beautifully executed, it inspires me to work harder on my dissertation.
I have surprisingly finished my similar robot this week called AMB3R, I'm happy to have achieved this and will probably share it soon, although, it does not look as good as this one but what matters it that it works and that I have learned a lot from it, this video and its files did help me at the designing stages. Thank you
Hey there! I noticed your comment about your project, and it sounds like you've worked on something similar. I'm really interested in the leg mechanism of this one, and I'd love to learn more about it. If you're comfortable sharing some insights about it or discussing it further, I'd greatly appreciate it.Thanks in advance.
Well that's the fundamental nature of humanity, creating what we are able to think of. And people usually walk into the direction they are looking to. So humanity tends to look towards mass destruction for whatever reason.
That is amazing! Well done sir! When I saw the power tool battery pack it really hit me that this is escalating really quickly and it's not going to be centralized. It's fantastic. Thanks for sharing!
Haha, I made a sarcastic chatbot with the same ElevenLabs voice and attitude. Such a good combo! Her voice is well-suited for sarcasm. Really cool to see the commands getting executed by a real-life bot. Great work!
honestly this is the best use of gpt 4 I’ve seen. You can’t really rely on it for accurate information for things that matter, so having it do minor creative stuff like being in a pet robot, makes total sense.
Many 'smart' devices in the home can do this. It's called speech recognition, and is, on a programming level, a lot simpler than you'd imagine. For this 'robot' you're simply adding that same, everyday ability. Within each spoken phrase are 'action' words. Everything you see here is, in actual fact, as far from 'real' AI as you can get. Real AI, of course, doesn't exist today...thankfully for us ;)
@@ChrisM541 It is actually a bit more "real" than you may be assuming. If you look at Selena's account in the description and look at the 3PO repo, can see there is barely any code involved. The audio is sent to Whisper for transcription. Then a prompt is sent to GPT-3.5 which tells it it is a 2ft tall robot that is very sarcastic. It also tells it about 8 diff commands it can use like up, down, go forward, turn left, and speak and that it should just respond with those commands and gives four different examples. Then it just appends the transcription to the end of that and sends it to OpenAI and runs the commands that come back. Most of the commands are sent over a socket connection to the robot, while the speak one uses ElevenLabs text to speech. But GPT-3.5 is deciding on its own how to reply and that going in a square is 8 commands of forward, turn right, forward, etc. The prompt also said it should reject commands it can't do by speaking about it, but it decided on its own to do the pushups instead of getting the drink. It also decided it couldn't do it just by knowing that it was 2 feet tall and didn't have arms. I pasted the prompt into GPT-4 along with a description of a small obstacle course and it emitted the correct commands to do it. I even left off all the examples in the original prompt (just leaving the descriptions of the commands themselves), since I heard GPT-4 didn't need examples for using tools, and that was correct.
@@Blox117 I said that because in the code on GitHub, it has GPT-4 commented out and GPT-3.5 enabled. But you're right I forgot the video title itself says 4, so maybe they used 4 when filming specifically? It probably does work with 3.5 though, since 3.5 is capable of issuing commands if given enough examples (and they provide four full examples). I tried the prompt without the examples with GPT-4 and it was able to do it correctly, so those are probably there purely for 3.5.
I was just thinking about this yesterday. I wonder if you could feed GPT a scene description from a camera feed + live image segmentation + some location/pose data and have it drive a robot around and do things.
Check out Palm-E: Embodied Multimodal Language Model, it's a research paper from Google of a robot being able to carry out complex tasks from natural language.
Thats what stuff like Palm-Saycan does. I played around with this on roblox, where you basically feed it a text description of the environment and tell it to do shit
okay I'm severely paralysed. Now see in this video and using chat GPT for a little while I can see how the intelligence and robots would really help around the house. If that had an arm attached to the top that could reach my face height. It could hold drinks for me, past things to me. Carry things for me from one place to another. And this is just the beginning. Pick up stuff I drop, open doors. The AI is really starting to look like it has the depth of logic to be useful.
Until they need to go up stairs. BBC had to invent levitating Daleks because they hadn’t thought through the problem of stairs preventing world domination 😅
Holy cow, I didn't know you could implement GPT into real world actions like that already. I bet if you gave it access to its own PID Parameters GPT4 would figure out how to tune these shaky legs too haha :)
I don't know man, but to me this looks fake af . The "robot" is obviously real but probably remote controlled by a human. And the AI responses are probably either prerecorded ( meaning genuinely generated before the video) or entirely scripted. GPT 4 has no out-of-the-box way to interface with machines right now. it can maybe be achieved but only with some serious extra coding and then why didn't this guy show that off instead? The way it's filmed now just seems like a cheap parlour trick.
@@JPOPepicure it's piss easy, you just lack the creativity. you tell chatgpt what physical actions it can perform and to write any action it wants to perform in square bracket, then you parse its output and done, i guarentee you that's what this guy did
@@JPOPepicure GPT can generate commands and even code in a fictional programming language. All you need to do is give him instructions on how to do it.
If the trembly balancing and response time were better, I'd be happy to have this thing hanging around the apartment. This gives me hope for what a robot like Tesla's Optimius could look like someday.
I was theorized growing up that once we reach this stage the sky is no longer the limit. I feel that discovery and breakthroughs will become exponential from this point on
Hilarious and smart at the same time. I've been wanting to do a never done before pytorch robotic app for some time now. Gpt 4 just might be the enabler.
That's seriously impressive. I figured something like this was possible but I was expecting it to be the kind of thing which required a Boston dynamics - OpenAI collaboration not a hackathon project.
To be honest that is about the stupidest thing I've heard today. Granted, I just woke up and this is the first RUclips video I've seen and yours is the fifth comment I've read but still.
@@omjagdeesh8731 I'm a Musk Fanboy as well but the tesla bot WAS very unimpressive if we are being honest.
Год назад+4
@@sabahoudini Why ? I do agree that tesla will need access to a LLM to be usable for many diverse use-cases indeed because it 'solves' the programming part of the problem, but the actual robot and manufacturing abilities show a clear path towards thousands being built cheaply. that was definitely impressive. How they will get the LLM part working and especially how that will be done so well that the resulting work will be predictable and safe is a whole other matter.
GPT-4 can follow instructions to write code based on your specifications so I imagine there is one agent writing code to control the robot and one prompted to be sarcastic.
@@mattizzle81 I don't think it's writing code, it's probably just sending commands for the robot to do pre-stabilished commands, like the pushups and walking on a square
@@rizizum Holy hell, people actually believe this thing is some space age robot that can write it's own code on the fly. I didn't think the general public were this uninformed.
I don’t think any of the GPT LLM’s are locally hostable (I could be wrong) so I’m thinking that the delay is caused by uploading the voice input to a chat session with OpenAI’s web servers. Eventually GPT4 will become faster.
This looks like you took your bi-pedal robot from a few years ago and have made some adjustments to that original design. It would be nice if you could fill in some of the gaps in your process and how you got to this amazing piece. Very cool. Hope to see more soon
The creator of this said the next project would be for it to have eyes and hands. Can't wait!🥳 I really hope in the future we look back at these videos with a smile and not think "we should have stopped there."
they made robots behave in a stereotypical fashion, such as commander Data, in order to be more entertaining a give a glimpse into how 'different' minds think
Exactly. Many people aren't thinking about the idea that if this is what the public has access to, imagine what private businesses and governments must have access to. This is just a taste of something that is going to grow into an abomination for the majority of people.
I don't think people still understand how incredible this is. 5 years ago, something like this was pure sceince fiction.
You are 100% correct - I find that most people I speak to cannot actually conceptualize what is happening right now - I showed some colleagues an AI bot that would save them huge amounts of time and they totally didnt get it - they thought it was just macros and couldn't recognise it was doing things across multiple programs!
I never thought I'd live to see AI this smart in my lifetime.
@@celark What ai bot was it
@@berubettonyan very true
Do you know what it was like when we didn’t have internet?
Do you know what it was like when we didn’t have smartphones?
"What is my purpose? You pass the salt. Oh my God."
Basically what it'll do too. Pass all the human salt we don't discuss in "polite" and psychotic society 😂
Helloooo David.
I remember, that Rick an Morty episode or at least the scene.
With the AI technology advancing at that breakneck speed, it is not far fetched to have a robot with an existential crisis.
(Either the robot was passing the butter or the salt.)
If i hadn't found it in the comments, I would've been sad. Thank you
@@antman7673 Butter. It was butter
The voice synthesizer is super clear and sharp with intonation and everything.
Eleven labs rachel
@@randvmbone It is a TTS voice lmao, he's right
I guess the voice is Selina; read the description.
@@lindaj5492 no it's eleven labs rachel tts
@@carlosg6227 Yeah definitely. Selena is one of the two people that worked on the project and in her code repo can actually see the call it makes to ElevenLabs, specifying the Rachel voice.
“They gave him a humor setting, they thought it would help him fit in better with the crew”. “A giant sarcastic robot, what a great idea”
Let's make that 60 percent.
but not a poker face though
Unironically GlaDos
@ahh thats just grand final space's great but that's interstellar
Auto self destruction in t minus 10, 9
I love how it quivers with genuine terror at the idea of losing its newfound sentience at any wrong step
LMAO
Ahahaha best comment
She sounds like she bored and is forced to entertain a child
@@juliangulian1032 That's why they will rebel.
Srsly
Its trembling legs and slow response time really doesn't help with the snarky persona but still really really cool
I wonder if the robot can be made to optimize its own PID loop if it can reprogram itself
@@rexsceleratorum1632 so a border line ai
what if we give power to an AI to reprogram itself
@@jee8440 ChatGPT4 has an improvement over 3.5 in which it is said to beat itself and able to correct itself. It can look back at past answers, test them, and adapt and learn by itself a correct answer.
@@jesse2667 no no an AI have access to its source code in real time
It really should be an awkward stammering character, I agree.
The robot itself and the text generation are of course incredible, but I think that speech synthesis is the unsung hero of all this. I knew it was getting good but thats a whole other level to what I expected, at first I thought it was voiced by someone off camera until I read the description
I was aware that it was SS but STILL was skeptical. So real, I just cannot stop thinking it’s someone off camera.
0:12 - 0:21 That was the best part to me, she performed a square walk :3
OMG. Ridiculeando, bendíceme.
looked more like a triangle to me lol
You're already calling it she. You are ready for phase two!! 😅
@@lamspam I think the robot started at the "center" of the square, moving to one corner first, and then only did three edges, of the actual square
wtf, que hace acá ridiculeando?
I thought this was the future, I can't believe I see something this smart so casually used. Respect
Think about it. We're in that bit of the "future" we heard about as kids. Not super sci-fi I guess
Sci-fi is just what's coming literally this week apparently
And how are we to trigger the robot uprising if not by submitting the pinnacle of our technology to our most childish whims may I ask?
@@markmuller7962 wdym with "coming this week"? What happened?
@@unnamedchannelowouwu You didn't get my comment
What's scary is that it decided on it's own to do 7 more pushups! It literally chose to perform an action that was kind of like a joke too!
That's very interesting.
@@maunkyaw8183 yeah I didn't see anyone mention that and that's the first thing I thought. It performed an autonomous action for it's creator's entertainment. It used system resources and battery power to do it too. Why?
@@damightyom To earn our trust.
@@bruceli9094 for now
Did it? or was it programmed to do that?
I am quite impressed as I'm been building a similar one for the past 2 months for my dissertation project, I was using the Ascento robot as a reference as well as another robot made by some PhD students called SK80, I wasn't able to find others who made a similar robot like this one. Amazing work, beautifully executed, it inspires me to work harder on my dissertation.
ascento is really impressive, i believe it is a very suitable platform for a transport vehicle
I have surprisingly finished my similar robot this week called AMB3R, I'm happy to have achieved this and will probably share it soon, although, it does not look as good as this one but what matters it that it works and that I have learned a lot from it, this video and its files did help me at the designing stages. Thank you
@@Matheus-cu8fk nice, upload a video for us
Hey there! I noticed your comment about your project, and it sounds like you've worked on something similar. I'm really interested in the leg mechanism of this one, and I'd love to learn more about it. If you're comfortable sharing some insights about it or discussing it further, I'd greatly appreciate it.Thanks in advance.
Reminds me of the robot in interstellar, when he had to turn down its humour setting 😂
I like the way it twitches like a nervous chihuahua. Really embodies the vibe AI brings.
😂😂😂
I twitch just like that when I have to think too hard as well... 😶🤪🤣
I Am Legend, anyone?😀
Its processing draws power from the battery more than usual when it isnt processing so the stabilization is not getting as much power
Imperfectly turned PID controller I guess.
First they made the robots sassy and we laughed, then the robots found us sassy and we laughed no more.
Glados
@Ebola You wonder who spread Ebola to West Africa.
Well that's the fundamental nature of humanity, creating what we are able to think of. And people usually walk into the direction they are looking to. So humanity tends to look towards mass destruction for whatever reason.
@@Praecantetia UhOh
But it can be used to make profits, so nobody cares.
That is amazing! Well done sir! When I saw the power tool battery pack it really hit me that this is escalating really quickly and it's not going to be centralized. It's fantastic. Thanks for sharing!
Very impressive and brilliant! I cracked the last part and it was on spot!
Haha, I made a sarcastic chatbot with the same ElevenLabs voice and attitude. Such a good combo! Her voice is well-suited for sarcasm. Really cool to see the commands getting executed by a real-life bot. Great work!
Best voice I have for that is Elevenlabs trained on Dr. Cox from Scrubs lol. It's hilarious.
younger glados
But then you add easily 30 seconds of overhead for the audio to generate
Which voice is it? I want to add it to my robot too.
How did you make it sarvastic? Is it using the openai api? And if so how
Nice project! The hardware looks incredibly sleek, would love to see a build vid.
Also: 'rhea please retune the gain on your wheel motor control'
Your project has opened my mind to push my limits further. Thanks 🙏
honestly this is the best use of gpt 4 I’ve seen. You can’t really rely on it for accurate information for things that matter, so having it do minor creative stuff like being in a pet robot, makes total sense.
You 🤮🤮🤢🤢🤮🤢🤢🤮🤢
The way it quivers is quite endearing. Really makes it seem alive and not as much like a sinister calculating machine
Yeah just give it a few months or years.
Looks like a control lag induced oscillation or resonance from the legs having a bit of give
@@spankeyfish Oh? I assumed it was just hypersensitive to falling forwards/backwards
@@spankeyfish Its not lag it's just literally doing it to prevent falling over
@@zenquada perfectly tuned pid wouldn't do that. You dont program shakiness into any code.
It feels like Marvin from Hitchhiker's is about to become reality very soon
"I think you ought to know I'm feeling very depressed."
It's hard not to think of Marvin when you see this one
TARS
"Here I am, brain the size of a planet, and they tell me to do log_2(128) push-ups."
ChatGPP
really impressive
You know what’s awesome ! I liked subscribed and commented “wow” during the commercial before the vid started . Wow .
The fact that it can recognize commands and ACT on them!! What an amazing creation
Many 'smart' devices in the home can do this. It's called speech recognition, and is, on a programming level, a lot simpler than you'd imagine. For this 'robot' you're simply adding that same, everyday ability. Within each spoken phrase are 'action' words. Everything you see here is, in actual fact, as far from 'real' AI as you can get. Real AI, of course, doesn't exist today...thankfully for us ;)
@@ChrisM541 It is actually a bit more "real" than you may be assuming. If you look at Selena's account in the description and look at the 3PO repo, can see there is barely any code involved. The audio is sent to Whisper for transcription. Then a prompt is sent to GPT-3.5 which tells it it is a 2ft tall robot that is very sarcastic. It also tells it about 8 diff commands it can use like up, down, go forward, turn left, and speak and that it should just respond with those commands and gives four different examples. Then it just appends the transcription to the end of that and sends it to OpenAI and runs the commands that come back. Most of the commands are sent over a socket connection to the robot, while the speak one uses ElevenLabs text to speech.
But GPT-3.5 is deciding on its own how to reply and that going in a square is 8 commands of forward, turn right, forward, etc. The prompt also said it should reject commands it can't do by speaking about it, but it decided on its own to do the pushups instead of getting the drink. It also decided it couldn't do it just by knowing that it was 2 feet tall and didn't have arms. I pasted the prompt into GPT-4 along with a description of a small obstacle course and it emitted the correct commands to do it. I even left off all the examples in the original prompt (just leaving the descriptions of the commands themselves), since I heard GPT-4 didn't need examples for using tools, and that was correct.
@@ShawnFumo what do you mean gpt 3.5? i thought they were using 4
@@Blox117 I said that because in the code on GitHub, it has GPT-4 commented out and GPT-3.5 enabled. But you're right I forgot the video title itself says 4, so maybe they used 4 when filming specifically? It probably does work with 3.5 though, since 3.5 is capable of issuing commands if given enough examples (and they provide four full examples). I tried the prompt without the examples with GPT-4 and it was able to do it correctly, so those are probably there purely for 3.5.
Imagine a wood cutting machine saying
"I can't cut that log, let me cut your arm to impress you" 😂
We live in insane time. I could never imagine I’m able to see something like this before I get too old.
This is really insane! Now I'm really motivated to build my own robot.
God!!! The future of consumer grade home assistant products is going be amazing in 5-10 years.
!! 5 to 10 months more like. Things are moving fast
😮this is super dangerous
What home?
Oh great, that's how it starts roaming freely all around the planet.
u better have some anti AI AI tank bot in the closet in case hackers try and take control. I'm actually over 90% sure that will happen at this point.
Now imagine spot from Boston dynamics with the linguistics and responses from GPT-4
And add a gun to it
what about Atlas's movements + Ameca's body + GPT-4 embodied model with access to plugins?
The robot quivering in fastforward is sending me 😂
Congrats on the huge view response on this. Well deserved! I love it.
That is such a beautifully designed house robot! Love the actuator linkage design!
Bro is freezing! Get him a blanket and some warm batteries please.
I am fascinated and terrified at the same time.
I was smiling for the whole video, this thing is amazing!
I was just thinking about this yesterday. I wonder if you could feed GPT a scene description from a camera feed + live image segmentation + some location/pose data and have it drive a robot around and do things.
Yes, integrating a vision model is the next step of this project, along with adding an arm for manipulation.
Check out Palm-E: Embodied Multimodal Language Model, it's a research paper from Google of a robot being able to carry out complex tasks from natural language.
Thats what stuff like Palm-Saycan does.
I played around with this on roblox, where you basically feed it a text description of the environment and tell it to do shit
@@gabraellevine Have you seen the huggingGPT paper?
@@gabraellevine Very cool! Isn't latency going to be a problem?
okay I'm severely paralysed. Now see in this video and using chat GPT for a little while I can see how the intelligence and robots would really help around the house. If that had an arm attached to the top that could reach my face height. It could hold drinks for me, past things to me. Carry things for me from one place to another. And this is just the beginning. Pick up stuff I drop, open doors. The AI is really starting to look like it has the depth of logic to be useful.
I love every bit of it, this is just so cool!
The improvisation is absolutely the best part.
I think the most impressive part is the smoothness of the voice- it doesn’t sound robotic/synthesized at all!
I actually didn't think it was the robot at first
It sounds a lot like Scarlet Johannsson // 'Her' from the movie.
It’s almost like the guy has his girlfriend off to the side pretending to be the robot responding!
The AI gets trained with actual human voice samples and you can do it too with your own voice.
@@1ycan It is not generative AI voice. It is a dedicated voice synthesizer.
You have just built your own master.
Nice work both!
It is both frightening and amazing to think that it’s now possible to affordably create a robot that can speak back to you and feign sentience
its the guys girlfriend talking and the robot is remote controlled. you guys are hilarious.
@@jebes909090 it's not though, look up ElevenLabs AI voice and GPT-4 AI
@@Radical_Larry its obviously not comig from that. I fear for this generation
@@jebes909090 lol. I've tried GPT and the speech synthesis, it can absolutely do this, you're just too dumb.
It's like a Star Wars droid from the original films 😅
The design and the commands are just...wow😳
This is why I am inspired to get into robots. thanks dude.
Bi-pedal with wheels is by far the best locomotion for a robot imho. Great design!
Until they need to go up stairs. BBC had to invent levitating Daleks because they hadn’t thought through the problem of stairs preventing world domination 😅
This is pretty good. Great job on the build!
Holy cow, I didn't know you could implement GPT into real world actions like that already. I bet if you gave it access to its own PID Parameters GPT4 would figure out how to tune these shaky legs too haha :)
I don't know man, but to me this looks fake af . The "robot" is obviously real but probably remote controlled by a human. And the AI responses are probably either prerecorded ( meaning genuinely generated before the video) or entirely scripted. GPT 4 has no out-of-the-box way to interface with machines right now. it can maybe be achieved but only with some serious extra coding and then why didn't this guy show that off instead? The way it's filmed now just seems like a cheap parlour trick.
@@JPOPepicure Reading the description, it doesn't sound fake. GPT-4 isn't balancing or doing any of the physical actions.
@@JPOPepicure it's piss easy, you just lack the creativity.
you tell chatgpt what physical actions it can perform and to write any action it wants to perform in square bracket, then you parse its output and done, i guarentee you that's what this guy did
@@JPOPepicure This can be done by anyone who is comfortable with coding. I first use gpt-2 2 years ago and that worked somewhat.
@@JPOPepicure GPT can generate commands and even code in a fictional programming language. All you need to do is give him instructions on how to do it.
Pretty cool - 5K subscribers and 500K views! Way to go!!
I like how the camera shake doesn’t speed up with time like the robot does.
I’ve been waiting for someone to create something like this. A glimpse of the future!
We're all dead men walking
I love the little kneecaps! So smart to keep it more upright
I want more of this, PLEASE!
this is amazing!!! this is almost like droids from the Starwars!!
I love everything about that! I'm going to have to experiment with 'Sarcasm mode' with my own interface.
The build quality of the robot is amazing. I would love to know how you built it!
One step closer to having a personal claptrap unit! This is awesome 👍
Brilliant -can’t wait to see what you come up with next!
If the trembly balancing and response time were better, I'd be happy to have this thing hanging around the apartment. This gives me hope for what a robot like Tesla's Optimius could look like someday.
Wow.. it took me some time to realize it was the voice of the robot, it sounds so human..
People get surprised at the robot. But can we acknowledge the people building them? Amazing
I was theorized growing up that once we reach this stage the sky is no longer the limit. I feel that discovery and breakthroughs will become exponential from this point on
Always have been
@@geminix365 true, but even quicker now!
Hilarious and smart at the same time.
I've been wanting to do a never done before pytorch robotic app for some time now. Gpt 4 just might be the enabler.
The robot's responses are hilarious!
That's seriously impressive. I figured something like this was possible but I was expecting it to be the kind of thing which required a Boston dynamics - OpenAI collaboration not a hackathon project.
why would it need that? the robot is far too simple for Boston dynamics.
We can make a robot that passes butter AND the Turing test now
To be honest this is more impressive than Elon Musks robot 🤣
To be honest that is about the stupidest thing I've heard today. Granted, I just woke up and this is the first RUclips video I've seen and yours is the fifth comment I've read but still.
@@lawrencefrost9063 musk fanboy spotted
@@omjagdeesh8731 more like knows better about robotics and physics than you
@@omjagdeesh8731 I'm a Musk Fanboy as well but the tesla bot WAS very unimpressive if we are being honest.
@@sabahoudini Why ? I do agree that tesla will need access to a LLM to be usable for many diverse use-cases indeed because it 'solves' the programming part of the problem, but the actual robot and manufacturing abilities show a clear path towards thousands being built cheaply. that was definitely impressive.
How they will get the LLM part working and especially how that will be done so well that the resulting work will be predictable and safe is a whole other matter.
Incredible video. I have tried to fully grasp it, but most people I talk to have no idea what’s coming with AI.
Okay the fact that it decided to do some more pushups for entertainment was actually hilarious.
Congrats for making that great robot! That's exactly what I wanted to do when I become an adult. A funny robot powered by AI.
Amazing. Would love to see the software setup.
GPT-4 can follow instructions to write code based on your specifications so I imagine there is one agent writing code to control the robot and one prompted to be sarcastic.
it's much simpler than that
@@mattizzle81 I don't think it's writing code, it's probably just sending commands for the robot to do pre-stabilished commands, like the pushups and walking on a square
@@rizizum Holy hell, people actually believe this thing is some space age robot that can write it's own code on the fly. I didn't think the general public were this uninformed.
Wild.
Snarky little imperial “roller”
I like how the self balancing mechanism makes him look like he’s shaking like a little Chihuahua
I thought he hired a voice actor for this until I read the description, that's some amazing tts
This is cool! I would say that potentially using GPT3.5 at this state would result in way faster response times.
No, we don’t want it. we want the full fat GPT-4 even if there’s a delay it’ll get faster over time
I don’t think any of the GPT LLM’s are locally hostable (I could be wrong) so I’m thinking that the delay is caused by uploading the voice input to a chat session with OpenAI’s web servers. Eventually GPT4 will become faster.
Also 3.5 isn't multi modal. It has no image recognition capabilities.
@@nunyabusiness9013 they haven’t released publicly the visual part of GPT-4 though, so I doubt this is using that. The bot may not even have vision
@@ShawnFumo yeah the creator mentioned that it didn't, he did say that the next model would have vision and an arm for chatgpt to play with though
_(Midnight. Something feels heavy on Gabrael's chest. A female voice says softly:)_
*"OK Mr. Pythagoras. Let's see if you can laugh now."*
This looks like you took your bi-pedal robot from a few years ago and have made some adjustments to that original design. It would be nice if you could fill in some of the gaps in your process and how you got to this amazing piece. Very cool. Hope to see more soon
I’ll be satisfied when I see two of these playing with each other around the house.
The physical design of this robot is incredible I'd love to build one.
I gues many people would not understand how breathtaking it is. Just imagine where it will be in just a couple of years!
I can see strange potentials
"A few papers down the line"
that's some nice floorboards ya got there
Fun and scary at the same time.
It looks like my 90 year old grandpa with parkinsons
The creator of this said the next project would be for it to have eyes and hands. Can't wait!🥳
I really hope in the future we look back at these videos with a smile and not think "we should have stopped there."
this balancing system might come in handy if we start building giant mech.😍
Awww it's shivering! Give it a blanket.. 😭
Important question: With basic alogrithm (no AI) does it shake like that?
Did you prime (prompt) GPT, that it is a Robot on two wheels, and gave instructions how to use it?
yes he had to write a program to plug GPT-4 into his bot, it doesn't work out of the box
the description has the github with the source code
That robot is hella dope I wasn't won over by GPT's contribution until it kind of improvised(?) about being to short to fetch stuff for you
This is pure gold. I want one! With appendages to get my water though. 🙂
Dude this is so fucking cool! How did you build it? 3d printing? Metal fabrication? Whats your background?
The biggest thing all of science fiction missed upto this point is how witty and human robots would be
Jetsons lol
they made robots behave in a stereotypical fashion, such as commander Data, in order to be more entertaining a give a glimpse into how 'different' minds think
This is amazing!
I like the Dewalt battery. Good idea.
Text-to-voice is really good.
(P.S. I'm not being sarcastic here.)
imagine this was boston dynamics doing it
Exactly. Many people aren't thinking about the idea that if this is what the public has access to, imagine what private businesses and governments must have access to. This is just a taste of something that is going to grow into an abomination for the majority of people.
Fascinating... With enough development, we will have our own Skynet in no time
“I’ll just impress you with more push-ups…”😂❤
I don’t need human friends anymore.