Это видео недоступно.
Сожалеем об этом.

Why I Built This Robot

Поделиться
HTML-код
  • Опубликовано: 30 май 2021
  • AD For your chance to win a custom Tesla Model S and $20,000 and support a great cause, enter at omaze.com/jame....
    It's time for an update on the Really Useful Robot and also show some updates I've been working on. I've managed to write some hacky scripts that pull the depth detection out of the Intel Realsense D435i camera, so I can pin point items in 3D space that are detected with the NVIDIA Inference/TensorRT deep learning models. This will allow the robot to manipulate specific items in the environment driving the inverse-kinematic model for the arm.
    Walking Robot video: • My Walking Robotic His...
    You can support me on Patreon or buy my Merchandise:
    ***************************
    Patreon: / xrobots
    Merchandise: teespring.com/...
    ***************************
    Affiliate links - I will get some money of you use them to sign up or buy something:
    ***************************
    Matterhackers 3D printing supplies: www.matterhacke...?aff=7500
    Music for your RUclips videos: share.epidemics...
    ***************************
    Other socials:
    ***************************
    Instagram: / xrobotsuk
    Facebook: / xrobotsuk
    Twitter: / xrobotsuk
    ***************************
    CAD and Code for my projects: github.com/XRo...
    Huge thanks to my Patrons, without whom my standard of living would drastically decline. Like, inside out-Farm Foods bag decline. Plus a very special shoutout to Lulzbot, Inc who keep me in LulzBot 3D printers and support me via Patreon.
    HARDWARE/SOFTWARE
    Below you can also find a lot of the typical tools, equipment and supplies used in my projects:
    Filament from: www.3dfuel.com/
    Lulzbot 3D Printers: bit.ly/2Sj6nil
    Lincoln Electric Welder: bit.ly/2Rqhqos
    CNC Router: bit.ly/2QdsNjt
    Ryobi Tools: bit.ly/2RhArcD
    Axminster Micro Lathe: bit.ly/2Sj6eeN
    3D Printer Filament: bit.ly/2PdcdUu
    Soldering Iron: bit.ly/2DrNWDR
    Vectric CNC Software: bit.ly/2zxpZqv
    Why not join my community, who are mostly made up of actual geniuses. There’s a Facebook group and everything: / 287089964833488
    XROBOTS
    Former toy designer, current RUclips maker and general robotics, electrical and mechanical engineer, I’m a fan of doing it yourself and innovation by trial and error. My channel is where I share some of my useful and not-so-useful inventions, designs and maker advice. Iron Man is my go-to cosplay, and 3D printing can solve most issues - broken bolts, missing parts, world hunger, you name it.
    XRobots is the community around my content where you can get in touch, share tips and advice, and more build FAQs, schematics and designs are also available.

Комментарии • 233

  • @philippk736
    @philippk736 3 года назад +87

    5:30 Love, how the cup is briefly categorized as a toilet...

    • @bastianfrei5447
      @bastianfrei5447 3 года назад +6

      and then as a dining table

    • @DylanLisansky
      @DylanLisansky 3 года назад +7

      He was categorized as an umbrella and him and the chair together were categorized as a couch at around the same time.

    • @Johanss-on
      @Johanss-on 3 года назад +4

      It got drunk for a sec

    • @CaliHime
      @CaliHime 3 года назад +2

      and here I'm wondering why James would put his printer on a suitcase...

    • @hugodeandres1497
      @hugodeandres1497 3 года назад +4

      He was also a dog 10:41

  • @nielsencs
    @nielsencs 3 года назад +264

    "And although I'm not a great software developer, I feel I can copy and paste well enough to get robots to do what I want."! 😆👍

    • @markmarney622
      @markmarney622 3 года назад +7

      A man after my own heart

    • @BrandonPoulton
      @BrandonPoulton 3 года назад +9

      Thank you! This sentence makes me feel more adequate.

    • @devilduckietu
      @devilduckietu 3 года назад +7

      Ah, yes. The most important software development skill.

    • @youjustlostthegame2028
      @youjustlostthegame2028 3 года назад +3

      I feel called out and I'm not even past college yet.

    • @nielsencs
      @nielsencs 3 года назад +2

      Of course it is a bit more than that!

  • @almondtech
    @almondtech 3 года назад +17

    I just realized I've been following you since 2013-2014 or so, before I even had a YT channel to subscribe. lol
    Keep it going!

  • @koosdeboer798
    @koosdeboer798 3 года назад +21

    It's really insane what this guy can make. Even if an intire company was making these robots I would be impressed. Hopefully you can inspire more young people like me to do more with robotics!

  • @stevensteltenpool904
    @stevensteltenpool904 3 года назад +4

    @James Bruton , the Realsense library supports converting from one sensors pixel/space/point to another sensors pixel/space/point fia the sensors intrinsic/extrinsic. This way you can completely remove the parallax error between the rgb sensor and the depth sensor when a depth is found for that pixel (the first IR camera is the center of the depth sensors). The nice thing about these cameras is that they are per-calibrated and you can request these intrinsic/extrinsic for each sensor.

  • @prakharjain2548
    @prakharjain2548 3 года назад +8

    He completely read out 'Thank to the 'DYNAMIXEL XC430-W240-T'' lol. Inspiring stuff :P .

  • @mozkitolife5437
    @mozkitolife5437 3 года назад +46

    James Bruton: fulfilling every man's dream of retrieving a beer from the fridge without leaving your recliner.

  • @joshuadsuper101
    @joshuadsuper101 3 года назад +4

    You should check out the MIT Kimera semantic SLAM stuff. I just got it running tonight and it’s pretty cool stuff. I would love to see you playing around with this powerful tool.

    • @joshuadsuper101
      @joshuadsuper101 3 года назад +2

      It’s a collection of programs which can do mapping, localization, in real time as well as label the map by what type of object it is ie floor, table, etc. There have since been improvements and more added functionality but I’m just learning about it now. I definitely think you would appreciate taking a look at it

    • @jamesbruton
      @jamesbruton  3 года назад +2

      Is there a ROS node/support?

    • @joshuadsuper101
      @joshuadsuper101 3 года назад +2

      @@jamesbruton Yup it’s all ros based!

  • @MattJoyce01
    @MattJoyce01 3 года назад +11

    You're building a robot to fetch you a beer, we all know it, and frankly it's what robots are for!!!

    • @jamespicking1140
      @jamespicking1140 3 года назад

      How do you know it's just to fetch beer? Have you ever seen that episode of Big Bang Theory with Howard and the robotic arm? 😭

  • @redd_rl
    @redd_rl 3 года назад +12

    Feels good to be so early to a video.

    • @michael4576
      @michael4576 3 года назад

      @@kevinbissinger it's a rush!

  • @RND_ADV_X
    @RND_ADV_X 3 года назад +42

    I just realized... James is a young Rick Sanchez, and this is the butter passing robot 😉 (j/k)
    James, I have an idea for a project:. I'm a dj and music producer, and I love all the fun tech there is out there for electronic music.
    So, you know going from a traditional piano keyboard to a drum machine is stepping up from 1d to 2d, and adding knobs to a drum machine makes it 3d editing, so to speak?
    What if you used digital object recognition...
    To create a music device that uses dancers' body to define and edit procedurally generated music?
    So instead of an audience dancing to a song, the song creates ITSELF based on the people dancing?
    I dunno... It's weird but I think you might get a kick out of the idea... And I don't know ANYTHING about object recognition. So I'm gifting the idea to you if you want it!
    Also I'm getting kind of excited to see where your latest dog bot project is going!

    • @michael4576
      @michael4576 3 года назад +2

      Love this idea! I thought along the sane lines during a daydream many years ago... Back then, the technology didn't exist to realize the concept.

    • @twobob
      @twobob 3 года назад

      Pretty sure this exists fella. Sorry.

    • @RND_ADV_X
      @RND_ADV_X 3 года назад

      @@twobob where?
      Even if it did, that doesn't mean someone couldn't try to do it BETTER. James didn't invent ROBOTS, and yet here he is enjoying building his own.

    • @ThatGuy-fi9bm
      @ThatGuy-fi9bm 3 года назад

      @@twobob ditto on where. That sounds cool, I want to look it up. You can’t leave us hanging like that.

    • @twobob
      @twobob 3 года назад

      @@ThatGuy-fi9bm Well I have seen several projects along those lines. Some follow the dancers shapes with camera, some react to the various relatives movements of markers (or whatever fiducial ), all you are saying here is convert object recognition into the "fiducial". This is not even remotely hard. Other than lacking a UI for noobs this already seems like a very done thing. Frankly it's just the The Chemical Brothers - Star Guitar video but in reverse.... (Look it up). If you want a device that is probably "ready" to add a UI look at the studio version of Ableton. If you want specific links to some of these aforementioned projects I can look them up. All the research into midi data gloves /pretty much/ covers 90% of the code you need to get this up and running "as your vision". Couple of days project with a Nvidia Jetson

  • @richardlincoln886
    @richardlincoln886 3 года назад +2

    Any milage in going full human mode - i.e. visually detect the gripper and rather than trig to theoretically put it where the object is - Visually match the coords.
    Thus any error in the Kina model would null out.

    • @Ithirahad
      @Ithirahad 3 года назад

      Humans don't really 'visually match the coordinates' either; we run on both visual and kinesthetic/vestibular senses to error-correct one another constantly.

  • @rizkyp
    @rizkyp 3 года назад +1

    What my family think of me: Robot Genius
    Me: copy paste random codes from github.

  • @scottmudge2580
    @scottmudge2580 3 года назад +1

    The correct way to map RGB data onto the depth frame is to find the rectification transforms between the two optical systems (RGB and D). OpenCV (required by ROS) makes this incredibly easy, as it has very turn-key method/functions for this. You'll need to print out some kind of target (checkerboard grid or dot grid), and then invoke the required OpenCV methods. It will then spit out a few matrices (including those for lens distortion) that you can use to call a different OpenCV method to rectify the RGB image based on the depth camera's optical properties. The RGB data will then be mapped pixel-to-pixel (or very close to it) with the depth camera.
    Hacking static offsets without using any lens distortion coefficients (which an be significant in these Realsense Cameras) is going to cause a lot of problems for you later when you're trying to track and target objects.

  • @joannot6706
    @joannot6706 3 года назад +13

    2:55 Story of my life lmao!!!

  • @Metal-Possum
    @Metal-Possum 3 года назад +4

    Does it piss beer into a cup?

    • @scottr3237
      @scottr3237 3 года назад

      He's coding the robodog to do that.

  • @BlueScreenCorp
    @BlueScreenCorp 3 года назад +1

    I think its hilarious that the algorithm has a 60-70% confidence that a cup and a pair of jeans is a dog

  • @haydonwolfe7536
    @haydonwolfe7536 3 года назад +6

    “And although Im not a grea software developer i feel i can program well enough to get robots to do what i want”

    • @bigsteve6729
      @bigsteve6729 3 года назад

      That's not what he said

    • @haydonwolfe7536
      @haydonwolfe7536 3 года назад

      @@bigsteve6729 Alright so the joke is I swapped the words "copy and paste" for "programming" because a lot of people say they program (myself included) but use copy and past (me also)

  • @kevinmcaleer28
    @kevinmcaleer28 3 года назад +1

    I like how you're using the weekly RUclips videos to drive your knowledge of Robotics so that you make use what you've learned for future project - I do this too!

  • @Sustaita.handmade
    @Sustaita.handmade 3 года назад

    I may not know much about robots, coding, or engineering but I enjoy your videos and seeing how you solve problems. I hope that by watching your videos I can learn even just a little bit about robots over time.

  • @Keneo1
    @Keneo1 3 года назад +3

    You could train your model to recognise the robot hand and then find the vector it needs to move in to touch the cup, then you can make it move relative to the cup and don’t bother with the Carthusian coordinates
    You’ll probably want to do this anyway when you have a grabbing hand and different sized cups to grab

    • @SoundsLikeOdie
      @SoundsLikeOdie 3 года назад

      Good idea. Maybe add some aruco markers on it. Then he will have location and orientation.

    • @ziad_jkhan
      @ziad_jkhan 3 года назад +1

      There is potential for eliminating the need for encoders on the motors with this method as well

    • @evannibbe9375
      @evannibbe9375 3 года назад

      Vectors necessarily consist of Cartesian coordinates (as far as deriving one from the other goes).

  • @nonchip
    @nonchip 3 года назад

    i like how at 10:50 the robot randomly decides that a cup plus your legs equals a dog... might be helpful to give it some kinda "short term memory" by having it bias the detected object "certainty" in favour of ones it saw in the previous few frames at the same position? like instead of each frame taking the "most likely result", try keeping a record of all the things it thought it might be over the last few frames and the percentage it calculated for its certainty, then average those out and sort for the average certainty to get rid of temporary glitches?

  • @ashaygoli3014
    @ashaygoli3014 3 года назад

    Amazing!!...
    You can use moveit for the arm which can handle all the interpolation and obstacle avoidance.

  • @Aubron
    @Aubron 3 года назад

    Please never, ever stop using that B-roll of you and the Wolverine claws. Please. It should always be shown in its entirety, and with audio. Thank you.

  • @richardbloemenkamp8532
    @richardbloemenkamp8532 3 года назад +1

    This may well become your most successful project ever. Some ideas:
    - Try to remote control your robot while you look at the video of the camera and using the remote control. If you can do useful things like that then in principle a good A.I. should be able to do similar things.
    - As you move the end-effector closer to the target, decrease the speed and detect both the target and the end-effector (you can put a special easy-detection marker on it) with the camera. This will allow you to control the last part of the movement much better.
    - It might be interesting to have a Pybullet or Gazebo or other model for your robot and environment.
    - Think about a battery-charging solution like the vacuum-cleaners it can return by itself to the charger bay.
    - It appears there is a small risk that your robot falls over. Don't make the top part too heavy and try to displace heavy elements to the base.
    - Think about real applications of your robot such as: help people that have difficulty to walk, have the robot walk around the house or company when you are on vacation, clean, water the plants, check/feed pets, get a beer from the fridge, play games, etc.
    Good luck

  • @oscar8005
    @oscar8005 3 года назад +2

    Are you going to put the Omni directional drive system on this project? Seems perfect for it

    • @jamesbruton
      @jamesbruton  3 года назад +1

      Not for this one, but it's an option for other robots

  • @NeoFrontierTechnologies
    @NeoFrontierTechnologies 3 года назад

    Fascinating. The right channel to watch when you like to tinker with robots or develop something of actual use. Totally made me subscribe.

  • @qvcybe
    @qvcybe 3 года назад

    like how for a second the cup was called a toilet XD and later the person turns in to a dog for a second man i love tech stuff like this

  • @sddiymakeitworthit7512
    @sddiymakeitworthit7512 3 года назад +4

    Your skills is awesome 👏 👍

  • @alex.thedeadite
    @alex.thedeadite 3 года назад

    The Purple walker you made looks like a half dismantled GONK power droid from star wars

  • @vaisakhkm783
    @vaisakhkm783 3 года назад +2

    If(probability (person) >96){
    Kill();
    }
    Oops! Syntax error, i ment to use ' Skill()'😱

  • @knightdeluxegaming
    @knightdeluxegaming 3 года назад +5

    Can we get playlist of your ai robot videos. And how i can start with cheap machine vision microcobtroller

  • @makecodeandhardware1395
    @makecodeandhardware1395 3 года назад

    Subscribed, our students ages 9-15 yr study these concepts and imagine an automated Starship factory; Mecanum wheels and various communication protocols are studied to better understand smart devices. Thank you for your practical research : mobility and automation.

  • @louisepage716
    @louisepage716 3 года назад

    Wow! James, you are an inspiration. So humble, open and impressive. I’m very happy I found your channel. Awesome stuff!

  • @Emile_v
    @Emile_v 3 года назад +1

    oh cool idea

  • @alexisveynachter2699
    @alexisveynachter2699 Год назад +2

    @james any follow up for this robot ? No manipulator planned ?

  • @faruknane
    @faruknane 3 года назад +1

    Did you try to detect the arm of the robot with cam? Then you can just make a feedback mechanism for motors to move the arm to under the cup?

  • @charlesashurst1816
    @charlesashurst1816 3 года назад

    I'm counting on you guys to have robots ready to take care of me in my declining years. Keep up the good work.

  • @maxschik4684
    @maxschik4684 3 года назад

    small tipp, real sense cameras should publish a topic called "aligned_depth" or smth which is aligned with the color image, so you don't have to do that yourself

  • @khalifdove7955
    @khalifdove7955 3 года назад +2

    Caught it 42 minutes after upload what a wonderful number

    • @khalifdove7955
      @khalifdove7955 3 года назад +1

      Thank you very much for uploading!

  • @tanjiro3285
    @tanjiro3285 3 года назад +1

    hey James I came back to like and comment this Great informative video 12 hrs after downloading.
    Edit: I am really promoting ur Channel 🔥 Brother

  • @sammflynn6751
    @sammflynn6751 3 года назад +1

    You may want to look at *open3d* for pointcloud processing, it has builtin function which can segment objects from pointcould and find centroid from camera refence frame. Also , why not use ros TF info for co-ordinate transformation?

  • @Sirenhound
    @Sirenhound 3 года назад

    Robot... red cup...
    for a moment I forgot which channel I was looking at and thought the robot would be pissing beer into the cup.

  • @Saxandviolins
    @Saxandviolins 3 года назад +5

    Crazy inspiring!

  • @kristofferhellstrom
    @kristofferhellstrom 3 года назад

    Is James a genius or what? I'm blown away

  • @dineroparson6778
    @dineroparson6778 3 года назад +2

    This man needs to go to battlebots!!!! This intelligence and technology in a lifesize heavyweight would definently dominate it.

  • @JulianMakes
    @JulianMakes 3 года назад

    You are so awesome james!

  • @jonathanmoothart8038
    @jonathanmoothart8038 3 года назад

    Nothing but awesome videos from this channel! I love your stuff!

  • @thescienceysworkshop5368
    @thescienceysworkshop5368 3 года назад +1

    Sir can you please tell me which modules did you use in your omni wheel machine please? I didn't quite understand it

  • @extremeroom7u7
    @extremeroom7u7 3 года назад +1

    10:44 robot is like dude that's a cup robot brain no is a dog well then is both

  • @timschafer2536
    @timschafer2536 3 года назад

    Would it be beneficial if you would use camera position of the arm in a kind of Feedback Loop to detect the delta of the two?

  • @interestedinstuff1499
    @interestedinstuff1499 3 года назад +1

    That was very groovy.

  • @parthd714
    @parthd714 3 года назад +1

    Wait, I didn't subscribe??
    How did I not?
    I thought I subscribed cause you are always in recommendation, whaever imma sub

  • @EvilSpyBoy
    @EvilSpyBoy 3 года назад

    Silly Question (because I haven't finished the video yet) - Have you considered adding an identifier of some sort on the robot arm so you can object rec on the 'hand' to give you some relative information to play with too? A QR code or something as a sticker on the joints for registration like Mo-Cap suits

    • @jamesbruton
      @jamesbruton  3 года назад

      Yes potentially - it would help a lot with the left-right detection once the arm is in view. I need to make a gripper first though

  • @AMTunLimited
    @AMTunLimited 3 года назад

    James, you should look at how RC airplane pilots strap their remotes to themselves. I feel like the way you're doing it could lead to some strain

  • @vainerfever8776
    @vainerfever8776 3 года назад +1

    Woah so cool

  • @thomasre8073
    @thomasre8073 3 года назад

    This and the robodog projects are my favorite.

  • @rdyer8764
    @rdyer8764 3 года назад +6

    You're a Brainiac. In many quarters (including those I inhabit) that's among the greatest of compliments.

  • @tantamounted
    @tantamounted 3 года назад +1

    James, have you ever experimented with virtual biology, such as computer-simulated bodies, bio-inspired robotic functions, or other such things? I just have this hunch about creating instinct-driven AI in part through simulated bodily function.

  • @pebre79
    @pebre79 3 года назад +1

    I want a robot that will cook me cuisine for breakfast, lunch, and dinner. Just biding my time and i cant wait! Haha

  • @wilgarcia1
    @wilgarcia1 3 года назад

    well damn man. Its really moving along. Have you considered using binocular vision as opposed to these cameras? I have only seen one or two examples. But hat was years ago. I like it cause it's more like human vision and you just need a set of cheep webcams to make it work.

  • @woolliedev
    @woolliedev 3 года назад +1

    Wow which things do you use for it I want to make a kind of robot with a arduino it sound really cool to me

  • @AsbjornOlling
    @AsbjornOlling 3 года назад

    This is amazing The title really undersells the video.

  • @webjoeking
    @webjoeking 3 года назад +2

    Bring back the 914 PC Bot!

  • @mcdevel1
    @mcdevel1 3 года назад

    0:40 Gonk droid from star wars xd

  • @nilsdock
    @nilsdock 3 года назад

    why not use the reverse kinematic model to calibrate the position of the camera.
    then the arm could be fitted with a easily recognized pattern.
    then the arm could increase the precision of the movements by "hand eye coordination"

  • @diegocarvajal233
    @diegocarvajal233 3 года назад +1

    I'm also starting on ROS and isn't it supposed to be able to calculate the transform between the camera frame and the arm for the inverse kinematics?

    • @TS-kg4lf
      @TS-kg4lf 3 года назад

      Yes, if you published the end_effector

  • @amirbahadormoineddini1639
    @amirbahadormoineddini1639 3 года назад +2

    Love it, Amazing!

  • @MrSeanana
    @MrSeanana 3 года назад

    Hi James, I was curious if you had any recommendations for adults with kids who want to get into robotics? I know my kids would love it but this is a request more for me instead of my kids. I want something that is pretty complicated but teaches the fundamentals really well. Any ideas? Thanks.

  • @Genubath1
    @Genubath1 3 года назад

    Binocular vision!

  • @OZtwo
    @OZtwo 3 года назад +2

    Have you given any thought in looking into building a robot using RL? I'm still new to all this yet I really want to look into using RL as then the robot will seem to evolve more as it learns..I hope anyway.

    • @jamesbruton
      @jamesbruton  3 года назад +2

      Yes I have a project coming up like that

  • @darkener3210
    @darkener3210 3 года назад

    This man about to home brew a robot person

  • @kabelloseskabel7029
    @kabelloseskabel7029 3 года назад

    Hi, I want to build something similar, but with more objects that I want to manipulate one by one. Is this possible?

  • @ayanvaidya2727
    @ayanvaidya2727 3 года назад

    Really really useful

  • @abreezy1016
    @abreezy1016 3 года назад +2

    You and Michael reeves need to do a colab on pissing beer into a cup from a distance with like laminar flow or something!!!

    • @jamesbruton
      @jamesbruton  3 года назад +1

      I'm planning a p*ssbot 10000 with my own openDog at some point.

    • @parthd714
      @parthd714 3 года назад +1

      @@jamesbruton will you oneday Collab with Pranshu tople?
      His tutorials actually helped me more than any tutorials (including yours).
      He also made a urdf file of your robot

  • @r.iyushofficial5318
    @r.iyushofficial5318 3 года назад +3

    AWESOME 👌

  • @Car_Ram_Rod
    @Car_Ram_Rod 3 года назад +1

    Ugh i want that nvidia jetson!

  • @MuhammadDaudkhanTV100
    @MuhammadDaudkhanTV100 3 года назад

    Amazing idea

  • @naiyahp
    @naiyahp 2 года назад +1

    I like how no one controls your robots. 🙂🙂🙂🙂🙂

  • @enaudeni
    @enaudeni 3 года назад

    Hi James great videos, I am always amazed at what you get up too and build! I have a "Big Ask!" have you ever thought of doing a series of how to build an easier robot / hexapod / or something else for us technical wannabe's or for kids, yes the kids!!?!! Do it for the kids!!!!! Most people have 3D printers and a basic knowledge of Arduino, so if you had to do a series based for "kids" or even the kid at heart, a project broken up into small chunks so there is a better understanding of how it works and how it's programmed, building up to a meduim project when all put together. OK yes there is many tutorials on RUclips; on Arduino; on programming; on 3D printing; on designing; etc.... But there is only one of you, considering you started before RUclips and all the robots you have made you have the perfect experience and long term working knowledge. You could even sell a series like that! Just have a think about it, mention this to those who know you and see what they think. Anyway thanks for all the content and sharing your passion with us. Much appreciated.

  • @sethmcbride8490
    @sethmcbride8490 3 года назад

    10:45 when your legs are dog

  • @syedsulaiman8380
    @syedsulaiman8380 3 года назад +1

    Amazing

  • @katherineespedido4978
    @katherineespedido4978 3 года назад +1

    This is good

  • @sleepingneeds2588
    @sleepingneeds2588 3 года назад

    Question, could you make a program that is able to remember the layout of the house and can track where you are in the house to come to the area? or would it be too hard

  • @arnaudherault9168
    @arnaudherault9168 3 года назад

    james you are the best

  • @mandar3567
    @mandar3567 3 года назад

    Awesome!

  • @VLSystems
    @VLSystems 3 года назад

    Hi James, i have robot with ROS.
    What should I do if i want to remote my robot with internet?

  • @skuzlebut82
    @skuzlebut82 3 года назад

    Here I am in the US. I should be going to sleep but this video is more important.

  • @AHHEUS
    @AHHEUS 3 года назад +1

    got this in recommended cool video you gained a subscribe

  • @manojsk6189
    @manojsk6189 3 года назад +1

    Super

  • @AliceEverglade
    @AliceEverglade 3 года назад

    you should teach it what it's own arm looks like

  • @jstro-hobbytech
    @jstro-hobbytech Год назад

    The former 400 dollar Xavier is now being msrp agonstic aka (enabled scalping) by nvidia for over 1000 and the jetson that used to be 150cad is now 4 to 800 depending where you buy it. I can't imagine how much the ampere/hopper ones will be given they've eol'd the Tegra x1 based on maxwell which the jetson b01 runs on. Nvidia is catering to higher end customers. It sucks trying to develop a program as a volunteer teacher

  • @nmanbamboo1980
    @nmanbamboo1980 3 года назад

    Copy and Paste ... yessshhh.. you are not alone.

  • @bonvi2896
    @bonvi2896 3 года назад

    Dude, i'm afraid the same way delight that someday i'll handle a robot in whoose hand there is cutting edge and on the back we'll se "JB Design"

  • @joewow1229
    @joewow1229 3 года назад

    He made the GONK droid

  • @ali_the_maze_master
    @ali_the_maze_master 3 года назад

    Mmm yes Dining table 10:36

  • @itssid4ree_yt
    @itssid4ree_yt 3 года назад

    Reminds me about Michael Reeves video 👀

  • @Toaster1
    @Toaster1 3 года назад +1

    swag

  • @giacomomilan8692
    @giacomomilan8692 3 года назад

    Why the software doesn't remember the object, but only recognize it? For example the cup, sometimes il less than 50% shure that it's a cup even if 1 second before it was shure at 80%. If the object didn't leave the field of view it should be the same as before

    • @jamesbruton
      @jamesbruton  3 года назад +1

      That's interesting - I guess it's down the pre-trained model that detects any cup, I could retrain it for this specific cup and it would be a higher % certainty. Ultimately it's doing the detection on every loop from scratch so it will vary as I move the cup around.

  • @MEan0207
    @MEan0207 3 года назад

    Awesome