Teaching my custom AI drone to track humans

Поделиться
HTML-код
  • Опубликовано: 19 июн 2024
  • I paired an NVIDIA Jetson Nano and an OAK-D Lite depth camera to allow my drone, Stanley, to follow me around!
    This project has been a year in the making. In this episode, I cover details on the hardware I used, alongside the approaches I took in building the Python control code.
    I really struggled to fit all the content in under 20 mins!
    Make sure to check out the rest of the content in this series!
    Part 1: • My Raspberry Pi drone:...
    Part 2: • Stereo depth mapping w...
    Part 3: • You Only Look Once: ob...
    Part 4: • Meet Stanley, my NVIDI...
    Part 5: (this video)
    Please feel free to ask any questions in the comments, and I'll try my best to answer!
    😎 Social Stuff
    Twitter: / akamatchstic
    Patreon: / akamatchstic
    🌐 Links
    GitHub repository for my code
    github.com/Matchstic/stanley
    Jetson Nano GPS interference
    forums.developer.nvidia.com/t...
    Labforge Bottlenose
    www.labforge.ca/features-bott...
    DepthAI SDK
    docs.luxonis.com/projects/sdk...
    3D printed parts
    www.dropbox.com/sh/ip13e0wacf...
    🎞️ Chapters
    00:00 - Last time...
    00:28 - Repairs
    01:40 - Take to the skies
    02:39 - The OAK-D Lite
    04:58 - Virtual drones?
    06:51 - Subsumption
    08:25 - Setting up the Nano
    10:19 - Gimbal time
    11:30 - Full SITL test
    12:40 - Flight one
    13:50 - A descent into madness
    15:24 - Let's try that again
    15:49 - Test? Test.
    19:12 - That's a wrap!
    ⚙️ Hardware
    OAK-D Lite depth camera
    docs.luxonis.com/projects/har...
    NVIDIA Jetson Nano
    developer.nvidia.com/embedded...
    ℹ️ Attributions
    THE Hasan
    Ubuntu + Xbox Kinect 360 nite depth sensing visualization
    • Video
    OpenCV AI
    OpenCV AI Kit OAK-D-LITE Kickstarter Launch Video
    • OpenCV AI Kit OAK-D-LI...
    🎵 Music
    Punch Deck - Neon Underworld
    Listen: • Punch Deck - Neon Unde...
    DivKid - Icelandic Arpeggios
    Listen: • Video
    Alex-Productions - Startup
    Listen: • Minimal Technology Cor...
    Punch Deck - Persistence
    Listen: • Punch Deck - Persistence
    Bensound - Summer
    Listen: www.bensound.com/royalty-free...
    TABAL - An Unknown Journey
    Provided by Lofi Records
    Listen: • TABAL - An Unknown Jou...
    Corbyn Kites - Instant Crush
    Listen: • corbyn kites - instant...
    nihilore - Garden
    Download: www.nihilore.com/prog-rock
    Alex-Productions - Shine
    Listen: • Ambient Technology Mus...
    Scott Buckley - The Things That Keep Us Here
    Listen: • 'The Things That Keep ...
    French Fuse - Depth Fuse
    Listen: • Video
    Home - We're Finally Landing
    Watch: • HOME - We’re Finally L...
    NEFFEX - Grateful
    Listen: • NEFFEX - Grateful [Cop...
  • НаукаНаука

Комментарии • 95

  • @akamatchstic
    @akamatchstic  Год назад +24

    [Q and A]
    1. Would a Raspberry Pi work for this too?
    It would for this current iteration! In the future, I plan to add voice and gesture recognition, and so want the ability to expand the scope of the project to utilise more GPU-orientated tasks.
    2. Did you use ROS (Robot Operating System)?
    I did not. This is because my code only relies on incoming data from the camera, and then pushes commands back out via MAVLink. I didn't think the pipelining of data that ROS allows would really improve the development experience in this case.

    • @robertboateng-duah9555
      @robertboateng-duah9555 Месяц назад

      1. A Raspberry Pi would work, but not as effectively as the Jetson Nano because it's GPUs aren't close to as powerful as the Jetson Nano's.
      2. Using ROS may increase your development experience depending on what your long-term goal is. ROS comes with some pre-packaged tools that help you effectively simulate the drone and make it easy to integrate sensors before you actually try them in the real-world.

  • @ellawasincredible
    @ellawasincredible Год назад +1

    Your editing on this is awesome and it is so lovely to watch you being so excited and joyful when Stanley recovered! :)

  • @labforge
    @labforge Год назад +2

    Thanks for the mention, Matt! This has been a great project to watch unfold - so happy Stanley made it through all the troubleshooting and had a successful run in the end!

  • @healy9898
    @healy9898 Год назад +5

    Watching Stanley follow you seemed like he was attached to you and looking out for you literally. Absolutely amazing what you are doing keep it up!!

  • @HoffmanTactical
    @HoffmanTactical Год назад +3

    This is very cool. I'm trying to learn more about autonomous systems, computer vision, and AI. Your videos are interesting! Thank you for sharing.

  • @steveangell4426
    @steveangell4426 5 месяцев назад

    I watched this and one other video of your first drone build. I say great work man!! It is very impressive what you designed and built. Keep up the great work!!

  • @KarthikArumugham
    @KarthikArumugham Год назад

    Great stuff mate! There are a lot of videos on DIY drones, but not on custom AI on DIY drone. So thanks for leading the pack on this. Looking forward to more such videos.
    I'm getting the parts to build one. Wish me luck.

  • @kanebuckthorpe1
    @kanebuckthorpe1 Год назад +2

    Nice work man! What an amazing accomplishment. Stanley is sick!

  • @Gavthefox
    @Gavthefox Год назад +1

    You have a real talent for making informative, fun and well edited videos. These are really helpful! Also based in the UK and attempting a similar project :D

  • @Paiadakine
    @Paiadakine 11 месяцев назад

    Very impressed. I worked so hard to get my s550 hexacopter to fly, still working to get my 3d printer to make quality parts and I don’t know chit about code. You are a one man design integration and test IPTs.

  • @XRay52909
    @XRay52909 Год назад +4

    That’s the coolest thing man I want to do the same thing as you

  • @eprohoda
    @eprohoda Год назад +2

    yahoo, love it. you mmade awesome footage-see ya!! friend- 👋

  • @MarcoPono
    @MarcoPono Год назад

    Thank you so much. I have been working on a similar project and your videos have been infinitely helpful.

  • @AbdullahJirjees
    @AbdullahJirjees 10 месяцев назад

    You could just simply, give up and and say this will never work, I love the NO GIVE UP that you done in this project man, you made my day, I am starting my work day and watching your video. I LOVED IT. KEEP IT UP 💪💪

  • @Sonofzudema
    @Sonofzudema Год назад

    Just picked up a Jetson nano with a similar idea in mind! Solid work.

  • @_AG_63
    @_AG_63 Год назад +1

    This is so cool. I am from India and i am currently preparing for a entrance exam for engineering college. I love tech and such projects. I have planned somethings that I would make in college. Keep up the good work. Love your content

  • @martinfreeman1825
    @martinfreeman1825 Год назад +1

    Awesome job Matt,

  • @alexandresibony1345
    @alexandresibony1345 Год назад +1

    You HAVE done it! Bravo !!

  • @snovak1
    @snovak1 Год назад

    This deserves more views. Good job bro.

  • @ViridiRobotics
    @ViridiRobotics Год назад

    I love it
    Hope to one day be as good engineer as you are
    Keep at great work, can't wait for next video and other projects

  • @tonmoysarker3447
    @tonmoysarker3447 12 дней назад

    I watched the whole video series. So inspiring

  • @nishantnarsale6279
    @nishantnarsale6279 Год назад

    Man, you're cool.. please don't stop. keep going.

  • @miketesznar7711
    @miketesznar7711 Год назад +1

    Great work.

  • @guylionel9909
    @guylionel9909 Год назад +2

    You are a king 🔥

  • @neilclay5835
    @neilclay5835 7 месяцев назад

    tl;dr: message from an old git: don't drive yourself nuts.
    I think that with projects like this, when you get an issue, it's easy to assume that the problem you're having is something really complicated. And because you've been continuously learning all the time whilst building the system, you're still naturally in that mode, and think that you need to learn EVEN MORE to get it working. IMO, the best thing at that point is to walk away. Clear your head. And then come back and look at the simple stuff, one item at a time. I can't tell you how many times that has worked for me.
    Don't forget the basics. Your RF interference issue there was an absolute classic. All the best.

  • @mindaugask1638
    @mindaugask1638 15 дней назад

    Great project BRO!!!

  • @AdamWeatherall
    @AdamWeatherall 2 месяца назад

    Great work here 👍

  • @vtrandal
    @vtrandal 11 дней назад

    Very very good. I want to do this. I want to repeat your steps and mistakes and fixes until it works. I have not watched all of your relevant videos (for this drone project) from start to finish. I am unclear how the Pixhawk and the Jetson Nano communicate. Or maybe they don't send messages to each other. Why should they? I want to start at the same starting point you did and go step by step. Fantastic! I've heard the Jetson Orin is much faster than the Jetson Nano. But I think you chose the Oak D-Lite to speed up the image processing. Yes?

  • @savvaskouloutsides3847
    @savvaskouloutsides3847 Год назад

    Amazing project!!! I am trying to do something similar and I wanna use the exact same visualizer for the human so the drone can follow ( in S.I.T.L ). Could you walk me through on how you created that "human"?

  • @i_think_2_much277
    @i_think_2_much277 Год назад +2

    So cool, you’ve done an amazing job
    your such an inspiration…
    I’m currently working on a tank drive robot with an Xbox 360 connect Using a jetson nano and playing around with obstacle avoidance, recognition, and SLAM navigation
    Programming is hard and it’s nice to know that there’s other people out there researching these kinds of things with makeshift robots
    Keep it up!
    I love to watch your videos!!!! 🤘
    Ps my Jetsons wifi connectors are also messed up a bit,
    I feel your pain lol

    • @akamatchstic
      @akamatchstic  Год назад +1

      Programming absolutely can be difficult, especially when you have to deal with the real world!
      Best of luck with your robot, sounds like a great project :)

  • @kaiwang629
    @kaiwang629 Год назад

    Good job!

  • @Edward3C
    @Edward3C Год назад +1

    Great series. Would love to see you go back to the original goal of indoor (slam?) mapping and modeling autonomously...
    Question about Stanley, were you able to implement any form of object permanency? Could you turn your back to the drone and still be recognized? Have you considered implementing gesture recognition to control distance and altitude while following?

    • @akamatchstic
      @akamatchstic  Год назад

      > object permanency
      Kind of. The YOLO model in use only labels a person and doesn't track them, but does still work if you turn your back. I set it up to follow whichever person detection is closest
      > gesture recognition
      I do eventually plan to add this (or spoken commands) to do things like takeoff. Controlling altitude likely would be a good idea too, since I struggled with ensuring the terrain height was followed properly!
      > indoor mapping and modeling autonomously
      This is a big maybe. It would require me to find somewhere I can actually fly indoors due to the size of Stanley, outside of handling a GPS-denied environment!

  • @sefutho
    @sefutho Год назад

    Thank you very much for this video 🙏🏾. I'm definitely going to be asking questions from you in the near future. I hope you will be kind enough to share some of your knowledge

  • @aliakturin8124
    @aliakturin8124 10 месяцев назад

    KillSwitch is what they should have done in the Terminator movie xD. Love your project!

  • @droneguy69
    @droneguy69 7 месяцев назад

    Great job man. How long did it took you to write the code?

  • @ilan_mittelman
    @ilan_mittelman Год назад +1

    that's dope

  • @theabyss5647
    @theabyss5647 Год назад +2

    I enjoyed watching this project and your knowledge evolve very much!
    Shame you didn't try to run away from the drone :D I wonder what would be the result.

    • @akamatchstic
      @akamatchstic  Год назад

      I did consider it! Unfortunately under UK law, you gotta have eyes on the UAV at all times so it was off the cards 😅

    • @theabyss5647
      @theabyss5647 Год назад

      @@akamatchstic Even if so, I hope you tried anyway and it worked! Because of course if you did you can't say! :P

    • @akamatchstic
      @akamatchstic  Год назад

      @@theabyss5647 Last week I did get a bit cheeky when showing it off to my family, and actually did try this! Their eyes were on it so technically all was a-ok 😅

  • @Ajay-di3zn
    @Ajay-di3zn 8 месяцев назад

    Dedication ❤❤❤

  • @santhoshmamidisetti
    @santhoshmamidisetti 8 месяцев назад

    could you share some video / resources on how to setup and use SITL Simulator (Software in the Loop) , especially for your case, that would help a lot , I recon that would itself be a separate video 😅

  • @Rizefz.
    @Rizefz. Год назад

    So cool, I have a task like this and can the camera use a webcam? and what should be changed? thanks very much

    • @akamatchstic
      @akamatchstic  Год назад

      If you're looking to do this, you're going to need a way to at least get people detection running. I'd recommend using an NVIDIA Jetson TX2 (or newer) in this case, which will have to do the heavy lifting of running inference over your webcam's video feed. That'll be the biggest change in the software architecture - its doable, but not as easy as using something like the OAK-D!

  • @StuartRobinson123
    @StuartRobinson123 11 месяцев назад

    what made you switch from the waveshare camera in your initial project to the more expensive oak-d gadget you're using here with stanley? and thank you for these videos!!!

    • @akamatchstic
      @akamatchstic  11 месяцев назад +1

      I found that running the stereo algorithm on the Jetson to get depth perception was pretty taxing, and was concerned that there wouldn’t be enough compute resources left to be able to do gesture recognition later down the line. The OAK-D just so happened to be available for preorder right when I made the decision to use an external depth camera

  • @user-qe6gf1qq7z
    @user-qe6gf1qq7z 6 месяцев назад

    so cool .

  • @neilclay5835
    @neilclay5835 7 месяцев назад

    Just wondering if that jitter is due to a PID loop that's not quite balancing.

  • @Cordic45
    @Cordic45 Год назад

    Would you tell me how to install Brushless 2Axis camera Gimbal on the deon?

  • @KarthikArumugham
    @KarthikArumugham Год назад

    Could you please share the details of the 3D printed parts, especially for mounting the OAK-D Lite camera with the 2-axis gimbal?

    • @akamatchstic
      @akamatchstic  Год назад

      Sure, these are available here: www.dropbox.com/sh/ip13e0wacflxtbb/AADVNoiI6ry5AH1D5ktzPIPRa?dl=0

  • @naherglamour8128
    @naherglamour8128 Год назад +1

    Hello, im from Ukraine. Therefore, I apologize for bad English, I use a translator. I'll tell you how I got on the video. I want to make a drone for our army now, for obvious reasons. And one of the problems is the transfer of video in high quality. I found the Herelink system and it's great, but it only works with the Pixhawk Orange Cube, which is a bit pricey. I want it cheaper, and I saw the Open HD system, on raspberries. But it seems to me that the system weighs a lot, it cannot process high quality video, and I decided to learn about nvidia, I saw about this system and got on your video. Very interesting you tell and shoot, I would really like to see more videos from you. You are cool, t is a very interesting story. After your video, I got fired up creating my own drone and creating AI for unique control. Thank you for your work, you are awesome.
    Did you know a programming language before, or did you learn while doing your project? Did it take you one year? Did you spend a lot of time? Do you see any other systems for creating a drone with control through voice, movement? Thank you very much.

  • @45throoster
    @45throoster 3 месяца назад

    Okay now set your code to run waypoint missions at the park until it detects a person and have it follow them.

  • @dsyeta174omkshirsgar2
    @dsyeta174omkshirsgar2 Год назад

    can you make a video series of the programming of the Stanley and modify the program to land the drone in the center of a solid color square. please help me through the image processing and controlling of drones as I am already working on this project as my final project for engineering. I appreciate your efforts to make this drone as I know what effort and patience are needed for reaching to achieve this goal. I am using ros for detecting objects and hovering over that object. i am able to detect the object and mark its center but i am not able to take the drone over the detected object . please help me through this . i am using ros, gazebo ,open cv, python ,qgis .

    • @akamatchstic
      @akamatchstic  Год назад +1

      Sorry, I’m not planning to do this with Stanley. The code I wrote is all available for free on GitHub, which should be in the description of this video. You’ll probably want to look at research around detecting “registration marks” in OpenCV from a downward facing camera, and use that data with a corresponding distance sensor to control landing. (Reading your comment again, looks like you have that already!)

  • @abeshifat7707
    @abeshifat7707 Год назад

    would you please also make a video how i can make a indoor flying drone who perform like a flying robot . like if i want i can move a glass of water from one room to another room using this drone and put the glass in another room in accurate position . And what types of hardware is needed to be done to make this. Like Jetson Nano , pixhawk 2.4.8 , GPS and last thing how i can control it auto by using laptop or computers . by the way i cant write English properly so hope you kindly forgive me if i make some mistake to write this . I will be waiting for your video .Carry on brother .

  • @sairajesh4817
    @sairajesh4817 11 месяцев назад

    i need a sujection ai drone how make in program and which type sensors are used plz help me get it sir

  • @ArpitParashar-fr9dv
    @ArpitParashar-fr9dv Год назад

    Amazing work !!
    Kinda working on the same in India.
    Would love to get some suggestions. 🎉

  • @gemnicherry2670
    @gemnicherry2670 Год назад

    Now around at 18:00 as we can see Stanley was improving in the following me mode was that because of the Jetson AI learning through all the trial you had done or were you changing code. Note: I’m not versed in computers or coding but I’m trying to learn a little from you. What’s your education background? What kind of work do you do? Do you have any intention to incorporate some kind of obstacle avoidance into the AI?

    • @akamatchstic
      @akamatchstic  Год назад

      That was me changing the code itself - updating some configuration parameters I added, and adjusting the output of the follow code. The AI side is mainly used for detecting people in the camera feed, which then goes into a more “conventional” control algorithm.
      Education - good question! I taught myself how to code from 2011 onwards, starting with modifications for jailbroken iOS devices. From 2014 to 2018 I did a undergraduate + masters degree in Computer Science, and now in industry I’m a full-stack software engineer for web services.
      If you’re looking to get into robotics, I would recommend playing around with Arduino or a Raspberry Pi Pico. The community and support around those platforms is very good, meaning a Google search for “how to do ” should result in useful information 😅

  • @shuaili5656
    @shuaili5656 Год назад

    Hi, are u using the auto focus or the fixed once of the oak-d lite ? I saw there are two models

    • @akamatchstic
      @akamatchstic  Год назад

      This is the fixed focus - reading up on the auto mode led me to think that the vibrations of the drone would cause issues to keep things focussed.

    • @shuaili5656
      @shuaili5656 Год назад

      @@akamatchstic thanks, I made a mistake, I just got one auto focus for my drone, let me try to see how bad it can be

  • @simonbelmont1986
    @simonbelmont1986 5 дней назад

    Can you program it to follow strangers? For Halloween

  • @ImpulsiveMigea
    @ImpulsiveMigea 11 месяцев назад +1

    Is it possible to complete the task using another FC? For example SpeedyBee?

    • @akamatchstic
      @akamatchstic  11 месяцев назад +1

      I’d expect so. The key thing is that you can control the usual pitch + yaw + roll to do appropriate navigation. The code I wrote assumes it’s talking to a MAVLink capable flight controller, which handles the conversion of “move x meters” to attitude changes for me

  • @antonyshani9705
    @antonyshani9705 2 месяца назад

    Sir can i use this same code for raspberry pi 5

  • @abeshifat7707
    @abeshifat7707 Год назад

    would you please make a video how i can make a indoor flying drone who perform like a flying robot . like if i want i can move a glass of water from one room to another room using this drone and put the glass in another room in accurate position . And what types of hardware is needed to be done to make this. Like Jetson Nano , pixhawk 2.4.8 , GPS and last thing how i can control it auto by using laptop or computers . by the way i cant write English properly so hope you kindly forgive me if i make some mistake to write this . I will be waiting for your video .Carry on brother .

  • @ensam9317
    @ensam9317 Год назад

    I couldnt figure out from where the jetson nano gets those 5v, is it from the bec or is it from the pixhawk?

    • @akamatchstic
      @akamatchstic  Год назад

      I’m using the BEC to power the Jetson Nano, which supplies 5V via GPIO. I’m half certain my ESCs double as a BEC which could have also worked

    • @ensam9317
      @ensam9317 Год назад

      @@akamatchstic thank u so much

  • @Cordic45
    @Cordic45 Год назад

    thanks a lot
    Do you have any ideal about swarm?

    • @akamatchstic
      @akamatchstic  Год назад

      Swarm as in those big light show drones, right? I've thought a little about it, but the big limiting factor will be getting enough $$$ to actually have enough drones to make it look cool!

  • @MrTombraider69
    @MrTombraider69 9 месяцев назад

    Fuck yeah!

  • @JoaoVitor-qi5ce
    @JoaoVitor-qi5ce 11 месяцев назад +1

    How you powered the jetson nano?

    • @akamatchstic
      @akamatchstic  11 месяцев назад

      I’ve got a BEC hooked up to the power distribution board that outputs 5V, which in turn feeds power to the 5V (and GND) GPIO pins

  • @TangoBolshevik
    @TangoBolshevik Год назад

    How does this have only 9k views?

  • @sooryaraji6700
    @sooryaraji6700 9 месяцев назад

    Sir one doubt when the man runs fast drone will follow to catch the man is it possible or not

    • @akamatchstic
      @akamatchstic  9 месяцев назад

      If you run faster than the speed of the drone (I set this to 1m/s), it would be possible to increase the distance until the camera no longer can detect the person. The solution would be to match the acceleration of the drone to the person, to maintain a reasonable distance to allow the camera to continue detecting the person.

  • @cameronrose1737
    @cameronrose1737 11 месяцев назад

    Did youtube make you change the name of this video??? I remember a different title that I preferred much more.

    • @akamatchstic
      @akamatchstic  11 месяцев назад

      I ended up changing it, was experimenting with how different names changed the viewing metrics

    • @cameronrose1737
      @cameronrose1737 11 месяцев назад

      ​@@akamatchstic it took me forever to find this again because I was looing for human attack drone lol. I'm trying to do something similar, make a drone to follow a moving object. I'm testing in SITL and having a little trouble figuring out the flow of mavlink commands to make it fly around from point to point. Do you have any advice for that? I can make it takeoff and then land, but I can't send it to a specific GPS location or make it fly X distance in some desired direction. Also I connected my jetson nano to SITL on my windows laptop so I can test the real code! So that's pretty cool.

  • @tk-maker
    @tk-maker Год назад +1

    Instead of put jetson on drone, why not we do like FPV drone using VTX to transmit video down to ground unit and jetson became pilot fly the drone through remote via trainer port (e.g SBUS in TX16s)

    • @matthewarchibald5118
      @matthewarchibald5118 3 месяца назад

      It's for the stereo camera

    • @Scitus
      @Scitus 2 месяца назад

      This scenario that you are talking about has a delay and don't allows real time reactions like following a person

  • @08halit
    @08halit Год назад

    dropp the go pro,it has to much lag,,use a faster response camera

    • @akamatchstic
      @akamatchstic  Год назад

      It worked well enough to get some aerial footage 🤷‍♂️ I do have a DJI Phantom now that has yet to see it’s maiden flight (for me, anyway)

  • @IamG3X
    @IamG3X Год назад +1

    Can I recommend you a project idea? self balancing cube. like this one.
    ruclips.net/video/n_6p-1J551Y/видео.html
    I think it will be interesting and challenging (probably not) for you.

  • @kubricksghost6058
    @kubricksghost6058 7 месяцев назад

    this is great. now all you need is a hentai waifu

  • @Patty-cu1ke
    @Patty-cu1ke Год назад

    𝚙𝚛𝚘𝚖𝚘𝚜𝚖 👊

  • @piconano
    @piconano 2 месяца назад

    I'd like to watch the rest, but your "music" is too annoying.