This Motorised Mechanical Eye Ball is build with AI (p2)

Поделиться
HTML-код
  • Опубликовано: 2 апр 2022
  • This Robotic Eye works!!! It uses Jetson and ODrive, with some powerful motors. I have encountered many issues with this build, and at some point I thought that it is never going to work. But finally it works really great! Also I have included Jetson AGX Orin unboxing :) .
    Special thanks to my special Patrons: Shounak Bhattacharya and M. Aali!
    Please subscribe. This will help me to develop other projects like this, to bring the bright future closer!
    One time donation:
    www.paypal.me/Skyentific
    If you want to help this channel, please support me on Patreon:
    / skyentific
    Instagram: / skyentificinsta
    Facebook: / skyentificface
    Twitter: / skyentifictweet
    #DIY #robot #AI
  • НаукаНаука

Комментарии • 115

  • @Skyentific
    @Skyentific  2 года назад +55

    I need your like! Please help me. The more likes I will have, the more views will be. The more views gives more subscribers. The more subscribers we will have, the more videos I will make.

    • @Nicholas-my4xj
      @Nicholas-my4xj 2 года назад +1

      Looks great, good work as always!

    • @DanielIzguerra2012
      @DanielIzguerra2012 2 года назад

      Any update on the brushless motor arm?

    • @MagicGumable
      @MagicGumable 2 года назад

      Could you try to shield the cable with simple tinfoil connected to ground? It may be not as elegant but dirt cheap ;)

    • @tyeth
      @tyeth 2 года назад

      You should stick affiliate links to certain featured things, like the CSI to HDMI adapters. Also would be nice to see it look for face-less people (i.e. back of head turned to camera or face obscured), and to have a slow roaming mode if no faces, probably similar to some film :)
      Keep up the good work!

  • @Corbald
    @Corbald 2 года назад +7

    Maybe it's the yellow color, but this fits into 'Creepy-Cute' to me. It seems _so happy_ to see you!

  • @WalkingEng
    @WalkingEng Год назад

    This is just brilliant, probably one of the best robot applications yet

  • @freakinccdevilleiv380
    @freakinccdevilleiv380 2 года назад +5

    Awesome and funny, good job Sky. It would be a nice exercise to go to the other extreme and try to do it as cheaply as possible.

  • @DanielIzguerra2012
    @DanielIzguerra2012 2 года назад +11

    Amazing, thank you for your videos! You inspire us

  • @knoopx
    @knoopx 2 года назад +1

    hahaha cool glasses and nice workout! and awesome project of course!

  • @marcusluis_s
    @marcusluis_s 2 года назад +2

    Great job @Skyentific 👏

  • @JaySmith91
    @JaySmith91 Год назад

    This is fantastic; thanks for sharing the process.

  • @organicelectrics
    @organicelectrics 2 года назад +1

    So cool! Love the fluid motion with the higher frame rate.

  • @dfn808
    @dfn808 2 года назад

    This is an awesome project. Thank you for taking the time to share and explain everything clearly.

  • @Dangineering
    @Dangineering 2 года назад +2

    Incredible project! Thank you for sharing!

    • @Skyentific
      @Skyentific  2 года назад +1

      Thank you for watching my videos and for this comment!

  • @jrohit1110
    @jrohit1110 2 года назад

    Those shades are badass!

  • @danielghani3903
    @danielghani3903 2 года назад

    thank you

  • @joels7605
    @joels7605 2 года назад +2

    Spectacular work. I love it. Well done, sir.

  • @JohnDuthie
    @JohnDuthie 2 года назад +3

    I love how the motors react at the framerate of the Jetson Nano. It's odd but cool to see how everything is dependent on the vision.

  • @imadjawad4408
    @imadjawad4408 Год назад

    outstanding videos every time I watch, A+

  • @charlesb689
    @charlesb689 2 года назад

    Amazing video!

  • @PhG1961
    @PhG1961 2 года назад +3

    Awesome !! Of course I hit the thumbs up button ! I always do !

  • @ericcarabetta1161
    @ericcarabetta1161 2 года назад +7

    This is so cool, I'd love to be able to implement something like this in an art project I had in mind.

    • @Inertia888
      @Inertia888 2 года назад +1

      it reminds of the laser guided systems that fighter jets use

  • @sash710
    @sash710 2 года назад +1

    WOW 100% Cool

  • @plotze0692
    @plotze0692 2 года назад +1

    Very cool project!

  • @vishalsingh-yf9es
    @vishalsingh-yf9es 2 года назад

    You'r are the most inspiring person when it comes to ROBOTICS, God bless you man :)

  • @JuanCarlos-ff2rh
    @JuanCarlos-ff2rh 2 года назад

    MARAVILLOSO!!! Voy a hacer como este pero con Windows. Tus videos me encantan. Muchas gracias

  • @nathaniellangston5130
    @nathaniellangston5130 2 года назад

    Great Video! I am REALLY impressed with how fast the fancy pants Jetson was able to track you! I too have a hard time doing anything non-overkill!

  • @taitywaity1836
    @taitywaity1836 2 года назад

    keep making these! cool socks btw

  • @worksasintended4997
    @worksasintended4997 2 года назад

    That thing is smooth! Great work, great to see it done.
    I need to build something like that and add a water hose at the top. It will be hilarious fun with the children.

  • @thunderinvader9031
    @thunderinvader9031 2 года назад

    It was fun eyeballing )

  • @josgraha
    @josgraha Год назад +2

    I love Mr. Bruton but I think you are the engineer in the room here. :). Thanks again for your video sir! BTW, it would be interesting to see you use the ROS2 IK solvers for some of these geometric challenges. Can't wait for you to release more videos, I always learn a lot from you and I think I have more fun than you (possibly) seeing all these very cool projects you share with us.

  • @EatRawGarlic
    @EatRawGarlic 2 года назад +5

    Very cool! I did something similar with a Pi4 to kill some time during the lockdown, but soon ran into its processing limitations.
    What I did manage to do, was make it play the Metal Gear warning sound upon recognition of a face :D

    • @wetfish412
      @wetfish412 2 года назад

      I have dyslexia, I read that as to kill someone.

  • @user-xx3lj2ul3c
    @user-xx3lj2ul3c 2 года назад

    Your always make cool video !!!!!!

  • @beefsand419
    @beefsand419 2 года назад

    Nice

  • @MJ12GRAVITON
    @MJ12GRAVITON Год назад

    Amazing! 👁👁👁👀👀👀

  • @tszulpinedo757
    @tszulpinedo757 2 года назад

    Están bien cool sus lentes, profesor...

  • @baxter1484
    @baxter1484 Год назад +1

    this is just the portal rocket sentry core

  • @AtTheZebo
    @AtTheZebo 2 года назад

    Give it the body of a "minion" and teach it how to multiply itself!

  • @robottinkeracademy
    @robottinkeracademy 2 года назад +3

    Awesome work, next step an end effector to slap anyone that isn't you that the robot sees 😀

  • @user-jl3ti3tc2j
    @user-jl3ti3tc2j 2 года назад +2

    Use the quaternions! Look for explaining at the 3BlueDotBrown channel.

    • @Skyentific
      @Skyentific  2 года назад

      Great video, thank you for advise.

  • @CyberSyntek
    @CyberSyntek 2 года назад +3

    Please give it a linear microphone array to detect which direction sound is coming from also! 🙏

    • @Skyentific
      @Skyentific  2 года назад +1

      Great idea! And I have one :)

    • @CyberSyntek
      @CyberSyntek 2 года назад

      @@Skyentific if you get that working your already existing legend status will grow even beyond what it already is in my mind! :)
      That is actually something I really wanted to play with and get working with a combo of filters limiting the sound response signals within a sane range so the bot or in this case eye doesn't have a melt down and play with stop/start responses between the linear mic array and openCV so that we have seeing and hearing robots available to the open source community. Life just became a bit complicated recently and I haven't found the time yet to play around with that or much at all. Hopefully that will change near future, but I believe you are a much more qualified man for the job than myself honestly.
      Beyond that stuff, mannn love the response times you are getting on this eye. Very very nice. What a cool combo to have using the odrive. Amazing stuff.

    • @CyberSyntek
      @CyberSyntek 2 года назад

      Just as example of what I mean.
      If sound range is within (whatever range the filters are set) stop/disable openCV > moveTo(where ever sound was) >start.OpenCV()
      Though I suppose we would want to take it a step further with some if statements so that if the sound detected is still within the range/direction of where it is currently looking (if the person speaking is already being detected while speaking) that it wouldn't disable openCV since it may cause some strange movement patterns between words. 🤔
      I'm thinking more towards a 6 mic linear array as that would likely make things easier to work with.

  • @lionelheavener3396
    @lionelheavener3396 2 года назад

    imagine seeing this in a large animatronic

  • @buidelrat132
    @buidelrat132 2 года назад +2

    Great job! Love the analytical solution. Could quaternions have simplified the math?

  • @manyirons
    @manyirons 2 года назад

    Cool! Now see if you can make it track a fly.

  • @Tetsujinfr
    @Tetsujinfr 2 года назад +3

    Really cool project, congrats! One note about Nano vs NX use: if you wanted a more fluid set of motions you could just keep using Nano and add some smoothing interpolation algo (local cubic spline for instance) on top of the raw points and read the interpolated points and extrapolate as well, at say 50hz, it should work really well. Now if you wanted to reduce the eye tracking latency, then you indeed have no choice but to reduce the initial compute time to start moving asap to track the target.

    • @frollard
      @frollard 2 года назад

      Now we just need another ai layer of projecting where it thinks the target will be; there will still be 70ms latency to first move, but once moving the targeting solution could be a look-ahead.

    • @Tetsujinfr
      @Tetsujinfr 2 года назад

      @@frollard good point, with a pose detector there is a lot of info to use to forecast the head position, but deep learning is compute expensive for those little embedded computers.

  • @jacquesb5248
    @jacquesb5248 2 года назад +4

    dude you brilliant. if there was two faces, which one would it follow?

    • @Skyentific
      @Skyentific  2 года назад +2

      Great question! It is programmed to follow the biggest (yet closest) face. But I have not tested this yet :)

  • @BrainSlugs83
    @BrainSlugs83 2 года назад

    Very cool project. -- I would have stayed with the PID loop (a full PID loop though), the little overshoots the eyeball does could be overcome by tuning the Ki and Kd parameters.

  • @Muny
    @Muny 2 года назад +4

    I'm not sure I follow the formula you created for figuring out the angle you need to point at. I did a very similar thing recently, and I just calculated the Angle per Pixel for horizontal and vertical (for my particular lens+sensor) and multiplied the pixel error by that to get the change in angle necessary. (HFOV/ImageWidth) for horizontal, (VFOV/ImageHeight) for vertical.

  • @iloverobotics113
    @iloverobotics113 2 года назад

    Ya. Really cool!!!! This is the eye of God.

  • @ikkeniikkewel
    @ikkeniikkewel 2 года назад

    not bad.

  • @fischX
    @fischX 2 года назад

    Could be fun to mount this on a camera crane

  • @riccardoberra9476
    @riccardoberra9476 2 года назад +2

    Very impressive, i’m doing a really similar project (a robot with the same principes of movement but with the camera there’s also an electric airsoft carriage for shooting drones 😂, detection, tracking and shooting )
    Your problem solving and the solutions you took inspired me a lot. Nice job!!

    • @user-qy9rg3nt2l
      @user-qy9rg3nt2l 2 года назад +1

      I have a similar project going on too. Fully auto Airsoft sentry.

  • @Wyld1one
    @Wyld1one Год назад

    What would happen if you put a picture of yourself on the wall behind you?
    Also noticed something interesting. On large movements it looks like it's overcompensating a bit on the amounted moves and then it backs up a bit. I wonder if it would be better if it's had a large distance to move it would move say 90% of it and then slow move to the smaller distance?
    I was also watching a video they're talking about the neurology of how we read. An interesting bit they were showing was how the eye moves. It tends to jump or snap to a New direction. Could probably do that because it's lightweight to move

  • @joshieeee20
    @joshieeee20 Год назад

    Could actually be useful if you had a spare smartphone mounted to it to record hd video if made it track your body, offest Infront of you and mount it on the ceiling somewhere

  • @frollard
    @frollard 2 года назад

    I wonder if rate limiting the odrive would reduce the jerkiness of the motion from the nano; the robot could get to the destination too fast for 1/15s to pass for the new video frame. An interpolation/planner layer between input and output could send high frequency odrive instructions from low frequency input coordinates and still maintain very fast reactions.

    • @ChrisSivanich
      @ChrisSivanich 2 года назад

      PID (Proportional Integral Derivative) algorithms can do the interpolation you're thinking of. They can adjust motor power at a much higher frequency than the target position input and therefore provide smooth transition to any target while accounting for over/undershoot. They're extremely useful when doing relative motion change with encoders in limited compute situations, especially with low frequency + small movements.
      They're very cheap performance-wise too, my PID implementation (which is in no way optimal) can easily control 3 motors in under 1ms of compute with only an M0 equipped Arduino (no image compute/tracking in my case, just targeted positioning).

  • @dawitsarsenbaev2333
    @dawitsarsenbaev2333 2 года назад

    Молодец братан

  • @armurak
    @armurak 2 года назад +1

    That squid game killer

  • @MYouMusikTV
    @MYouMusikTV 2 года назад +1

    Will it work to shield the old camera cable by a pice of grounded aluminum foil which is wrapped around the flat cable?

  • @rextalon7763
    @rextalon7763 2 года назад +2

    As a next step, just for interesting cosmetic reasons, maybe put in a range finder and an iris. When the detected object is far away open the iris, and as the object gets closer, have the iris close tighter. (or the other way around, idk)

  • @johnkoester7795
    @johnkoester7795 2 года назад

    That’s what I wanna do for my robotics give them the ability to see

  • @regularfryt
    @regularfryt 2 года назад +1

    I do wonder how much of the need for a more powerful Jetson is because you're doing full pose estimation (expensive) rather than face location (which is comparatively cheap). You could probably get a *much* higher frame rate using something like YOLO, and *that* would let you use something far smaller and cooler to drive the motion.

    • @Skyentific
      @Skyentific  2 года назад

      Very good point. Probably you are right, I have not tried YOLO. I can only answer that the object detection with the Jetson Nano is still relatively slow (25fps). Which is better than the pose estimation (15fps), but not as fast as Jetson Xavier NX (more 60fps for pose estimation).

  • @Idlecodex
    @Idlecodex 2 года назад

    Did you consider using an slip ring instead?

  • @dwalthers
    @dwalthers Год назад

    Would love to have a pair of eyes that work in unison in a paintable white material. Would you design and make these for me?

  • @TheNadOby
    @TheNadOby 2 года назад

    Nice project, with great color scheme, but the computation requrements seems to big.
    Have you tried something old-fashioned, like OpenCV?

  • @turnedup28
    @turnedup28 2 года назад

    How well does it work if you are farther from the camera? Most of your testing was really close up, but it would be neat to see it used for something like a sporting event. For a suggestion, try using the rule of thirds from framing the subject. It might not feel as creepy to the person being followed, but the video captured will look more natural.

  • @FaithfulMC
    @FaithfulMC 2 года назад +4

    Who won the RTX graphics card?

    • @Skyentific
      @Skyentific  2 года назад +4

      It will be announced very soon (this comming week).

    • @oldemand
      @oldemand 2 года назад

      @@Skyentific Did the winner get announced?

  • @alexdorand
    @alexdorand 3 месяца назад

    do you have or sell or offer "how to build ....?" of your robots and projects?

  • @constantinehelen9935
    @constantinehelen9935 2 года назад +4

    What happens with 2 people in the field of view?? I know Posenet can do multiple people at once.

    • @Skyentific
      @Skyentific  2 года назад +2

      Great question. It will follow the person with the greatest distance between the eyes (normally this is the closest person). Although, I have not tested this yet :)

    • @constantinehelen9935
      @constantinehelen9935 2 года назад

      @@Skyentific ahh nice!! Great work! I love your channel :)

  • @IronChad_
    @IronChad_ 2 года назад +1

    how did you get the Jetson nano? they keep going out of stock. please help

    • @Skyentific
      @Skyentific  2 года назад

      I bought this jetson nano couple of years ago.

  • @kahwigulum
    @kahwigulum Год назад

    okay but what about two people
    who will it follow

  • @DPTech_workroom
    @DPTech_workroom 2 года назад +2

    Прикольно получилось.
    Вот такую бы штуку в Украину на каждое здание, только с защитой против ракет и прочего мусора, что сыпется с неба от недоброжелательных рашшистов.
    У меня были проблемы с I2C датчиком на 3-х осевом стабилизаторе камеры. Тоже кабель шел параллельно силовым к моторам. (первое видео заставки канала)

    • @backgammonbacon
      @backgammonbacon 2 года назад

      It turned out nice.
      That would be such a thing for Ukraine for every building, only with protection against missiles and other debris that falls from the sky from unfriendly rashshists.
      I had problems with the I2C sensor on the 3 axis camera gimbal. The cable also ran parallel to the power motors. (first channel intro video)

  • @ChrisSivanich
    @ChrisSivanich 2 года назад +1

    Booting into the desktop environment then starting the tracking program might be wasting some performance - my experience with low power embedded applications has always been better without X/GNOME getting in the way. If the NVIDIA drivers require X, you can start it up without GNOME. Especially on those lower power boards, I'd love to see what impact it'd have.
    Also, I was excited to see how you'd implement and tune a PID algorithm for this, but you found a way around that with raw math 😄

    • @Skyentific
      @Skyentific  2 года назад +1

      Completely agree, without Gnome it should be better. I will try.
      I think with analytical solution it should perform better than with perfectly tuned PID. And I am really bad at tuning PID :)))

    • @tyeth
      @tyeth 2 года назад

      @@Skyentific Please let us know the results, the frame rate at 11min 48seconds onwards is no where near 15 frames per second (the rate mentioned when upgrading the nano to NX with ~80fps), I wonder if there is significant delay to the video processing when the robot "reacts", i.e. does things other than check video frames for faces. The gnome/X talk is definitely a good shout. I'm very sorry to admit I skipped some of the sections of the video, but plan to revisit once I have a Jetson :)

  • @gillespons4053
    @gillespons4053 2 года назад +1

    What happens with more people on the frames ? 😅

    • @Skyentific
      @Skyentific  2 года назад

      Great question! I programmed to detect closest person (more precisely person with greatest distance between two eyes). But I have not tested it yet :)

  • @fischX
    @fischX 2 года назад

    Divide by zero is for all practical purposes perfectly estimated at 0
    - it is mathematical wrong but good enough ;)

  • @JimCGames
    @JimCGames 2 года назад +1

    Built

  • @lucerino1973
    @lucerino1973 Год назад

    Complimenti, pagherei per avere 1 /10 delle tue conoscenze

  • @to1704
    @to1704 2 года назад

    Ух у тебя акцент))
    Нафига такой большущий глаз? ))) там камера мелкая. А воще прикольно

  • @sergeyworm1476
    @sergeyworm1476 2 года назад

    How old is this boy? 🙂

  • @orhansezaikisioglu5038
    @orhansezaikisioglu5038 4 месяца назад

    Hello. Can you share the source codes? I am a student

  • @jakobfindlay4136
    @jakobfindlay4136 Год назад

    I keep seeing Nvidia dev boards, what about the coral tpu?

  • @timsteel1060
    @timsteel1060 2 года назад

    средняя церковно-приходская с английским уклоном ))))) у меня в голове не укладывается как можно так быстро говорить с таким акцентом )))) а воообще - иц эмэйзинг! сэнк ю фо юр джоб и все эти вещи. особенно ценно , что я понял каждое слово, когда обычно не понимаю и половины ))))

  • @SirTodd.
    @SirTodd. 2 года назад

    I want those sunglasses. Pure sexy.

  • @Chris-bg8mk
    @Chris-bg8mk 2 года назад +2

    🇺🇦

  • @andre7417
    @andre7417 2 года назад

    I wonder how funny it would be to program the robot to do the opposite and instead of focusing on the person avoid 'eye' contact as much as possible. Completely useless but interesting nonetheless.

  • @mrpeaceful1
    @mrpeaceful1 Год назад

    this video is dogecoin

  • @ivprojects8143
    @ivprojects8143 2 года назад

    Really nice result! Well done.