This Motorised Mechanical Eye Ball is build with AI (p2)
HTML-код
- Опубликовано: 2 апр 2022
- This Robotic Eye works!!! It uses Jetson and ODrive, with some powerful motors. I have encountered many issues with this build, and at some point I thought that it is never going to work. But finally it works really great! Also I have included Jetson AGX Orin unboxing :) .
Special thanks to my special Patrons: Shounak Bhattacharya and M. Aali!
Please subscribe. This will help me to develop other projects like this, to bring the bright future closer!
One time donation:
www.paypal.me/Skyentific
If you want to help this channel, please support me on Patreon:
/ skyentific
Instagram: / skyentificinsta
Facebook: / skyentificface
Twitter: / skyentifictweet
#DIY #robot #AI Наука
I need your like! Please help me. The more likes I will have, the more views will be. The more views gives more subscribers. The more subscribers we will have, the more videos I will make.
Looks great, good work as always!
Any update on the brushless motor arm?
Could you try to shield the cable with simple tinfoil connected to ground? It may be not as elegant but dirt cheap ;)
You should stick affiliate links to certain featured things, like the CSI to HDMI adapters. Also would be nice to see it look for face-less people (i.e. back of head turned to camera or face obscured), and to have a slow roaming mode if no faces, probably similar to some film :)
Keep up the good work!
Maybe it's the yellow color, but this fits into 'Creepy-Cute' to me. It seems _so happy_ to see you!
This is just brilliant, probably one of the best robot applications yet
Awesome and funny, good job Sky. It would be a nice exercise to go to the other extreme and try to do it as cheaply as possible.
Amazing, thank you for your videos! You inspire us
hahaha cool glasses and nice workout! and awesome project of course!
Great job @Skyentific 👏
This is fantastic; thanks for sharing the process.
So cool! Love the fluid motion with the higher frame rate.
This is an awesome project. Thank you for taking the time to share and explain everything clearly.
Incredible project! Thank you for sharing!
Thank you for watching my videos and for this comment!
Those shades are badass!
thank you
Spectacular work. I love it. Well done, sir.
Thank you!
I love how the motors react at the framerate of the Jetson Nano. It's odd but cool to see how everything is dependent on the vision.
outstanding videos every time I watch, A+
Amazing video!
Awesome !! Of course I hit the thumbs up button ! I always do !
Thank you!
This is so cool, I'd love to be able to implement something like this in an art project I had in mind.
it reminds of the laser guided systems that fighter jets use
WOW 100% Cool
Very cool project!
Thank you!
You'r are the most inspiring person when it comes to ROBOTICS, God bless you man :)
MARAVILLOSO!!! Voy a hacer como este pero con Windows. Tus videos me encantan. Muchas gracias
Great Video! I am REALLY impressed with how fast the fancy pants Jetson was able to track you! I too have a hard time doing anything non-overkill!
keep making these! cool socks btw
That thing is smooth! Great work, great to see it done.
I need to build something like that and add a water hose at the top. It will be hilarious fun with the children.
It was fun eyeballing )
I love Mr. Bruton but I think you are the engineer in the room here. :). Thanks again for your video sir! BTW, it would be interesting to see you use the ROS2 IK solvers for some of these geometric challenges. Can't wait for you to release more videos, I always learn a lot from you and I think I have more fun than you (possibly) seeing all these very cool projects you share with us.
Very cool! I did something similar with a Pi4 to kill some time during the lockdown, but soon ran into its processing limitations.
What I did manage to do, was make it play the Metal Gear warning sound upon recognition of a face :D
I have dyslexia, I read that as to kill someone.
Your always make cool video !!!!!!
Nice
Amazing! 👁👁👁👀👀👀
Están bien cool sus lentes, profesor...
this is just the portal rocket sentry core
Give it the body of a "minion" and teach it how to multiply itself!
Awesome work, next step an end effector to slap anyone that isn't you that the robot sees 😀
Great idea! :)
Use the quaternions! Look for explaining at the 3BlueDotBrown channel.
Great video, thank you for advise.
Please give it a linear microphone array to detect which direction sound is coming from also! 🙏
Great idea! And I have one :)
@@Skyentific if you get that working your already existing legend status will grow even beyond what it already is in my mind! :)
That is actually something I really wanted to play with and get working with a combo of filters limiting the sound response signals within a sane range so the bot or in this case eye doesn't have a melt down and play with stop/start responses between the linear mic array and openCV so that we have seeing and hearing robots available to the open source community. Life just became a bit complicated recently and I haven't found the time yet to play around with that or much at all. Hopefully that will change near future, but I believe you are a much more qualified man for the job than myself honestly.
Beyond that stuff, mannn love the response times you are getting on this eye. Very very nice. What a cool combo to have using the odrive. Amazing stuff.
Just as example of what I mean.
If sound range is within (whatever range the filters are set) stop/disable openCV > moveTo(where ever sound was) >start.OpenCV()
Though I suppose we would want to take it a step further with some if statements so that if the sound detected is still within the range/direction of where it is currently looking (if the person speaking is already being detected while speaking) that it wouldn't disable openCV since it may cause some strange movement patterns between words. 🤔
I'm thinking more towards a 6 mic linear array as that would likely make things easier to work with.
imagine seeing this in a large animatronic
Great job! Love the analytical solution. Could quaternions have simplified the math?
Cool! Now see if you can make it track a fly.
Really cool project, congrats! One note about Nano vs NX use: if you wanted a more fluid set of motions you could just keep using Nano and add some smoothing interpolation algo (local cubic spline for instance) on top of the raw points and read the interpolated points and extrapolate as well, at say 50hz, it should work really well. Now if you wanted to reduce the eye tracking latency, then you indeed have no choice but to reduce the initial compute time to start moving asap to track the target.
Now we just need another ai layer of projecting where it thinks the target will be; there will still be 70ms latency to first move, but once moving the targeting solution could be a look-ahead.
@@frollard good point, with a pose detector there is a lot of info to use to forecast the head position, but deep learning is compute expensive for those little embedded computers.
dude you brilliant. if there was two faces, which one would it follow?
Great question! It is programmed to follow the biggest (yet closest) face. But I have not tested this yet :)
Very cool project. -- I would have stayed with the PID loop (a full PID loop though), the little overshoots the eyeball does could be overcome by tuning the Ki and Kd parameters.
I'm not sure I follow the formula you created for figuring out the angle you need to point at. I did a very similar thing recently, and I just calculated the Angle per Pixel for horizontal and vertical (for my particular lens+sensor) and multiplied the pixel error by that to get the change in angle necessary. (HFOV/ImageWidth) for horizontal, (VFOV/ImageHeight) for vertical.
Ya. Really cool!!!! This is the eye of God.
not bad.
Could be fun to mount this on a camera crane
Very impressive, i’m doing a really similar project (a robot with the same principes of movement but with the camera there’s also an electric airsoft carriage for shooting drones 😂, detection, tracking and shooting )
Your problem solving and the solutions you took inspired me a lot. Nice job!!
I have a similar project going on too. Fully auto Airsoft sentry.
What would happen if you put a picture of yourself on the wall behind you?
Also noticed something interesting. On large movements it looks like it's overcompensating a bit on the amounted moves and then it backs up a bit. I wonder if it would be better if it's had a large distance to move it would move say 90% of it and then slow move to the smaller distance?
I was also watching a video they're talking about the neurology of how we read. An interesting bit they were showing was how the eye moves. It tends to jump or snap to a New direction. Could probably do that because it's lightweight to move
Could actually be useful if you had a spare smartphone mounted to it to record hd video if made it track your body, offest Infront of you and mount it on the ceiling somewhere
I wonder if rate limiting the odrive would reduce the jerkiness of the motion from the nano; the robot could get to the destination too fast for 1/15s to pass for the new video frame. An interpolation/planner layer between input and output could send high frequency odrive instructions from low frequency input coordinates and still maintain very fast reactions.
PID (Proportional Integral Derivative) algorithms can do the interpolation you're thinking of. They can adjust motor power at a much higher frequency than the target position input and therefore provide smooth transition to any target while accounting for over/undershoot. They're extremely useful when doing relative motion change with encoders in limited compute situations, especially with low frequency + small movements.
They're very cheap performance-wise too, my PID implementation (which is in no way optimal) can easily control 3 motors in under 1ms of compute with only an M0 equipped Arduino (no image compute/tracking in my case, just targeted positioning).
Молодец братан
That squid game killer
Will it work to shield the old camera cable by a pice of grounded aluminum foil which is wrapped around the flat cable?
As a next step, just for interesting cosmetic reasons, maybe put in a range finder and an iris. When the detected object is far away open the iris, and as the object gets closer, have the iris close tighter. (or the other way around, idk)
That’s what I wanna do for my robotics give them the ability to see
I do wonder how much of the need for a more powerful Jetson is because you're doing full pose estimation (expensive) rather than face location (which is comparatively cheap). You could probably get a *much* higher frame rate using something like YOLO, and *that* would let you use something far smaller and cooler to drive the motion.
Very good point. Probably you are right, I have not tried YOLO. I can only answer that the object detection with the Jetson Nano is still relatively slow (25fps). Which is better than the pose estimation (15fps), but not as fast as Jetson Xavier NX (more 60fps for pose estimation).
Did you consider using an slip ring instead?
Would love to have a pair of eyes that work in unison in a paintable white material. Would you design and make these for me?
Nice project, with great color scheme, but the computation requrements seems to big.
Have you tried something old-fashioned, like OpenCV?
How well does it work if you are farther from the camera? Most of your testing was really close up, but it would be neat to see it used for something like a sporting event. For a suggestion, try using the rule of thirds from framing the subject. It might not feel as creepy to the person being followed, but the video captured will look more natural.
Who won the RTX graphics card?
It will be announced very soon (this comming week).
@@Skyentific Did the winner get announced?
do you have or sell or offer "how to build ....?" of your robots and projects?
What happens with 2 people in the field of view?? I know Posenet can do multiple people at once.
Great question. It will follow the person with the greatest distance between the eyes (normally this is the closest person). Although, I have not tested this yet :)
@@Skyentific ahh nice!! Great work! I love your channel :)
how did you get the Jetson nano? they keep going out of stock. please help
I bought this jetson nano couple of years ago.
okay but what about two people
who will it follow
Прикольно получилось.
Вот такую бы штуку в Украину на каждое здание, только с защитой против ракет и прочего мусора, что сыпется с неба от недоброжелательных рашшистов.
У меня были проблемы с I2C датчиком на 3-х осевом стабилизаторе камеры. Тоже кабель шел параллельно силовым к моторам. (первое видео заставки канала)
It turned out nice.
That would be such a thing for Ukraine for every building, only with protection against missiles and other debris that falls from the sky from unfriendly rashshists.
I had problems with the I2C sensor on the 3 axis camera gimbal. The cable also ran parallel to the power motors. (first channel intro video)
Booting into the desktop environment then starting the tracking program might be wasting some performance - my experience with low power embedded applications has always been better without X/GNOME getting in the way. If the NVIDIA drivers require X, you can start it up without GNOME. Especially on those lower power boards, I'd love to see what impact it'd have.
Also, I was excited to see how you'd implement and tune a PID algorithm for this, but you found a way around that with raw math 😄
Completely agree, without Gnome it should be better. I will try.
I think with analytical solution it should perform better than with perfectly tuned PID. And I am really bad at tuning PID :)))
@@Skyentific Please let us know the results, the frame rate at 11min 48seconds onwards is no where near 15 frames per second (the rate mentioned when upgrading the nano to NX with ~80fps), I wonder if there is significant delay to the video processing when the robot "reacts", i.e. does things other than check video frames for faces. The gnome/X talk is definitely a good shout. I'm very sorry to admit I skipped some of the sections of the video, but plan to revisit once I have a Jetson :)
What happens with more people on the frames ? 😅
Great question! I programmed to detect closest person (more precisely person with greatest distance between two eyes). But I have not tested it yet :)
Divide by zero is for all practical purposes perfectly estimated at 0
- it is mathematical wrong but good enough ;)
Built
Complimenti, pagherei per avere 1 /10 delle tue conoscenze
Ух у тебя акцент))
Нафига такой большущий глаз? ))) там камера мелкая. А воще прикольно
How old is this boy? 🙂
Hello. Can you share the source codes? I am a student
I keep seeing Nvidia dev boards, what about the coral tpu?
средняя церковно-приходская с английским уклоном ))))) у меня в голове не укладывается как можно так быстро говорить с таким акцентом )))) а воообще - иц эмэйзинг! сэнк ю фо юр джоб и все эти вещи. особенно ценно , что я понял каждое слово, когда обычно не понимаю и половины ))))
I want those sunglasses. Pure sexy.
🇺🇦
I wonder how funny it would be to program the robot to do the opposite and instead of focusing on the person avoid 'eye' contact as much as possible. Completely useless but interesting nonetheless.
this video is dogecoin
Really nice result! Well done.