Robot that makes me an Aiming Pro | Physical Aimbot

Поделиться
HTML-код
  • Опубликовано: 14 окт 2024
  • Thanks for watching leave a like and subscribe if you like the video.

Комментарии • 880

  • @Alex-de7rz
    @Alex-de7rz 2 года назад +2401

    For people who code, we all know this not as simple as his explanation. Great work.

    • @ProdByCari
      @ProdByCari 2 года назад +18

      How would this be able to work in an actual fps game? There are so many different colors everywhere?

    • @mannysingh1358
      @mannysingh1358 2 года назад +72

      @@ProdByCari good question. probably require a lot more AI 'training'

    • @Alex-de7rz
      @Alex-de7rz 2 года назад +24

      Never tried to make one, but i think the different part is when in aimlab, the target and background already have contrast color, so he can 'increase' the contrast, meanwhile in real fps game, the target is probably a character that is more blend to the background, so i assume the program would need to access the enemy character model, render it in different color that is contrast to the background, then continue with the same process. Again, just assumption.

    • @welovethemallthesame1125
      @welovethemallthesame1125 2 года назад +18

      @@Alex-de7rz I think Object recognition + Image classification might do the trick. No need to pin point the exact boundary of the character model, just draw a box over the character model and try to hit the center of that frame. May need to handle the delay and character movement tho.

    • @ignaciotello6371
      @ignaciotello6371 2 года назад +4

      @@welovethemallthesame1125 Yo en mi proyecto de tesis para acortar tiempos pasé de tensorflow a AWS Rekognition y era muy fácil el aprendizaje del modelo, en su momento creé un sistema de videovigilancia que detectaba objetos personales perdidos, supongo que la aplicación debe ser muy similar, sólo que al detectar un personaje de Valorant dispare... Claro que es más fácil decirlo, hay mucho ensayo y error.

  • @Lightinway
    @Lightinway 2 года назад +4564

    Please give that robot a better mouse. You're holding it back

    • @barjuandavis
      @barjuandavis 2 года назад +197

      I agree. You need to take note if
      1. The mouse has mouse accel on/not, make sure it’s not
      2. How many Gs the mouse can handle in terms of speed? Most modern gaming mice can handle adequate Gs that we mere humans cannot reach.
      3. Do you turn off Windows pointer precision? This may cause you to do those complicated maths.
      My point is, replace the mouse and turn off Windows pointer precision. I recommend at minimal a Logitech G304 since thats the cheapest most adequate wireless gaming mouse I can think of; The more variables you can control, the merrier. Nice work!

    • @matinprsd
      @matinprsd 2 года назад +165

      Give that robot some RGB lighting + mouse with RGB lighting, instantly 200x Pro

    • @thickgirlsneedlove2190
      @thickgirlsneedlove2190 2 года назад +9

      Nah he doesn't have to

    • @guadalupe8589
      @guadalupe8589 2 года назад +37

      @@thickgirlsneedlove2190 does he want a gamer robot or not? Make that mouse look like a Christmas tree!

    • @naoltitude9516
      @naoltitude9516 2 года назад +20

      @@barjuandavis Windows pointer precision doesn't affect games that use raw input (99% of modern shooters). Also, if anything, he should ideally just turn the sens up a shit ton and make the motors as precise as possible.

  • @Chrexter
    @Chrexter 3 месяца назад +237

    Way late, but this was one of the coolest videos I've seen. Good ingenuity

    • @up.grayedd
      @up.grayedd 3 месяца назад

      This is true physical: "I Made Real-life Airsoft AIM-ASSIST: Aimbot V3"

    • @DrDeFord
      @DrDeFord 3 месяца назад

      3:10 “This took me 2 painstaking months.”

  •  2 года назад +383

    This proves that you can get pro level aim with a cheap wireless mouse and a high latency screen.

    •  2 года назад +22

      @QUAD849 His OpenCV code taking screenshots is slow, compared to "pro gaming 100000Hz RGB monitors".

    • @thickgirlsneedlove2190
      @thickgirlsneedlove2190 2 года назад

      @ Nooo it isn't slow

    • @re4796
      @re4796 2 года назад +7

      No it doesn't its quite literally a robot

    •  2 года назад +35

      @@re4796 that's not my point. It proves that your mouse and screen don't matter that much when your movements are perfect. You don't need a fancy gaming mouse to be that good, otherwise the robot would not be be able to achieve the scores.

    • @re4796
      @re4796 2 года назад +5

      @ I hope to God you're joking

  • @lordsiomai
    @lordsiomai 2 года назад +69

    Now I wanna see an aim lab tournament with this against 5 pros lol

  • @moatddtutorials
    @moatddtutorials 2 года назад +525

    Possible improvements:
    (physical)
    - unshell the mouse and mount its optical sensors directly onto the robot. This will reduce rattle/wiggle and ditch the mass of the shell.
    - replace the mouse buttons with relays that can be directly triggered.
    - move as many components as possible into a separate stationary breakout box, again, to reduce the mass of the mobile unit. Ideally, your mousedroid will have only motor drives, wheels and the optical sensor(s) within the mobile unit.
    (possible complete reconstruction)
    - consider keeping the sensor itself completely stationary and inverted (pointing up) and moving the surface instead (a thin piece of roughened plastic that is easily optically trackable and could be moved with a geared/belted X/Y setup.
    - or, mount the sensor in the same way as the printing head of a 3D printer over the tracking surface.
    (programming)
    - keep tabs on the velocity of the mousedroid and don't just take the proximity of the target into account, but also try to select targets that are also in line with the current direction of motion to reduce the amount that the mouse has to turn.

    • @KamalCarter
      @KamalCarter  2 года назад +143

      Thank Mark for watching! These suggestions are great, and some of them I do plan on doing for a V2. This was really a demo, and was me really seeing if I can do it. But I don't like how bulky the system is so I am going to take a part a mouse and make a new much slimmer and faster system coming soon. Also, I love your name you gave to the mouse --- mousedroid is fantastic, and I might still that for the future video!

    • @gSys1337
      @gSys1337 2 года назад +21

      @@KamalCarter You can also completely ditch the mouse and make the micro controller pretend to be a mouse.

    • @topkek5853
      @topkek5853 2 года назад +26

      @@gSys1337 yes but the whole point is tho have a physical aimbot

    • @mrED123
      @mrED123 2 года назад +1

      Dude joined RUclips 15 years ago!

    • @sarimbinwaseem
      @sarimbinwaseem 2 года назад +4

      @@topkek5853 microcontroller can act as a HID device too... So that will be convenient..

  • @mattcelder
    @mattcelder 2 года назад +39

    Sick idea, even sicker execution!

    • @HurlingMongroach
      @HurlingMongroach 3 месяца назад +2

      I had the same idea i just never learned to code cuz i got sick

  • @joshuaam7701
    @joshuaam7701 2 года назад +12

    Awesome work man, very cool project!

  • @akiko009
    @akiko009 2 года назад +46

    Awesome stuff. An algorithm change to allow shots while on the move should give you those extra points. Plan ahead for the optimum path, re-evaluate if targets change, shoot as you pass the target on the way to the next. That's how the humans do it. Never slow down, and estimate distances in terms of time to get there, not physical distance.
    You might also want to get an old school rubber ball mouse, toss the ball, and turn the rollers directly.

    • @niezbo
      @niezbo 2 года назад +8

      Using ball mouse is actually pretty damn good idea!! It would require only to motors with small torque. It could be extremely fast!

    • @akiko009
      @akiko009 2 года назад +9

      @@niezbo And if you want, you could skip the motors entirely, and instead simply blink LEDs at a rate that suits you.
      The movement in a particular direction is detected by two photo junction devices that look at a LED through a perforated disk. Skip the disk and the LED, and excite the sensors with two of your own LEDs directly to create the movement you want. The challenge is to get the timing right, but once there, the number of moving parts is 0.

    • @Shedding
      @Shedding 2 года назад +12

      Go a step further. See how the mouse sends the electrical signal through USB and use an arduino to simulate mouse movement by sending the right vdc x and vdc y.

    • @akiko009
      @akiko009 2 года назад +5

      @@Shedding There's (some) danger in that one. It is conceivable that a future gaming product might look for that kind of a hack. So if you do it, it better emulate all of a gaming mouses behavior... That said, the standard mouse signaling over USB is well defined and plenty of sample code exists out there.

    • @Shedding
      @Shedding 2 года назад +1

      @@akiko009 yep. There might already be someone who did this already. :)

  • @fir3haz4rd50
    @fir3haz4rd50 3 месяца назад +141

    Move the mousepad, not mouse

    • @jamessever8936
      @jamessever8936 3 месяца назад +14

      Ohhhh!!! Damn smart!!!

    • @АртемГоловко-п1ъ
      @АртемГоловко-п1ъ 3 месяца назад +4

      It’s cheating

    • @laughitout1331
      @laughitout1331 3 месяца назад +4

      It's easy to say but the whole program will change if you invert the moving medium. He don't make the code he just copied it.

    • @respise
      @respise 3 месяца назад +5

      It's even easier to just emulate a mouse with an arduino

    • @laughitout1331
      @laughitout1331 3 месяца назад +6

      @@respise You don't have any idea how gameguard/anticheat program works. Arduino devices are an old school cheat using it will detect you easily.

  • @maldo96
    @maldo96 2 года назад +43

    RIP SHOOTERS 1987-2022, THANKS

  • @chrisroberts5287
    @chrisroberts5287 2 года назад +282

    That was an awesome project! I was super impressed with how well it turned out. I’m sure that you could beat TenZ with a little more optimizing.

    • @KamalCarter
      @KamalCarter  2 года назад +23

      I'm happy you enjoyed it. I, too, think I could have beat Tenz, but it would have taken a considerable more amount of work.

  • @omaryahia
    @omaryahia 2 года назад +62

    I am a backend developer, but I took 2 embedded systems courses, I like this a lot
    combining hardware with software with Machine Learning is just amazing
    and a complex task
    oh man , you are a "Real" software engineer, you had an idea and you made it into reality and kept developing it
    congratulations

    • @KamalCarter
      @KamalCarter  2 года назад +11

      Thanks for the kind words, and understanding how much work went into this! Software engineering is something else really difficult, but these one off programming challenges are fun.

  • @oyster4465
    @oyster4465 2 года назад +8

    This was awesome man. Actually so cool

  • @bcd398
    @bcd398 2 года назад +22

    Another genius using his powers for evil lol

  • @amanvishwanathan7336
    @amanvishwanathan7336 2 года назад +7

    This is such an awesome project, came across the video on your tiktok and you’re making super high quality content, keep it up man, and I can’t wait to see how far you can go

  • @giovannimela504
    @giovannimela504 2 года назад +58

    Cool video and nice work!
    I feel like the PID tuning would be much easier of you focus on singular events. For example if you let the robot shoot a couple of targets and then make a plot of "error" over time, you might have an easier time making the PID feedback faster while keeping an eye on overshoot and unwanted oscillations.
    Probably too much work but it would be cool to see these kind of plots!

    • @KamalCarter
      @KamalCarter  2 года назад +14

      First off thanks for watching. Yeah I have some plot of those errors I didn't think it made for interesting content but will write a technical post about that. And when I make a V2 I can add that in.

    • @youngwang5369
      @youngwang5369 2 года назад +3

      I think that and even bigger improvement could be made by using optimal control techniques instead of PID. In the video, it looks like the robot tends to overshoot some targets when it starts move large distances. Maybe using LQR or something and optimizing around minimizing some objective outputs would give much better system response than just tuning PID by hand.

    • @Cassiusisback
      @Cassiusisback 3 месяца назад

      i just postet it already now. i see, this would have been the right place.
      cnc-servo-motors are controlled with 3 pids. position->pid->speed->pid->torque->pid->dutycycle. sounds complicated, but makes tuning much easier in the end.

  • @Omar-mm6ms
    @Omar-mm6ms 2 года назад +5

    Very cool! The fastest way to get a screenshot is to copy directly from the game’s back buffer by hooking the Present function in the D3D render pipeline. This involves writing a small library and injecting the DLL into the game’s process. A much simpler method involves using the Windows GDI API. It’s not as fast, but you can still reach a few hundred FPS. You’d probably be able to achieve much higher scores by improving your kinematics. Directly driving two axes rather than using omniwheels will reduce slipping and allow you to increase acceleration.

    • @KamalCarter
      @KamalCarter  2 года назад +2

      Wow thanks watching and thanks for the information the screenshotting tips sound real helpful and I will look into them. My V2 will be still using wheels, but down the line definitely want to try with a 2 axis gantry.

    • @sma2981
      @sma2981 3 месяца назад

      that mean kernel cheat guard certainly detect that?

  • @shabanfarooq5833
    @shabanfarooq5833 3 месяца назад +1

    The way it can be improved so much, make me think the extent of robots and Ai powers. And bro just casually made it.

  • @Hullbreachdetected
    @Hullbreachdetected 3 месяца назад +2

    Legends say even the game developers approved him for his innovative idea.

  • @Kaizien
    @Kaizien 2 года назад +1

    @kamal carter: I love the hardware aspect to this project, but I see the microcontroller you're using and I was wondering what the point of the mouse was? Couldn't we just send HID mouse/keyboard commands from the Microcontroller to emulate the mouse? It would remove several of the variables right?

    • @KamalCarter
      @KamalCarter  2 года назад +1

      Yeah I really wasn't trying to do the best way more so just wanted to see if this idea could work with limited resources. The better way is to definitely spoof the mouse inputs

  • @sailorgaijin8838
    @sailorgaijin8838 2 года назад +1

    And people thought learning ML via C++ was tough. Your dedication is insane,keep up the good work.

  • @erickdelgado8515
    @erickdelgado8515 2 года назад +1

    Waddp carter! Found this video on tiktok and im super glad i found the channel. Super excited to watch more of your content.

  • @pedrohenriquefischer
    @pedrohenriquefischer 2 года назад +29

    Use gaming mouse with a good switch and good sensor. The higher DPI will help the robot tracking. By using unhumanly high sensitivity you can get the robot to aim with little (but precise) movement

    • @resarier4779
      @resarier4779 2 года назад +1

      Yeah bro, also higher sens allows the mouse to hit a target without moving a long distance, which will speed it up a bit.

  • @icelezz5253
    @icelezz5253 Год назад +1

    First of all, this creation is fantastic! Got that exact idea myself like a year ago. If you are still "Working" on this or will in the future i have a thought. (And to point out, i have close to no expertise in this area). Would it be an option to use a trackball mouse and wire the controller board to the 2 sensors? In that way you could input the exact number of movements (like a stepper motor?). And also counteract the gravitation of the device itself??? just something i came to think about as i stumbeled on this video.
    Awesome work. Good video. 🙂👌

    • @KamalCarter
      @KamalCarter  Год назад +1

      Yeah I plan on using a track ball in the future been so busy with other things but thank you for the suggestion

  • @robertbriese7349
    @robertbriese7349 2 года назад +3

    wow, what a nice piece of robot you made 😳👍
    have a few ideas for the V2 if you want 😊

    • @KamalCarter
      @KamalCarter  2 года назад

      Let me hear them! I have ideas in my head and am working on some things but would love to hear what others say.

  • @머쓱타드-w4t
    @머쓱타드-w4t 2 года назад +3

    how can a human being be this creative

  • @evanbarcells1371
    @evanbarcells1371 2 года назад +1

    And this was the humble beginning of Mr. Glass

  • @MintMilk.
    @MintMilk. 3 месяца назад +4

    0:01 "whole round of ammo" 💀

  • @ChakaHamilton
    @ChakaHamilton 2 года назад +10

    Glad to see your back creating content hopefully in a more regular basis.

  • @martijn3463
    @martijn3463 2 года назад +5

    would love to see you develop this further

  • @RandoNetizen27
    @RandoNetizen27 2 года назад +7

    Not a python developer but have you tried pillow module to take the center of the screen instead of the whole screen as a screenshot? I just imagine there's a lot of dead space around the edges that could be cropped to make the image smaller for faster encoding. Possibly faster to use .PNG encoding but not 100% sure on that.
    from PIL import ImageGrab
    ss_region = (300, 300, 600, 600)
    ss_img = ImageGrab.grab(ss_region)
    ss_img.save("SS3.png")

  • @Syntox94
    @Syntox94 2 года назад +2

    It's like watching a video from "stuff made here" similar style. Talking ect.

  • @tortiboy142
    @tortiboy142 3 месяца назад +3

    I think that you could get a couple more points by optimising the targetting algorithm. Instead of targeting the closest circle,.you could turn this into a travelling sales man problem, to select the shortest path.
    Because there are only 3 circles visible at once, there only 6 (3!) possibilities.
    You could add those paths into an array, sort it by total distance and pick the shortest path and repeat after every hit.
    Also, tuning the values for the mouse, sounded liie something you could also turn into an ML project

  • @hectatusbreakfastus6106
    @hectatusbreakfastus6106 2 года назад +3

    I just heard about this. Pretty impressive skills you are showing off. Great work!

    • @KamalCarter
      @KamalCarter  2 года назад

      Thank you for watching. More content to come!

  • @MicroplaysMC
    @MicroplaysMC 2 года назад +1

    "The crux of it was, I was changing three numbers around for days" - yep, sounds about right.

  • @111Crytek
    @111Crytek 3 месяца назад +1

    Great work! A couple of suggestions:
    First, opencv library in python it's easy to use and setup but if you manage to code it in c++ you would improve the perfomances.
    Second, as far as for the PID controller. I dont know the gains you set but it's importat to have the correct sample time (for you it would be the frequency at which you execute the PID) since it influence the effectiveness of the control. Moreover, given you case, it might be enough to have a PD controller that basically has the integration gain set as zero (or just a very little value compared to the other two). The proportional part let you have a fast response while the derivative avoid the mouse to go over the target.
    However, again, great work! I love it!
    Edit: I realized too late that the video is old..lol

    • @KamalCarter
      @KamalCarter  3 месяца назад

      @@111Crytek hey love suggestions the video is old but this definitely process for improvement

  • @teal.9710
    @teal.9710 2 года назад +7

    Bro you have potential in youtube, keep making these bangers

  • @TheDanteBoots
    @TheDanteBoots 3 месяца назад +1

    You could strip the mouse down to just its sensor and board for a smaller footprint. It'd probably be more accurate with it's reduced weight minimising inertia. You could also suspend it in place and have a small sliding base under the sensor.

  • @RealButcher
    @RealButcher 3 месяца назад

    Wow... what a nice piece of wheels... cleverrrr ❤
    And also, nice coding!

  • @GambleCasts
    @GambleCasts 2 года назад +5

    Great robot man. Great video pacing and editing too

  • @mightyfp
    @mightyfp 2 года назад +15

    Old trackball mice might be better for this as the mechanics required can be smaller and fit in the void for the ball. In reality you could also just disassemble the mouse completely to get to the bar rollers...

  • @parahax5878
    @parahax5878 2 года назад +8

    put that robot on a gaming chair it will hit 200k no cappp

  • @gyntergyntersen9998
    @gyntergyntersen9998 3 месяца назад +1

    Awesome job, man. Keep it up. I'd like to see more stuff like that

  • @BellCranel2030
    @BellCranel2030 2 года назад +2

    Nice 😂 nice Idea. would like to have something like that too 😂

  • @jacksonashby7471
    @jacksonashby7471 2 года назад +2

    when the terminators attack we gotta call tenz

  • @mukulkumar8681
    @mukulkumar8681 2 года назад +14

    You just need to detect a color pixel, one optimization can be made in the size of screen capture you take(resolution), make it as small as possible which still keeps the accuracy of detection high. So as to reduce the load on computation. Maybe as low as 128x128 for a 1:1 display.
    let me know how this may affect your performance

    • @vulpix9210
      @vulpix9210 2 года назад +1

      the problem with this is real life, in a game the more you lower resolution the lower information you have, which may cause the robot to not detect a head

  • @natedagrate227
    @natedagrate227 2 года назад +11

    You could try using a 3d printer gantry to move the mouse faster and more precisely, the mouse could be held stationary while the mousepad moves under it

  • @abyssamun588
    @abyssamun588 2 года назад +5

    Great job, as a vision engineer, i really impressed ! I don't use python/open source in my industrial project, but maybe OpenCv in C++ is faster ?
    For the modelisation in the discrete automation is really math heavy, you've done a great job to find the value by trying !

    • @KamalCarter
      @KamalCarter  2 года назад

      Hello, thank you for your kind words, especially from someone that do computer vision for a living. How would you go about simplifying this model of the system because at the moment it's definitely nonlinear and wheel slippage will be a problem? Would you first get it into state space form?

    • @SafiUllah-jo9gr
      @SafiUllah-jo9gr 2 года назад

      Totally agreed c++ will make a better program in terms of speed, and ofc an AMAZING PROJECT! Love it!

    • @chawakornchaichanawirote1196
      @chawakornchaichanawirote1196 2 года назад +2

      OpenCV-python is just a wrapper around the C++ library so it shouldn't actually be much faster to switch, provided you are using numpy and not native python features

    • @KamalCarter
      @KamalCarter  2 года назад +1

      @@chawakornchaichanawirote1196 correct the CV was very fast due to the python wrapper

  • @Il_Principessa
    @Il_Principessa 3 месяца назад +1

    I swear engineering people are the one that we shouldn't be messing with

  • @SparksandCode
    @SparksandCode 2 года назад +1

    Great job. I had this same idea a few months ago. I moved it down on my list. Using wheels a great idea. Again great job.

  • @labradoodlesilver3756
    @labradoodlesilver3756 Год назад +1

    I also do computer vision stuff, and I usually cv2.split(frame) to get the blue, green and red frame, and then just get the mask by cv2.subtract(g,b) to get the mask that only green stuff remains

    • @labradoodlesilver3756
      @labradoodlesilver3756 Год назад +1

      I find it faster than hsv threshold, as I work on computer vision for FRC competitions

    • @labradoodlesilver3756
      @labradoodlesilver3756 Год назад

      then you can blur, or erode the mask to filter out noise from the mask, then get the contours, center of contours, and so on

  • @noirnit.
    @noirnit. 2 года назад +4

    I had a similar idea but with a metal frame and when you move one end of the frame to the other it does a 360 turn so u can get more consistent results with a steadier frame for faster flicks

  • @Google_Does_Evil_Now
    @Google_Does_Evil_Now 3 месяца назад +1

    Kamal, you could apply for NASA or similar. You've a natural interest, you can solve the problem, you can get the results. You achieve.

  • @ZennySilverhand
    @ZennySilverhand 2 года назад +2

    Really cool! I’d suggest maybe using some small brushless motors. It will be a more complicated wiring setup but likely would be quite a bit quicker and more accurate. I’d also try more securely mounting the mouse to the plate, it seems there’s slight wiggle which could cause lower accuracy. Well done!

    • @KamalCarter
      @KamalCarter  2 года назад +1

      Wait you might be on to something I am going to look into brushless motors they would be a lot better. One thing I doubt they get a lot of torque the motors I used are geared down a lot to do this.

    • @ZennySilverhand
      @ZennySilverhand 2 года назад

      @@KamalCarter nah they’ll have lots of torque. That’s really what they’re for, speed and power.

    • @schlomoshekelstein908
      @schlomoshekelstein908 2 года назад +2

      @@ZennySilverhand why brushless motors instead of stepper motors like used on a CNC machine? also take the mouse apart and just use the guts, no need for the extra weight of the case

  • @chiconosauh
    @chiconosauh 2 года назад +1

    I want to see vs people in game, you use a cam to track it or just record the screem?

  • @localdude2979
    @localdude2979 3 месяца назад +1

    This whole video was entertaining af

  • @michaelgiglia2158
    @michaelgiglia2158 2 года назад +2

    To comment on the screenshot thing. You might not need a faster screenshot process if you do trajectory planning on each screen shot. Then use PID (or cascade control my favorite) to track the trajectory. This also may give you an even higherscore as you can plan an optimal path so you never have to slow down the mouse so it's always moving. I always wanted to do this but with a stereo camera looking at the screen. That may also increase your screenshot refresh rate

    • @KamalCarter
      @KamalCarter  2 года назад +2

      Hmm 🤔 pointing the camera at the screen would be cool. It would increase the it's just like a human playing aspect. Yeah I'm also thinking about doing more advanced/optimal controls in the future the jerky jerk of the PID is definitely not sufficient if I want the hardware not to implode.

    • @michaelgiglia2158
      @michaelgiglia2158 2 года назад

      @@KamalCarter can't wait to see it :P

  • @cosmilitebar6772
    @cosmilitebar6772 2 года назад +1

    What where you using to make those screenshots? The fastest way known to me is the library "mss". Also you cold use something like yolov5 (or yolov7 but didn't test that - most likely overall better) to get those screen-coords. I know - machine learning is a bit overkill to shoot at same-colored spheres but it's GPU accelerated and probably much faster. Plus you could train it to recognize and shoot players. (i did exactly that)

    • @KamalCarter
      @KamalCarter  2 года назад +1

      Yes I used mss for this demo I found a new library that's called dxcam that worked really fast. And you're going like my next video if you know about YOLO alogrithms

  • @emolasher
    @emolasher 3 месяца назад +1

    3d printers cap out at 5-6 meters per second. They are incredibly accurate too. I suggest modifying your setup to use this tech. Gut the mouse and just have the laser attached to the print head. Lock the Z axis, and set some real records!!

  • @Joe-zw9ep
    @Joe-zw9ep 2 года назад +2

    Did you use Brett Beauregard's PID library or made your own? :)

    • @KamalCarter
      @KamalCarter  2 года назад +1

      I did my own PID. I have done PID tuning enough to not need a library unless I have a lot of gains going.

  • @mar504
    @mar504 2 года назад +1

    Really cool project! Thanks for sharing!

  • @jellobob3103
    @jellobob3103 3 месяца назад +1

    iv never seen the subscribe button glow like that when u said like and subscribe lol that was cool

  • @RevvingInPeace
    @RevvingInPeace 3 месяца назад

    The pro player training session was insane

  • @Blank_Immortal
    @Blank_Immortal 4 месяца назад +1

    Its really impressive to see your robot hit 118k even with a burnt out motor. Before I stopped playing fps my aimlabs score sat around 103k though am I no expert by all means, realistically anything higher than 100k isn't necessary for anyone at all.

  • @Flame_Playz2019
    @Flame_Playz2019 2 года назад +2

    Props for not using this strong aim bot in game :)

    • @KamalCarter
      @KamalCarter  2 года назад

      Thanks man. But you might not like what I do in the next video.

  • @MidoriShibuya
    @MidoriShibuya 2 года назад +1

    Honestly, this is crazy.

  • @guy16year4month6
    @guy16year4month6 2 года назад +1

    This is why you shouldn't mess with engineers

  • @dukedub
    @dukedub 2 года назад +1

    Yo this awesome project! Great work

    • @KamalCarter
      @KamalCarter  2 года назад

      Thank you so much glad you liked it!

  • @SL33py45H
    @SL33py45H 3 месяца назад +1

    Bruh why didnt youtube recommend this much sooner i sub for your work you put into this video

  • @NeelixMaster
    @NeelixMaster 3 месяца назад +10

    Actually People didn’t won, if you give the robot a better mouse and faster click and rotation speed it could dominate e-sporters too

  • @hul1ann
    @hul1ann 4 месяца назад

    this is really cool! if your gameplay gets audited most games will ban you for dual mouse inputs though, spoofing an arduino as a mouse fixes this, in your case you might need 2 arduino + host shields since you have a physical robot rather than just sending the arduino serial commands.

  • @nunyamyob9555
    @nunyamyob9555 3 месяца назад +1

    I welcome our robot masters.

  • @congriofish
    @congriofish 2 года назад +1

    You went the extra mile using those motors instead of just modifying the mouse lol. That was impressive

  • @biomechanism1
    @biomechanism1 2 года назад +1

    Did you Explore moving the surface below the mouse to make the pointer move?

  • @ryssesabel2750
    @ryssesabel2750 3 месяца назад +1

    ur a genuis, motivation 101

    • @KamalCarter
      @KamalCarter  3 месяца назад

      Thank you for the kind words

  • @martinsmith331
    @martinsmith331 2 года назад +1

    This video deserves 1 million view

  • @corneliusblackwood9014
    @corneliusblackwood9014 3 месяца назад +1

    I’ve done 91,876 , definitely not as good as I used to be but i’m 40 years old with arthritis.
    That’s what 30 years of video games gets you.

  • @dubber889
    @dubber889 2 года назад +1

    Why did you cut the Wheel Encoder cable on the motors ? you could achieve better wheel speed control by using that.

    • @KamalCarter
      @KamalCarter  2 года назад

      Wow you noticed that 😲. Yeah I could have done a lot with knowing the position and velocity of the encoders, but when the controls started working the encoders were over killed

  • @paulmeynell8866
    @paulmeynell8866 3 месяца назад +1

    That’s brilliant great work

  • @cluelessblueberries
    @cluelessblueberries 2 года назад +1

    I have subbed, bro keep it up!!

    • @KamalCarter
      @KamalCarter  2 года назад

      Thanks for watching and subbing

  • @tony_0878
    @tony_0878 2 года назад +2

    自瞄滑鼠墊感覺是個不錯的東西
    這樣馬達就不用再負擔滑鼠的重量

  • @henryteng5191
    @henryteng5191 2 года назад +1

    amazing, this video deserves a like and a sub

  • @nicholastanzola488
    @nicholastanzola488 2 года назад

    Came from TikTok ur content is right up my alley🤙

    • @KamalCarter
      @KamalCarter  2 года назад

      Appreciate all my TikTok viewers hopefully I can keep producing content you enjoy.

  • @matejlipa2913
    @matejlipa2913 3 месяца назад +25

    Where is V2 ?

    • @Sillybandsrock
      @Sillybandsrock 3 месяца назад

      Actually sounds like a great idea

    • @dreadfulman5191
      @dreadfulman5191 3 месяца назад

      He already made it. It's the second most recent video, called undetectable AI robot aimbot

  • @julianpetit4180
    @julianpetit4180 2 месяца назад +1

    I believe the Sentdex (on yt) has made a gta bot. He also ran into slow screen capture but solved it somehow.

  • @Spork888
    @Spork888 2 года назад +1

    Multiplayer's days are numbered

  • @KlarenceMS
    @KlarenceMS 2 года назад +1

    Yes do more video about this this could destroy p2w games

  • @MrTropicalFusion
    @MrTropicalFusion 2 года назад +4

    That was fantastic! I'd love a 2nd channel where you dive into the code of things. Subbed!

  • @chain3519
    @chain3519 2 года назад +6

    Did you use the root locus method for your PID tuning? You may be able to squeeze a little more performance out of it by optimizing rise and settling time

    • @KamalCarter
      @KamalCarter  2 года назад +3

      Wow, root locus that's a term I haven't heard in a minute 😂. No, I did not do root locus because I did not want to find the transfer function of the system. But if I was really engineering I would definitely go down a path like this.

    • @chain3519
      @chain3519 2 года назад

      I've been thinking about it. Since you already have it, you could just back out the transfer function of you plant. Then maybe Ziegler Nichols method would work well. It's really simple, just need to find the open loop gain. No root-locus necessary

    • @KamalCarter
      @KamalCarter  2 года назад

      @@chain3519 Hmm intriguing. I guess I can make the assumption this is a second order system and find the damping ratio and natural frequency by reading the response time plots and build the model that way. It's and interesting proposition much more research focus than perhaps a RUclips video, but if I come across the data I will send it your way.

    • @chain3519
      @chain3519 2 года назад +1

      @@KamalCarter Second order would definitely cover it. I think something like that could work, but seeing as you already have something that works...
      My suspicion is you could back out that stuff with a ruler and an iphone using slo mo. Then it's just a quadratic fit in excel or something (ma+cv+kx =F -> s^2 + c/m s + k/m = F(S), making sure m matches what is measured on a scale).
      Ziegler-Nichols probably would have just been a really nice starting point for the tuning you did anyways.
      Also thought of a solution that is simpler than anything we thought of/ you did. Use an arduino pro micro that poses as a mouse. Have it draw straight lines to the target and inject noise into the path so it's harder to detect. What you did is a lot more fun though.

    • @KamalCarter
      @KamalCarter  2 года назад

      @@chain3519 yeah I can get odometry data from the servos for the tracking part. Also, your last point is spot on I could have just spoof mouse inputs and made it seem as real as possible, but why do that when I get just literally make the mouse move.

  • @Synaptic_Synthesizer
    @Synaptic_Synthesizer Год назад

    Using the PyAutoGUI library: PyAutoGUI is a cross-platform Python library that can be used to automate keyboard and mouse actions, including taking screenshots. Using the PIL library: The Python Imaging Library (PIL) provides a way to capture the screen on Windows, but it requires the installation of a third-party library called Pillow. Using the mss library: The MSS (Multi-Screen Shot) library is a cross-platform library that can be used to take screenshots in Python. (Just let me know if you need any starter code for this... I will be glad to help)

  • @vitorfox
    @vitorfox 2 месяца назад +1

    That is amazing!!!

  • @recklessroges
    @recklessroges 3 месяца назад +1

    Nicely done.

  • @parkjammer
    @parkjammer 2 года назад +1

    Fun project, well and interestingly done!

  • @Prototiphrom
    @Prototiphrom 3 месяца назад

    Human trains for a whole life in AIMLAB, Robot: almost beats they after a few days of adjustments

  • @ChunkyROX
    @ChunkyROX 2 года назад +1

    What you proved here is the concept, and it works. I have no doubt that you can significantly improve the performance once you start min maxing each factor. Outside gaming - is there a use case where you want the control to be from an external factor (like from a mouse / joystick) and not programmed via an application in the OS?

    • @KamalCarter
      @KamalCarter  2 года назад +1

      No real use case just thought it would be fun. Yeah it could be significantly improved but for this version I liked how it all came together

  • @SyntheticFuture
    @SyntheticFuture 2 месяца назад +1

    This with an arm would probably rip. I think the wheels and traction as cool as they are add a lot of inaccuracies and slowness to the entire system. Still sick though 😁

  • @siaaaae
    @siaaaae 2 года назад +1

    Sometimes your opponent is having a really good day.

  • @IplayGames-ep2wy
    @IplayGames-ep2wy 3 месяца назад +1

    This guys content is underated🙌👏👏
    Idk why he stop uploading tho

  • @koray4550
    @koray4550 Год назад +1

    fast way to screenshot on python would probably be dxcam, pretty much the fastest way to do screenshots possible.