robot mania
robot mania
  • Видео 70
  • Просмотров 430 469
YOLOv8 OBB Training and ROS Implementation
In this tutorial I expain how to train yolov8-obb custom model and implement it to ROS2.
Ubuntu22.04
ROS2: Humble
The project is here:
drive.google.com/drive/folders/1Mw3TFLMtiMTCbRAGX8L5hWzXaIctwXgd?usp=sharing
How to Use YOLOv8 with ROS2:
ruclips.net/video/XqibXP4lwgA/видео.html
labelImg2:
github.com/chinakook/labelImg2
yolov5-utils:
github.com/otamajakusi/yolov5-utils
ros-drivers/usb_cam:
github.com/ros-drivers/usb_cam
Просмотров: 553

Видео

Path Tracking Algorithm Simulation in ROS2
Просмотров 81928 дней назад
In this tutorial I explalin basics of two path tracking algorithms: Pure Pursuit and Model Predictive Control. Ubuntu 24 ROS:Jazzy The project is here: drive.google.com/drive/folders/1whczoN0XLgl3wqbGHqMemEAvdWoba8Ep?usp=sharing How to Create GUI for Robot State Visualization and Control (Gazebo Harmonic, ROS2 Jazzy and pyqt): ruclips.net/video/u54WAlAewMU/видео.html MPC: atsushisakai.github.io...
How to Create GUI for Robot State Visualization and Control (Gazebo Harmonic, ROS2 Jazzy and pyqt)
Просмотров 894Месяц назад
In this tutorial I show how to convert gazebo topics to ROS topics and use them to create a simple GUI. ROS2:Jazzy Gazebo Harmonic The project is here: drive.google.com/drive/folders/1qRyeqFk1crV68H3dioz4YGdWjbULWmTh?usp=sharing How to Use ROS2 Jazzy and Gazebo Harmonic for Robot Simulation ruclips.net/video/b8VwSsbZYn0/видео.html Gazebo Sim API Reference 8.5.0 gazebosim.org/api/sim/8/classgz_1...
How to Use ROS2 Jazzy and Gazebo Harmonic for Robot Simulation
Просмотров 3,6 тыс.2 месяца назад
In this tutorial I explain how to use the newly released ROS2 Jazzy and Gazebo Harmonic to do robot simulation. OS: Ubuntu 24.04 ROS2: Jazzy Gazebo-Harmonic The project is here: drive.google.com/drive/folders/1Dlog1E_v9GqShoheGZMatSKFkB045kdi?usp=sharing Simulation of a 4WS Robot Using ROS2 Control and Gazebo: ruclips.net/video/VX53gAXafUA/видео.html ROS2 Jazzy installation link: docs.ros.org/e...
Comparing On-policy and Off-policy Methods in Reinforcement Learning Using a Simple Simulation
Просмотров 6443 месяца назад
In tihs tutorial I am doing experiments using the well-known on-policy and off-policy algorithms to understand their difference. Ubuntu: 22.04LTS ROS: Humble Gazebo-classic The project is here: drive.google.com/drive/folders/1V6svAdVtXhJuUAUx7sBPv6bQY9zmg-E2?usp=sharing 4WS-tutorial ruclips.net/video/VX53gAXafUA/видео.html
Using Transformer in a Multi-Agent Cooperative Task
Просмотров 6703 месяца назад
In this tutorial I attempt to use Transformer network to solve cooperative multi-agent reinforcement learning problem. The training itself was not successful, but this work should give direction for the future success. Ubuntu 22.04, ROS2 Humble, Gazebo-classic The project is here: drive.google.com/drive/folders/1h4eXUNd2M1AmCdYJCJWODMwy-_CB1QAp?usp=sharing Simulation of a 4WS Robot Using ROS2 C...
Obstacle Avoidance with Imitation Learning
Просмотров 2,1 тыс.4 месяца назад
In this tutorial I explain how to use imitation learning technique to train a robot to avoid obstacles while heading to the goal. Ubuntu22.04LTS ROS:Humble The project is here: drive.google.com/drive/folders/1qm7ogub1sXPb8kOupMs7jDWufHvciPqY?usp=sharing
How to Make a Simple Surveillance System Using Yolov9 with Triton Inference Server
Просмотров 1,2 тыс.5 месяцев назад
In this tutorial I explain how to utilize Triton Inference server, python and a camera to create a simple surveillance system. The project is here: drive.google.com/drive/folders/1if2tJdLm2c2fptmTfhs9iDD8jSyZA8Bl?usp=sharing
Object Recognition with Orin Nano (Jetpack 6.0) Using YOLOv9, TensorRT and RealSense
Просмотров 3,7 тыс.5 месяцев назад
In this tutorial I explain how to do inference with Orin Nano (Jetpack 6.0) using Yolov9, TensorRT and a RealSense camera. The project is here: drive.google.com/drive/folders/1V6V4o_e-ocI43Rplpg_08evZQcFhk2QL?usp=sharing Yolov9 repository: github.com/WongKinYiu/yolov9 yolov9-tensorrt repository: github.com/LinhanDai/yolov9-tensorrt Installing PyTorch for Jetson Platform: docs.nvidia.com/deeplea...
Cooperative Multi-Agent RL Simulation Using ROS2 and Gazebo
Просмотров 9016 месяцев назад
In this tutorial I explain basic theory of VND and try to solve a cooperative MARL problem. ROS2: Humble The project is here: drive.google.com/drive/folders/1qEd6kn_sTZN1YTe0fS2C3ANwRagQAHZo?usp=sharing
Competitive Multi-Agent RL Simulation Using ROS2 and Gazebo
Просмотров 1,4 тыс.7 месяцев назад
In this tutorial we will train two robots (agents) to play a simple game against each other. ROS2: Humble The project is here: drive.google.com/drive/folders/1tXbfbfvEk38sYa8R2A_gSQ_gJQTUOIKt?usp=sharing
Multiple Robot Simulation in Gazebo Using ROS2 and ROS2 Control
Просмотров 1,7 тыс.7 месяцев назад
In this tutorial I explain how to do a simulation in Gazebo of multiple robots which are using ros2 control. ROS2: Humble The project is here: drive.google.com/drive/folders/1cz_JNckrTkZL4eVs8M5ulcQZQDDprm4q?usp=sharing
Object detection using Yolo3D with ROS2
Просмотров 2,5 тыс.8 месяцев назад
In this tutorial I explain how to use Yolo3D with ROS2. ROS: Humble The project is here: drive.google.com/drive/folders/1SyyDtQC7LpSIld-jmtkI1qXXDnLNDg6w?usp=sharing YOLO3D: github.com/ruhyadi/YOLO3D
Surround View with ROS2
Просмотров 5119 месяцев назад
In this tutorial I explain how to create surround view using ROS2. Ubuntu 22.04LTS ROS: Humble Joypad is required for this symulation. The project is here: drive.google.com/drive/folders/1z0a4BmmG9AFv2um4gavnhM-vRU67-q-A?usp=sharing Surround view theory: arxiv.org/pdf/2205.13281.pdf surround view system introduction repository: github.com/hynpu/surround-view-system-introduction/blob/master/doc/...
How to use DeepStream with Jetson Orin Nano and ROS2
Просмотров 7 тыс.10 месяцев назад
In this tutorial I explain how to use DeepStream with Jetson Orin Nano. Jetpack version : 5.12 DeepStream SDK version : 6.3 The project is here: drive.google.com/drive/folders/1zmRSXi7I7zGK0HpKS1Y6N2ZWUqRwCk8m?usp=sharing NVIDIA DeepStream SDK Developer Guide: docs.nvidia.com/metropolis/deepstream/dev-guide/ deepstream python apps github repository: github.com/NVIDIA-AI-IOT/deepstream_python_ap...
Object Recognition with Jetson Orin Nano using YOLOv8 and RealSense
Просмотров 17 тыс.10 месяцев назад
Object Recognition with Jetson Orin Nano using YOLOv8 and RealSense
How to Operate Brushless Motors from a Radio Controller Using ROS2
Просмотров 1,4 тыс.11 месяцев назад
How to Operate Brushless Motors from a Radio Controller Using ROS2
How to Control a Robot Using Smartphone and ROS2
Просмотров 1,8 тыс.Год назад
How to Control a Robot Using Smartphone and ROS2
YOLOv8 Semantic Segmentation: Custom Class Training and Implementation
Просмотров 1,7 тыс.Год назад
YOLOv8 Semantic Segmentation: Custom Class Training and Implementation
How to Control a Robot using WebRTC
Просмотров 2,6 тыс.Год назад
How to Control a Robot using WebRTC
Oriented Object Detection using YOLOv5-OBB: Inference, Training Using GPU and Implementation
Просмотров 3,9 тыс.Год назад
Oriented Object Detection using YOLOv5-OBB: Inference, Training Using GPU and Implementation
Oriented Object Detection using ROS2
Просмотров 2,5 тыс.Год назад
Oriented Object Detection using ROS2
Object tracking with YOLOv8 using Jetson Nano
Просмотров 19 тыс.Год назад
Object tracking with YOLOv8 using Jetson Nano
How to Use YOLOv8 with ROS2
Просмотров 17 тыс.Год назад
How to Use YOLOv8 with ROS2
Autonomous Navigation with Deep Reinforcement Learning Using ROS2
Просмотров 19 тыс.Год назад
Autonomous Navigation with Deep Reinforcement Learning Using ROS2
How to Create a Gazebo Model Using CAD and Blender
Просмотров 6 тыс.Год назад
How to Create a Gazebo Model Using CAD and Blender
Creating a Point Cloud Using LIO-SAM with ROS2 and Gazebo
Просмотров 6 тыс.Год назад
Creating a Point Cloud Using LIO-SAM with ROS2 and Gazebo
Navigation of an Omni Wheel Robot Using Only a Lidar
Просмотров 4,7 тыс.Год назад
Navigation of an Omni Wheel Robot Using Only a Lidar
Simulation of a 4WS Robot Using ROS2 Control and Gazebo
Просмотров 14 тыс.Год назад
Simulation of a 4WS Robot Using ROS2 Control and Gazebo
Object Recognition Using Yolov7 and TensorRT
Просмотров 6 тыс.Год назад
Object Recognition Using Yolov7 and TensorRT

Комментарии

  • @begood5427
    @begood5427 13 часов назад

    Is it supported in jazzy ros

  • @edwilliams9914
    @edwilliams9914 День назад

    Thanks for another really useful tutorial. But I'm afraid I got lost at one point at 3>30 where you list the terms for DLS there is the term I which you don't tell us what that is. (Maybe it's some constant of something you would assume we would just know, but I'm self taught so there are big things I'm missing. What is I? Thanks for a really useful series of videos on ROS! I look forward to absorbing them all (if I can).

    • @robotmania8896
      @robotmania8896 День назад

      Hi Ed Williams! Thanks for watching my video! “I” denotes identity matrix. Term “I” is generally used for identity matrix in linear algebra.

  • @a.t10
    @a.t10 День назад

    you are a awesome. thank you for everything that you do.

    • @robotmania8896
      @robotmania8896 День назад

      Hi A. T.! It is my pleasure if my videos help you!

  • @elastico_007
    @elastico_007 2 дня назад

    Hi I modelled a Ackerman chassis car and spawned in gazebo classic but i can't spawn it in the ignition i changed all params from gz to ign

    • @robotmania8896
      @robotmania8896 2 дня назад

      Do you have any errors in the terminal?

    • @elastico_007
      @elastico_007 День назад

      @@robotmania8896 nope nothing Just world loaded and robot didn't load

    • @robotmania8896
      @robotmania8896 День назад

      @@elastico_007 Is your directory for mesh files correct? If the model not shows up, maybe the directory in the xacro file is wrong.

  • @AbdelkaderBelabed-j2d
    @AbdelkaderBelabed-j2d 3 дня назад

    Hello, it's great work thanks for sharing I have a question i train ddqn for this robot 24 hours , 2500 episodes using cpu , But sometimes the robot swings right and left in front of goal_point or obstacles and does not move forward. Any tips to improve training? This is hyperparameters using in the training : Lr = 0.001 Discount_factor =0.99 tau = 1e-3 update target every = 4 steps Batch_size = 64

    • @robotmania8896
      @robotmania8896 2 дня назад

      Hi Abdelkader Belabed! Thanks for watching my video! I think that hyper parameters are fine. I think this problem can be negated to some extend by giving a proper reward. You may make the reward bigger when robot reaches the goal.

    • @AbdelkaderBelabed-j2d
      @AbdelkaderBelabed-j2d 2 дня назад

      @@robotmania8896 ok thanks , i will try it

    • @AbdelkaderBelabed-j2d
      @AbdelkaderBelabed-j2d День назад

      @@robotmania8896 it's working thank you very much

  • @StoneGeoffrey-n5n
    @StoneGeoffrey-n5n 4 дня назад

    Viola Villages

    • @robotmania8896
      @robotmania8896 4 дня назад

      Hi Stone Geoffrey! Thank you for watching my video!

  • @Myfcollbodybuilding
    @Myfcollbodybuilding 5 дней назад

    Is there a way to measure the speed of tracked moving object by using deepstream?

    • @robotmania8896
      @robotmania8896 4 дня назад

      Hi Yang Liu! Thanks for watching my video! If you know real coordinates of object you are tracking, it easy to calculate velocity. But I don’t think that there is a module in deepstream to do that. Also, to do this, you have to use RGBD camera.

    • @Myfcollbodybuilding
      @Myfcollbodybuilding 3 дня назад

      @@robotmania8896 Thanks

  • @user-ur7kf3wf2l
    @user-ur7kf3wf2l 6 дней назад

    Subscribed, hope it will work because Im having trouble with multi robot control. I gave each robot a namespace but I just cannot spawn the second controller(no matter what order). It seems that the resource manager for the second robot cannot recognize any hardware interface(the manager exists in the node list).❤❤❤

    • @robotmania8896
      @robotmania8896 5 дней назад

      Hi 郭少! Thanks for watching my video! I hope this video will help you!

    • @user-ur7kf3wf2l
      @user-ur7kf3wf2l 5 дней назад

      @@robotmania8896 Thank you for your responese! Inspired by your video, I've solved the problem adding the <robot_param_node> tag to the gazebo-ros2-control plugin part!

  • @CathyNorris-y7t
    @CathyNorris-y7t 6 дней назад

    Lilly Track

    • @robotmania8896
      @robotmania8896 4 дня назад

      Hi Cathy Norris! Thank you for watching my video!

  • @ileonrd
    @ileonrd 6 дней назад

    Estoy utilizando ros2-humble, y el problema mas grande es la paquetetaria de velodyne_simulator porque no escompatible con ros2 humble, han encontrado una solucion? para ros2 humble?

  • @alancyriac8610
    @alancyriac8610 7 дней назад

    Thanks a lot for the video. I was able to complete it successfully using a keyboard instead of a joystick controller.

    • @robotmania8896
      @robotmania8896 7 дней назад

      Hi Alan Cyriac! Thanks for watching my video! Glad to hear that! Sometimes people ask me whether it is possible to move the robot using keyboard.

  • @FrancyLlamado
    @FrancyLlamado 7 дней назад

    and may I ask why you did not use ROBOFLOW for annotating your data set, is using labelimg for annotation faster?

    • @robotmania8896
      @robotmania8896 7 дней назад

      You may use whatever tool is convenient for you. I prefer using tools without any logins or via the Internet.

  • @FrancyLlamado
    @FrancyLlamado 7 дней назад

    is there a way to generate synthetic data for YOLOV8OBB

    • @robotmania8896
      @robotmania8896 7 дней назад

      Hi Francy Llamado! Thanks for watching my video! I have never done it, but you may use DALL·E to generate a dataset.

  • @elastico_007
    @elastico_007 7 дней назад

    Hi I'm modelling a Ackerman chassis car in fusion and converting it to urdf. Can I give streer joint and rotation joint at the same place? And both joints are perpendicular?

    • @robotmania8896
      @robotmania8896 7 дней назад

      Hi ELASTICO! Thanks for watching my video! Yes, you can. I have several videos with Ackerman steering mechanism. You can refer to URDF files for those videos.

  • @user-zh6fe4co1v
    @user-zh6fe4co1v 8 дней назад

    Hello! Teacher, i have a question. I have Jetson Orin Nano dev-kit(jetpack6). I want to operate intel realsense with yolo,Ros2 in Jetson. But when i try to operate, errors comes.. without jetson i can operate realsense with ros2 and yolov8 but in jetson.. i can't. Can you check what is problem? And what should i do 😢 error is just when i run the code in terminal, connect doesn't exist. But when i check the connect in jetson's Port. I can find realsense is connected

    • @robotmania8896
      @robotmania8896 8 дней назад

      Hi 다비! Thanks for watching my video! What error exactly do you have?

    • @user-zh6fe4co1v
      @user-zh6fe4co1v 7 дней назад

      ​@@robotmania8896 Can i get your email? i want to show my project's procedure...i already did do too many troubleshooting... and i want to crying ㅠㅠHa.... Basically 1. install Ros humble 2. install pytorch,torchvision 3. build opencv with cuda 4. install ros-realsense wrapper(but i realized Jetpack 6.0 doesn't support librealsense...so i should build from source, but even that also error comes. i guess JetsonHackNano's librealsense is too old version) 5. install YOLOv8-ros wrapper...purpose is operate intel realsense D455 , and if you give me email, than i can explain more better bcz i can upload the photo of errors.. plz help me ㅠㅠ

    • @robotmania8896
      @robotmania8896 5 дней назад

      Here is my e-mail: robotmania8867@yahoo.com

    • @mr.9489
      @mr.9489 4 дня назад

      @@robotmania8896 Thankyou! I sended email! i'm 제천대성

  • @cihanokay3798
    @cihanokay3798 8 дней назад

    Hello, thank you for the video. I followed each step in the video but I encountered a problem. In the controller tab, all of the commands works properly but not "move the arm". When I clicked on it, robot arm goes somewhere else. Also, how can I edit those commands so that I can control the robot arm as I wish, and maybe I can solve the problem in this way? Thank you!

    • @robotmania8896
      @robotmania8896 5 дней назад

      Hi Cihan Okay! Thanks for watching my video! If the arm moves in a wrong direction, probably waypoint is not set properly. Please check whether block position is recognized correctly. Yes, you can edit commands by modifying the “Pick_up_the_block” function in the “controller.py” script. If you would like to edit UI, please edit the “controller_window.ui” file using qt designer tool and convert it to python script.

  • @muhammadibrahim-og4lf
    @muhammadibrahim-og4lf 8 дней назад

    Hello Sir, Do you hava python version. I am working on 3d model for an irrigation robot with four independently moving wheels on gazebo. I hope the link you share will work in ros 1 noetic on Ubuntu 20. Thank you so much. i make the urdf of my model using isaac sim.

    • @robotmania8896
      @robotmania8896 8 дней назад

      Unfortunately, I don’t have python version for this tutorial, but you may use code of the tutorial you originally commented on. You should be able to use the 4WS part with small modifications.

    • @muhammadibrahim-og4lf
      @muhammadibrahim-og4lf 8 дней назад

      Thank you, sir, for your guidance and information

  • @petersobotta3601
    @petersobotta3601 8 дней назад

    Awesome stuff! This is exactly what I need for my robot. Will it work on a Jetson Xavier using exactly the same steps?

    • @robotmania8896
      @robotmania8896 8 дней назад

      Hi Peter Sobotta! Thanks for watching my video! Yes, it should work.

  • @GhOsT-id6qy
    @GhOsT-id6qy 9 дней назад

    Hi, so i have tried your part of launch files for custom mobile robot . Its seems fine to work in gazebo but when check tf tree , only first robot gets Tf. Other robots does not have Tf . Any input would be great.

    • @robotmania8896
      @robotmania8896 5 дней назад

      Hi GhOsT! Sorry for the late response. I visualized tf tree using “ros2 run rqt_tf_tree rqt_tf_tree” command. The tf tree for the first robot is shown normally, but tf tree for the second robot is shown only up to the “robot_2_base_link” due to the “TF_DENORMALIZSED_QUARTENION” error. I am not sure what causing this error in this case since the robots are completely identical.

  • @MohamedAssanhaji
    @MohamedAssanhaji 10 дней назад

    God bless you king gonna implement this for my visual servoing application with Universal robot 3, i implemented most of your tutorials, any advice for implementing this one (how can i send extracted position to the robot arm via ethernet and using python and YOLO v8 OBB + ROS) ?

    • @robotmania8896
      @robotmania8896 9 дней назад

      Hi Mohamed Assanhaji! Thanks for watching my video! If you would like to send information using ethernet, you can use socket communication. Regarding the ethernet communication, code for this tutorial will help you. ruclips.net/video/40p2avodFV0/видео.html

  • @muhammadibrahim-og4lf
    @muhammadibrahim-og4lf 10 дней назад

    Hello Sir, please tell me that will it work in ros1?

    • @robotmania8896
      @robotmania8896 9 дней назад

      Hi muhammad Ibrahim! Thanks for watching my video! No, this tutorial will not work with ROS1. But if you are interested in four-wheel steering with ROS1, this tutorial will work. ruclips.net/video/4_bftUbQTaM/видео.html

    • @muhammadibrahim-og4lf
      @muhammadibrahim-og4lf 9 дней назад

      @robotmania8896 Thankyou sir, I am working on 3d model for an irrigation robot with four independently moving wheels on gazebo. I hope the link you share will work in ros 1 noetic on Ubuntu 20. Thank you so much.

  • @temitopeibrahimamosa2885
    @temitopeibrahimamosa2885 12 дней назад

    Thank you for providing this tutorial and the code. please, guide me. The robot is loading upside down in gazebo. It only load correctly when ros2_control is installed but the robot fail to move

    • @robotmania8896
      @robotmania8896 11 дней назад

      Hi Temitope Ibrahim Amosa! Thanks for watching my video! What do you mean by “fail to move”? Are there any errors in the terminal?

  • @vihan2549
    @vihan2549 12 дней назад

    Very useful videos. Please keep posting manipulator videos also.

  • @user-qp6uv3ut3s
    @user-qp6uv3ut3s 13 дней назад

    Hello, thanks for this video. it is very helpful. I need your help, I have implemented yolov5 a an engine in TensorRT and used Deep sort for tracking after that. but I ran into low memory situation and I cant continue. what should I do?

    • @robotmania8896
      @robotmania8896 12 дней назад

      Hi Massar Sara! Thanks for watching my video! Try to close all unnecessary programs to free as much memory as possible. Because editors, browsers consume a lot of memory. You also may use smaller image size for inference. Lastly, you may use smaller model.

  • @prescriptionoatmeal7970
    @prescriptionoatmeal7970 14 дней назад

    Quick and easy!

    • @robotmania8896
      @robotmania8896 13 дней назад

      Hi Prescription Oatmeal! Thanks for watching my video! It is my pleasure if this video has helped you!

  • @user-ri5gp7pb4f
    @user-ri5gp7pb4f 14 дней назад

    Thank you for making very good content. what if I don't have the controller, how can I use keyboard to control the robot? Could you please suggest?

    • @robotmania8896
      @robotmania8896 14 дней назад

      Hi อภิวัฒน์ พิทักษ์ศิลป์! Thanks for watching my video! If you would like to use a keyboard, you can use this package to publish commands. cmower/ros2-keyboard: Keyboard driver for ROS 2. (github.com)

  • @iamshakeelsindho
    @iamshakeelsindho 14 дней назад

    Thank you for your video.

    • @robotmania8896
      @robotmania8896 14 дней назад

      Hi iamshakeelsindho! Thanks for watching my video! It is my pleasure if this video has helped you!

  • @andrewli1126
    @andrewli1126 15 дней назад

    Thank you for the nice video! However, when I execute my Python script on the last step, the pipeline fails to start, returning Runtime error: "No device connected." Could you provide some insight on this? (the camera is connected to a USB 3port with a USB 3 cable)

    • @robotmania8896
      @robotmania8896 15 дней назад

      Hi andrewli1126! Thanks for watching my video! Usually, this error occurs when you have a problem with USB cable. Can you obtain an image using the “realsense-viewer” application?

    • @andrewli1126
      @andrewli1126 13 дней назад

      @@robotmania8896 Thanks for the reply! Yes the camera works fine with realsense-viewer, I could also import pyrealsense2 module, just that when running the script, the pipeline fails to start, returning the error: "No device connected."

    • @robotmania8896
      @robotmania8896 13 дней назад

      @@andrewli1126 Can you get frames from the realsense using this code?(index may not be 0) import cv2 capture = cv2.VideoCapture('/dev/video0') while(True): try: ret, frame = capture.read() cv2.imshow('frame', frame) except: import traceback traceback.print_exc()

    • @andrewli1126
      @andrewli1126 10 дней назад

      @@robotmania8896thank you for the reply! I have resolved the issue by uninstalling pyrealsense2 and rebuilding the wrapper from source. Cheers!

  • @happynewyeartt
    @happynewyeartt 15 дней назад

    The roboportal website page is missing, what should I do?

    • @robotmania8896
      @robotmania8896 15 дней назад

      Hi happy new year! Thanks for watching my video! Indeed, it seems that the page is not responding. I don’t have any information about whether they are closed or not, so I guess all we can do is just to wait. Or you may open an issue at “roboportal/bot_box” page to ask about this matter.

  • @semiuadebayo9626
    @semiuadebayo9626 17 дней назад

    great work. Thank you. Do you do paid robotics trainings? I'm interested in learning more

    • @robotmania8896
      @robotmania8896 17 дней назад

      Hi Semiu ADEBAYO! Thanks for watching my video! Unfortunately, I don’t have any other paid material. What subjects are you interested in?

  • @user-hh3ww2cr4h
    @user-hh3ww2cr4h 21 день назад

    Having issues running colcon build in scr dir due to cmake

    • @user-hh3ww2cr4h
      @user-hh3ww2cr4h 21 день назад

      edit: forgot to set up ros /sorce ect

    • @robotmania8896
      @robotmania8896 20 дней назад

      Hi Zachary Reid! Thanks for watching my video! So, now you can build the packages successfully?

  • @omarsalem5832
    @omarsalem5832 23 дня назад

    Thank you!

  • @maese.
    @maese. 23 дня назад

    Hi! I am using this robot (with modifications) in a project with ROS2. Do you have it in GitHub or similar so I could properly mention your work? Thanks for your work!

    • @robotmania8896
      @robotmania8896 23 дня назад

      Hi Maese! Thanks for watching my video! I don’t have Github repositories with this robot. If you would like to mention my work, you may cite this video as have described in this post. Thank you! www.scribbr.com/citing-sources/cite-a-video/

  • @sutanmuhamadsadamawal715
    @sutanmuhamadsadamawal715 24 дня назад

    Very nice project. It is possible to run it with ROS Humble and Gazebo Classic ? what should I adjust for that ?

    • @robotmania8896
      @robotmania8896 24 дня назад

      Hi Sutan Muhamad Sadam Awal! Thanks for watching my video! Yes, it is possible. If you are using gazebo classic, to get the model pose, instead of subscribing to ”/model/fws_robot/pose”(tf2_msgs.msg) topic, you should subscribe to “/gazebo/model_states”(gazebo_msgs.msg) topic. Also, launch file will be different. Please refer to launch file from my other projects.

  • @sageralansi7858
    @sageralansi7858 24 дня назад

    Please dowenload file tetcher

    • @robotmania8896
      @robotmania8896 24 дня назад

      Hi Sager Alansil! Thanks for watching my video! What is “file tetcher”?

  • @mohammedbourouba9274
    @mohammedbourouba9274 25 дней назад

    Hello thank you for this amazing video but i want my robot recognize traffic sign and react according to them how can I do it ?

    • @robotmania8896
      @robotmania8896 25 дней назад

      Hi Mohammed Bourouba! Thanks for watching my video! In that case, you have to train your own model.

    • @mohammedbourouba9274
      @mohammedbourouba9274 25 дней назад

      yes i do it but i want my robot when it detect for example turn left it turn left automatically

    • @robotmania8896
      @robotmania8896 23 дня назад

      In that case, you can publish the “cmd_vel” topic when the robot detects “turn left” sign. Probably you will need depth camera as well to make the robot turn at right point.

    • @mohammedbourouba9274
      @mohammedbourouba9274 14 дней назад

      What's are the changes to do in case i want to be autonomous, it's possible ?

    • @robotmania8896
      @robotmania8896 14 дней назад

      @@mohammedbourouba9274 Do you mean, that you would like to a navigation?

  • @user-mu8gr1wm9g
    @user-mu8gr1wm9g 28 дней назад

    is it possible change robot size?

    • @robotmania8896
      @robotmania8896 27 дней назад

      Do you mean change robot size during the simulation? If so, it is possible if you will make your robot such way.

  • @chiyoungkwon
    @chiyoungkwon 28 дней назад

    Is it possible to modify the code to make the size of obstacles increase over time in maze solving system?

    • @robotmania8896
      @robotmania8896 27 дней назад

      Hi chiyoung kwon! Thanks for watching my video! For a simple shape I think there are ways to do that.

  • @vilsonwenisbelle9041
    @vilsonwenisbelle9041 28 дней назад

    Great. Thanks for another video!!!

    • @robotmania8896
      @robotmania8896 28 дней назад

      Hi Vilson Wenis Belle! It is my pleasure if this video has helped you!

  • @mr.9489
    @mr.9489 28 дней назад

    thankyou for saving me ㅠ.ㅠ

  • @elastico_007
    @elastico_007 Месяц назад

    hi i want to use intel d435i in the gazebo sim and i couldn't find plugin can you suggest how to do?

    • @robotmania8896
      @robotmania8896 28 дней назад

      Hi ELASTICO! Thanks for watching my video! I cannot find plugin for Gazebo sim either. I guess you have to wait till someone make it, or you should make it yourself.

    • @elastico_007
      @elastico_007 26 дней назад

      Hi Thanks I added rgbd camera plugin in gazebo sim and used Ros gazebo bridge. And I can visualise depth and camera in rqt image view but not in rviz2 it's like blank white screen.

    • @robotmania8896
      @robotmania8896 25 дней назад

      @@elastico_007 If I remember correctly, if there are no topics received, there will be “No image” writing in the window. So, probably the topic is received correctly. But I cannot tell you anything else specific.

  • @indramal
    @indramal Месяц назад

    I like this guy and have similar interest

    • @robotmania8896
      @robotmania8896 Месяц назад

      Hi Indramal Wansekara! Thanks for watching my videos! I hope you will find videos you are interested in!

    • @indramal
      @indramal Месяц назад

      @@robotmania8896 definitely, looking for more interesting videos.

  • @tianxie3954
    @tianxie3954 Месяц назад

    I am really curious. How long have you been coding for? The repos that you shown here are some of the most advanced ROS2 tutorials on RUclips...

    • @robotmania8896
      @robotmania8896 Месяц назад

      Hi tian xie! I've been coding for 15 years roughly. But I personally think that the most important is to understand theory behind the code. Also, valuable skill nowadays is to find information (code) quickly, combine it with your knowledge and get outcome you want.

  • @kawthertrabelsi4996
    @kawthertrabelsi4996 Месяц назад

    Hello :) is it possible to use Jetson Nano 2 GB with 2 USB Webcams and yolov8!!

    • @robotmania8896
      @robotmania8896 Месяц назад

      Hi Kawther Trabelsi! Thanks for watching my video! With small image size and small model, you probably will be able to execute yolov8 on the Jetson Nano. But you will not be able to execute inference simultaneously on both cameras. You will need to execute inference sequentially.

  • @AlexisKM100
    @AlexisKM100 Месяц назад

    Amazing project and video pal, Just one question, in the video at 9:03 you show the tensorboard graph, and it says it took 20 hrs of training, is that accurate, if not, how much time did it take to converge?

    • @robotmania8896
      @robotmania8896 Месяц назад

      Hi Serapf-p! Thanks for watching my video! It depends on computational capabilities of your machine. As far as I remember, I used CPU for training in this project. Using GPU, the training time will be shorter, but I cannot tell you for how mush exactly.

  • @tianxie3954
    @tianxie3954 Месяц назад

    Keep it up!

  • @tianxie3954
    @tianxie3954 Месяц назад

    Man I am so impressed by your channel. Finally some teaching ros2 with more than just "ok this is how you write a subscriber and publisher"

    • @robotmania8896
      @robotmania8896 Месяц назад

      Hi tian xie! Thank you for your hart warming comment!

  • @nhatnet479
    @nhatnet479 Месяц назад

    I want to ask more a question. Currently I am following your video, I got the 3D point coordinate in image coordinate, and I want make a project related to robot perception. For more specific, I want the robot understanding around objects. So after calculating (X,Y,Z) in camera coordinate like this ( X = dist*(x_center - self.cx)/self.fx Y = dist*(y_center - self.cy)/self.fy Z = dist) we need to transform them to coordinate to robot coordinate and then to the map ? could you explain more for me deeper understand ? or following as your video with steps is suitable for my project ? Thanks

    • @robotmania8896
      @robotmania8896 Месяц назад

      Hi Nhat Net! Thanks for watching my video! If I understand correctly, you would like to estimate object coordinates in map coordinate system? In that case, you have to do exactly what I did in this tutorial.

    • @nhatnet479
      @nhatnet479 Месяц назад

      @@robotmania8896 yes, i want robot perceive the around objects and then, use this information for obstacle avoidance task. So, after calculating the X,Y and Z we will transform to the map. Is this right for pipeline? . And based on my understanding, currently we just determine the position of object, should i dertermine the orientation of object and how to make it. Could you recommend the materials for this. Thanks. I learned a lot of useful knowledge from your channel.😀

    • @nhatnet479
      @nhatnet479 Месяц назад

      Curent i am doing object avoidance for robot navigation based object coordinate from camera. Could recommend for me some materials for this. Thanks

    • @robotmania8896
      @robotmania8896 Месяц назад

      @@nhatnet479 It is my pleasure if my videos have helped you. Since general navigation algorithms use grid map, you may plot detected object position (x, y) on the grid map and then use usual algorithms. Generally, estimating object orientation is even harder than estimating object position. In some cases, knowing orientation of the object will help to predict direction of movement of the object resulting in even more smooth object avoidance. But for most applications knowing object position is enough. I found this repository regarding object avoidance using camera. Maybe it will help you. github.com/Onlee97/Object-Detection-and-Avoidance-with-Intel-Realsense

    • @nhatnet479
      @nhatnet479 Месяц назад

      @@robotmania8896 thanks. in your code. I saw you transformed the coordinate of pixel of camera from the center camera to coordinate of robot and to coordinate of map. with my understand, with this method, the map will be understand position of object in the map coordinate like we transform pose of robot from base_link-> odom -> map, at this time, the map will be understand position and orientation of the robot in the map coordinate. Is that right ? . And in your situation, instead of coding from scratch the transforms matrix we can also use TF lib to transform it. Is that right ?

  • @nhatnet479
    @nhatnet479 Месяц назад

    perspective projection transformation means that we will calculate the position of object (x, y) with respect to camera, and from this we can calculate its orientation. So we do not need to extrinsic parameters ? in my case I am using a actual realsense camera and I got 2 topics like this "camera/color/camera_info" and "camera/depth/camera_info" should I subscribe the topics to get suitable params?

    • @robotmania8896
      @robotmania8896 Месяц назад

      I have several videos in which I am using RealSense camera to estimate object position. I hope this video may help you. ruclips.net/video/oKaLyow7hWU/видео.html

  • @bahajouili737
    @bahajouili737 Месяц назад

    Hello sir, with these tools, can I control my robot over different IP networks?

    • @robotmania8896
      @robotmania8896 Месяц назад

      Hi baha jouili! Thanks for watching my video! Yes, you can.

    • @bahajouili737
      @bahajouili737 Месяц назад

      @@robotmania8896 okay thank you