Incredible introduction! Im building a cheap robot with two kinects, a LDS-01 lidar and a Atomic Pi! I already got Visual Based SLAM working and im now shifting to Lidar SLAM to compare the results! Thank you for the great video! Installing Cartographer was a breeze!
Sorry i'm taking too long, my computer broke, so im unable to record the video i promised! As soon as it's fixed and the video is ready i'll let you know!
RPLidar A1m8 is the first product produced by our company. Please support us. We also have better quality lidar. The company has been established for 10 years. I hope everyone likes it. We will continue to produce and develop more powerful products.
Why do you approach it with mapper_ws instead of cartridge_ws made by cartographer? Do I have to create a separate mapper_ws? Or will it be installed automatically?
The map is generated on RPi. Laptop is used only for visualisation of the ros topics. Laptop runs rviz only and is connected to the RPi's roscore via network.
@@RoboticsWeekends thanks for your answer. do you have a detailed documentation or a manual how to rebuild the system at home? i am very new to this topic, but i bought my first raspberry and the lidar, because i think your project is really cool.
Hey thanks for your video. Quite helpful. Is it similarly possible to do something like this where I’d record laser output with a Pi only and then process/visualize/solve it all at a workstation several hours later?
I'm glad that you liked the video. And yes it is possible to make offline processing. First - you have to record all sensor messages into a bag-file, and after "play" that data on the other computer and feed them to Cartographer. Cartographer itself supports offline and online map processing mode. For more information please refer to ros_bag ROS wiki page and also look for Cartographer offline_node in documentation
@@RoboticsWeekends I probably got ahead of myself. your video gave me some confidence I shouldn't have had as a beginner. haha. getting stopped at around the 7:50 mark. rplidar is spinning but have yet to see any data from it. I don't have ttyUSB0 or any other entry in lsusb that seems to be my rplidar, so the launch file is getting an error and stops. looking forward to any help you can offer. thanks!
Maybe it's ttyACM0. Have you read the post at Medium related to this video? There I provided more information than in the video. The link is in video description.
Thanks again for your quick response. I had read the article - yes. I figured it out - the cable I was using was enough to power the Lidar to spin but never had power light on the serial board. Went and tried a new cable and magically everything started working. Have seen the scan on a windows machine using the test exes. Just working on loading the laser output in Rviz now.
Hi! I have a couple of questions about your project. I am doing something similar as well. You mentioned that you installed ros melodic cartographer rviz onto your laptop to visualize the rviz tool correct? Does your laptop have ubuntu as well or does it have a different OS? Also what if your pi has ros kinetic instead? Asking because my pi has ros kinetic and my laptop is running windows.
Hello. Yes, you are right, ros is also installed on my laptop. I'm using it for ROS nodes development as well. Yes, I use Ubuntu as a main OS on my computer. I'm not using Windows for development, so can't say for sure, but ROS can be installed on windows as well. This code will work on ROS Kinetic also.
@robotics_weekends Hi how can I configure cartographer node to work with an IMU? Or is there another algorithm to do this? Using an IMU would increase the accuracy of localization data correct?
Hi, yes, despite it is optional for 2D mapping, you can add IMU source for increasing the mapping quality. By default Cartographer is waiting for IMU data on the /imu topic. Also don't forget to provide transformation between the IMU and lidar - you can do it in URDF or specifying static transform. Please refer to the Cartographer wiki on how to enable the IMU support in configuration.
Is there any way to save the map and reuse it??..like example.. labeling on the first square of the rviz as ‘living room’ and wherever robot locate, I want robot to come to living room when I command to robot.
For saving and reusing map (occupancy grid) please refer to the map_server package. As for the labeling usecase, nothing comes to my mind. If you find something please let me know.
Awesome video! But can u plz answer this question? Im done with installing ubuntu and ros melodic on raspberry pi, but where can i get the cartographer package?
This is a great video to get started with SLAM. For this project, should one install Ubuntu 18.04 Desktop, Server or core? Also, what ROS version should be used?
Thanks. It doesn't matter, any version of Ubuntu 18.04 will be fine. Well, maybe not the core version... The only difference is how many additional dependencies will be installed. For the Ubuntu 18.04 ROS melodic is recommended.
Hi, is it possible that I will use cartographer slam for mapping not in real time? Like i want a robot to inspect into a drainage and since there will be a disturbance of connection from the laptop to a raspi. Is it possible that I will view the constructed map after the robot inspects the drainage?
Really nice video! helped me how to go forward with a proyect I am trying to develop in my university, sorry to ask this, I am kinda new, but can you give me an insight about the urdf for robot description?
Hello, Thank you for the excellent tutorial. I was wondering if I can use this map made of SLAM and apply path planning algorithms using only the raspberry pi? I am new to this topic, and any hint or help would be useful. Thank you
You can convert the Cartographer map into occupancy grid by using occupancy_grid_node. Occupancy grid can be used in the ROS navigation stack as a map source. For navigation configuration please refer to the ROS navigation wiki.
Thank you. I'll think what can I do. Actually if we use RPLidar and the robot is not too fast, we can extract its position and rotation from Cartographer's pose estimation.
you are my hero! well done and excellent video. Would an RPI4 make a difference w.r.t performance? It looks like the PI is not a limiting factor under you current setup, but not sure if you had to trade accuracy/robustness for performance boost. Also, are you planning to augment your setup with an IMU and make an episode 2 video :) ?
Hi. Thank you. RPi4 will show better performance especially if you are going to map bigger area than my apartment;-). I'll think about another video showing IMU fusing in Cartographer.
I have done the same proces but for hector_slam. I am able to setup a wifi connection but rviz is not showing any map while on wifi. Can you suggest some solution?
Check if ROS_MASTER_URI and ROS_IP/ROS_HOSTNAME are set correctly on both computers. Also check if all needed hector_slam message packages are installed on computer running rviz
i was waiting for a tutorial like this. can I implement SLAM, how to do the localisation part. i have mpu 6050 and 12v encoder motor connected to mega. mega is a node to pi. i dont know where to next from here.
Localization is already provided by the cartographer - that blue trace on the map. But if you need a full autonomous navigation stack, than you have to use path planning algorithms additionally to current software set. I'd recommend you to see the linorobot.org open project for more information. As far as I remember their firmware supports Mega.
Hi, An excellent tutorial. I'd been using the RPLidar with an arduino just to see if it worked and finally got around to following your tutorial. Finding an image of Ubuntu 18.04 desktop for the raspberry Pi 4 took a bit of doing but I finally succeeded in installing it. I had a few hiccups along the way particularly around installing a catkin ws and setting up the environment correctly. Also ensuring that environments were set up correctly to find the ROS files e.g source devel/setup.bash . Finally all up and working (ROS and visualisation all done by the pi4) so I'm now looking to see how I can use ROS on the pi4 to actually move a robot around. Thanks again
This was a superb tutorial. Got it working after a bad time trying to make it work with Noetic and Ubuntu 20.04. Ugly! :) Wish I had started with 18.04 from the beginning.
@@rushikajoshi8202 I have not really tried after getting it going with Melodic. My problem was getting the code compiled on Noetic. At the time there was a direct install available on Melodic. Have not messed with it recently, but perhaps by now they have similar available for more recent Ubuntu releases.
Very informative! Thank you! Your launch file started a node called gbot_core executing capture_server_node.py. What is this node doing? It's the only file or call in this project I do not understand.
@@RoboticsWeekends It took off on its own like magic :-), but instead of a map, I got a warning: **W0627 **13:51:54**.000000 8649 tf_bridge.cc:52] "laser" passed to lookupTransform argument source_frame does not exist.** I'm using a copy of cartographer's backpack_2d.urdf file, so I can't figure out why I'm getting this. I don't take your excellent support for free. Reply with your Patreon and I'll buy you a case of beer ;-).
@UCn-MeRRGch26Tok_W-EQCGA Nevermind. Simple solution. In robot.launch file, I changed param *name="frame_id"* from *value="laser"* to *value="horizontal_laser_link"* to match the urdf sample provided by Cartographer. Silly. I'll still buy you that brew if you reply with your Patreon. Use the money for more toys and videos.
Glad that you've solved the issue. Unfortunately I have no Patreon account, but can propose PayPal donation link if it is convenient for you: paypal.me/roboticsweekends Thank you:)
Thanks! Odometry is provided by Cartographer. It's based on laserscan points matching and does not need imu if you have dense (and preferably with high rate) laserscan data stream. Imu data can be added as option and increases accuracy if point matching constantly fails because of scans with sparse points
Hello, I have Raspberry Pi 3 b+. I successfully install Ubuntu 18.04 mate and ros melodic from (wiki.ros.org/ROSberryPi/Installing%20ROS%20Melodic%20on%20the%20Raspberry%20Pi) . But, when i install ros cartographer from its site (google-cartographer-ros.readthedocs.io/en/latest/compilation.html) then the installation hangs up on the line (src/cartographer/scripts/install_proto3.sh).how can i solve this issue. is there any solution for this.
Добрый день, спасибо:) Точность - 5 см на пиксель. Точность можно задавать в параметрах, но с данным сенсором при такой площади, да ещё и без дополнительных вспомогательных сенсоров, это не улучшит точность карты.
@@RoboticsWeekends Большое спасибо за быстрый ответ и с праздником Вас! Что-бы не тратить ваше время, задам еще вопрос: собираю сыну робота паука на raspberry 4, хочу построить карту небольшой местности и сконнектить с Python - Ros или open cv для дальнейшего автономного передвижения и определения помех, что посоветуете -Лидар или Kinekt? лидара правда нет, пока, но kinect по вашему видео, уже подключаю. Заранее, спасибо
У каждого из предложенных вами сенсоров есть плюсы и минусы. Если коротко, допустим вы планируете строить 2Д карту. Плюсы Кинекта: - выше дата рейт - данные об окружении поступают с более высокой частотой; - благодаря облаку точек можно определять не только стены, но и холмы, которые чуть ниже лидара но при этом непроходимые, или места, где робот не пройдёт по высоте и отмечать эти места на 2Д плоскости; - возможность построить 3Д карту(не знаю зачем вам, но смотрится это эффектно); - хоть и плохая, но встроенная камера. Из минусов: - ограниченный угол обзора - особенно ощущается, если в окружающей среде мало предметов (например пустая комната 10х10м); - слепота при солнечном свете; - сравнительно невысокая эффективная дальность ~6м, плюс не распознает объекты ближе 0.5м; - требует доп 12В для подключения (но это решаемо - загляните на мою страничку на Medium.com); - если работать с облаком точек, то нужно запастись ОЗУ и процессорными ресурсами. Лидар (я предполагаю, мы говорим, об LDS 360, например один из дешевых RPLIDAR a1m8) имеет из плюсов: - обзор 360 градусов - это очень помогает при построении карты и последующей навигации; - простота работы и подключения-воткнул в USB и готово; Из минусов: - низкий дата рейт у дешёвых лидаров - 5-7Гц. Из-за этого скорость движения робота ограничена. Но для шагаюшего робота думаю это не проблема. - на солнце тоже слепнет, но хоть в тенечке что-то видно. - подвижные части - лишнее место потенциальной механической поломки; - ещё у шагающего робота обычно невысокая посадка, и если лидар будет перекашивать, он будет ловить отражения от пола и, очевидно, воспринимать их как препятствие. С облаком точек и известным углом перекоса, можно алгоритмичесики пересчитать Laserscan с учётом перекоса. Я бы наверное для шагающего робота выбрал, если не сам кинект, то что-то типа кинекта. Но опять же - все зависит от среды где робот будет находиться, его характеристик, и задач, которые поставлены перед роботом. ...это, если коротко)))
@@RoboticsWeekends Большое спасибо за содержательный и исчерпывающий ответ, Вашу статью прочитал, отличные статьи! иду тоже по пути сброса веса с kinect, но вентилятор я оставил
@@GabriGlider I didn't try to run it on Noetic, but I 99% sure that it will work. Maybe you will need to build some packages which are missing in Noetic repository
@@RoboticsWeekends Ok thanks you. Im trying to make it work for my final degree project drone. If I manage it to work, do you mind if I use your Github for it? obviously with your reference in the bibliography.
Hello! I'm starting a project using Kinect, Raspberry Pi and ROS. Can please make a tutorial about how to yield skeletal joints of a Human using Raspberry Pi and Kinect?
iam getting error when iam following Cartographer is building and installing please can you mention all the details u have been using in the project like ubuntu version . iam using ubuntu 18.04 with ros melodic and getting clang error when iam trying build and install using ninja please can u help me??
How did you manage to install the cartographer on the raspberry pi? Like this way: google-cartographer-ros.readthedocs.io/en/latest/compilation.html ??? My Rasperry pi 3 1gb seems to be to slow for that ...
Incredible introduction!
Im building a cheap robot with two kinects, a LDS-01 lidar and a Atomic Pi! I already got Visual Based SLAM working and im now shifting to Lidar SLAM to compare the results! Thank you for the great video! Installing Cartographer was a breeze!
Thank you. I'm glad that you liked the video. It would be very interesting for me to see your robot, especially how you tied up two Kinects.
@@RoboticsWeekends Okay! I can record a video and upload it to youtube, i'll let you know as soon as it's up!
It would be awesome! Looking forward to it!
Sorry i'm taking too long, my computer broke, so im unable to record the video i promised! As soon as it's fixed and the video is ready i'll let you know!
Sure, no problem, take your time. And thank you for the update!
RPLidar A1m8 is the first product produced by our company. Please support us. We also have better quality lidar. The company has been established for 10 years. I hope everyone likes it. We will continue to produce and develop more powerful products.
Wow!!! It's amazing I was looking for just this! Thanks for share, I promise you that if finally get my project send u a reference!
You are welcome 🙂
Very interesting! Why you set up the param use imu to true although no imu was there (9:21) ?
Brilliant video! Thank you for the starting point.
Thanks! You're welcome.
I have been looking for something like this, and Google knows it. Maybe that is why they sent me here :D good job
It's a magic! 😀 I'm glad that you liked the video! Stay tuned!
Would be soo nice if there was a way to get it to actually do a 3d scan of the inside of the room. Not just a 2d. Still, very cool. Nice job.
To be fair, the sensor is only scanning on a 2d plane. you would need a 3d lidar.
Good work
Neat job 😄👍🏽
Thanks:)
good stuff.
Thanks!
𝓟𝓻𝓸 𝓽𝓲𝓹
Use shrink tubes around the legs of the RPLIDAR to make it a snug fit in the 3d printed holes ;)
That's a good idea! Thanks;)
Great job! Thank you
You are welcome:) I'm glad that you liked it👍
Why do you approach it with mapper_ws instead of cartridge_ws made by cartographer?
Do I have to create a separate mapper_ws? Or will it be installed automatically?
Ubuntu version 18.04 isn´t available. If i use the newer version, is it the same process?
How did you send the LIDAR data from the Raspberry pi to the ROS on the laptop? Was the map generated on the PC, or already on the Pi?
The map is generated on RPi. Laptop is used only for visualisation of the ros topics. Laptop runs rviz only and is connected to the RPi's roscore via network.
@@RoboticsWeekends thanks for your answer. do you have a detailed documentation or a manual how to rebuild the system at home? i am very new to this topic, but i bought my first raspberry and the lidar, because i think your project is really cool.
I have no much time for now, but maybe later I will prepare a basic configuration as a ROS package and put it on GitHub
@@Bibbs420 I put the project on Github and added text description on Medium: github.com/Andrew-rw/gbot_core
@@RoboticsWeekends You are my man! Thanks for your work.
Map is shifting. how can you handle this?
Brother can u make video on 3D mapping
Hey thanks for your video. Quite helpful. Is it similarly possible to do something like this where I’d record laser output with a Pi only and then process/visualize/solve it all at a workstation several hours later?
I'm glad that you liked the video. And yes it is possible to make offline processing. First - you have to record all sensor messages into a bag-file, and after "play" that data on the other computer and feed them to Cartographer. Cartographer itself supports offline and online map processing mode. For more information please refer to ros_bag ROS wiki page and also look for Cartographer offline_node in documentation
@@RoboticsWeekends amazing. Thanks for the reply!
@@RoboticsWeekends I probably got ahead of myself. your video gave me some confidence I shouldn't have had as a beginner. haha.
getting stopped at around the 7:50 mark. rplidar is spinning but have yet to see any data from it. I don't have ttyUSB0 or any other entry in lsusb that seems to be my rplidar, so the launch file is getting an error and stops.
looking forward to any help you can offer. thanks!
Maybe it's ttyACM0. Have you read the post at Medium related to this video? There I provided more information than in the video. The link is in video description.
Thanks again for your quick response. I had read the article - yes. I figured it out - the cable I was using was enough to power the Lidar to spin but never had power light on the serial board. Went and tried a new cable and magically everything started working. Have seen the scan on a windows machine using the test exes. Just working on loading the laser output in Rviz now.
Hi! I have a couple of questions about your project. I am doing something similar as well.
You mentioned that you installed ros melodic cartographer rviz onto your laptop to visualize the rviz tool correct?
Does your laptop have ubuntu as well or does it have a different OS? Also what if your pi has ros kinetic instead?
Asking because my pi has ros kinetic and my laptop is running windows.
Hello. Yes, you are right, ros is also installed on my laptop. I'm using it for ROS nodes development as well.
Yes, I use Ubuntu as a main OS on my computer. I'm not using Windows for development, so can't say for sure, but ROS can be installed on windows as well. This code will work on ROS Kinetic also.
Thank you for the video! Do you have a GitHub page for this project by any chance?
You are welcome. I haven't uploaded the project on GitHub yet. But I see that many people ask for that, so will upload it soon
@@RoboticsWeekends will definitly follow!
@robotics_weekends
Hi how can I configure cartographer node to work with an IMU? Or is there another algorithm to do this? Using an IMU would increase the accuracy of localization data correct?
Hi, yes, despite it is optional for 2D mapping, you can add IMU source for increasing the mapping quality. By default Cartographer is waiting for IMU data on the /imu topic. Also don't forget to provide transformation between the IMU and lidar - you can do it in URDF or specifying static transform. Please refer to the Cartographer wiki on how to enable the IMU support in configuration.
@@RoboticsWeekends awesome, thanks for the reply/info!
Do you have the files uploaded to github? I encountered some errors following the video and having the configuration files might help. Thanks!!
Honestly didn't think about that. But you are not the first one who asks. Will do it soon. Stay tuned:)
@@RoboticsWeekends Thanks!
Would be nice!!!
@@Bibbs420 Finally I did it: github.com/Andrew-rw/gbot_core
@@RoboticsWeekends Great!!!
That was great video. Do you how to transfer the data from lidar to pointcloud software? Thanks.
Thank you. For converting laserscan to pointcloud please take a look at "laser_geometry" ROS package.
Is there any way to save the map and reuse it??..like example.. labeling on the first square of the rviz as ‘living room’ and wherever robot locate, I want robot to come to living room when I command to robot.
For saving and reusing map (occupancy grid) please refer to the map_server package. As for the labeling usecase, nothing comes to my mind. If you find something please let me know.
Hello . Do you have a document about its internal structure and working principle?
If you mean Cartographer's internals, then official documentation has it
Awesome video! But can u plz answer this question? Im done with installing ubuntu and ros melodic on raspberry pi, but where can i get the cartographer package?
Should i use ssh again to download it on raspberry pi?
Hi, thank you! Yes, connect to the raspberry pi and install cartographer by: sudo apt install ros-melodic-cartographer-ros
Hi! Nice work i have a question . rp lidar rotates 360° can we ignore scanning certain portion like we can only use 270° or 180° for mapping
Hi, thanks. There is a special ROS package for this purpose - laser_filters.
This is a great video to get started with SLAM. For this project, should one install Ubuntu 18.04 Desktop, Server or core? Also, what ROS version should be used?
Thanks. It doesn't matter, any version of Ubuntu 18.04 will be fine. Well, maybe not the core version... The only difference is how many additional dependencies will be installed. For the Ubuntu 18.04 ROS melodic is recommended.
@@RoboticsWeekends Thanks. Can you share procedure / links on how to install Ubuntu and ROS Melodic?
For ubuntu you can follow this manual: wiki.ubuntu.com/ARM/RaspberryPi
And for ROS this one: wiki.ros.org/melodic/Installation/Ubuntu
@@RoboticsWeekends Thanks for quick response
Hi, is it possible that I will use cartographer slam for mapping not in real time? Like i want a robot to inspect into a drainage and since there will be a disturbance of connection from the laptop to a raspi. Is it possible that I will view the constructed map after the robot inspects the drainage?
Yes. Cartographer supports so called "offline mapping"
Really nice video! helped me how to go forward with a proyect I am trying to develop in my university, sorry to ask this, I am kinda new, but can you give me an insight about the urdf for robot description?
I will share a full project and and an article on medium.com/robotics-weekends with detailed description soon
Hello, can you make a video on how to use this map for autonomous navigation?
Hello, right now I have no much spare time, but I'll do my best
[QUESTION] How compute resource spend when run Cartographer with Raspberry Pi 3B+ and your configuration files?
It depends on map size. For indoor map of about 50 square meters CPU is loaded not greater than 50%
Hello, Thank you for the excellent tutorial. I was wondering if I can use this map made of SLAM and apply path planning algorithms using only the raspberry pi? I am new to this topic, and any hint or help would be useful. Thank you
You can convert the Cartographer map into occupancy grid by using occupancy_grid_node. Occupancy grid can be used in the ROS navigation stack as a map source. For navigation configuration please refer to the ROS navigation wiki.
@@RoboticsWeekends Thank you! Can I do that using rpi only? Or should I transfer the grid to a laptop and use ROS navigation?
It should work on rpi. But I'd recommend you to use rpi 4 with not less than 2GB of ram
@@RoboticsWeekends thank you so much for your help 😍
Can you make a rover in the next video that does the whole mapping of a floor along with mpu6050 ??
Well done sir...
Thank you. I'll think what can I do. Actually if we use RPLidar and the robot is not too fast, we can extract its position and rotation from Cartographer's pose estimation.
@@RoboticsWeekends oh..okay sir you are a great tutor keep it up sir.. expecting more videos of this sort...
Thankyou sir☺️
you are my hero! well done and excellent video. Would an RPI4 make a difference w.r.t performance? It looks like the PI is not a limiting factor under you current setup, but not sure if you had to trade accuracy/robustness for performance boost. Also, are you planning to augment your setup with an IMU and make an episode 2 video :) ?
Hi. Thank you. RPi4 will show better performance especially if you are going to map bigger area than my apartment;-). I'll think about another video showing IMU fusing in Cartographer.
I wonder if this would work with a Pi Zero?
Pi Zero has a relatively weak CPU and less RAM, so I think it wouldn't be possible.
I have done the same proces but for hector_slam. I am able to setup a wifi connection but rviz is not showing any map while on wifi. Can you suggest some solution?
Check if ROS_MASTER_URI and ROS_IP/ROS_HOSTNAME are set correctly on both computers. Also check if all needed hector_slam message packages are installed on computer running rviz
Hi Arpit,
If ur in India
Can u pls drop ur contact at nikgaurav32@gmail.com
thnx
i was waiting for a tutorial like this. can I implement SLAM, how to do the localisation part. i have mpu 6050 and 12v encoder motor connected to mega. mega is a node to pi. i dont know where to next from here.
Localization is already provided by the cartographer - that blue trace on the map. But if you need a full autonomous navigation stack, than you have to use path planning algorithms additionally to current software set. I'd recommend you to see the linorobot.org open project for more information. As far as I remember their firmware supports Mega.
Good work
Hi,
An excellent tutorial. I'd been using the RPLidar with an arduino just to see if it worked and finally got around to following your tutorial.
Finding an image of Ubuntu 18.04 desktop for the raspberry Pi 4 took a bit of doing but I finally succeeded in installing it.
I had a few hiccups along the way particularly around installing a catkin ws and setting up the environment correctly. Also ensuring that environments were set up correctly to find the ROS files
e.g source devel/setup.bash .
Finally all up and working (ROS and visualisation all done by the pi4) so I'm now looking to see how I can use ROS on the pi4 to actually move a robot around.
Thanks again
This was a superb tutorial. Got it working after a bad time trying to make it work with Noetic and Ubuntu 20.04. Ugly! :) Wish I had started with 18.04 from the beginning.
@@Pleiades-gm3mi have you found out a way to make it working with ros noetic?
@@rushikajoshi8202 I have not really tried after getting it going with Melodic. My problem was getting the code compiled on Noetic. At the time there was a direct install available on Melodic. Have not messed with it recently, but perhaps by now they have similar available for more recent Ubuntu releases.
Very informative! Thank you!
Your launch file started a node called gbot_core executing capture_server_node.py. What is this node doing? It's the only file or call in this project I do not understand.
Thank you. This is a leftover from the other project, and can be ignored. It is not involved in Cartographer computation flow
@@RoboticsWeekends It took off on its own like magic :-), but instead of a map, I got a warning: **W0627 **13:51:54**.000000 8649 tf_bridge.cc:52] "laser" passed to lookupTransform argument source_frame does not exist.** I'm using a copy of cartographer's backpack_2d.urdf file, so I can't figure out why I'm getting this.
I don't take your excellent support for free. Reply with your Patreon and I'll buy you a case of beer ;-).
@UCn-MeRRGch26Tok_W-EQCGA Nevermind. Simple solution. In robot.launch file, I changed param *name="frame_id"* from *value="laser"* to *value="horizontal_laser_link"* to match the urdf sample provided by Cartographer. Silly. I'll still buy you that brew if you reply with your Patreon. Use the money for more toys and videos.
Glad that you've solved the issue. Unfortunately I have no Patreon account, but can propose PayPal donation link if it is convenient for you: paypal.me/roboticsweekends
Thank you:)
@@RoboticsWeekends Done! Ping me if it didn't go through.
Hello. Is it compatible with ROS2?
Hi. I don't think so. But I'm going to figure out this soon
Nice video ! How odometry laserscan work without IMU ?
Thanks! Odometry is provided by Cartographer. It's based on laserscan points matching and does not need imu if you have dense (and preferably with high rate) laserscan data stream. Imu data can be added as option and increases accuracy if point matching constantly fails because of scans with sparse points
Hello, I have Raspberry Pi 3 b+. I successfully install Ubuntu 18.04 mate and ros melodic from (wiki.ros.org/ROSberryPi/Installing%20ROS%20Melodic%20on%20the%20Raspberry%20Pi) . But, when i install ros cartographer from its site (google-cartographer-ros.readthedocs.io/en/latest/compilation.html) then the installation hangs up on the line (src/cartographer/scripts/install_proto3.sh).how can i solve this issue. is there any solution for this.
Hello. I suppose it's due to low amount of RAM
Great work! Thanks
You are welcome! I'm glad that you liked it!
Keep it up..!!
Добрый день, супер видео, подскажите пожалуйста, какая точность при построении карты?
Pozdrawiam
Добрый день, спасибо:) Точность - 5 см на пиксель. Точность можно задавать в параметрах, но с данным сенсором при такой площади, да ещё и без дополнительных вспомогательных сенсоров, это не улучшит точность карты.
@@RoboticsWeekends Большое спасибо за быстрый ответ и с праздником Вас! Что-бы не тратить ваше время, задам еще вопрос: собираю сыну робота паука на raspberry 4, хочу построить карту небольшой местности и сконнектить с Python - Ros или open cv для дальнейшего автономного передвижения и определения помех, что посоветуете -Лидар или Kinekt? лидара правда нет, пока, но kinect по вашему видео, уже подключаю. Заранее, спасибо
У каждого из предложенных вами сенсоров есть плюсы и минусы.
Если коротко, допустим вы планируете строить 2Д карту. Плюсы Кинекта:
- выше дата рейт - данные об окружении поступают с более высокой частотой;
- благодаря облаку точек можно определять не только стены, но и холмы, которые чуть ниже лидара но при этом непроходимые, или места, где робот не пройдёт по высоте и отмечать эти места на 2Д плоскости;
- возможность построить 3Д карту(не знаю зачем вам, но смотрится это эффектно);
- хоть и плохая, но встроенная камера.
Из минусов:
- ограниченный угол обзора - особенно ощущается, если в окружающей среде мало предметов (например пустая комната 10х10м);
- слепота при солнечном свете;
- сравнительно невысокая эффективная дальность ~6м, плюс не распознает объекты ближе 0.5м;
- требует доп 12В для подключения (но это решаемо - загляните на мою страничку на Medium.com);
- если работать с облаком точек, то нужно запастись ОЗУ и процессорными ресурсами.
Лидар (я предполагаю, мы говорим, об LDS 360, например один из дешевых RPLIDAR a1m8) имеет из плюсов:
- обзор 360 градусов - это очень помогает при построении карты и последующей навигации;
- простота работы и подключения-воткнул в USB и готово;
Из минусов:
- низкий дата рейт у дешёвых лидаров - 5-7Гц. Из-за этого скорость движения робота ограничена. Но для шагаюшего робота думаю это не проблема.
- на солнце тоже слепнет, но хоть в тенечке что-то видно.
- подвижные части - лишнее место потенциальной механической поломки;
- ещё у шагающего робота обычно невысокая посадка, и если лидар будет перекашивать, он будет ловить отражения от пола и, очевидно, воспринимать их как препятствие. С облаком точек и известным углом перекоса, можно алгоритмичесики пересчитать Laserscan с учётом перекоса.
Я бы наверное для шагающего робота выбрал, если не сам кинект, то что-то типа кинекта. Но опять же - все зависит от среды где робот будет находиться, его характеристик, и задач, которые поставлены перед роботом.
...это, если коротко)))
@@RoboticsWeekends Большое спасибо за содержательный и исчерпывающий ответ, Вашу статью прочитал, отличные статьи! иду тоже по пути сброса веса с kinect, но вентилятор я оставил
Hello. Can I do it with RPLidar A1?
Yes, you can
@@RoboticsWeekends And Can I use Ubuntu 20.04 and Ros Noetic? Im having many troubles with old versions
@@GabriGlider I didn't try to run it on Noetic, but I 99% sure that it will work. Maybe you will need to build some packages which are missing in Noetic repository
@@RoboticsWeekends Ok thanks you. Im trying to make it work for my final degree project drone. If I manage it to work, do you mind if I use your Github for it? obviously with your reference in the bibliography.
@@GabriGlider sure, you're welcome to use it
Hello! I'm starting a project using Kinect, Raspberry Pi and ROS. Can please make a tutorial about how to yield skeletal joints of a Human using Raspberry Pi and Kinect?
Hi. It's a bit out of scope for my current experiments. But I'd recommend you to take a look on NiTE, openNI2, or similar open frameworks
Robotics Weekends thats too bad, but thanks anyway! ☺️
iam getting error when iam following Cartographer is building and installing please can you mention all the details u have been using in the project like ubuntu version . iam using ubuntu 18.04 with ros melodic and getting clang error when iam trying build and install using ninja please can u help me??
In this demo I'm using Cartographer binaries from ROS repository. No compilation is needed just 'apt install ros-melodic-cartographer'
Could you please give me the 3d file of the box?
Sure, here it is - www.thingiverse.com/thing:3970110
can i see your gbot launch file?
As lots of people asked, I will share ready to launch project and and an article on medium.com/robotics-weekends with detailed description soon
github.com/Andrew-rw/gbot_core
How did you manage to install the cartographer on the raspberry pi? Like this way: google-cartographer-ros.readthedocs.io/en/latest/compilation.html ??? My Rasperry pi 3 1gb seems to be to slow for that ...
It is available as binary ros melodic package in Ubuntu repository
@@RoboticsWeekends Could you please provide a link? Thank you in advance
@@Bibbs420 If you used repositories mentioned in the ROS installation Wiki, then: sudo apt install ros-melodic-cartographer-ros
hello can i get your email bro need to ask you some questions ?
Hi. Feel free to ask here in comments