Autonomous RC Truck with Wireless Charging (ROS/Jetson TX2)
HTML-код
- Опубликовано: 2 окт 2024
- A self-driving system developed with Linux & ROS on an NVIDIA Jetson TX2, to demonstrate dynamic inductive charging from the road. The sensor package consists of a lidar and camera. Hector SLAM is used for mapping and localization. Pure Pursuit is used to follow a manually recorded path. Machine-learning and behavior cloning is used to make the vehicle drive with camera only. Recorded camera images and steering input were used as training data to a deep neural network. The truck is controlled with PWM-signals from a Teensy 3.2 microcontroller. There are four different driving modes; manual, record new path, lidar/SLAM-mode, and AI-mode. The truck has an onboard induction charger system that talks with the TX2 through CAN-bus. There is also a custom GUI to monitor the states of the truck and battery.
Link to paper: ntnuopen.ntnu....
Github: github.com/jon...
Master Thesis in Cybernetics & Robotics by Jon Eivind Stranden at the Norwegian University of Science and Technology (NTNU) in cooperation with SINTEF Energy Research, 2019.
Music:
Lensko - Let's go
Self driving truck future
This is your daily dose of Recommendation
The self driving part is a bit sketchy, but it's look like a very fun project to do for a thesis. Good job.
I would love to have the money to buy all of the equipment to test it out myself. This is so cool!
real thing is to be able to make candy from shit. you can discover more
Really like this concept
Is that tiny rotating thing a lidar? It's really cute😍
And expensive....😂
The price of that thing is astronomical, Elon is even scared to use it 😂
@@Alkebic They're like ~$100 lol
@@Alkebic you can get one for less than 200 with a simple google search
@@Alkebic Lidars can be very expensive, but this one is not lol
THIS PROJECT IS GREAT, CONGRATULATIONS!
1:10 that's awesome, the Whole thing is awesome
On Christmas day under the tree, we make it moving around the house full night. I know that the energy it consumes is not worth it. But, having sound of electronic running during the night and seeing kid smile while seeing it, is worth it. It is not a lonely night.
can you please let me know how you communicate remotely with TX2 from your PC
@@zafrankhan1829 Hi, use SSH or ROS over network
@@joneivinds how can i use ssh..can you pleasw giude me..please email me at zafrankhan7133@gmail.com
@@joneivinds ill be thankfull to you
@@zafrankhan1829 ssh username@192.168.x.x
Oh hey! I saw this guy sitting in the snake robot lab a bunch! Cool to see it in action!
That's really cool. I want to this with my mini quad
Great job mate... Go ahead. Good luck 🤞🤞🤞👌
Great video
that stock music made me puke...
It’s “let’s go” by NCS. I like it
It's _Let's go_ by _Lensko._
NCS did nothing but re-publish the song.
Coffin dance background music, yeaaahhh
Unbearable
This is a really outstanding project! Well done!
It's awesome, the only problem is the cost of the Jetson!
Really cool
good job !
The next big thing. The wave of the future.
Making tutorial please
Hello, can you please share the link for the RC track model that you used? And perhaps from where you purchase it? Thank you
its awesome...awesome expensive
nice realy cool keep the work going (-: and wil it be comertaly avalible?
I love tamiya truck and robotics projects, so this is really amazing!!! I must try something same with my King hauler :-D
Badass
For some reason I knew it was gonna have this song
What the actual fuck
Perfect ,Wonderful
i would love to know where you bought the truck it self
Another wireless charging crap. If somebody things wireless charging for heavy loads is the way, than something is missing :D
What all things you done to get yor localisation proper? Wheel encoder ? IMU ? Lidar?. I am doing something similar but my localisation is very poor. I am only using wheel encoders.
Nice job, lidar is doomed tho. I'm currently working on an RC car project relying only on a camera, GPS, and IMU.
Hello, I too want to do this kinda project . Can you please let me know if there is any documentation to read about it in more detail....if possible can you share me the details abt this project on my mail id ,if so I can share my id here. Thanks.
danish.reza1@gmail.com
@@mohammeddanishreza4902 Start learning C++, and OpenGL. These are low level enough to utilize all the power of the raspberry pi. Don't try to use compute shaders, they are broken (actually got fixed in a newer driver but the update is not listed in the package manager). For OpenGL, you can use the Khronos documentation. The raspberry camera is also a pain in the A (doesn't give you raw data). If I'm being honest, you better of buying an orange pi 4B or something with better driver support. RPi 4 was meant to be better than the Pi3 but atm it's crap if you want to build a low-level application (repeatedly because of driver support).
@@skrya1248 Thanks alot for the information 😀
What kind of Lidar is this?
What is the Lidar sensor?
I want to do the same xd
I’m trying to do a an RC car with automatic breaking. Cost and complexity are my main concerns. This truck is beautiful and complicated and expensive.
Very nice project!
Fuck beat me to my idea
how are you guys able to know the current position of your robot I'm struggling in that part please help me out
If you use lidar on a self driving vehicle you are doomed said Elon Musk
How did you make the wireless charging?
perfect !
i'm in love
They did.
@@torinion wow.... want to learn it
Came for the autonomous driving of a small-scale electric truck model with dynamic wireless charging, stayed for the music! 🎵🎶
Thank you for sharing your work. Very beautiful design! :)
what is the top of the truck always turned
Amazing work!!
Why jetson tx2? Why not jetson nano?
Night drive please~
How to you transmit signals from jetsontx2 to your imu/motor controller ? I am using an OlixmexStm32 and I facing trouble in communicating them
Hi, I use a Teensy 3.2 microcontroller with rosserial, that is connected to the Jetson with USB. The IMU is connected to the Teensy (i2c) and it transmits data via ROS. The motor controller is also controlled via the Teensy with PWM signals going to the different servo inputs. The PWM values are set via ROS.
Check out github.com/joneivind/Self-Driving-Truck/blob/master/Teensy/teensy_car_controller.ino
Biltema batteri jo :D ^^
Lol that looks like it works better the the auto c scrubbers we have at work >
Who the hell wants a rc that drives it’s self💀💀💀
It is concept development. Once they have refined the idea, and find the best solution then they can try it on real trucks.
Great work. I would like to ask about power up the tx2 . As you are using 12volt DC power. But the required voltage for max performance of tx2 is 18 volt. Can you please suggest any solution, if I want to use s4 lipo or s6 battery. How could i get the stable output voltages. Any help regarding power up tx2 would be appreciated.
18 V and 80 W??? TX2 uses 15 W at full power.
@@dini86
Thank you for your reply. Sorry my mistake I don't know why I wrote 80 watts. But any lipo battery for power up the tx2. Please suggest. For full performance and 1.5 hours of operating time.
@@arifanjum84 To be on the safe side lets calculate with 25 W max consumption (this includes losses and maybe you want to use some USB or other devices). For 1,5 h operation you need 25W * 1,5h = 37,5 Wh. So roughly a 40 Wh battery.
Input voltage can be anything between 5,5 - 19,6 V. With a 2s LiPo battery we have 7,4V and with 4s we get 14,8 V. 5s nominal voltage is 18,5 V so it looks like okay. BUT when a lipo is fully charged it has 4,2 V. So a 5s battery would be 5*4,2 V = 21V which is too much!!
Based on this I would choose a 4s battery pack because the higher the voltage is, the lower the current is. A battery that can give out less current is cheaper than a high current capable battery.
So with 4s and 40 Wh we need a battery capacity of 40 Wh / 14,8 V = 2,7 Ah = 2700 mAh. The current drawn by the device is 25 W / 14,8 V =
1,7 A = 1700 mA. This means that you can choose a 4s 1C battery pack.
To sum it up, buy any battery that is at least 4s 1C 2700 mAh = 4s 2700 mAh 2,7A.
For example the below battery would work. It has a C rating of 75 which means that it can give out a current of 240 A (3,2 Ah * 75). This is more than enough for your needs.
www.banggood.com/ZOP-Power-14_8V-3200mAh-75C-4S-Lipo-Battery-XT60-Plug-for-RC-Airplane-p-1548932.html?rmmds=search&cur_warehouse=CN
@@dini86 I really appreciate you for your reply. It saved a lot of efforts for me. As I was searching it for a long time. Thank you
I used a 3s 5500mAh lipo, input voltage 5,5-19,6V as mentioned :)
Now make the software aware of the trailer
hi, i wish to learn to make it, it's seems so cool
Awesome!
Awesome ,dude
What a crapy music (noise)
good
The best dor EVER . ....
From Iran
WOW. Awesome work. Sweet
need for information on the software
I would like to understand that first path following system, that prevents device to leaving it's original path when something non movable comes in path, I can give you some subscribers, thank you
ENGINEERING COMMUNITY
EDIT:
Subscriber is a cheap deal, we can do a lot more..., Editing the comment after 9 months,
Thanks
Hi, check out links in description
sir can you please send the mentioned course link
Udacity Self driving Nano Degree.
Good
Make it use the camera image only, nothing else
Hi, with camera only: ruclips.net/video/wexNknhHwoU/видео.html
@@joneivinds Good, now keep using camera data only for AI Driving
lidar is finally getting cheaper
Hee that's actually pretty cool
How to program its path?
Can raspberry Pi handle the autonomous driving?
it can, but it would be very slow vs the jetson
Will the Nano be sufficient or will have to go for TX series?
1:46 made with unity ha ha
Haha cool
Which Lidar was used
OOh My God
how to make buy all modum
Very impressive, great job 👍
Good jop
Wow superb work.
That is pretty impressive
Did you try to use only imu and magnetometer instead of lidar for positioning? How the lidar behaves on open environments with few close objects to scan?
Yes, I tried using the throttle input (estimated speed vs input) and yaw angle from an IMU, but it wasnt presice enough without wheel encoders, although the concept worked :)
I think the lidar has a range of about 12m so it will loose position if the objects are out of range, hence why I implemented the AI controller that uses camera instead
Thank you for your replies. I'm working on a project very similar to yours (jetson tx2 + neural network) that uses imu and encoders, but my team is struggling with the cumulative errors from the imu readings and the magnetometer drift. We also use camera to recognize objects, but only when they are close. It seems to be hard to have a good positioning system that don't relies on external references.
Nice work!
Which programming you used
Nice project
Wow!!
👍Espetacular
Felicidades que tal pro
yectaz
o
Can it sweep my floor?
Am I correct in assuming that there are no wheel encoders?
Yes, the position is estimated using lidar
@@joneivinds
Am I also correct in assuming that you are using move_base with TEB planner ?
But anyhow , cool project .
No, I wrote my own path logger and path tracker based on the A* algorithm
@@joneivinds Did you notice any performance or map quality issues without it? We plan on putting together a 3D slam robot without an odometer
Edit: also, do you have a link to your paper? Would be nice to see it.
@@cwill6491 Link to paper: ntnuopen.ntnu.no/ntnu-xmlui/handle/11250/2625670
It is not Autonomous if you keep using your RC.
Now make it drive AGGRESSIVELY
Hello there,
how did you do that I want to do that please help, is there any link for its learning and it's document format...??
It is possible that it is unrealistic to use inductive charging when it is so much easier and efficient to just plug in.
Why ROS?
How else would you do it
@@RakshithPrakash without it
is this special?
Wonderful design, horrible music.
Volvo : nigga you want a job?