- Видео 72
- Просмотров 50 191
CE-Bench
США
Добавлен 16 янв 2016
An unscripted Computer Engineer's video blog: Software, electronics, multi rotors, other unmanned vehicles of various flavors. You know, stuff I find worthwhile.
PYNQ Tutorial - Xilinx Virtual Cable (XVC)
In this video we're gonna do away all those expensive smart links! I know you got a couple of them daggling off your work bench / test fixtures. They get in the way, the ribbon cables are short AF, your always running out of them, or even worse, "Having somebody poach yours while your working remote!" We're gonna skip all that hassle and slap that puppy right down in the PL itself. Whoop 🙌 How to use Xilinx Virtual Cale (XVC) with PYNQ:
GitHub Repo:
github.com/Austin-Parks/PYNQ_XilinxVirtualCable
forked from Xilinx Virtual Cable repo:
github.com/Xilinx/XilinxVirtualCable
Instructions are in the README...
- Austin P.
GitHub Repo:
github.com/Austin-Parks/PYNQ_XilinxVirtualCable
forked from Xilinx Virtual Cable repo:
github.com/Xilinx/XilinxVirtualCable
Instructions are in the README...
- Austin P.
Просмотров: 60
Видео
FPGA Midi Synth - (ArtyZ7 + Petalinux)
Просмотров 2283 года назад
Just an informal video blog update on the progress of a personal DIY project of mine. ArtyZ7 - Midi Synth: The aim of the project is to create a stand alone midi synthesizer using the ArtyZ7-20 project board (From Digilent, inc.). It features an advanced Xilinx Zynq-7020 SoC (System on Chip) that comes with a duel-core 650 MHz ARM Cortex-A9 Artix7 equivalent FPGA in a single chip package. I'll ...
Autonomous Navigation - True Mine Sweeping
Просмотров 785 лет назад
Did some quirky optimizations to reduce and concentrate the search space in the middle of the squares. This would only be for the purposes of competition. Real world mines will be placed more randomly.
Plan B
Просмотров 355 лет назад
So I believe I may have found a viable solution to our autonomous navigation crisis. This ROS package (frontier_exploration) should work out of the box, as long as we properly calibrate and tune up our robot platform. worth a shot I guess...
ROS Tutorial - Using Qt Creator
Просмотров 8 тыс.5 лет назад
Here I'm just showing the steps necessary to use QtCreator with a pre-existing ROS work space using the catkin build system (not using "catkin_make", the python ros catkin tools method "catkin build") This enables auto complete, highlighting, and many other nice modern features for ROS c development. This shows how to open up an entire ROS workspace in QtCreator from start to finish.
Static Path Sequencing
Просмотров 385 лет назад
So I've figured out how to take control of move_base and force a sequence of static paths. This would be for situations where you know your doing an (20*20) meter search square and just want to map the detection of metal objects.
Global Path Planner move base plugin
Просмотров 4655 лет назад
Finally figured out how to interface with move_base at the plugin level. Much improved results for simple mine sweeping.
Servo Tracking Control - Box Sequence Test
Просмотров 146 лет назад
Servo Tracking Control - Box Sequence Test
Servo position tracking control - test 1
Просмотров 326 лет назад
Servo position tracking control - test 1
The video is great, you need to keep promoting your channel. Is not hard.
If driving servos, use the servo lib instead, to get good servo PWM signals.
could you share with us the source git hub packages
Instead of defining your own FOSC you could just use F_CPU I think
Thanks for sharing man!❤️ And easily explained👍🏽
Can u share the image shown initially, ROS, Gazebo, Mavlink & QGround control flow chart drawn by you. At 1:51 Thankyou.
Best tutorial . Thanks from India
Hi, are you able to share a video about setting up with Qt QML together with ROS
Next time, use drawio program to graph things, it's like no learning curve.
Hi.would you please send me your codes?
hello, i would need some help after opening the renamed file, because i do not have the key CMAKE, my keys are empty, what can I do to fix this?
Hello, I accidentally found you video and I think this is the approach I need to follow. I'm doing an experiment which consists using a Raspberry Pi to send command messages to an Arduino Mega via a Serial link. The problem I'm facing (just like the example you initially showed) is that these command messages are not entirely read and I cannot be consistently polling the serial buffer to check for new messages since my Arduino needs to be doing other stuff. What I need is an interrupt that gets activated as soon as the Arduino receives data from the Pi - just like I think you're doing here. However, I think your code is a bit complex for me to understand so I would like to ask if there's any chance I could achieve the same output using Arduino libraries? By the way, do you have any documentation that you can suggest that explain your approach in "baby steps" so I can try to learn how to fix this problem? I've searching for days now and still haven't found a proper solution xD (I don't wan't to simply copy code because I wouldn't understand what it does) Thank you for this video - it has enlighten me a lot
Amezing
Source code available?
Nice video, if it is possible would you make a tutorial about RQT (C++ and Python version of RQT) to control our drone with ROS / GAZEBO / PX4 too? thank you very much.
After seeing this I just changed my mind :’D
hello dear I am used Sabertooth 2x60 motor driver, so I want to interface ROS. can u share the CODE and Hardware interface? Thank u so much.
wow
hello could u help me to interface sabertooth 2x60 with arudino ros?
hi can we have the Firmware folder i 've downloaded it from github but my build folder dosent consist of pix4_rover app !!
I have made a video on px4 gazebo simulation ruclips.net/video/pLNs_c5SLEI/видео.html
Works like a charm
Guys refer for px4 gazebo simulation ruclips.net/video/pLNs_c5SLEI/видео.html
De casualidad me podrias pasar el codigo?
Good intro. Can you post that hand-drawn diagram? Thank you.
can I access the code to this, I would love to play around with it
can i have the source code? iarc 2016 dev
I am also an engineering student at uah (MAE). Do you mind me asking what course is teaching this? In the MAE department we have studied more theoretically about robotic control, but this seems to get into practical application.
Are you still developing this unit?
deaf!
thanks a lot
This is so helpful! I am wondering what is the version of Gazebo that you're using and did you build from source or binary?
Fascinating work! Do you know how to set up a ground rover in PX4 using a pixhawk mini? Whenever I try to find the "rover" Airframe in the PX4 setup from Q ground control I can never find it. Is PX4 only for air vehicles?
No it works for rovers too, It's not easy to find good info on how to set it up though. I know I went through a nightmare a while back and gave up after I got the results in this video with a hacked together steering controller (depreciated at this point). Eventually, a newer firmware version was released that finally shipped with the missing rover attitude / position controllers as standard. I'm not sure why you wouldn't see it as an option though...? \_(-_-)_/ What version of QGroundControl are you using? What version of the PX4 firmware are you running? I don't know if it makes a difference, but I use the current development branch.
I understand it must have been an intense nightmare. Do you know which is the latest or best version of PX4 which supports rover?
My apologies, I didn't see the remainder of your reply. I installed the latest version of both QGC and PX4 and now I see the rover "air frame" option! Thanks for the hint. Cant wait to get to building/tuning! one last question, I see the 3 options for rover vehicle types are "generic ground vehicle, axial rc car, and the stampede rc car" ; is there a skid steer or tank steer vehicle option or only ackerman style vehicles. ?
What you need is a motor controller that has the ability to take the ackerman-style ( 2 Channel RC PWM | Steering / Throttle ) signals from the pixhawk and control a differential / skid drive motor pair. I'll list a good one below that's good for most motors: www.robotshop.com/en/sabertooth-dual-2x32a-6v-24v-regenerative-motor-driver.html The control mode I'm talking about is detailed on page 20 in the data sheet: www.robotshop.com/media/files/pdf/rb-dim-47-datasheet.pdf A video demo of this using a wheel chair and raw rc receivers and this motor controller (no pixhawk involved): ruclips.net/video/hS_yKXXlUsc/видео.html ( works with wheel chair motors )
That's exactly what I was looking into you read my mind. Thanks again for all the help I'll keep you posted as the project unfolds.
Hi, where can I download the IGVC-Dev folder?
Hi, I was asking how you added the physical camera on the Iris model because I tried to modify the iris.sdf but does not work! thank you very much.
So first off, the "iris.sdf" file gets re-generated every time you run make posix_sitl_gazebo. (be careful modifying it in place, might get overwritten). I think it's made from a ".urdf" file. I don't know anything about urdf, so I just first copy the iris model folder from wherever it gets built to my ROS workspace ( I use "ROS_workspace/models" for my model directory, but anywhere works as long as it's set in your ".bashrc" file | GAZEBO_MODEL_PATH ). Then I modify the file to add the camera. I added this INSIDE of the <base_link> tag towards the bottom ): <collision name='base_link_lump::camera_link_collision_1'> <pose frame=''>0.12 0 -0.01 0 0.7 0</pose> <geometry> <box> <size>0.05 0.05 0.05</size> </box> </geometry> <surface> <contact> <ode/> </contact> <friction> <ode> <mu>0.2</mu> <mu2>0.2</mu2> </ode> </friction> <bounce/> </surface> <max_contacts>10</max_contacts> </collision> <visual name='base_link_lump::camera_link_visual_1'> <pose frame=''>0.12 0 -0.01 0 0.7 0</pose> <geometry> <box> <size>0.05 0.05 0.05</size> </box> </geometry> <material> <script> <name>Gazebo/Red</name> <uri>file://media/materials/scripts/gazebo.material</uri> </script> </material> </visual> <sensor name='camera1' type='camera'> <update_rate>20</update_rate> <camera name='head'> <horizontal_fov>1.39626</horizontal_fov>  <clip> <near>0.02</near> <far>300</far> </clip> <noise> <type>gaussian</type> <mean>0</mean> <stddev>0.007</stddev> </noise> </camera> <plugin name='camera_controller' filename='libgazebo_ros_camera.so'> <alwaysOn>true</alwaysOn> <updateRate>20</updateRate> <robotnamespace>iris</robotnamespace> <cameraName>mavlink/</cameraName> <!-- Video feed will be pushished as: ... --> <imageTopicName>image_raw</imageTopicName> <cameraInfoTopicName>cam_info</cameraInfoTopicName> <frameName>camera_link</frameName> <hackBaseline>0.07</hackBaseline> <distortionK1>0.0</distortionK1> <distortionK2>0.0</distortionK2> <distortionK3>0.0</distortionK3> <distortionT1>0.0</distortionT1> <distortionT2>0.0</distortionT2> </plugin> <pose frame=''>0.12 0 -0.01 0 0.7 0</pose> </sensor> This is from an old workspace so I hope it still works for you. Let me know how it goes.
Thank you very much for your help!!
Hi I am using gazebo rviz and Qground controller but unable to work with all three simultaneously can please suggest which commands we need to use to work with all three applications
what you need is the 'mavros' ROS package (I'm on indigo): sudo apt-get install ros-indigo-mavros ros-indigo-mavros-extras It provides a way for the virtual px4 to connect to it, and from there, allows a 'gcs_bridge' udp address to be forwarded to QGroundControl. Sorry it took me so long. Here's my mavros launch script: <!-- ___________________________ MAVROS ________________________ --> <!-- this is the px4 SITL address, QGroundControl default I think --> <arg name="fcu_url" default="udp://:14550@127.0.0.1:14557" /> <!-- mavros forwards a multiplexed MavLink stream to this udp port --> <arg name="gcs_url" default="udp://@127.0.0.1:12600" /> <!-- gotta start px4 and mavros first; otherwise, QGroundControl will bind to the default port. I disabled the default connection. note: you'll have to create a new connection in QGroundControl's general settings and provide the correct udp port to mavros --> <arg name="tgt_system" default="1" /> <arg name="tgt_component" default="50" /> <arg name="log_output" default="screen" /> <include file="$(find mavros)/launch/node.launch"> <arg name="pluginlists_yaml" value="$(find mavros)/launch/px4_pluginlists.yaml" /> <arg name="config_yaml" value="launch/px4_config.yaml" /> <arg name="fcu_url" value="$(arg fcu_url)" /> <arg name="gcs_url" value="$(arg gcs_url)" /> <arg name="tgt_system" value="$(arg tgt_system)" /> <arg name="tgt_component" value="$(arg tgt_component)" /> </include> In short, gazebo rotors / imu plugins connect to the PX4 SITL as default. The PX4 SITL connects to the mavros node as if it were QGroundControl. Then, mavros exposes a multiplexed MavLink port to QGroundControl. Now mavros can send commands to the PX4 or QGroundControl can send commands (through mavros) to the PX4 and they all can communicate properly. Yeah it's kind of a frustrating set-up at times.
Hi, What a great performance. Could you please explain how you have added a camera to simulation since I am using PX4 which was prepared by developer Guide part of PX4 site ( dev.px4.io/en/simulation/gazebo.html) and the code is here (github.com/PX4/Firmware.git ) and the problem is that gazebo is running separately from ROS and i am not able to add a camera and receive its image in ROS ? Your previous explanation was useful if you had run gazebo via ROS not separately and the link you mentioned above isn't working so your previous explanation didn't help me
how did you add a ground texture into gazebo?
I want to know the same thing
Hi, great Work !!! Which Software or SDK do you take ? Looks like very professionell !!! I have Motors from LDPower (www.ldmpower.com). Maybe you want to test this ? If you are interested, of course, more accurate data. Usually do machine construction. If you are interested, we could develop this project further ...
Hello, nice work! Can you explain and/or send a link to understand how to add a camera to the simulation? Thanks!
Yeah it took me many many hours to understand how to get a ROS-Topic enabled camera in the simulation, theres very little documentation on it. I host the simulation on a live gzWeb Server: uah-iarc-dev.org:8080 The bulk of the work is in the gazebo 'model.sdf' files. Gotta have gazebo 6, ros-indigo-gazebo-pkgs, and either launch gazebo via a roslaunch script or manually run it with the ros server plugin: gazebo --server-plugin /opt/ros/indigo/lib/libgazebo_ros_api_plugin.so [Your_File.world] be sure to set all the environment variables for gazebo, namely the plugin path variable: export GAZEBO_PLUGIN_PATH=${GAZEBO_PLUGIN_PATH}:/opt/ros/indigo/lib Total pain in the ass, trial and error got it to work for me. ################ Here's some important bits of the .sdf I file used ############### ###### Camera Deffinition / this is the .sdf file telling gazebo to add a camera to some ###### pre-defined position on the robot, a.k.a. , "the camera link" ###### this also defines the ros topic name / image format and various other camera ###### parameters. <sensor name='camera1' type='camera'> <update_rate>20</update_rate> <camera name='head'> <horizontal_fov>1.39626</horizontal_fov>  <clip> <near>0.02</near> <far>300</far> </clip> <noise> <type>gaussian</type> <mean>0</mean> <stddev>0.007</stddev> </noise> </camera> ### here is where most of the ros stuff gets defined. Be sure that GAZEBO_PLUGIN_DIRECTORY ### (bash environment variable) includes the ros-gazebo-pkgs directory. <plugin name='camera_controller' filename='libgazebo_ros_camera.so'> <alwaysOn>true</alwaysOn> <updateRate>20</updateRate> <robotnamespace>iarc-robot</robotnamespace> <cameraName>sensor/cam_1</cameraName> <imageTopicName>image_raw</imageTopicName> <cameraInfoTopicName>cam_1_info</cameraInfoTopicName> ### this next part is important, make sure to have defined this link ### may need to read up on .sdf spec and how scripted links work <frameName>camera_link</frameName> <hackBaseline>0.07</hackBaseline> <distortionK1>0.0</distortionK1> <distortionK2>0.0</distortionK2> <distortionK3>0.0</distortionK3> <distortionT1>0.0</distortionT1> <distortionT2>0.0</distortionT2> </plugin> <pose frame=''>0.12 0 -0.01 0 0.7 0</pose> </sensor> just send me a PM if you need any more help or if you got a specific error message that keeps ruining your day.
Thank you! I let you know about the final result! :)
Glad to help, I'll be looking forward to it then.
congrats Man ! huge work and for sure people will be interest in DIY kit like this with software (me first !)
+macfly1202 I did it for a university project. I wound up spending more money and time on it than I had planned. :-/ I just don't have the spare time or money to do much with it at this point.
Love it! well done..
It flies much better after dialing up the gains. I've added 2 cctv board cameras. One is kind of a wide angle front view. and the other is a target drop camera that faces straight down. I can switch between them with a switch, and the OSD changes for target drop, and normal front facing view.
The software is really buggy at the moment. I've never tried to make an actual gui application before this project. The dyno controller firmware isn't that reliable either. I just need some time to work out some of the bugs and remove unused / broken features, as well as fix the features that I want to work.