bashrc
bashrc
  • Видео 145
  • Просмотров 137 327
Freedombone interactive installer
An interactive installer for the Freedombone self-hosted web server
Просмотров: 460

Видео

Demo of big-endian Linux kernel running on Jetson TK1 (ARM Tegra)
Просмотров 4089 лет назад
The final proof! See wiki.baserock.org/How_to_install_a_big-endian_Linux_system_to_NVIDIA_Jetson_TK1
FreedomBox Foundation - James Vasile
Просмотров 18 тыс.12 лет назад
Freedom Box Executive Director James Vasile speaks at the Contact Summit, hosted by Douglas Rushkoff and held at the Angel Orensanz Foundation in New York on October 20, 2011.
Kevin Warwick and the Seven Dwarf robots (1996)
Просмотров 1 тыс.12 лет назад
Kevin Warwick demonstrating the Seven Dwarf robots at an exhibition in Glasgow in 1996. The robots show learning of obstacle avoidance and flocking behaviors.
Navigation with the Turtlebot
Просмотров 45912 лет назад
A successful A to B and B to A navigation from one end of the house to the other. Being able to do this multiple times is a good indication of how reliable the navigation system is.
Navigation with the Turtlebot
Просмотров 16212 лет назад
A pretty successful navigation from one end of the house to the other, as seen within the Rviz GUI. On the way back the bot gets stuck at the final doorway.
Turtlebot following behavior
Просмотров 74012 лет назад
The following behavior works quite well. A possible improvement would be that if it sees something which looks like a flat surface (a wall) then it should rotate to search for new people to follow.
Testing 2D SLAM
Просмотров 76913 лет назад
The 3D point cloud from the Kinect sensor is converted to a fake laser scan and a 2D SLAM algorithm is then applied.
Testing 2D SLAM
Просмотров 1,6 тыс.13 лет назад
The 3D point cloud from the Kinect sensor is converted to a fake laser scan and a 2D SLAM algorithm is then applied.
GROK2 Manual Jog in Rviz
Просмотров 5413 лет назад
Manually jogging the robot using the joystick. Odometry isn't calibrated, but approximately somewhere in the right ballpark.
GROK2 viewing its environment
Просмотров 19713 лет назад
The alignment of views is not very good. One of the reasons is that the Kinect sensor is heavier than the previous stereo cameras, reducing the repeatability of movement in the pan axis.
URDF model showing pan/tilt head motion
Просмотров 35313 лет назад
URDF model showing pan/tilt head motion
Testing the Kinect on the GROK2 robot
Просмотров 6013 лет назад
Testing the Kinect on the GROK2 robot
Kinect initial test
Просмотров 33913 лет назад
Kinect initial test
Detecting the horizontal surface of a chair and table
Просмотров 4613 лет назад
Detecting the horizontal surface of a chair and table
Detecting horizontally oriented surfaces within a point cloud
Просмотров 10313 лет назад
Detecting horizontally oriented surfaces within a point cloud
Two chairs
Просмотров 7713 лет назад
Two chairs
Detecting horizontally aligned surfaces from the GROK2 robot
Просмотров 3513 лет назад
Detecting horizontally aligned surfaces from the GROK2 robot
View of a table and chair observed by the GROK2 robot
Просмотров 2513 лет назад
View of a table and chair observed by the GROK2 robot
View of the desktop from the Rodney robot
Просмотров 8013 лет назад
View of the desktop from the Rodney robot
Point cloud model of the author
Просмотров 26613 лет назад
Point cloud model of the author
View of the desktop from the Rodney robot
Просмотров 2713 лет назад
View of the desktop from the Rodney robot
GROK2 looks at the computer desk
Просмотров 7413 лет назад
GROK2 looks at the computer desk
The GROK2 robot views a pile of books
Просмотров 4213 лет назад
The GROK2 robot views a pile of books
View of a room from the GROK2 robot
Просмотров 8513 лет назад
View of a room from the GROK2 robot
Composite point cloud from the GROK2 robot
Просмотров 1,5 тыс.13 лет назад
Composite point cloud from the GROK2 robot
First composite point cloud model from the GROK2 robot
Просмотров 10513 лет назад
First composite point cloud model from the GROK2 robot
pcloud: A point cloud viewing utility
Просмотров 24213 лет назад
pcloud: A point cloud viewing utility
The importance of looking earnest
Просмотров 4013 лет назад
The importance of looking earnest
Mesh model of Surveyor SRV robot
Просмотров 4213 лет назад
Mesh model of Surveyor SRV robot

Комментарии

  • @Anton-wy9qu
    @Anton-wy9qu 4 месяца назад

    We thought this was great when we were a child, till we realised the schematics aren't accurate some circuits don't function or are "dummy's" on cybot one of the 'burnouts' isn't actually soldered to the right pin outs 😅

  • @x66Hawk66x
    @x66Hawk66x Год назад

    I have the cybot, that was based on these robots.

  • @joaopauloamado5026
    @joaopauloamado5026 5 лет назад

    Can you share the code or more information?

  • @vaishnavj2589
    @vaishnavj2589 7 лет назад

    v4l2stereo -0 /dev/video1 -1 /dev/video0 --features When i run this it says - v4l2stereo: command not found Any idea how to resolve this?

  • @astqxrandom
    @astqxrandom 7 лет назад

    could you tell me how does it work, what sensors did you use?

  • @astqxrandom
    @astqxrandom 7 лет назад

    could you tell me how does it work, or what type of sensors did you use?

  • @sivaprasadrajubairaju4088
    @sivaprasadrajubairaju4088 8 лет назад

    Hello Sir, Could you please send me a message where you got the data for this video from, because i have the same project..

  • @closingtheloop2593
    @closingtheloop2593 8 лет назад

    This made my day... seriously i have been trying to verify if this cheap webcam can do this... thanks a bunch!

  • @cr4zyw3ld3r
    @cr4zyw3ld3r 8 лет назад

    I keep getting a libhighgui.so.2.1 error it is saying there is no such file or directory after i run the video1 and video0 command any help?

  • @johnsmith6990
    @johnsmith6990 11 лет назад

    Amazing! I did a project back in University to detect red & green light Worked great in the lab, didn't get around to trying it on the road Are you detecting color with a lookup table or math? I used stats.and created a square filter Works best if calibrated based on surrounding light source. Are you using RGB? I use YUV. I think that if you can have one axis as the intensity range of the surrounding light source(ie. sun & maybe street lights) you can use the other two axis determine the color

  • @motters2001
    @motters2001 11 лет назад

    The code for this is available on Google Code. It's only sparse stereo, since the DSPs and the very limited bandwidth between them can't handle much more than that.

  • @jonrod1693
    @jonrod1693 11 лет назад

    how did you embed the program on the robot? I need to embed a program on the surveyor robot with a single camera. your help will be greatly appreciated

  • @motters2001
    @motters2001 11 лет назад

    On Google code under libv4l2cam

  • @SurabhiKannan
    @SurabhiKannan 11 лет назад

    where do i find the code for this?

  • @motters2001
    @motters2001 11 лет назад

    Yes. Search for v4l2stereo, since I don't think you can post URLs on RUclips comments.

  • @motters2001
    @motters2001 11 лет назад

    No. This is a simple edge detection using non-maximal suppression which is then stereo matched.

  • @motters2001
    @motters2001 11 лет назад

    This uses the Minoru stereo webcam.

  • @RasoulMojtahedzadeh
    @RasoulMojtahedzadeh 11 лет назад

    Is the code available? I would like to do some experience with my home-build webcam based stereo vision system.

  • @RasoulMojtahedzadeh
    @RasoulMojtahedzadeh 11 лет назад

    Are you using SURF feature matching between left and right images?

  • @RasoulMojtahedzadeh
    @RasoulMojtahedzadeh 11 лет назад

    What is the model of the webcams?

  • @RasoulMojtahedzadeh
    @RasoulMojtahedzadeh 11 лет назад

    What is the model of the webcams?

  • @KingMysion
    @KingMysion 12 лет назад

    This is real time enough for a car. Not a missle but for slow things it works.

  • @roidroid
    @roidroid 12 лет назад

    hehe, just found this video after stumbling across your blog and wiki during a google search. Imagine my surprise to see that i was ALREADY subscribed to your youtube channel. small world eh.

  • @cuongmits
    @cuongmits 13 лет назад

    Motter, please tell me what's the problem you want to solve in this video?

  • @dislexable
    @dislexable 13 лет назад

    @motters2001 And with this particular dataset?

  • @RasoulMojtahedzadeh
    @RasoulMojtahedzadeh 13 лет назад

    Produce a Point cloud demonstration in which RGB-D (voxels) can be seen.

  • @motters2001
    @motters2001 13 лет назад

    For 320x240 resolution images on a computer several years old, yes you can do realtime dense stereo.

  • @dislexable
    @dislexable 13 лет назад

    Can you create the disparity map in realtime ie. the videos 10 fps? And with what processing requirements?

  • @dubbelfriz123
    @dubbelfriz123 13 лет назад

    I like the sound

  • @anfedres
    @anfedres 13 лет назад

    Did you do this in an actual robot? Did you do it using which programming language?

  • @motters2001
    @motters2001 13 лет назад

    In this case, yes.

  • @motters2001
    @motters2001 13 лет назад

    They could be, but more likely it's just an artefact of the matching algorithm.

  • @scythiconatron
    @scythiconatron 13 лет назад

    Are the horizontal lines caused by iterlacing in the original stereo images? I encountered this trying to make 3D videos.

  • @motters2001
    @motters2001 14 лет назад

    There isn't a binary package release of this yet, but there will be soon.

  • @Plasticpals
    @Plasticpals 14 лет назад

    Quick and easy is right! Looks very simple, even for someone like me?

  • @pirobotproductions
    @pirobotproductions 14 лет назад

    @motters2001 I see from this article (Google "sensors_sharpirrange.shtml") that the IR beam is narrower than I imagined. Funny how hard it is to find an actual specification posted anywhere of the beam width. I found one posting that measured a 0.25cm width at 10cm which would be about 1.5 degrees. So this would require 60 samples per 90 degree sweep. For a 2 second sweep that would be about 33 ms per sample which I guess is actually pretty good!

  • @motters2001
    @motters2001 14 лет назад

    Possibly the scan rate could be faster, but as the scan time decreases you get less range data (looks like wider arcs) and there is more mechanical stress and noise.

  • @pirobotproductions
    @pirobotproductions 14 лет назад

    @motters2001 Thanks for your reply. BTW, does 2 seconds seem a little slow to you? If I am reading the specs correctly (which is not at all certain), the time between samples can be as short as 5 ms. So even at 50 ms we'd get 20 samples per second. Assuming the beam width is about 15 degrees (can't find this in the specs), that would be 6 samples per 90 degrees which would only take 300 ms. I suppose it depends on the microcontroller you are using and how fast it can query the IR sensors.

  • @motters2001
    @motters2001 14 лет назад

    The fastest looks like about 2 seconds for a 90 degree sweep. I'm also checking to see how this compares to ranges from stereo vision, since the characteristics of this sensor are unspecified by the manufacturer.

  • @pirobotproductions
    @pirobotproductions 14 лет назад

    This is really nice! I was wondering what is the fastest you can pan the IR sensors (say, in sweeps per second or degrees per second) while still sampling the distance readings fast enough not to introduce gaps in the map? (Hope that makes sense.)

  • @tmtyler
    @tmtyler 14 лет назад

    Heh: you name your mind children around when they are conceived.

  • @motters2001
    @motters2001 14 лет назад

    I was thinking of calling it "babelbrox", after the Tower of Babel and Zaphod Beeblebrox (who had two heads).

  • @tmtyler
    @tmtyler 14 лет назад

    I like it! You should call any resulting robot "max headroom" :-)

  • @motters2001
    @motters2001 14 лет назад

    The main reason for trying this configuration is to simplify the algorithms. With multiple mirrors on a plane it is also possible to get stereo ranges, but the data association problem remains complicated. In this arrangement there is a special epipolar geometry where we only need to compare features along radial lines. This might mean that this approach is suitable for use on low powered systems or DSPs.

  • @tmtyler
    @tmtyler 14 лет назад

    I didn't mean moving the bottom reflector - I meant adding 4 ball bearings. I know that the software to make use of the extra information would be the difficult bit...

  • @tmtyler
    @tmtyler 14 лет назад

    I see this is called "catadioptric" vision.

  • @motters2001
    @motters2001 14 лет назад

    Yes you're right. You can imagine the mirrors not existing and this just being two cameras with fisheye lenses.

  • @tmtyler
    @tmtyler 14 лет назад

    Well, you don't focus on the mirror - you focus on what you are looking at. The reflected objects seen in the top mirror are further away - but you *probably* just want to focus on infinity here in any case.

  • @motters2001
    @motters2001 14 лет назад

    There is, but this is just because the field of view is rectangular and the mirrors are spherical. Moving the lower mirror closer to the camera makes getting a good focus on both more difficult. A solution to the focus problem would be to mount a lens in the hole of the lower mirror, but this increases the cost/complexity.

  • @tmtyler
    @tmtyler 14 лет назад

    It seems as though there is some wasted space in the corners. Room for four more reflectors there - perhaps? :-)