Unlocking the Performance of Proximity Sensors by Utilizing Transient Histograms

Поделиться
HTML-код
  • Опубликовано: 30 июн 2024
  • Directly utilizing low-level information generated by optical time-of-flight sensors allows recovery of planar geometry and albedo from a single sensor measurement.
    Code is now available: github.com/uwgraphics/Proximi...
    Project website: cpsiff.github.io/unlocking_pr...
    Robotics and Automation Letters (RA-L) / To Appear: ICRA 2024
    0:00 Intro to Transient Sensors
    1:10 This Work
    1:53 Baseline
    2:08 Our First Method
    2:37 Our Second Method
    3:33 Evaluation
    3:48 Results
    4:17 Robot Demo
    5:47 Next Steps
  • НаукаНаука

Комментарии • 103

  • @uwgraphics
    @uwgraphics  7 месяцев назад +73

    To clarify something we've seen many comments on: our results don't mean that the manufacturer's solution for distance estimation is "bad". The manufacturer's solution runs onboard the sensor, which has a minuscule amount of computing power, and was likely designed to support generic use cases for their sensor, not the very specific planar recovery that we perform.

  • @DaveEtchells
    @DaveEtchells 7 месяцев назад +22

    Brilliant work! I was surprised how well simple calibrated peak detection worked, compared to the much more complicated differentiable method.

  • @tombowen9861
    @tombowen9861 7 месяцев назад +10

    I'm immediately thinking of all the super low cost ways you could now provide backup orientation data to an IMU. With the simplicity and price-point, you could use it to provide a crosscheck to combat sensor drift over long trips. If the sample rate is too low, use two! haha. So many lunar landers have gotten turned around over the years due to sensor confusion, and here's a few dollars that can provide surface geometry data that might be used to reorient. Really neat project!

  • @Ender_Wiggin
    @Ender_Wiggin 8 месяцев назад +69

    Very cool, always surprises me how bad the manufactures solutions are. Goes to show that even if you have all the hardware you could want software is still king.
    This makes me cry at night being a hardware guy / lover.

    • @emiliaolfelt6370
      @emiliaolfelt6370 7 месяцев назад +4

      just get better at hardware, duh

    • @kerimgueney
      @kerimgueney 7 месяцев назад +6

      Kind of disappointing how the "proprietary algorithm" of the manufacturer themselves is so extremely lackluster. Who knows what kind of amazing potential is lost, because we can't actually utilize our existing hardware effectively.

    • @956870733
      @956870733 7 месяцев назад +4

      This doesn't seem necessarily software. It's about having a different perspective on what the data is telling us, together with some math.

    • @Jakedasnake1066
      @Jakedasnake1066 7 месяцев назад +14

      I imagine there are tradeoffs being made in the proprietary algorithm to account for harsher requirements than this paper was restricted to. For instance, it seems that they used a full fledged PC to perform their histogram analysis, whereas the chip manufacturers must make use of the microcontroller hardware onboard the chip itself. I imagine that under those circumstances, the proprietary algorithm is performing quite well.

    • @Nobody-Nowhere
      @Nobody-Nowhere 4 месяца назад

      Not really, as it needs to run on the chip itself. Taking out the raw data and using much more powerful processor to filter it, makes sense you can get much better results.

  • @borisbadinov7757
    @borisbadinov7757 7 месяцев назад +4

    Fantastic. Also the clarity of your technical descriptions overcame my youtube induced ADHD

  • @Lion_McLionhead
    @Lion_McLionhead 7 месяцев назад +7

    Surprised these sensors evolved into low resolution 3x3 cameras. Definitely an advantage to apply offboard computing power to the histogram analysis. The integrated processor is a Cortex M0.

  • @jakobjorgensen7773
    @jakobjorgensen7773 7 месяцев назад

    I've been looking for a solution like this - excellent work! Thank you

  • @ThisIsToolman
    @ThisIsToolman 7 месяцев назад +1

    Very interesting. Looking forward to future videos involving this sensor.

  • @MyrLin8
    @MyrLin8 7 месяцев назад +1

    Excellent work. Impressive signal use.

  • @ConsultingjoeOnline
    @ConsultingjoeOnline 7 месяцев назад

    YES!!! Amazing research guys! Keep it up!

  • @brotherdust
    @brotherdust 7 месяцев назад +1

    Wow! This would be useful to implement as part of an automatic bed leveling routine on a 3D printer! Nice work! Subscribed!

  • @rileybrown7023
    @rileybrown7023 8 месяцев назад +2

    This is so cool. I love transient histograms!

  • @gluonsmx
    @gluonsmx 7 месяцев назад

    Really smart! Thanks for sharing the rationale and code

  • @bob2859
    @bob2859 7 месяцев назад +1

    Very cool. Looking forward to seeing if you can get geometry reconstruction working!

  • @jovaraszigmantas
    @jovaraszigmantas 7 месяцев назад +5

    Amazing video. Can not wait to see 3D scanning.

    • @Stinktierchen
      @Stinktierchen 7 месяцев назад

      Wasnt it done ages ago with a cheap Kinect camera from the Xbox?

  • @SirasPK
    @SirasPK 7 месяцев назад +23

    Very good work guys. Outperforming a company this way makes me wonder if they let the interns write codes for these things.

    • @overflow7276
      @overflow7276 7 месяцев назад +2

      They most likely do. I once bought a muscle sensor from Seed-Studio and the only code example they provided was using the raw unfiltered signal to light up an LED bar. This is so far off from the usual field of use for these sensors it really made me scratch my head why nobody over at seed studio has actually bothered trying to get a filtered and clean signal from their hardware, especially since the filter settings depend a lot on exactly that hardware.

    • @Fennecbutt
      @Fennecbutt 7 месяцев назад

      It's cause all the C levels are just biz majors whose only purpose in life is to convince investors and customers that their company has some sort of "secret sauce". I fucking loathe companies that NDA their datasheets like Pixart Imaging (and plenty of others).

    • @uwgraphics
      @uwgraphics  7 месяцев назад +2

      See the pinned comment - we actually don't think the manufacturer's solution is bad. It's just more constrained on compute and designed for more generic sensor use cases.

  • @luisca92
    @luisca92 7 месяцев назад

    Just got a ur10e for R&D, can’t wait to try this

  • @user-ui8jt6cx2k
    @user-ui8jt6cx2k 7 месяцев назад

    Great Work!

  • @chemicburn
    @chemicburn 7 месяцев назад

    Incredible work thanks for sharing!

  • @rojirrim7298
    @rojirrim7298 7 месяцев назад

    Damn that was interesting. Thanks a lot for uploading this!

  • @EPMTUNES
    @EPMTUNES 5 месяцев назад

    Pretty cool stuff here!

  • @friendlybear5924
    @friendlybear5924 7 месяцев назад

    Very informative video. Thank you! Enjoyed it a lot.

  • @pontosinterligados
    @pontosinterligados 7 месяцев назад

    Wow! Kudos guys! 👏

  • @MrFennicus
    @MrFennicus 8 месяцев назад

    Awesome demo 👌🏻

  • @AlaskanInsights
    @AlaskanInsights 7 месяцев назад

    Nice, This is a cool work around fur sure.

  • @therealnagia2461
    @therealnagia2461 7 месяцев назад

    hooooley crap! who knew simple tof sensors had so much output you could gather such useful data from! thank you for this!

  • @jhoughjr1
    @jhoughjr1 7 месяцев назад

    this is awesome, earned a sub.

  • @royalag007
    @royalag007 8 месяцев назад +1

    Great video 😊 thks a lot

  • @acatfrompoland5230
    @acatfrompoland5230 7 месяцев назад

    Very inspiring video :3

  • @FullCircleTravis
    @FullCircleTravis 7 месяцев назад

    Thanks for sharing this.

  • @jumpstar9000
    @jumpstar9000 5 месяцев назад

    Very cool. You could make a MIDI controller as a demo. Something like a theremin. It might also be nice to arrange a few of these in a hemispherical or triangulated configurations for a desktop device. Then you could use both hands as manipulators for a 3D input device. Also use with prosthetics. Really cool stuff guys 👍

  • @ryanreedgibson
    @ryanreedgibson 7 месяцев назад

    This would be great to use with my thrust vectoring rocket.

  • @VacuousCat
    @VacuousCat 7 месяцев назад

    Works like echolacation. Animals can do echolocation to detect shape, distance, and texture.

  • @mumblety
    @mumblety 7 месяцев назад

    Cool!

  • @JaydenLawson
    @JaydenLawson 7 месяцев назад

    3:55 wow your results are MUCH better than the proprietary distance estimates

  • @ollie-d
    @ollie-d 7 месяцев назад +2

    Good work and clean explanation. I imagine you could get quite good results with a sensor mounted to an unbalanced motor. Although I’m not sure what the sampling rates are with these devices and it would certainly be introducing more cost and points of failure.

    • @pmj_studio4065
      @pmj_studio4065 7 месяцев назад

      You can set the sample rate of TMF8820 anywhere on the order of 1 to 1000 Hz, trading off accuracy - the actual sample rate is something like 50kHz, but individual samples are combined for better accuracy. (I don't remember the exact numbers, but you get the point)

    • @uwgraphics
      @uwgraphics  7 месяцев назад +3

      That's right. In our configuration the sensor sends out 4,000,000 pulses of light for every measurement. It is still able to report measurements at about 10Hz (but we are bottlenecked by the I2C interface to about 3Hz for full histograms). I'm not sure how low you can go. I expect for something like a spinning LiDAR with this sensor you would be limited by the interface bandwidth, especially if you want to get full histograms.

    • @DaveEtchells
      @DaveEtchells 7 месяцев назад +1

      I wonder if anyone makes a cheap TOF sensor with a SPI interface?
      (Just checked, first result was for some Broadcom modules that are ~~$70 in onesies, but they’re intended for greater distances (up to 10m, 100m with dual frequencies) Didn’t go further, the ~$20 modules are perfect for house use case)

    • @uwgraphics
      @uwgraphics  7 месяцев назад +2

      @@DaveEtchells The new ST VL53L8CH has an SPI interface. Unfortunately the example code is all in I2C so it's taking us some time to get it working over SPI. If anyone gets it working, we'd love to get in touch!

    • @DaveEtchells
      @DaveEtchells 7 месяцев назад +1

      @@uwgraphics Ah, cool! That should give you a huge boost in frame rate. Fantastic work, it’s astonishing how much better the precision is than with the manufacturer’s own code. I can see this having lots of applications: It’s a very cheap and performant solution for a common need.

  • @LanceThumping
    @LanceThumping 7 месяцев назад +1

    I'd be interested in seeing a version of this maximizing for linear accuracy. The setup I envision is the sensor is mounted to a linear rail with a board/reflector fixed at a right angle to one end.
    Then the sensor is set at a known distance to the board, the differentiation is run with known values for the distance/angle to calibrate for the surface reflectance.
    Finally the sensor runs on the linear rail, with the calibrated data and the fixed angle relative to the surface to provide only distance data.
    I'm curious how much all this boiling down of the data will increase the accuracy of the linear distance measurement.
    Reason being is that I've been interested in the idea of using these types of sensors for closed loop control of a 3d printer (I like the idea of knowing exactly where the print head is at a given time), but one issue is the accuracy. The sensor frequency would definitely be too slow but there are other comments addressing it that could be explored.

  • @frosty1433
    @frosty1433 7 месяцев назад

    Stereo cameras are also very interesting.

  • @constantinosschinas4503
    @constantinosschinas4503 7 месяцев назад +1

    Will adding more sensors, in possibly different angles, improve lower distance error across different materials?

  • @u9vata
    @u9vata 7 месяцев назад +8

    Two questions:
    - Is there source code available for just the plane reconstruction (excluding robot moving parts or such)?
    - Is there similar solution for higher range similar sensors? Would be pretty interested if there is and might have use case for AR. Maybe sennsor range is high already but not sure. I know some of these ToF sensors have longer range like around 10m at most (which would be enough for my case) but I guess maybe this exact sensor has smaller range.

    • @uwgraphics
      @uwgraphics  7 месяцев назад +13

      Code has just been released! github.com/uwgraphics/ProximityPlanarRecovery
      The sensor used in this work typically has a max range of about 3 meters. We set it to "low range, high accuracy" mode to give it a max range of 1m for higher temporal resolution. Our solution could in principle work with higher range sensors. The highest I know of in this form factor have a quoted max of 5 meters (e.g. ST VL53L8)

    • @maciejurbanski6146
      @maciejurbanski6146 7 месяцев назад +3

      @@uwgraphics brilliant work; thank you for publishing the code!

    • @paulwesley3862
      @paulwesley3862 7 месяцев назад +1

      ​@@uwgraphicsthanks for publishing the code. you said you restricted it for

    • @u9vata
      @u9vata 7 месяцев назад +1

      @@uwgraphics Thanks! I just realized its 3FPS because of I2C though so likely need some sensor fusion, but this is a very cool project with low cost sensors becoming much more powerful than they by factory are! Awsome work!

    • @uwgraphics
      @uwgraphics  7 месяцев назад +1

      @@paulwesley3862 There are 128 bins, but the first 15 or so are never used. The bin resolution is the equivalent of about 1cm. That doesn't mean the final output of the sensor is limited to 1cm resolution, though.

  • @bendingsands87
    @bendingsands87 7 месяцев назад +2

    I wonder if this could be used to make a device for 3d printer bed leveling to either make a bed mesh or for real time manual bed leveling. This is very cool.

    • @robstamm60
      @robstamm60 7 месяцев назад +3

      Unless you come up with some insane hack to increase the accuracy from a few millimeters to less than 0.1mm you are out of luck. These sensors measure the time of flight of light - to measure with 0.1mm accuracy you would need to measure with less than 1ps time resolution...

  • @maxmyzer9172
    @maxmyzer9172 7 месяцев назад

    cool!

  • @syber-space
    @syber-space 7 месяцев назад +1

    A very neat technique! Will the demo code be released publicly so this can be applied to community libraries, or ia this method being kept proprietary?

    • @uwgraphics
      @uwgraphics  7 месяцев назад +6

      We would like to release it publicly, it's just a matter of having the time to clean up the code and document it. Reach out to the first author if you're interested in getting your hands on some "research quality" code.

    • @uwgraphics
      @uwgraphics  7 месяцев назад +4

      Code has just been released: github.com/uwgraphics/ProximityPlanarRecovery

    • @syber-space
      @syber-space 7 месяцев назад

      @@uwgraphics awesome!

  • @ChandrashekarCN
    @ChandrashekarCN 8 месяцев назад

    💖💖💖💖

  • @qwertyboguss
    @qwertyboguss 7 месяцев назад

    Cool. Cool cool cool

  • @ethaneveraldo
    @ethaneveraldo 7 месяцев назад +1

    I’ve been longing for an affordable, compact 3D scanner/solid state lidar for years. So many things you could do with something like this. I use a lot of single point ToF sensors in my projects, didn’t know there were 9 points measurement ones. Maybe by having a small array of them, each at a small angle, you could create a working 3D sensor?
    Video is incomplete. Need to see more

    • @uwgraphics
      @uwgraphics  7 месяцев назад +4

      Combining data from multiple distributed sensors is an interesting direction for future work.
      There are higher resolution versions of these sensors available! Check out the ST VL53L8, which reports 8x8 zones. The VL53L8CH even reports histograms, but they're lower temporal resolution than the TMF8820 that we use.

  • @matthewsheeran
    @matthewsheeran 7 месяцев назад

    I wonder if these are the new "induction" previously PIR sensor light sensors to detect a passing human and turn a light on?

  • @Sonic_Shroom
    @Sonic_Shroom 8 месяцев назад

    cool

  • @dmendesf
    @dmendesf 7 месяцев назад

    What's the model of the robotic arm? It's really cool.

    • @uwgraphics
      @uwgraphics  7 месяцев назад

      It's a Universal Robots UR5

  • @andybrice2711
    @andybrice2711 7 месяцев назад

    Is this a similar principle to RADAR and ultrasound scanners?

    • @uwgraphics
      @uwgraphics  7 месяцев назад

      Those often also operate on time-of-flight, so yes!

    • @andybrice2711
      @andybrice2711 7 месяцев назад

      @@uwgraphics But also I think they have to try and parse out multiple reflections from complex surfaces and various layers of changing impedance.

  • @MyKidFPV
    @MyKidFPV 8 месяцев назад +1

    Very cool! Can you share your Arduino code?

    • @uwgraphics
      @uwgraphics  8 месяцев назад +2

      yes, send an email to the first author Carter: sifferman@wisc.edu

    • @BeefIngot
      @BeefIngot 7 месяцев назад +4

      ​@@uwgraphicsIm assuming then that this isn't going to be the sort of thing that gets MIT licensed in a nice neat library, but instead licensed back to these companies to improve their sensors?

    • @uwgraphics
      @uwgraphics  7 месяцев назад +2

      @@BeefIngot We have no partnership with a company or plans to license this. We are happy to release it with a permissive license, we just have to find the time to clean it up and document it. If you would like to get your hands on some "research quality" code, please reach out to the first author.

    • @uwgraphics
      @uwgraphics  7 месяцев назад +2

      Our code has been released: github.com/uwgraphics/ProximityPlanarRecovery

  • @retinapoliyn7462
    @retinapoliyn7462 7 месяцев назад

    آYou could easily do the same with the VL53L5X sensor

  • @Joso997
    @Joso997 7 месяцев назад

    what about the Sun?

    • @uwgraphics
      @uwgraphics  7 месяцев назад

      Good question! SNR is lower in sunlight because ambient light crowds out many of the "good" photons. In our testing (separate of this paper) the sensor itself is still able to function up to about half a meter even in direct sunlight. We would expect the performance of our plane finding algorithm to perform more poorly in sunlight because of the low SNR. We haven't tested it thoroughly because it's hard to get a robot arm outside!

  • @roninbadger7750
    @roninbadger7750 7 месяцев назад

    When is this going to be in the next Samsung or Pixel, 6 months? Next Apple 5 years? next LG last year?
    "Ai assisted 3D modeling" next feature on a smart phone.

  • @BUrbbable
    @BUrbbable 7 месяцев назад

    This won't be very accurate with translucent materials, I assume ?

    • @uwgraphics
      @uwgraphics  7 месяцев назад +2

      We haven't tested it thoroughly but probably not. There may be ways to deal with it, especially if you know the material ahead of time and have a model for its subsurface scattering.

    • @Shinobubu
      @Shinobubu 7 месяцев назад +1

      @@uwgraphics Yeah I was thinking about modeling the material. This might be a good option for us 3D printer users as an alternative for bed leveling similar to Bambu Lab's Lidar bed scanner.

  • @TouYubeTom
    @TouYubeTom 7 месяцев назад

    the source code is missing

    • @uwgraphics
      @uwgraphics  7 месяцев назад +2

      It has just been released! github.com/uwgraphics/ProximityPlanarRecovery

    • @TouYubeTom
      @TouYubeTom 7 месяцев назад

      @@uwgraphics cool thanks

  • @BitSmythe
    @BitSmythe 7 месяцев назад

    1:55. The blue pane doesn’t match the table. It looks like the bottom left corner should be further away.

  • @benargee
    @benargee 7 месяцев назад +2

    One small step towards no longer having to pay tips to human waiters.

  • @1kreature
    @1kreature 7 месяцев назад

    Would just like to point out: Arduino is NOT a microcontroller.
    It is a framework and development environment with certain compatible/compliant development boards containing a microcontroller from different companies such as Atmel and STM.

  • @markrix
    @markrix 8 месяцев назад +1

    Transients... Hobos... Travelers.. oh wait no, these are way neater!!

  • @icebluscorpion
    @icebluscorpion 5 месяцев назад

    optimize it to use this for bed leveling in 3D printers other uses are pointless and stupid

  • @luzookiipcbway
    @luzookiipcbway 8 месяцев назад

    Hello there. We found your video doing PCBs related contents very intersting, and were wondering if our PCB(A) related custom services may help in your future projects? Would love to supply freely and reach any YT collab together! (PCBWay luz)

  • @napent
    @napent 7 месяцев назад

    Nice - would love to play with that data in Python