Unlocking the Performance of Proximity Sensors by Utilizing Transient Histograms

Поделиться
HTML-код
  • Опубликовано: 9 янв 2025

Комментарии • 104

  • @uwgraphics
    @uwgraphics  Год назад +77

    To clarify something we've seen many comments on: our results don't mean that the manufacturer's solution for distance estimation is "bad". The manufacturer's solution runs onboard the sensor, which has a minuscule amount of computing power, and was likely designed to support generic use cases for their sensor, not the very specific planar recovery that we perform.

  • @DaveEtchells
    @DaveEtchells Год назад +22

    Brilliant work! I was surprised how well simple calibrated peak detection worked, compared to the much more complicated differentiable method.

  • @borisbadinov7757
    @borisbadinov7757 Год назад +6

    Fantastic. Also the clarity of your technical descriptions overcame my youtube induced ADHD

  • @tombowen9861
    @tombowen9861 Год назад +11

    I'm immediately thinking of all the super low cost ways you could now provide backup orientation data to an IMU. With the simplicity and price-point, you could use it to provide a crosscheck to combat sensor drift over long trips. If the sample rate is too low, use two! haha. So many lunar landers have gotten turned around over the years due to sensor confusion, and here's a few dollars that can provide surface geometry data that might be used to reorient. Really neat project!

  • @Lion_McLionhead
    @Lion_McLionhead Год назад +7

    Surprised these sensors evolved into low resolution 3x3 cameras. Definitely an advantage to apply offboard computing power to the histogram analysis. The integrated processor is a Cortex M0.

  • @Ender_Wiggin
    @Ender_Wiggin Год назад +69

    Very cool, always surprises me how bad the manufactures solutions are. Goes to show that even if you have all the hardware you could want software is still king.
    This makes me cry at night being a hardware guy / lover.

    • @emiliaolfelt6370
      @emiliaolfelt6370 Год назад +4

      just get better at hardware, duh

    • @kerimgueney
      @kerimgueney Год назад +6

      Kind of disappointing how the "proprietary algorithm" of the manufacturer themselves is so extremely lackluster. Who knows what kind of amazing potential is lost, because we can't actually utilize our existing hardware effectively.

    • @956870733
      @956870733 Год назад +4

      This doesn't seem necessarily software. It's about having a different perspective on what the data is telling us, together with some math.

    • @Jakedasnake1066
      @Jakedasnake1066 Год назад +15

      I imagine there are tradeoffs being made in the proprietary algorithm to account for harsher requirements than this paper was restricted to. For instance, it seems that they used a full fledged PC to perform their histogram analysis, whereas the chip manufacturers must make use of the microcontroller hardware onboard the chip itself. I imagine that under those circumstances, the proprietary algorithm is performing quite well.

    • @Nobody-Nowhere
      @Nobody-Nowhere 11 месяцев назад

      Not really, as it needs to run on the chip itself. Taking out the raw data and using much more powerful processor to filter it, makes sense you can get much better results.

  • @jovaraszigmantas
    @jovaraszigmantas Год назад +5

    Amazing video. Can not wait to see 3D scanning.

    • @Stinktierchen
      @Stinktierchen Год назад

      Wasnt it done ages ago with a cheap Kinect camera from the Xbox?

  • @SirasPK
    @SirasPK Год назад +23

    Very good work guys. Outperforming a company this way makes me wonder if they let the interns write codes for these things.

    • @overflow7276
      @overflow7276 Год назад +2

      They most likely do. I once bought a muscle sensor from Seed-Studio and the only code example they provided was using the raw unfiltered signal to light up an LED bar. This is so far off from the usual field of use for these sensors it really made me scratch my head why nobody over at seed studio has actually bothered trying to get a filtered and clean signal from their hardware, especially since the filter settings depend a lot on exactly that hardware.

    • @Fennecbutt
      @Fennecbutt Год назад

      It's cause all the C levels are just biz majors whose only purpose in life is to convince investors and customers that their company has some sort of "secret sauce". I fucking loathe companies that NDA their datasheets like Pixart Imaging (and plenty of others).

    • @uwgraphics
      @uwgraphics  Год назад +3

      See the pinned comment - we actually don't think the manufacturer's solution is bad. It's just more constrained on compute and designed for more generic sensor use cases.

  • @ThisIsToolman
    @ThisIsToolman Год назад +1

    Very interesting. Looking forward to future videos involving this sensor.

  • @brotherdust
    @brotherdust Год назад +1

    Wow! This would be useful to implement as part of an automatic bed leveling routine on a 3D printer! Nice work! Subscribed!

  • @ConsultingjoeOnline
    @ConsultingjoeOnline Год назад

    YES!!! Amazing research guys! Keep it up!

  • @luisca92
    @luisca92 Год назад

    Just got a ur10e for R&D, can’t wait to try this

  • @JaydenLawson
    @JaydenLawson Год назад +1

    3:55 wow your results are MUCH better than the proprietary distance estimates

  • @MyrLin8
    @MyrLin8 Год назад +1

    Excellent work. Impressive signal use.

  • @gluonsmx
    @gluonsmx Год назад

    Really smart! Thanks for sharing the rationale and code

  • @jakobjorgensen7773
    @jakobjorgensen7773 Год назад

    I've been looking for a solution like this - excellent work! Thank you

  • @LanceThumping
    @LanceThumping Год назад +1

    I'd be interested in seeing a version of this maximizing for linear accuracy. The setup I envision is the sensor is mounted to a linear rail with a board/reflector fixed at a right angle to one end.
    Then the sensor is set at a known distance to the board, the differentiation is run with known values for the distance/angle to calibrate for the surface reflectance.
    Finally the sensor runs on the linear rail, with the calibrated data and the fixed angle relative to the surface to provide only distance data.
    I'm curious how much all this boiling down of the data will increase the accuracy of the linear distance measurement.
    Reason being is that I've been interested in the idea of using these types of sensors for closed loop control of a 3d printer (I like the idea of knowing exactly where the print head is at a given time), but one issue is the accuracy. The sensor frequency would definitely be too slow but there are other comments addressing it that could be explored.

  • @VacuousCat
    @VacuousCat Год назад

    Works like echolacation. Animals can do echolocation to detect shape, distance, and texture.

  • @rileybrown7023
    @rileybrown7023 Год назад +2

    This is so cool. I love transient histograms!

  • @jumpstar9000
    @jumpstar9000 11 месяцев назад

    Very cool. You could make a MIDI controller as a demo. Something like a theremin. It might also be nice to arrange a few of these in a hemispherical or triangulated configurations for a desktop device. Then you could use both hands as manipulators for a 3D input device. Also use with prosthetics. Really cool stuff guys 👍

  • @bob2859
    @bob2859 Год назад +1

    Very cool. Looking forward to seeing if you can get geometry reconstruction working!

  • @mohammade.8770
    @mohammade.8770 10 часов назад

    Good explanation.

  • @nagiaVR
    @nagiaVR Год назад

    hooooley crap! who knew simple tof sensors had so much output you could gather such useful data from! thank you for this!

  • @u9vata
    @u9vata Год назад +8

    Two questions:
    - Is there source code available for just the plane reconstruction (excluding robot moving parts or such)?
    - Is there similar solution for higher range similar sensors? Would be pretty interested if there is and might have use case for AR. Maybe sennsor range is high already but not sure. I know some of these ToF sensors have longer range like around 10m at most (which would be enough for my case) but I guess maybe this exact sensor has smaller range.

    • @uwgraphics
      @uwgraphics  Год назад +13

      Code has just been released! github.com/uwgraphics/ProximityPlanarRecovery
      The sensor used in this work typically has a max range of about 3 meters. We set it to "low range, high accuracy" mode to give it a max range of 1m for higher temporal resolution. Our solution could in principle work with higher range sensors. The highest I know of in this form factor have a quoted max of 5 meters (e.g. ST VL53L8)

    • @maciejurbanski6146
      @maciejurbanski6146 Год назад +3

      @@uwgraphics brilliant work; thank you for publishing the code!

    • @paulwesley3862
      @paulwesley3862 Год назад +1

      ​@@uwgraphicsthanks for publishing the code. you said you restricted it for

    • @u9vata
      @u9vata Год назад +1

      @@uwgraphics Thanks! I just realized its 3FPS because of I2C though so likely need some sensor fusion, but this is a very cool project with low cost sensors becoming much more powerful than they by factory are! Awsome work!

    • @uwgraphics
      @uwgraphics  Год назад +1

      @@paulwesley3862 There are 128 bins, but the first 15 or so are never used. The bin resolution is the equivalent of about 1cm. That doesn't mean the final output of the sensor is limited to 1cm resolution, though.

  • @ryanreedgibson
    @ryanreedgibson Год назад

    This would be great to use with my thrust vectoring rocket.

  • @rojirrim7298
    @rojirrim7298 Год назад

    Damn that was interesting. Thanks a lot for uploading this!

  • @bendingsands87
    @bendingsands87 Год назад +2

    I wonder if this could be used to make a device for 3d printer bed leveling to either make a bed mesh or for real time manual bed leveling. This is very cool.

    • @robstamm60
      @robstamm60 Год назад +3

      Unless you come up with some insane hack to increase the accuracy from a few millimeters to less than 0.1mm you are out of luck. These sensors measure the time of flight of light - to measure with 0.1mm accuracy you would need to measure with less than 1ps time resolution...

  • @friendlybear5924
    @friendlybear5924 Год назад

    Very informative video. Thank you! Enjoyed it a lot.

  • @EPMTUNES
    @EPMTUNES Год назад

    Pretty cool stuff here!

  • @AndreasB-p8w
    @AndreasB-p8w Год назад

    Great Work!

  • @ollie-d
    @ollie-d Год назад +2

    Good work and clean explanation. I imagine you could get quite good results with a sensor mounted to an unbalanced motor. Although I’m not sure what the sampling rates are with these devices and it would certainly be introducing more cost and points of failure.

    • @pmj_studio4065
      @pmj_studio4065 Год назад

      You can set the sample rate of TMF8820 anywhere on the order of 1 to 1000 Hz, trading off accuracy - the actual sample rate is something like 50kHz, but individual samples are combined for better accuracy. (I don't remember the exact numbers, but you get the point)

    • @uwgraphics
      @uwgraphics  Год назад +3

      That's right. In our configuration the sensor sends out 4,000,000 pulses of light for every measurement. It is still able to report measurements at about 10Hz (but we are bottlenecked by the I2C interface to about 3Hz for full histograms). I'm not sure how low you can go. I expect for something like a spinning LiDAR with this sensor you would be limited by the interface bandwidth, especially if you want to get full histograms.

    • @DaveEtchells
      @DaveEtchells Год назад +1

      I wonder if anyone makes a cheap TOF sensor with a SPI interface?
      (Just checked, first result was for some Broadcom modules that are ~~$70 in onesies, but they’re intended for greater distances (up to 10m, 100m with dual frequencies) Didn’t go further, the ~$20 modules are perfect for house use case)

    • @uwgraphics
      @uwgraphics  Год назад +2

      @@DaveEtchells The new ST VL53L8CH has an SPI interface. Unfortunately the example code is all in I2C so it's taking us some time to get it working over SPI. If anyone gets it working, we'd love to get in touch!

    • @DaveEtchells
      @DaveEtchells Год назад +1

      @@uwgraphics Ah, cool! That should give you a huge boost in frame rate. Fantastic work, it’s astonishing how much better the precision is than with the manufacturer’s own code. I can see this having lots of applications: It’s a very cheap and performant solution for a common need.

  • @chemicburn
    @chemicburn Год назад

    Incredible work thanks for sharing!

  • @ethaneveraldo
    @ethaneveraldo Год назад +1

    I’ve been longing for an affordable, compact 3D scanner/solid state lidar for years. So many things you could do with something like this. I use a lot of single point ToF sensors in my projects, didn’t know there were 9 points measurement ones. Maybe by having a small array of them, each at a small angle, you could create a working 3D sensor?
    Video is incomplete. Need to see more

    • @uwgraphics
      @uwgraphics  Год назад +4

      Combining data from multiple distributed sensors is an interesting direction for future work.
      There are higher resolution versions of these sensors available! Check out the ST VL53L8, which reports 8x8 zones. The VL53L8CH even reports histograms, but they're lower temporal resolution than the TMF8820 that we use.

  • @pontosinterligados
    @pontosinterligados Год назад

    Wow! Kudos guys! 👏

  • @AlaskanInsights
    @AlaskanInsights Год назад

    Nice, This is a cool work around fur sure.

  • @matthewsheeran
    @matthewsheeran Год назад

    I wonder if these are the new "induction" previously PIR sensor light sensors to detect a passing human and turn a light on?

  • @constantinosschinas4503
    @constantinosschinas4503 Год назад +1

    Will adding more sensors, in possibly different angles, improve lower distance error across different materials?

  • @MrFennicus
    @MrFennicus Год назад

    Awesome demo 👌🏻

  • @jhoughjr1
    @jhoughjr1 Год назад

    this is awesome, earned a sub.

  • @syber-space
    @syber-space Год назад +1

    A very neat technique! Will the demo code be released publicly so this can be applied to community libraries, or ia this method being kept proprietary?

    • @uwgraphics
      @uwgraphics  Год назад +6

      We would like to release it publicly, it's just a matter of having the time to clean up the code and document it. Reach out to the first author if you're interested in getting your hands on some "research quality" code.

    • @uwgraphics
      @uwgraphics  Год назад +4

      Code has just been released: github.com/uwgraphics/ProximityPlanarRecovery

    • @syber-space
      @syber-space Год назад

      @@uwgraphics awesome!

  • @frosty1433
    @frosty1433 Год назад

    Stereo cameras are also very interesting.

  • @dmendesf
    @dmendesf Год назад

    What's the model of the robotic arm? It's really cool.

  • @royalag007
    @royalag007 Год назад +1

    Great video 😊 thks a lot

  • @acatfrompoland5230
    @acatfrompoland5230 Год назад

    Very inspiring video :3

  • @andybrice2711
    @andybrice2711 Год назад

    Is this a similar principle to RADAR and ultrasound scanners?

    • @uwgraphics
      @uwgraphics  Год назад

      Those often also operate on time-of-flight, so yes!

    • @andybrice2711
      @andybrice2711 Год назад

      @@uwgraphics But also I think they have to try and parse out multiple reflections from complex surfaces and various layers of changing impedance.

  • @FullCircleTravis
    @FullCircleTravis Год назад

    Thanks for sharing this.

  • @mumblety
    @mumblety Год назад

    Cool!

  • @retinapoliyn7462
    @retinapoliyn7462 Год назад

    آYou could easily do the same with the VL53L5X sensor

  • @Joso997
    @Joso997 Год назад

    what about the Sun?

    • @uwgraphics
      @uwgraphics  Год назад

      Good question! SNR is lower in sunlight because ambient light crowds out many of the "good" photons. In our testing (separate of this paper) the sensor itself is still able to function up to about half a meter even in direct sunlight. We would expect the performance of our plane finding algorithm to perform more poorly in sunlight because of the low SNR. We haven't tested it thoroughly because it's hard to get a robot arm outside!

  • @MyKidFPV
    @MyKidFPV Год назад +1

    Very cool! Can you share your Arduino code?

    • @uwgraphics
      @uwgraphics  Год назад +2

      yes, send an email to the first author Carter: sifferman@wisc.edu

    • @BeefIngot
      @BeefIngot Год назад +4

      ​@@uwgraphicsIm assuming then that this isn't going to be the sort of thing that gets MIT licensed in a nice neat library, but instead licensed back to these companies to improve their sensors?

    • @uwgraphics
      @uwgraphics  Год назад +2

      @@BeefIngot We have no partnership with a company or plans to license this. We are happy to release it with a permissive license, we just have to find the time to clean it up and document it. If you would like to get your hands on some "research quality" code, please reach out to the first author.

    • @uwgraphics
      @uwgraphics  Год назад +2

      Our code has been released: github.com/uwgraphics/ProximityPlanarRecovery

  • @ChandrashekarCN
    @ChandrashekarCN Год назад

    💖💖💖💖

  • @TouYubeTom
    @TouYubeTom Год назад

    the source code is missing

    • @uwgraphics
      @uwgraphics  Год назад +2

      It has just been released! github.com/uwgraphics/ProximityPlanarRecovery

    • @TouYubeTom
      @TouYubeTom Год назад

      @@uwgraphics cool thanks

  • @BitSmythe
    @BitSmythe Год назад

    1:55. The blue pane doesn’t match the table. It looks like the bottom left corner should be further away.

  • @BUrbbable
    @BUrbbable Год назад

    This won't be very accurate with translucent materials, I assume ?

    • @uwgraphics
      @uwgraphics  Год назад +2

      We haven't tested it thoroughly but probably not. There may be ways to deal with it, especially if you know the material ahead of time and have a model for its subsurface scattering.

    • @Shinobubu
      @Shinobubu Год назад +1

      @@uwgraphics Yeah I was thinking about modeling the material. This might be a good option for us 3D printer users as an alternative for bed leveling similar to Bambu Lab's Lidar bed scanner.

  • @qwertyboguss
    @qwertyboguss Год назад

    Cool. Cool cool cool

  • @roninbadger7750
    @roninbadger7750 Год назад

    When is this going to be in the next Samsung or Pixel, 6 months? Next Apple 5 years? next LG last year?
    "Ai assisted 3D modeling" next feature on a smart phone.

  • @maxmyzer9172
    @maxmyzer9172 Год назад

    cool!

  • @1kreature
    @1kreature Год назад

    Would just like to point out: Arduino is NOT a microcontroller.
    It is a framework and development environment with certain compatible/compliant development boards containing a microcontroller from different companies such as Atmel and STM.

  • @Sonic_Shroom
    @Sonic_Shroom Год назад

    cool

  • @benargee
    @benargee Год назад +2

    One small step towards no longer having to pay tips to human waiters.

  • @icebluscorpion
    @icebluscorpion 11 месяцев назад

    optimize it to use this for bed leveling in 3D printers other uses are pointless and stupid

  • @markrix
    @markrix Год назад +1

    Transients... Hobos... Travelers.. oh wait no, these are way neater!!

  • @luzookiipcbway
    @luzookiipcbway Год назад

    Hello there. We found your video doing PCBs related contents very intersting, and were wondering if our PCB(A) related custom services may help in your future projects? Would love to supply freely and reach any YT collab together! (PCBWay luz)

  • @napent
    @napent Год назад

    Nice - would love to play with that data in Python