Is Apple's Lidar Scanning Usable Yet?

Поделиться
HTML-код
  • Опубликовано: 10 янв 2025

Комментарии • 69

  • @tom9380
    @tom9380 3 года назад +7

    Thanks for a honest, detailed review of the iOS LIDAR tech. Often "tech reviewers" buy too often into PR stuff tech companies publish without second guessing them or downplaying the negative sides. This is an actual balanced and useful review.

  • @jellykid
    @jellykid 3 года назад +3

    For me the LiDar on the iPad has been amazing for projects I'm working on. For all little details in the backgrounds of the scene that will be out of focus since they bring a really nice depth! I don't have to wait that long to see the results unlike processing on meshroom

  • @anastazjapamp
    @anastazjapamp 2 года назад +1

    thanks for taking the effort to make this video. however i think you left an important aspect out which i think is also the reason for your bad scans and reappearing surfaces. lidar scanninig works with distances of points from the device and the relationships between these points. This is also mentioned in the instructions that polycam gives, that you should scan in a very clear path where you dont aim at the same point many times. I guess it easier to calculate. in my experience I would say that the more your path (the line that appears while polycam is calculating) looks like a harmonic curl not too tangled not too many big changes in direction the better the results. in the same logic you should contantly move cause when you keep stable the lidar collects more info while your hands shake which makes it also more difficult to calculate. for me it meant that when i tried to scan sculptures in a graveyard i would take some time to think of my route, keep my arms stable and find myself in a bit funny positions. but yeah try it and your scans will get better.

  • @daveindezmenez
    @daveindezmenez 3 года назад +13

    I've wondered if there might be someway to use both Lidar and Photogrammetry together to fill in the shortcomings of the other.

    • @bemccune7671
      @bemccune7671 2 года назад +1

      Hey, have you heard of everypoint? It does just that actually!

  • @ericenblom6014
    @ericenblom6014 3 года назад +9

    Thanks for the demo. I am going to wait until it can do continuous lidar recording and 3D solid video projection, probably sometime around stardate 72000.

  • @discardedparticles
    @discardedparticles 3 года назад +1

    Photo mode recently released for PolyCam, if you are not capturing a scene it can deliver clean and crisp models now!!

  • @dtvfan24
    @dtvfan24 3 года назад +4

    thanks for this, i did buy an iphone 12 but might consider upgrade to iphone 14 PRO once LIDAR gets better

  • @brido88
    @brido88 3 года назад +1

    Great video! Another couple of categories to think about when comparing:
    Processing time
    Accuracy of measurements
    Model size

  • @americannomadnewsthecardbo4339
    @americannomadnewsthecardbo4339 2 года назад

    I'd like to see more engineering applications for this lidar. An example would be the ability to apply layers to the 3D image from other sensors including infrared and UltraViolet. Having false color infrared or ultraviolet overlay for the wire frame would lend diagnostic capability to mechanics and engineers who upon making 3D scans of working components could simultaneously have a VR representation of the heat signatures. As a person who's worked in engineering in remote locations in Alaska doing a wide variety of tasks I can tell you that that would be a handy-dandy gadget.

  • @richardmccallum2735
    @richardmccallum2735 3 года назад +19

    Oot and aboot

    • @jesusacristo307
      @jesusacristo307 3 года назад

      Lmao i commented the exact same thing, then scrolled down.

    • @DarkFutureConsolidated
      @DarkFutureConsolidated 3 года назад

      HA! Me too! I was going to ask What’s a people Ootenaboot?

  • @albertocruz5538
    @albertocruz5538 3 года назад

    Thanks. I’m trying to get scans of alien mines in China. Still haven’t found a reasonable and cost effective way to do it. I’ll try Polycam on my next trip. Some of the underground chambers are quite large, so the 5M LiDAR limit is a problem.

  • @PhilEdwardsInc
    @PhilEdwardsInc 2 года назад

    Others have said this, but so nice to have a realistic review of this. Feels vindicating to hear.

  • @jandersen6802
    @jandersen6802 3 года назад +1

    They could use swarms of small drones with cameras and lidars to scan the Earth in high detail. And they could crowdsource 3d-scans from people with smartphones.
    But just imagine using the map to make a giant World Simulator! Like a mix between GTA and Flight Simulator or something. It could be subscription game that gets better and more detailed over time.

  • @antoniopepe
    @antoniopepe 3 года назад +2

    what version do you have of IPAD ? its the last ipad pro 3 ?
    thanks for the detailed comparison !

  • @haxorcomputerservices
    @haxorcomputerservices 3 года назад +2

    Great video thank you. Polycam if they fix the issues would be awesome but not very good if the geometry is so broken.

    • @jake-ly
      @jake-ly  3 года назад +1

      They're definitely the nicest design, although like you said not the best mesh generation

    • @johnroyal4913
      @johnroyal4913 3 года назад +1

      Polycam is pretty good now

  • @Tactic3d
    @Tactic3d 3 года назад

    Great and informative work. Thanks

  • @SnowyOwlPrepper
    @SnowyOwlPrepper 3 года назад

    I would like to know if there are any good technical videos on the LIDAR tablets or phones.

  • @jotaroandres
    @jotaroandres 3 года назад +3

    Thanks for the comparisons, I just picked up a iPhone12 pro. Got any workflow tips on cleaning up the scans?

    • @jake-ly
      @jake-ly  3 года назад +4

      My general workflow in Blender is
      1. Decimate a small amount to fill holes and reduce geometry noise
      2. Remesh with quad remesh to clean it up
      3. UV unwrap the new mesh
      4. Bake the original model texture onto this cleaner mesh
      5. Sculpt out the imperfections

  • @GustavoASMoreira
    @GustavoASMoreira 3 года назад

    Please analyse IPhone 13 Lidar enhancements

  • @davidd9027
    @davidd9027 2 года назад

    I feel like it’s a combination of the resolution and the software considering there are actually a good number of different apps on the App Store it really does produce mixed results because some apps are better at scanning different types of objects than others it honestly really just depends as for scanning realistic human models they would have to stand still as possible granted there would still be some in accuracies but there are steps people can take to help mitigate the loss of quality. Overall great video do use case I’m trying to use this is for real estate scanning. Also asset scanning has its use cases.

  • @rhadiem
    @rhadiem 3 года назад

    Thanks for saving me a bunch of money.

  • @americannomadnewsthecardbo4339
    @americannomadnewsthecardbo4339 2 года назад

    The answer to the shiny and the transparent is really easy. It's quite simple you just use polyvinyl alcohol or PVA which is liquid until you spray it on something and then it's basically a blue plastic cling that's perfectly laminated on the surface you'll see it on new products that that blue plastic layer you peel that off that's probably vinyl alcohol when you buy a brand new window and have it installed and there's a blue layer on it that's PVA. So you just take some masking tape and mask off the the parts that you don't want the PVA on spray the PVA and while it's still wet peel the masking tape off real quick go ahead and do your scans and then just as quick as you peeled off the masking tape you can peel off the PVA and you're done

  • @MightyPriest
    @MightyPriest 3 года назад +3

    I don't know what and how r u doing that but my scannings isn't that bad like yours

    • @MightyPriest
      @MightyPriest 3 года назад

      seriously how that bad I tried out again and again only the shiny or bright surfaces became glitchy

  • @Dylan-kw8pz
    @Dylan-kw8pz 3 года назад

    Do you happen to know where to point people that would like to develop apps using iPhone lidar?

  • @metaturnal
    @metaturnal 3 года назад +2

    What good does it do that it is much faster than photogrammetry if the result is totally unusable (except maybe for glitch art) ?
    The technology in our phones doesn't seem to be there yet.

    • @MrGTAmodsgerman
      @MrGTAmodsgerman 3 года назад

      Did you saw the video? Background objects, memories ect...

    • @metaturnal
      @metaturnal 3 года назад

      @@MrGTAmodsgerman Yes I saw the video and I am NOT impressed, the quality is not even sufficient for background objects yet.
      I bought it also because i was hopeful the Lidar would help the photogrammetry workflow.
      I'll agree with you on the memories and as I said for glitch art, where shitty quality is part of the aesthetic.

    • @metaturnal
      @metaturnal 3 года назад

      @@MrGTAmodsgerman To me it seems like more of a waste of than than photogrammetry, eve though it is very fast, result is unusable for MOST usecases, at least with photogrammetry you know the invested time actually serves a purpose.

  • @themsquaredgroup
    @themsquaredgroup 3 года назад

    I live in Bellevue. I would love to chat with you more about this. We use Artec scanners but we are exploring more options.

  • @esmokebaby
    @esmokebaby 3 года назад

    Iam trying to make a model of a controler but can't get a damn good scan

  • @thefrub
    @thefrub 3 года назад +1

    I'm not an apple guy, but this is crazy. I can't believe this isn't bigger news

    • @ansonkiek6471
      @ansonkiek6471 3 года назад +1

      Yes,that why I love to watch apple keynote, many said apple isn’t innovative but actually you watch apple events actually they invented some features turn it simplicity and intergrade it well. Most people check like to check the ram ,120hz, camera but I don’t care. I just like and expected what apple add something inside devices on each generation or events

    • @mastergreyskull523
      @mastergreyskull523 3 года назад

      @@ansonkiek6471 not invented lol
      implemented

  • @mcduez4724
    @mcduez4724 3 года назад

    Nice one bro traveller

  • @Richard-hx6mi
    @Richard-hx6mi Год назад

    Are you Canadian???
    "We were out and a boot"
    1:28

  • @esmokebaby
    @esmokebaby 3 года назад

    What they should done was pair lidar with a true depth camera Ike they have on the front of the iPad pro m1 cause you can get good face scans using it in skadie I think that's the app

  • @oddarneroll
    @oddarneroll 3 года назад

    Polycam video 👀

  • @grrinc
    @grrinc 3 года назад +2

    Very useful video, thanks! I purchased an IPhone 12 Max Pro anticipating the lidar would be useful for basic capture, but I've yet to try any apps just yet. It looks as though I shouldn't get my hopes up just yet.

    • @jake-ly
      @jake-ly  3 года назад +3

      Its going to be a bit before it starts working well on the first try. Sitescape has been a decent one since this video

    • @grrinc
      @grrinc 3 года назад

      @@jake-ly cheers man, I'll check it 👍

    • @kibbycabbit
      @kibbycabbit 3 года назад

      LiDAR tech is set. It’s just how softwares process differently.
      Every instance you scan will get different result. Don’t give up. It works really well!

  • @tylerchesley
    @tylerchesley 3 года назад +1

    Do you know if you can get the raw point clouds out of APIs?

    • @jake-ly
      @jake-ly  3 года назад +2

      I know there's a few apps that allow you to export the point clouds. Record3D has pointcloud videos, 3D Scanner App has point clouds, and I think EveryPoint was another with pointcloud export. Real time point cloud streaming is another thing though

    • @bryanatwood8401
      @bryanatwood8401 3 года назад +1

      Yes, you can get raw point clouds now. You need to add .sceneDepth to the ARWorldTrackingConfiguration frameSemantics. Documentation here: developer.apple.com/documentation/arkit/arconfiguration/framesemantics/3516902-scenedepth
      Then each frame will include a .sceneDepth which is an ARDepthData object to give you the depth map by pixel: developer.apple.com/documentation/arkit/ardepthdata

    • @tylerchesley
      @tylerchesley 3 года назад +1

      @@bryanatwood8401 thanks! Almost makes me want to get an iphone.

    • @bryanatwood8401
      @bryanatwood8401 3 года назад

      @@tylerchesley As Jake talks about, the quality is pretty poor for details and for reconstructions where you are rotating around an object. It's really good for flat objects like walls and tables. For scene reconstruction, there is an iOS API that constructs the entire mesh with object classification. It's pretty well made, and presumably what most of these room scanning apps use.

  • @JahKubRu1
    @JahKubRu1 3 года назад +1

    i think that problem is that apps wants to show you mesh instead of point cloud data....mesh should be just an export option. not a scan result. i am working with faro scanner and i have new ipad pro so i can tell you people that ipad scans are not accurate ;( now i understand why i pad cost around 1k and used (not new) faro cost 20k+soft..... i hope that developers will target point clouds

  • @YHK_YT
    @YHK_YT 2 года назад

    8:35 still waiting

  • @vrtotalquest
    @vrtotalquest 3 года назад

    Gracias por el video, ¿Cuál apk me recomiendas y que funcione?

  • @apollovacademy6066
    @apollovacademy6066 3 года назад +1

    The algorithm is the issue here, how can a camera and a LiDar produce worse data with more information.

  • @kikupub71
    @kikupub71 3 года назад

    Good report eh?

  • @ValeGoG
    @ValeGoG 3 года назад

    10/10 спасибо :)

  • @Jarmahent
    @Jarmahent 2 года назад

    “Oot and aboot”

  • @Frosty2
    @Frosty2 3 года назад

    THICC THUMBNAIL 😩😩

  • @nathanroberson
    @nathanroberson 3 года назад

    Your a lucky man. You have been chosen by a beautiful female.

  • @jesusacristo307
    @jesusacristo307 3 года назад +1

    oot and aboot

  • @sanctuaryplace
    @sanctuaryplace 2 года назад

    free mason sign? why

  • @Kokose
    @Kokose 3 года назад

    it's a bit odd that humans can't be scanned, for a computer it's just a geometry and shapes after all no different than a car or building

    • @joelface
      @joelface 3 года назад +4

      It works, but even the smallest mistakes by the program are very easily noticed since we have high expectations for what is human and what is "disturbing" If the detail is slightly skewed on a car we barely notice, but when it pulls her face slightly to the side we think "wow, that didn't work at all." So, it DOES work, just not well enough that we'd enjoy the results with humans.

  • @flashflexpro
    @flashflexpro 3 года назад

    yeah, still no point for iphone really.

  • @The_Unobtainium
    @The_Unobtainium 3 года назад

    So to recap: all these 3d scanning by iPhone is shitty quality, non consistent. It's just a toy not a usable tool.

  • @tann982
    @tann982 3 года назад

    Apples Lidar system is poo poo