Thanks for a honest, detailed review of the iOS LIDAR tech. Often "tech reviewers" buy too often into PR stuff tech companies publish without second guessing them or downplaying the negative sides. This is an actual balanced and useful review.
For me the LiDar on the iPad has been amazing for projects I'm working on. For all little details in the backgrounds of the scene that will be out of focus since they bring a really nice depth! I don't have to wait that long to see the results unlike processing on meshroom
thanks for taking the effort to make this video. however i think you left an important aspect out which i think is also the reason for your bad scans and reappearing surfaces. lidar scanninig works with distances of points from the device and the relationships between these points. This is also mentioned in the instructions that polycam gives, that you should scan in a very clear path where you dont aim at the same point many times. I guess it easier to calculate. in my experience I would say that the more your path (the line that appears while polycam is calculating) looks like a harmonic curl not too tangled not too many big changes in direction the better the results. in the same logic you should contantly move cause when you keep stable the lidar collects more info while your hands shake which makes it also more difficult to calculate. for me it meant that when i tried to scan sculptures in a graveyard i would take some time to think of my route, keep my arms stable and find myself in a bit funny positions. but yeah try it and your scans will get better.
Thanks for the demo. I am going to wait until it can do continuous lidar recording and 3D solid video projection, probably sometime around stardate 72000.
I'd like to see more engineering applications for this lidar. An example would be the ability to apply layers to the 3D image from other sensors including infrared and UltraViolet. Having false color infrared or ultraviolet overlay for the wire frame would lend diagnostic capability to mechanics and engineers who upon making 3D scans of working components could simultaneously have a VR representation of the heat signatures. As a person who's worked in engineering in remote locations in Alaska doing a wide variety of tasks I can tell you that that would be a handy-dandy gadget.
Thanks. I’m trying to get scans of alien mines in China. Still haven’t found a reasonable and cost effective way to do it. I’ll try Polycam on my next trip. Some of the underground chambers are quite large, so the 5M LiDAR limit is a problem.
They could use swarms of small drones with cameras and lidars to scan the Earth in high detail. And they could crowdsource 3d-scans from people with smartphones. But just imagine using the map to make a giant World Simulator! Like a mix between GTA and Flight Simulator or something. It could be subscription game that gets better and more detailed over time.
My general workflow in Blender is 1. Decimate a small amount to fill holes and reduce geometry noise 2. Remesh with quad remesh to clean it up 3. UV unwrap the new mesh 4. Bake the original model texture onto this cleaner mesh 5. Sculpt out the imperfections
I feel like it’s a combination of the resolution and the software considering there are actually a good number of different apps on the App Store it really does produce mixed results because some apps are better at scanning different types of objects than others it honestly really just depends as for scanning realistic human models they would have to stand still as possible granted there would still be some in accuracies but there are steps people can take to help mitigate the loss of quality. Overall great video do use case I’m trying to use this is for real estate scanning. Also asset scanning has its use cases.
The answer to the shiny and the transparent is really easy. It's quite simple you just use polyvinyl alcohol or PVA which is liquid until you spray it on something and then it's basically a blue plastic cling that's perfectly laminated on the surface you'll see it on new products that that blue plastic layer you peel that off that's probably vinyl alcohol when you buy a brand new window and have it installed and there's a blue layer on it that's PVA. So you just take some masking tape and mask off the the parts that you don't want the PVA on spray the PVA and while it's still wet peel the masking tape off real quick go ahead and do your scans and then just as quick as you peeled off the masking tape you can peel off the PVA and you're done
What good does it do that it is much faster than photogrammetry if the result is totally unusable (except maybe for glitch art) ? The technology in our phones doesn't seem to be there yet.
@@MrGTAmodsgerman Yes I saw the video and I am NOT impressed, the quality is not even sufficient for background objects yet. I bought it also because i was hopeful the Lidar would help the photogrammetry workflow. I'll agree with you on the memories and as I said for glitch art, where shitty quality is part of the aesthetic.
@@MrGTAmodsgerman To me it seems like more of a waste of than than photogrammetry, eve though it is very fast, result is unusable for MOST usecases, at least with photogrammetry you know the invested time actually serves a purpose.
Yes,that why I love to watch apple keynote, many said apple isn’t innovative but actually you watch apple events actually they invented some features turn it simplicity and intergrade it well. Most people check like to check the ram ,120hz, camera but I don’t care. I just like and expected what apple add something inside devices on each generation or events
What they should done was pair lidar with a true depth camera Ike they have on the front of the iPad pro m1 cause you can get good face scans using it in skadie I think that's the app
Very useful video, thanks! I purchased an IPhone 12 Max Pro anticipating the lidar would be useful for basic capture, but I've yet to try any apps just yet. It looks as though I shouldn't get my hopes up just yet.
LiDAR tech is set. It’s just how softwares process differently. Every instance you scan will get different result. Don’t give up. It works really well!
I know there's a few apps that allow you to export the point clouds. Record3D has pointcloud videos, 3D Scanner App has point clouds, and I think EveryPoint was another with pointcloud export. Real time point cloud streaming is another thing though
Yes, you can get raw point clouds now. You need to add .sceneDepth to the ARWorldTrackingConfiguration frameSemantics. Documentation here: developer.apple.com/documentation/arkit/arconfiguration/framesemantics/3516902-scenedepth Then each frame will include a .sceneDepth which is an ARDepthData object to give you the depth map by pixel: developer.apple.com/documentation/arkit/ardepthdata
@@tylerchesley As Jake talks about, the quality is pretty poor for details and for reconstructions where you are rotating around an object. It's really good for flat objects like walls and tables. For scene reconstruction, there is an iOS API that constructs the entire mesh with object classification. It's pretty well made, and presumably what most of these room scanning apps use.
i think that problem is that apps wants to show you mesh instead of point cloud data....mesh should be just an export option. not a scan result. i am working with faro scanner and i have new ipad pro so i can tell you people that ipad scans are not accurate ;( now i understand why i pad cost around 1k and used (not new) faro cost 20k+soft..... i hope that developers will target point clouds
It works, but even the smallest mistakes by the program are very easily noticed since we have high expectations for what is human and what is "disturbing" If the detail is slightly skewed on a car we barely notice, but when it pulls her face slightly to the side we think "wow, that didn't work at all." So, it DOES work, just not well enough that we'd enjoy the results with humans.
Thanks for a honest, detailed review of the iOS LIDAR tech. Often "tech reviewers" buy too often into PR stuff tech companies publish without second guessing them or downplaying the negative sides. This is an actual balanced and useful review.
For me the LiDar on the iPad has been amazing for projects I'm working on. For all little details in the backgrounds of the scene that will be out of focus since they bring a really nice depth! I don't have to wait that long to see the results unlike processing on meshroom
thanks for taking the effort to make this video. however i think you left an important aspect out which i think is also the reason for your bad scans and reappearing surfaces. lidar scanninig works with distances of points from the device and the relationships between these points. This is also mentioned in the instructions that polycam gives, that you should scan in a very clear path where you dont aim at the same point many times. I guess it easier to calculate. in my experience I would say that the more your path (the line that appears while polycam is calculating) looks like a harmonic curl not too tangled not too many big changes in direction the better the results. in the same logic you should contantly move cause when you keep stable the lidar collects more info while your hands shake which makes it also more difficult to calculate. for me it meant that when i tried to scan sculptures in a graveyard i would take some time to think of my route, keep my arms stable and find myself in a bit funny positions. but yeah try it and your scans will get better.
I've wondered if there might be someway to use both Lidar and Photogrammetry together to fill in the shortcomings of the other.
Hey, have you heard of everypoint? It does just that actually!
Thanks for the demo. I am going to wait until it can do continuous lidar recording and 3D solid video projection, probably sometime around stardate 72000.
Photo mode recently released for PolyCam, if you are not capturing a scene it can deliver clean and crisp models now!!
thanks for this, i did buy an iphone 12 but might consider upgrade to iphone 14 PRO once LIDAR gets better
Great video! Another couple of categories to think about when comparing:
Processing time
Accuracy of measurements
Model size
I'd like to see more engineering applications for this lidar. An example would be the ability to apply layers to the 3D image from other sensors including infrared and UltraViolet. Having false color infrared or ultraviolet overlay for the wire frame would lend diagnostic capability to mechanics and engineers who upon making 3D scans of working components could simultaneously have a VR representation of the heat signatures. As a person who's worked in engineering in remote locations in Alaska doing a wide variety of tasks I can tell you that that would be a handy-dandy gadget.
Oot and aboot
Lmao i commented the exact same thing, then scrolled down.
HA! Me too! I was going to ask What’s a people Ootenaboot?
Thanks. I’m trying to get scans of alien mines in China. Still haven’t found a reasonable and cost effective way to do it. I’ll try Polycam on my next trip. Some of the underground chambers are quite large, so the 5M LiDAR limit is a problem.
Others have said this, but so nice to have a realistic review of this. Feels vindicating to hear.
They could use swarms of small drones with cameras and lidars to scan the Earth in high detail. And they could crowdsource 3d-scans from people with smartphones.
But just imagine using the map to make a giant World Simulator! Like a mix between GTA and Flight Simulator or something. It could be subscription game that gets better and more detailed over time.
what version do you have of IPAD ? its the last ipad pro 3 ?
thanks for the detailed comparison !
Great video thank you. Polycam if they fix the issues would be awesome but not very good if the geometry is so broken.
They're definitely the nicest design, although like you said not the best mesh generation
Polycam is pretty good now
Great and informative work. Thanks
I would like to know if there are any good technical videos on the LIDAR tablets or phones.
Thanks for the comparisons, I just picked up a iPhone12 pro. Got any workflow tips on cleaning up the scans?
My general workflow in Blender is
1. Decimate a small amount to fill holes and reduce geometry noise
2. Remesh with quad remesh to clean it up
3. UV unwrap the new mesh
4. Bake the original model texture onto this cleaner mesh
5. Sculpt out the imperfections
Please analyse IPhone 13 Lidar enhancements
I feel like it’s a combination of the resolution and the software considering there are actually a good number of different apps on the App Store it really does produce mixed results because some apps are better at scanning different types of objects than others it honestly really just depends as for scanning realistic human models they would have to stand still as possible granted there would still be some in accuracies but there are steps people can take to help mitigate the loss of quality. Overall great video do use case I’m trying to use this is for real estate scanning. Also asset scanning has its use cases.
Thanks for saving me a bunch of money.
The answer to the shiny and the transparent is really easy. It's quite simple you just use polyvinyl alcohol or PVA which is liquid until you spray it on something and then it's basically a blue plastic cling that's perfectly laminated on the surface you'll see it on new products that that blue plastic layer you peel that off that's probably vinyl alcohol when you buy a brand new window and have it installed and there's a blue layer on it that's PVA. So you just take some masking tape and mask off the the parts that you don't want the PVA on spray the PVA and while it's still wet peel the masking tape off real quick go ahead and do your scans and then just as quick as you peeled off the masking tape you can peel off the PVA and you're done
I don't know what and how r u doing that but my scannings isn't that bad like yours
seriously how that bad I tried out again and again only the shiny or bright surfaces became glitchy
Do you happen to know where to point people that would like to develop apps using iPhone lidar?
What good does it do that it is much faster than photogrammetry if the result is totally unusable (except maybe for glitch art) ?
The technology in our phones doesn't seem to be there yet.
Did you saw the video? Background objects, memories ect...
@@MrGTAmodsgerman Yes I saw the video and I am NOT impressed, the quality is not even sufficient for background objects yet.
I bought it also because i was hopeful the Lidar would help the photogrammetry workflow.
I'll agree with you on the memories and as I said for glitch art, where shitty quality is part of the aesthetic.
@@MrGTAmodsgerman To me it seems like more of a waste of than than photogrammetry, eve though it is very fast, result is unusable for MOST usecases, at least with photogrammetry you know the invested time actually serves a purpose.
I live in Bellevue. I would love to chat with you more about this. We use Artec scanners but we are exploring more options.
Iam trying to make a model of a controler but can't get a damn good scan
I'm not an apple guy, but this is crazy. I can't believe this isn't bigger news
Yes,that why I love to watch apple keynote, many said apple isn’t innovative but actually you watch apple events actually they invented some features turn it simplicity and intergrade it well. Most people check like to check the ram ,120hz, camera but I don’t care. I just like and expected what apple add something inside devices on each generation or events
@@ansonkiek6471 not invented lol
implemented
Nice one bro traveller
Are you Canadian???
"We were out and a boot"
1:28
What they should done was pair lidar with a true depth camera Ike they have on the front of the iPad pro m1 cause you can get good face scans using it in skadie I think that's the app
Polycam video 👀
Very useful video, thanks! I purchased an IPhone 12 Max Pro anticipating the lidar would be useful for basic capture, but I've yet to try any apps just yet. It looks as though I shouldn't get my hopes up just yet.
Its going to be a bit before it starts working well on the first try. Sitescape has been a decent one since this video
@@jake-ly cheers man, I'll check it 👍
LiDAR tech is set. It’s just how softwares process differently.
Every instance you scan will get different result. Don’t give up. It works really well!
Do you know if you can get the raw point clouds out of APIs?
I know there's a few apps that allow you to export the point clouds. Record3D has pointcloud videos, 3D Scanner App has point clouds, and I think EveryPoint was another with pointcloud export. Real time point cloud streaming is another thing though
Yes, you can get raw point clouds now. You need to add .sceneDepth to the ARWorldTrackingConfiguration frameSemantics. Documentation here: developer.apple.com/documentation/arkit/arconfiguration/framesemantics/3516902-scenedepth
Then each frame will include a .sceneDepth which is an ARDepthData object to give you the depth map by pixel: developer.apple.com/documentation/arkit/ardepthdata
@@bryanatwood8401 thanks! Almost makes me want to get an iphone.
@@tylerchesley As Jake talks about, the quality is pretty poor for details and for reconstructions where you are rotating around an object. It's really good for flat objects like walls and tables. For scene reconstruction, there is an iOS API that constructs the entire mesh with object classification. It's pretty well made, and presumably what most of these room scanning apps use.
i think that problem is that apps wants to show you mesh instead of point cloud data....mesh should be just an export option. not a scan result. i am working with faro scanner and i have new ipad pro so i can tell you people that ipad scans are not accurate ;( now i understand why i pad cost around 1k and used (not new) faro cost 20k+soft..... i hope that developers will target point clouds
8:35 still waiting
Gracias por el video, ¿Cuál apk me recomiendas y que funcione?
The algorithm is the issue here, how can a camera and a LiDar produce worse data with more information.
Good report eh?
10/10 спасибо :)
“Oot and aboot”
THICC THUMBNAIL 😩😩
Your a lucky man. You have been chosen by a beautiful female.
oot and aboot
free mason sign? why
it's a bit odd that humans can't be scanned, for a computer it's just a geometry and shapes after all no different than a car or building
It works, but even the smallest mistakes by the program are very easily noticed since we have high expectations for what is human and what is "disturbing" If the detail is slightly skewed on a car we barely notice, but when it pulls her face slightly to the side we think "wow, that didn't work at all." So, it DOES work, just not well enough that we'd enjoy the results with humans.
yeah, still no point for iphone really.
So to recap: all these 3d scanning by iPhone is shitty quality, non consistent. It's just a toy not a usable tool.
Apples Lidar system is poo poo