this is beautiful! people sprint away from photogrammetry rendered as point clouds rather than meshes when they don't have the resolution but you've found a way to make them 1000% more realistic and human, it's dreamlike and familiar in the best way
I’m just as fascinated. I think what took my breath away was when I was scanning with LiDAR, the lidar pierced thru the mirror adding another layer of this side of reality inside the mirror.. it was eye opening. And I instantly fell in love. I have bought a new phone and new drone and soon a new camera for this art.. there is something special about it and I watch movies like dejavu 2006 and minority report that peeks into this technology
WOW thats amaazing! How did you do this? Did you just feed a video into meshroom and exported it as a pointcloud? Is there any chance you can do some kind of tutorial on how to do this?
It's actually just motion extracted from a actual video. Then use blender to camera track. I plan to show the actual video in future work, as the point cloud and the video show the same thing
this is beautiful! people sprint away from photogrammetry rendered as point clouds rather than meshes when they don't have the resolution but you've found a way to make them 1000% more realistic and human, it's dreamlike and familiar in the best way
insane bro
I’m just as fascinated. I think what took my breath away was when I was scanning with LiDAR, the lidar pierced thru the mirror adding another layer of this side of reality inside the mirror.. it was eye opening. And I instantly fell in love. I have bought a new phone and new drone and soon a new camera for this art.. there is something special about it and I watch movies like dejavu 2006 and minority report that peeks into this technology
Hell yeah man, it's really cool stuff
What is the audio behind this? It's really nice and effective :)
I made it, made on a korg minilogue, cheers, it's one of my favourite songs I've made
WOW thats amaazing! How did you do this? Did you just feed a video into meshroom and exported it as a pointcloud? Is there any chance you can do some kind of tutorial on how to do this?
I will make a tutorial this month
Great job - what are you using for the "camera" tracking - a vr headset? Has a fantastic hand-held feel.
It's actually just motion extracted from a actual video. Then use blender to camera track. I plan to show the actual video in future work, as the point cloud and the video show the same thing
top, you running all this in? touchdesigner? for live tracking on headset, greak work!