Lifecast
Lifecast
  • Видео 46
  • Просмотров 72 976

Видео

lifecast.ai #12 - Synchronizing GoPros for volumetric capture
Просмотров 844 месяца назад
Cameras need to be farther apart for NeRF capture, not practical to connect them with wires. Instead we will use the new QR synch feature of the GoPro 12, then some tricks in software (like a 4D radiance field).
Bear Sculpture - VR180 - Captured with iPhone - Rendered with Volurama
Просмотров 3429 месяцев назад
Bear Sculpture - VR180 - Captured with iPhone - Rendered with Volurama
Macro 3D Mushrooms VR180 - Part 1
Просмотров 4019 месяцев назад
Macro 3D Mushrooms VR180 - Part 1
Volurama Stereoscopic 3D Example
Просмотров 1889 месяцев назад
Volurama Stereoscopic 3D Example
Volurama Sizzle Reel 2023 - NeRF flythroughs, Looking Glass holograms
Просмотров 2879 месяцев назад
Volurama Sizzle Reel 2023 - NeRF flythroughs, Looking Glass holograms
Grand Canyon VR180 - Created with Volurama
Просмотров 1,2 тыс.10 месяцев назад
Grand Canyon VR180 - Created with Volurama
How much did we spend on this impossibly smooth 3D VR dolly shot?
Просмотров 1,1 тыс.Год назад
How much did we spend on this impossibly smooth 3D VR dolly shot?
Volumetric Video on Holographic Display - Looking Glass + Lifecast
Просмотров 1,2 тыс.Год назад
Volumetric Video on Holographic Display - Looking Glass Lifecast
Volumetric Video Editor for VR and Virtual Production
Просмотров 3,9 тыс.Год назад
Volumetric Video Editor for VR and Virtual Production
Volumetric video in Unreal Engine - Tutorial by Lifecast
Просмотров 2,4 тыс.Год назад
Volumetric video in Unreal Engine - Tutorial by Lifecast
Virtual production with volumetric video and text-to-3D in Unreal Engine - Tutorial by Lifecast
Просмотров 22 тыс.Год назад
Virtual production with volumetric video and text-to-3D in Unreal Engine - Tutorial by Lifecast
A short written by ChatGPT and animated with Stable Diffusion (3D with holovolo.tv)
Просмотров 2,4 тыс.Год назад
A short written by ChatGPT and animated with Stable Diffusion (3D with holovolo.tv)
Stable Diffusion to 3D Unity scene in 2 minutes
Просмотров 16 тыс.Год назад
Stable Diffusion to 3D Unity scene in 2 minutes
Tutorial: Lifecast volumetric video player in Unreal Engine 4.27
Просмотров 2,4 тыс.2 года назад
Tutorial: Lifecast volumetric video player in Unreal Engine 4.27
Encode h264 video from a directory of images using ffmpeg on Windows - Lifecast volumetric video
Просмотров 7552 года назад
Encode h264 video from a directory of images using ffmpeg on Windows - Lifecast volumetric video
Lifecast Volumetric Video Player for Unity - Tutorial
Просмотров 2,8 тыс.2 года назад
Lifecast Volumetric Video Player for Unity - Tutorial
Lifecast Volumetric Video from VR180 - Version 1.3 vs 1.2 (Deep Learning for Inpainting)
Просмотров 1 тыс.2 года назад
Lifecast Volumetric Video from VR180 - Version 1.3 vs 1.2 (Deep Learning for Inpainting)
Unreal Engine: Converting VR180 photos to 3D triangle mesh
Просмотров 2,6 тыс.2 года назад
Unreal Engine: Converting VR180 photos to 3D triangle mesh

Комментарии

  • @ZeroBudgetDevelopments
    @ZeroBudgetDevelopments 3 месяца назад

    thank you very very much L.O.V.E to the ceo and any team members

  • @Aero3D
    @Aero3D 4 месяца назад

    How's this coming along? Can we stream a person in 3D like a hologram?

  • @thereisnomouse
    @thereisnomouse 4 месяца назад

    Wouldn’t RecSync « sub-millisecond synchronization accuracy for multiple Android smartphones » be more accurate and convenient ?

    • @lifecast7823
      @lifecast7823 4 месяца назад

      That sounds cool. It might work as a way of using Android phones as input to our open video nerf engine

    • @thereisnomouse
      @thereisnomouse 4 месяца назад

      @@lifecast7823 Did you see the water balloon burst in the 2019 paper ? « Wireless Software Synchronization of Multiple Distributed Cameras » This is insane 🤯

    • @lifecast7823
      @lifecast7823 4 месяца назад

      @@thereisnomouse Cool!

  • @SmilingRob
    @SmilingRob 5 месяцев назад

    How about a video clapper, flash a light in every camera to sync them

    • @lifecast7823
      @lifecast7823 4 месяца назад

      QR code alone is only good to about 1/10 sec in my tests. We'll have to use a few other techniques as well

  • @brettcameratraveler
    @brettcameratraveler 5 месяцев назад

    How far apart do you feel you need to go? How many cameras? A mechanically geared and robust version of a Hoberman sphere sounds like it could be versatile/scene dependent.

    • @lifecast7823
      @lifecast7823 4 месяца назад

      a couple of meters I think

  • @Yourmomgoestocolledge
    @Yourmomgoestocolledge 5 месяцев назад

    Lildickgang

  • @sdarkao9297
    @sdarkao9297 Год назад

    👏✨👍

  • @violentpixelation5486
    @violentpixelation5486 Год назад

    Love it. Thanks alot for the video. Looking into this right now. 🤙

  • @kendrickgreen9778
    @kendrickgreen9778 Год назад

    This is dope

  • @alfski
    @alfski Год назад

    cool. In a video you probably want to show more of how the view changes as the camera pose moves relative to the display, not the mouse-driven interaction. And probably don't need the wind noise in the demo almost breaking my eardrums! lol

  • @jeromelozano
    @jeromelozano Год назад

    Love this! ordered it, thanks!

  • @KuyaQuatro
    @KuyaQuatro Год назад

    so sick! yall doing the good work, keep it up!

  • @ApexArtistX
    @ApexArtistX Год назад

    interesting

  • @KuyaQuatro
    @KuyaQuatro Год назад

    this looks amazing! can't imagine the render times though hah

  • @RahulGupta1981
    @RahulGupta1981 Год назад

    What if I have separate color image and depth image in skybox format?

  • @Romeo615Videos
    @Romeo615Videos Год назад

    Yooooo I'ma try this I noticed when u moved i saw some warping...also is this like nurf where u cant set objects on the sand just pan around in the scene?

    • @lifecast7823
      @lifecast7823 Год назад

      NeRF requires many (30+) images of an unmoving scene from all different angles to create a model that can later be viewed from any angle. Baseline NeRF doesn't handle video or scenes where anything moves. NeRF is also not fast enough to render on mobile devices currently, and not easy to edit with existing video tools. In contrast, Lifecast's software needs only one or two images, can do video, can be edited, and can render on mobile devices, but this comes with a tradeoff that the 3D reconstruction only looks perfect when the virtual camera is exactly on top of the original camera, and the farther away you move, the more artifacts appear. However, we are always working to improve our reconstruction quality, which is powered by machine learning and 3D geometric computer vision.

  • @Aero3D
    @Aero3D Год назад

    How do we create a life cast video ourselves?

    • @lifecast7823
      @lifecast7823 Год назад

      Please contact info@lifecastvr.com

    • @Instant_Nerf
      @Instant_Nerf Год назад

      You can use an iPhone that has LiDAR. There’s a plug-in for blender and you can play the scene in 3D while you move around it. It’s not great .. but it’s good. I think you would need a lot of gear to be able to have a full 360.. or maybe something down to 3 or 4 cameras with LiDAR.

  • @labyjaerith
    @labyjaerith Год назад

    Very interesting! Is there any intention to incorporate these assets into Unity projects aimed at VR headsets, like Oculus Quest? Any plans to create tutorials? Cool stuff!

  • @asciikat2571
    @asciikat2571 Год назад

    sweet!

  • @mendezcakson
    @mendezcakson Год назад

    Woow

  • @lifecast7823
    @lifecast7823 Год назад

    UPDATE for UE5: ruclips.net/video/ekEXxo1neVo/видео.html

  • @jolynefanacount4326
    @jolynefanacount4326 Год назад

    How come when I click on export to Unity for my Holovolo, it gives me around 16 circles in the JPEG instead of the simple 4 circles you have for the image and the depths maps? When I put mine as the texture it just still shows the other circles in the scene. I can't make this method work at all...

    • @lifecast7823
      @lifecast7823 Год назад

      Sorry for the confusion, we updated the software to improve visual quality. Please download an updated Unity example project here: drive.google.com/file/d/197Ea3MHUKMsS4BUy86iVwGukmClV0V_9/view?usp=sharing

    • @inthesameboat
      @inthesameboat Год назад

      @@lifecast7823 what about the web player? same issue

  • @apexvr7007
    @apexvr7007 Год назад

    You're awesome bro!

  • @SoundGuy
    @SoundGuy Год назад

    Are there any plans for this shader to also be in URP? there are similar shader but nothing that does it like this.

  • @sidremus
    @sidremus Год назад

    Can't do much with that scene though. No collision data, no raycasting. Can't even edit the environment as it's just a volumetric image.

    • @DimiArt
      @DimiArt Год назад

      it's just for the background, nothing is supposed to be interacting with it.

    • @sidremus
      @sidremus Год назад

      @@DimiArt consider changing the video title then: Stable Diffusion to 3D Unity background art in 2 minutes

  • @impheris
    @impheris Год назад

    not bad

  • @dpredie
    @dpredie Год назад

    which depth estimation model did you use? Midas? curious on how did you get absolute depth values because most depth estimation models did not return absolute values and temporarlly inconsistent too

    • @fbriggs000
      @fbriggs000 Год назад

      We don’t use Midas for vr180 input. It’s a stereo depth estimation method using deep learning. We use the distance between the two cameras to produce results in units of meters, accurate to within the limits of calibration and disparity error.

  • @xecense
    @xecense Год назад

    This is super cool! I was wondering If you knew if there is a way to convert an image and depth map into the file needed?

    • @lifecast7823
      @lifecast7823 Год назад

      We are working on something like this. Stay tuned!

    • @xecense
      @xecense Год назад

      @@lifecast7823 Nice looks really cool so far, i have been really wanting to dive into this and understand how I can tweak it and utilize this even more. Any suggestions of where I should start?

    • @lifecast7823
      @lifecast7823 Год назад

      @@xecense lifecastvr.com/tutorial.html

    • @xecense
      @xecense Год назад

      @@lifecast7823 Yoooo thanks brother !

    • @TeamPapaFils
      @TeamPapaFils Год назад

      @@lifecast7823 Thank vm bro ! but please wich parameter do you use to generate the initial image? wich dimensions etc?

  • @undrash
    @undrash Год назад

    Damn that's insane! Thanks for sharing

  • @lifecast7823
    @lifecast7823 Год назад

    Created with holovolo.tv (AI-generated 3D art for web and VR)