How to Convert ANY 2D Video to 3D Spatial Video on Vision Pro with AI ft. OpenAI Sora

Поделиться
HTML-код
  • Опубликовано: 28 сен 2024

Комментарии • 122

  • @hughhou
    @hughhou  7 месяцев назад +4

    So Owl3D switched to subscription now. But I still have good news for you. Here is a 10% off code you can use: " HUGH10 " on any plan. This is NOT an affiliated code or link (I don't get anything - 100% giving back to our viewers). I feel really bad for the misinformation, so I hope this makes it up to you. As always, I am doing my best to serve the community. A thumbs up and share are greatly appreciated!

  • @charleslorne
    @charleslorne 7 месяцев назад +2

    Dude you are THE most knowledgeable and resourceful person on this topic on the entire web period!!

    • @hughhou
      @hughhou  7 месяцев назад

      Awww thank you I am so glad you think so :)

  • @CarlitoTries
    @CarlitoTries 7 месяцев назад +6

    There is no 1 time purchase for that software.

    • @hughhou
      @hughhou  7 месяцев назад +1

      Oh good to know. Let me follow up with the developer to see what is going on.

  • @toralfczerniak8250
    @toralfczerniak8250 7 месяцев назад +3

    I can't find a Lifetime License there. When upgrading, only monthly subscriptions can be found

    • @toralfczerniak8250
      @toralfczerniak8250 7 месяцев назад +2

      I mean Owl3D

    • @toralfczerniak8250
      @toralfczerniak8250 7 месяцев назад +3

      I would have bought the program straight away for a fixed price. But I won't take out subscriptions for ANY program.

    • @christofthuis1834
      @christofthuis1834 7 месяцев назад +1

      My thoughts exactly. We need a competitor with fixed price

    • @hughhou
      @hughhou  7 месяцев назад +1

      Hey so when I bought the software, it was fixed price. But I think they change it recently (I purchase it a while back). Let me ask what is other options out there and report back. Sorry for the confusion.

    • @toralfczerniak8250
      @toralfczerniak8250 7 месяцев назад

      I heard that looking Glass Factory is behind olw3d. I backed the looking glass portrait and also the looking glass go. Maybe it is possible for looking glass supporters to get special conditions

  • @anivedk
    @anivedk 7 месяцев назад +2

    Hi Hugh! How do I make the video pass through , do I need a green screen? Not clear

    • @hughhou
      @hughhou  7 месяцев назад

      Pass tho need Green Screen.

  • @md.fahimalam7481
    @md.fahimalam7481 7 месяцев назад +2

    will it be possible with any 2d videos or only AI generated videos?

    • @jakemathew
      @jakemathew 7 месяцев назад +1

      Owl3D converts any 2D video to stereo-3D. Not sure why OpenAI Sora is in the title of this video (besides to get clicks)

  • @ksnax
    @ksnax 7 месяцев назад +1

    Almost didn't watch this one as it seemed Apple exclusive. Glad I did though. Owl3D was a tool I did not know about! I just get to skip the spatial video conversion part to work with my Quest 3.

    • @hughhou
      @hughhou  7 месяцев назад

      Hope you enjoy it! Yes it work on any 2D to 3D conversion but AVP make it do it allover again as the 3D look great there. But if you only have Quest, the Quest Spatial Video also benefit greatly for this approach.

  • @DillonThomasDigital
    @DillonThomasDigital 7 месяцев назад +1

    Great video! Glad to watch it as it premiered! Question about the export to looking glass hologram - this obviously spits out a more than stereoscopic image - how would one export something for viewing in quest3? I've been anticipating the democratization of volumetric video, and it appears we're encroaching on that now. Thanks again for your work!

    • @hughhou
      @hughhou  7 месяцев назад

      So looking glass is basically a RGB video with depth data (depth estimation). It has focus elements to make depth look better in Looking Glass or any hologram display. In Meta or any VR headset require higher resolution depth map to do Volume / Volumetric. There are several papers in active developement on this topic and none of them is good enough for pratical use yet. Kinda like early day of NeRF. But thing will change anyday as it is just science lol

  • @FlexScreen
    @FlexScreen 6 месяцев назад +1

    Hey Hugh, love the tutorials. Where can we find the lifetime license for Owl3d?

    • @hughhou
      @hughhou  6 месяцев назад

      So apparently I am wrong. Lifetime license (the one I got) is grandfather account. Since now the stable release only have subscription. But I got a discount code for you guys in the pinned comments. Might end up be cheaper.

  • @DSK123
    @DSK123 7 месяцев назад +1

    I guess I’ll have to wait and see when the video premiers but I’m left wondering if this is true spatial (in a previous video you mentioned parallax) or “just” 3D. I’ve been playing around with 2D to 3D conversion for some time now and it’s awesome but I’m curious about Spatial.

    • @Instant_Nerf
      @Instant_Nerf 7 месяцев назад +1

      To me Spacial means layers of objects , between them .. space between space, you can go between.. What I see on Vision Pro is depth not Spacial. I really believed they were using the LiDAR to create these Spacial videos in the beginning 🤦 there is a hologram app on the Vision Pro that does it cool! But it’s a meeting app. It takes the depth camera on iPhone creates a 3D persona and imports that into the app as a hologram .. you can actually walk around the person.. pretty wild

    • @joelface
      @joelface 7 месяцев назад

      @@Instant_Nerf That's what I was hoping as well and was feeling pretty disappointed about it. However, the interesting thing about Apple's spatial videos is that they're planted in space and when you move you can see the scene from different angles (I believe). Almost as though you're looking through a window at the scene. With a regular 3D movie it stays in the same perspective no matter how you move.

    • @stephenbard1590
      @stephenbard1590 7 месяцев назад +1

      The extreme parallax between the foreground/background of so-called "spatial videos" in the Apple ads is blatant false advertising. Instead of what is depicted in the ads, the "spatial video" actually produced by the AVP is normal 3D video with a blurry frame-artifact surrounding it, and the only parallax is a false one between the dreamy portal frame and the image. Somehow I had the impression that this annoying blurry format of spatial videos is intended to mitigate the 3D depth so that Apple's hyper-sensetive customers don't suffer motion-sickness and throw up in their ridiculously overpriced headset. The AVP and the iPhone 15 Pro attempt to produce the same dreamy-framed format, but the iPhone has less depth because the lenses are so close together. Can you turn off the blurry frame . . . or maybe customize it it to be flowers or somesuch? The only benefit of Apple coining the term "spatial video" is to popularize the idea of saving and methods for viewing the same standard 3D photos/videos that have existed for many years. Even though I think that the blurry frame detracts from the normal crisp 3D video, I am impressed that many users and even reviewers seem to find that the blur instills some sort of mystical "memory" quality to what what is a standard file type that they may not routinely work with. Serious Apple cult manifestations!

    • @Instant_Nerf
      @Instant_Nerf 7 месяцев назад

      @@joelface you can look into it like a window but the angle doesn’t change .. I know I have one, the other mode you can clearly see the whole image move, when you expand the Spacial video full screen.. the whole 3D image moves .. causes a little nausea for me. It’s like stereoscopic with added effects. What I was thinking the same way they scan and create a persona of you.. it would be wild to scan an environment and 3D reconstruct the environment, we can walk around in.

    • @joelface
      @joelface 7 месяцев назад

      @@Instant_Nerf thanks for clarifying. I don’t own one. I am hoping that your idea is what we see become available soon. It’s absolutely beyond frustrating that they didn’t use LiDAR at all for spatial video. Seems like such an obvious choice.

  • @leandrogoethals6599
    @leandrogoethals6599 6 месяцев назад +1

    yo guys do u know any open-source scripts where u can use a depth model to sbs?

    • @hughhou
      @hughhou  6 месяцев назад

      Yes here: blog.mikeswanson.com/spatial

  • @tshuxeng5267
    @tshuxeng5267 7 месяцев назад +1

    no longer see the 139 buy option. is this similar to ani3d, thier price is only $40 one time fee?

    • @hughhou
      @hughhou  7 месяцев назад

      They cancel the 1-time fee initial offer :(

  • @toralfczerniak8250
    @toralfczerniak8250 7 месяцев назад +1

    Does anyone know if I can run it on two devices at the same time if I have a license? Converting takes a while. It would be good if I could run it on two devices at the same time and convert two videos at the same time.

    • @hughhou
      @hughhou  7 месяцев назад

      Yes you can I believe - but I will invest in a powerful GPU instead of invest in 2 computers. GPU with high vram make all the different.

  • @anivedk
    @anivedk 7 месяцев назад +1

    Hi Hugh! Do can we do this on a mac m3 ? Which apple model do you suggest to make content for meta quest 3

    • @hughhou
      @hughhou  7 месяцев назад

      I use M3 Max but any Apple computer should work. PC is actually better in AI still.

  • @friedguyfriedguy
    @friedguyfriedguy 7 месяцев назад +1

    How can I use a Pc to transform my owl3-d creation 180vr video to spacial video? I only have a pc no Mac.

    • @benhuson1
      @benhuson1 7 месяцев назад +1

      I simply transferred the video to the headset and it plays in 3D SBS on the headset (EDIT - this was on the Quest 3, just to confirm)

    • @hughhou
      @hughhou  7 месяцев назад

      Do you have an iPhone? If so, you can use Spatialfy or a almost free web service by SpatialGen (the company made HaloDepth). See the tutorial here: ruclips.net/video/PBGOpTsOyOA/видео.htmlsi=2UduGyUUDYnSlHj5 (almost at the end after the Resolve workflow)

  • @VRainier
    @VRainier 7 месяцев назад +2

    I cant find any option for one time purchase on Owl3D, maybe they removed it? ):

    • @hughhou
      @hughhou  7 месяцев назад

      That I don't know. I own Owl3D for a while now. Let me ask and circle back.

    • @toralfczerniak8250
      @toralfczerniak8250 7 месяцев назад

      same at me@@hughhou

    • @christofthuis1834
      @christofthuis1834 7 месяцев назад

      I asked. He doesn't. We need a competitor who does

    • @VRainier
      @VRainier 7 месяцев назад

      @@hughhou what did they said to you?

  • @spediegunz
    @spediegunz 7 месяцев назад +1

    have you compared the output of owl3ds 2d to 3d conversion to that of using apples built in spatial recording natively using iphone 15 does AI 3D look better than the real thing?

    • @jakemathew
      @jakemathew 7 месяцев назад +1

      The iPhone 15 Pro can only record spatial videos in 1080p at 30fps. Owl3D can create spatial videos up to 8K and 60fps. If you know how to use the Owl3D settings quality of the 3D is just as good if not better than the iPhone 15 Pros

    • @hughhou
      @hughhou  7 месяцев назад

      You can actually see it from this video side by side. The talking head at the beginning is shot on iPhone 15 Pro Max and all the AI video is Owl3D (well some of them are HaloDepth), and they are intercut together fine. Just like the 2D world, AI is coming hot and fast in all video industry. Here is the video if you are curious (using Vision Pro to exam go without saying): ruclips.net/video/qQv6ysfLEjw/видео.htmlsi=ehjdhMAnjl9cGtLs

  • @urbanvr
    @urbanvr 7 месяцев назад

    Hugh you're a genius 🙏🏾 thank you, for what you do.

  •  7 месяцев назад +1

    That's great. ❤

    • @hughhou
      @hughhou  7 месяцев назад

      Thanks! 😊

  • @brettcameratraveler
    @brettcameratraveler 7 месяцев назад +1

    Midas AI was used last year to convert 2D video to 3D. Curious if this method is better quality.
    Might be awhile before we can convert 2D 360 and 180 video to 3D using AI as it might require very large amounts of training data from cameras like that. Hopefully they

    • @hughhou
      @hughhou  7 месяцев назад

      Yes Midas AI I think it is still being used and also there is TikTok Depth anything. I think after this tutorial, I will have several other approaches thanks to our very knowledgeable community suggestions. To be honest, reading the comment in this video is better than watching the video iteself lol.

  • @artenman
    @artenman 7 месяцев назад

    Can I get #2 Pho please thanks, and some chicken spring rolls.

  • @christofthuis1834
    @christofthuis1834 7 месяцев назад

    Let's go

  • @dtaylor7798
    @dtaylor7798 7 месяцев назад +12

    Wow, it would be interesting to compare a 3D video shot with a VR camera and a video shot with a regular camera and converted to 3D using software. If the quality is similar, then this will create an explosive growth in content for VR

    • @jakemathew
      @jakemathew 7 месяцев назад +1

      The 3D conversion quality of Owl3D is amazing. Many 3D videos on RUclips and DeoVR were converted using Owl3D.

    • @hughhou
      @hughhou  7 месяцев назад +5

      Sadly, right now there is still a big gap. Shooting 3D is still 10000x time better than GenAI - but the gap is closer everyday.

    • @paulct91
      @paulct91 7 месяцев назад +2

      ....it can't be "that" good yet, even OWL3D is kinda meh, works to make 2D to SBS (VR180), RGBD (LookingGlassFactory) Hologram Depthmaps, and classic 3D Red & Blue picture/video conversions, I'm referrring to OWL3D.

    • @jakemathew
      @jakemathew 7 месяцев назад

      Owl3D definitely does NOT convert 2D to VR180. Not sure where you heard that.@@paulct91

    • @bobbyDig
      @bobbyDig 6 месяцев назад

      It's not so much about the camera, but how you frame your shot for Spatial video. I got some great results with 2D video conversion the HaloDepth.

  • @Billwzw
    @Billwzw 7 месяцев назад +5

    How does the 3D conversion quality and artifacts compare to Leia Lume Pad?

    • @hughhou
      @hughhou  7 месяцев назад +1

      Leia Lume Pad is also really good but I don't think they open source to Vision Pro tho, right? I will do the comparison in the next video. As Lume Pad is an entire different breast lol. In fact, may way to approach this topic. I should make a 20 min video instead of a 7 min one lol.

  • @stephenbard1590
    @stephenbard1590 7 месяцев назад +3

    The Looking Glass is obsolete technology, so you should be displaying your final 3D product on the Lume Pad 2, which has a larger screen, much higher resolution and deeper depth.

    • @hughhou
      @hughhou  7 месяцев назад

      Yes Lume Pad 2 is also a great option. I have it working on Lume Pad 1 and 2 as well. I can not include all the display in this video but will touch on that in the next one.

  • @Gorto68
    @Gorto68 7 месяцев назад +4

    Hugh -- here to show my support and learn from your explorations. Excited to watch this stream and share to my circle.

    • @hughhou
      @hughhou  7 месяцев назад

      Awww thank you thank you! Good to see you live!

  • @bujin5455
    @bujin5455 7 месяцев назад +2

    2:58. Wouldn't this work with a 3090? Other than the 4090 being faster, they have basically the same feature set, and the same amount of vram.

    • @hughhou
      @hughhou  7 месяцев назад

      4090 is just faster - it work with even just your GPU - but really REALLY slow.

  • @YOCK7
    @YOCK7 2 месяца назад +1

    Hi Hugh should you still buy gears shooting spatial videos when there are already AI tools to help transforming the 2D videos, what would you suggest, Thanks!

    • @hughhou
      @hughhou  2 месяца назад

      I will, as AI tool, it is getting better, still no match with camera designed for the job

  • @lixinzhao6987
    @lixinzhao6987 7 месяцев назад +4

    这套369眼镜真好看。喜欢!❤

  • @crentist5723
    @crentist5723 4 месяца назад +1

    Why is HaloDepth so pricey per minute but Owl3D seems to have a better price overall. Is the quality difference even worth it?

    • @hughhou
      @hughhou  4 месяца назад

      Actually, Owl3D right now still get me better quality if you use the crazy render - ultra precise need 20G of Vram (you will need a RTX 4090 to run it locally). It gives you the best 3D and yes, better training model for AI give you way better depth estimation and you can def see the difference in AVP.

  • @Bil30321
    @Bil30321 5 месяцев назад +1

    Can you make a 180 3D VR video of a Star Wars X Wing fighter going through Death Star trenches using Sora?

    • @hughhou
      @hughhou  5 месяцев назад

      Yes! I am planning

    • @Bil30321
      @Bil30321 5 месяцев назад

      Here is an example of what 180 VR Star Wars could look like, although this was made in Blender, not Sora, which will be way better: ruclips.net/video/gw4AuT2PD34/видео.htmlsi=CrwcGQb4lRRqFy_d

  • @GezginDeneyci
    @GezginDeneyci 3 месяца назад

    yaptığı tek şey piksellerin parlaklığında siyah beyaz ayrı bir video oluşturması yapılan en klasik yöntem bu ben daha spesifik bişi bekliyordum bunu duvar kağıdı uygulamam bile yapıyordu arka planda boşa enerji kaybı diye silmiştim videoları orjinal 3 d bir kamera ile çekmek daha iyidir

  • @ElectricVehicleSpace
    @ElectricVehicleSpace 7 месяцев назад +1

    How do you record the view from inside the camera?

    • @hughhou
      @hughhou  7 месяцев назад

      You can use Xcode. Or just the native recorder under Control Panel.

  • @writeforanimation
    @writeforanimation 7 месяцев назад +1

    Do these videos have the same resolution limitation from Apple, or can you play at 4K resolution from the Apple Vision Pro?

    • @hughhou
      @hughhou  7 месяцев назад +1

      What resolution limitation? The limitation is only the M2 chip and as long as it run. I can play 4K per eye on Vision Pro and even pushing 5K per eye (not displaying it but serving it).

    • @writeforanimation
      @writeforanimation 7 месяцев назад

      @u When you take Spatial videos with the iPhone 15 Pro, they are limited to 1080p.

  • @BlondieSL
    @BlondieSL Месяц назад +1

    The only thing that could possibly interest me in this product, would be the availability of either an actually over-the-air (OTA) ATSC tuner with a GOOD 2D to 3D converter so as to watch all TV shows in 3D (even if 2D to 3D converted).
    Then, an option to watch 3D BluRay movies and even a 2D to 3D converter (in real time of course) to watch 2D BluRay movies in "3D".
    Also, the ability to connect this headset to 3D content via HDMI, like a 3D camcorder to watch 3D recorded content in full HD.

    • @hughhou
      @hughhou  Месяц назад

      So real time OTA conversion is still not there yet. But if there is one I will make a new tutorial.

  • @seyittahirozuolmez1026
    @seyittahirozuolmez1026 7 месяцев назад +1

    Great video. But i surely know that the AI can easily make precision depth map when its in creation phase. all other solutions have a problematic algoritms to generate eye-strain free stereoscopic images just like real-time 2D- 3D TV signal conversion.

    • @hughhou
      @hughhou  7 месяцев назад +2

      Good to know. Yes indeed. I hope OpenAI Sora can add this into native generation but not enough 3D media samples out there for AI to learn what is going on. That is the main problem. Sample size is small.

  • @MrLiree
    @MrLiree 7 месяцев назад +1

    At 3:13 ... Where did you get your information about the pricing of owl3d? The software states a monthly subscription. Or did I misunterstand sth. here?

    • @hughhou
      @hughhou  7 месяцев назад

      B/c I bought it like 6 - 8 months ago. It was 1 time fee but I guess they are launching something news let me circle back on this. Cheers!

  • @kingnt66
    @kingnt66 7 месяцев назад +1

    Thank you Hugh! You are always on top of the game! How does the quality compare OWL3D with HaloDepth? I use Owl3D but never use HaloDepth

    • @hughhou
      @hughhou  7 месяцев назад

      Owl3D right now still better I think but HaloDepth has update like almost every days so it will prob pass it soon :)

  • @jamesburland
    @jamesburland 7 месяцев назад +1

    Superb! Thank you.

    • @hughhou
      @hughhou  7 месяцев назад

      Our pleasure!

  • @taosranch
    @taosranch 7 месяцев назад +1

    Thanks! Another great video. But any chance of Owl3D or a like competitor coming soon to the mac? I'd prefer to keep my workflow all on my m3 macbook pro.

    • @hughhou
      @hughhou  7 месяцев назад

      Yes. HaloDepth is making a desktop service for Mac. And Owl3D is making something called Studio as well for any computers. So everything on this video are on active development - so workflow can change in coupled months. I will make update whenever those are available :)

    • @jakemathew
      @jakemathew 7 месяцев назад +1

      Owl3D released a Beta for their Mac app today.

    • @TV355
      @TV355 7 месяцев назад

      @@hughhou HaloDepth now offers free conversion up to 3 minutes in Full HD.

  • @ted4687
    @ted4687 7 месяцев назад +1

    Great🎉🎉🎉

    • @hughhou
      @hughhou  7 месяцев назад

      Thanks 🤗

  • @criticalkids
    @criticalkids 7 месяцев назад +1

    Love your videos man!

    • @hughhou
      @hughhou  7 месяцев назад

      I appreciate that!

  • @jesusmtz5141
    @jesusmtz5141 7 месяцев назад

    General question, i am trying to convert 3d hsbs movies to spatial video using ffmpeg with your code, but all i am getting is only one side and it is misplaced, any suggestions?

  • @knmisra
    @knmisra 7 месяцев назад

    Looking glass vs lume pad 2. Which is a better technology/ product overall?

    • @stephenbard1590
      @stephenbard1590 7 месяцев назад

      Looking Glass is completely obsolete. Lume Pad 2 has much larger screen, much higher resolution and deeper depth. Lume Pad has a huge suite of versatile AI-powered 3D apps.

  • @JohnNy-ni9np
    @JohnNy-ni9np 7 месяцев назад

    There is no such thing as using OpenAI Sora to convert 2D to 3D for now.
    Click bate and fake.

    • @brettcameratraveler
      @brettcameratraveler 7 месяцев назад +1

      Watch the video. Sora is just one step - an optional one at that. The end result is yes, you can choose to take a video that was originally created with Sora and process it with other apps to make it 3D.

    • @jakemathew
      @jakemathew 7 месяцев назад

      Owl3D is a great app that I've been using for months and it should have gotten it's own video. Choosing to make a video (and video thumbnail) combining it with OpenAI Sora and Apple Vision Pro just confuses some viewers. Owl3D has existed long before Sora and AVP - they are just the latest buzz words RUclipsrs like to use.

  • @Archway3D
    @Archway3D 7 месяцев назад +1

    Hi, Hugh! Thanks for making this great video and I will pass it on to Leon Lu who is the CEO of Owl3D. He will be very pleased.
    By the way, I recommend "render right side only" as render mode. This way, only the left side is converted with the depth map, and the right side is the original image, so you don't feel the distortion so much. Actually, I taught him this!

    • @hughhou
      @hughhou  7 месяцев назад +1

      Where is the "render right side only" options? And why this is better? Sorry I just want a little bit more technical insight.

    • @Archway3D
      @Archway3D 7 месяцев назад

      @@hughhou From "Advanced setting" after you select video to convert on Owl3D, you'll find Output render mode(by default "Render both sides"). Then chose "Render right side only".
      The converted video image may have the distortion in case of the scene foreground and the background were complexed. When the both side of left and right have each distortion, it's hard to watch the scene as a stereoscopic.
      So one side original and the other side converted are so efficient to watch the half of the scene no distortion. I prefer "Video 3D effect" to set Strength 9 because the one side is not converted. 75% people uses right eye mainly so that's why I recommend this way I don't feel stress.😁