3D Materials & Lighting for The Apple Vision Pro - For Beginners

Поделиться
HTML-код
  • Опубликовано: 25 дек 2024

Комментарии • 46

  • @haoouyang8257
    @haoouyang8257 10 месяцев назад +3

    Oh wow the reflection on the metal and wooden texture is so real! Hope the glass and emission texture would be work in the future. Thanks for sharing.

    • @michaeltanzillo
      @michaeltanzillo  10 месяцев назад

      My pleasure!! So glad you found it useful.

  • @tysont6529
    @tysont6529 10 месяцев назад

    Incredibly helpful examples to understand the limitations of the platform. Those fabric textures look amazing.

    • @michaeltanzillo
      @michaeltanzillo  10 месяцев назад

      Right!?!? So much better than I anticipated!

  • @SuperGGnoRE
    @SuperGGnoRE 10 месяцев назад +1

    Very cool! I love that wood texture as well. Hopefully those other properties work eventually, glass could look so beautiful.

    • @michaeltanzillo
      @michaeltanzillo  10 месяцев назад

      Eventually, I'm sure they will but what they have been able to accomplish so far is remarkable.

  • @iknowcoati
    @iknowcoati 10 месяцев назад +1

    Thank you for making such awesome content! Good resources on model & material creation for the vision pro are almost non-existent. Very much appreciate your work, and subscribed for more!

  • @snapsjot
    @snapsjot 10 месяцев назад +2

    Hi Michael, your videos really differ from the standard videos about the Vision Pro. You are actually showing what you can do with it on a creative level. What I would like to know is: If a couple of persons would stand right next to you in the same room with a Vision Pro headset, would you all see the same 3D model and could you all interact with it? And, what program do you use to upload and show the 3D models? Thanks in advance & looking forward to your upcoming videos.

    • @michaeltanzillo
      @michaeltanzillo  10 месяцев назад

      Thanks so much!
      For sharing 3D models with others, right now you have two main options.
      1. I could facetime someone and share my screen with them but they would only see a flat image of the 3D model.
      2. There are apps like Jig Space which are awesome and allow multiple people to log in and see the same 3D model, but that has a subscription cost.
      For uploading and viewing, I'm simply double clicking the model file (.usdz). The Apple Vision Pro knows to open it in this interactive viewer.

  • @BenBoyle1
    @BenBoyle1 10 месяцев назад +1

    super cool. that's great tech, and an amazing demo to understand how it's working, thanks!

  • @victorgomezsa
    @victorgomezsa 10 месяцев назад

    It's a great idea to show how the materials look like, thank you very much for this video!!!! Looking forward to see more! Regards!

  • @dirtcreature3d
    @dirtcreature3d 10 месяцев назад +1

    duuuude you're making me want a vision haha, the fact that it builds hdri's of your surroundings is insane!! cant wait to see how this progresses hopefully metal render engine will get some big updates at dub dub

    • @michaeltanzillo
      @michaeltanzillo  10 месяцев назад +1

      I couldn't believe it when I saw it! I was sure it would just be some default rig that would get brighter and dimmer but that's it. So cool.

    • @dirtcreature3d
      @dirtcreature3d 10 месяцев назад +1

      @@michaeltanzillo yeah definitely an exciting insight into apples thought process for the VP, if this is what their thinking about for v1 I can’t wait to see like v3 or 4

    • @michaeltanzillo
      @michaeltanzillo  10 месяцев назад

      @@dirtcreature3d 🤯

  • @VIREELY
    @VIREELY 8 месяцев назад +1

    I love your videos

  • @sifudarryl
    @sifudarryl 9 месяцев назад

    Hi Michael, great video with your explanations on materials and lighting. In one of the Apple examples from WWDC2023, they have a fishbowl that does indeed have a glass material. I haven’t dived into how that was done yet, but I did want to let you know that they did have glass materials with water showing inside the fishbowl

    • @michaeltanzillo
      @michaeltanzillo  9 месяцев назад

      That's interesting! I'll check that out.
      They must just be playing with the opacity and faking some of the effects? Because to get some of those effects, you would need a raytracing engine with the translucency settings and I just don't think that is in the renderer now.

    • @sifudarryl
      @sifudarryl 9 месяцев назад

      @@michaeltanzillo I’m not sure how they did it, but they have the code available if you want to dig in. The name of that example is called “swiftSplash”

  • @codywinesettx
    @codywinesettx 9 месяцев назад

    Love the video what software did you use to make all models with the different textures?

    • @michaeltanzillo
      @michaeltanzillo  9 месяцев назад +1

      The model is the Mat character that comes standard with Substance Painter. All the materials were either created by me in Substance or pulled from the Substance Asset Library.
      If you are curious about it, I have a whole tutorial series on Substance Painter:
      ruclips.net/p/PLfAG6wq8HC-9CP7VoTHwGvGWJtcScHLYP

  • @saliclothing
    @saliclothing 10 месяцев назад +1

    Hey, how about complete surroundings? Is there a way to use my Cinema4d or my unreal engine scene on the vision pro?

    • @michaeltanzillo
      @michaeltanzillo  10 месяцев назад +1

      It does your complete surroundings! It seems to regularly map the updated changes in the scene around you to the HDRI map around the 3D objects.
      For C4D and Unreal...for sure! Should have more videos about those in the future.
      For C4D, it's all about exporting as a USD/USDz file format with all your animation and texture data baked in.

    • @saliclothing
      @saliclothing 10 месяцев назад

      @@michaeltanzillo very interesting, I am super excited to hear more about this

  • @MaxGeeAparri
    @MaxGeeAparri 6 месяцев назад

    Amazing! About the texture of each model what size did you use is it 2k or 4k?

  • @VIREELY
    @VIREELY 8 месяцев назад

    What program did you use to create the model and add textures ?

    • @michaeltanzillo
      @michaeltanzillo  8 месяцев назад

      The model is a default model with Substance Painter. And all the materials were created in Substance Painter as well. You can find lots of Painter tutorials on my channel.

  • @Hobnockers
    @Hobnockers 7 месяцев назад +1

    Yeah, it’s a real-time render engine. The reflection comes form a smaller blurry cubemap , similar like in the unreal engine.
    That emissive isn’t working is strange. SSS, Displacement and Translucency isn’t working is also like a game engine.
    I guess UE 5.4 just added displacement for-real time.
    But what would be interesting to figure out: if the engine supports some basic lights?
    All videos I see is just all about the AR, which captured in real-time the environment and is using it for the reflection.
    What I couldn’t see in your video is shadows. Are there any shadows casted? Or any features to add Ambient Occlusion?
    Finally, what if you want to light a tiny room? Or any 3d object of your choice with your own 3d lights of your choice?
    I guess in that case you might need to compile perhaps your own 3d project from the Unreal Engine, similar like you would do when using the quest.
    But since Apple Vision Pro is its own Computer and GPU, it would be great to know if it could compute Unreal Engine Projects with High-Fidelity settings, not mobile render settings.
    Hmmm… if that would work, it would be super powerful then.

    • @michaeltanzillo
      @michaeltanzillo  7 месяцев назад

      If you wanted to add custom lighting, you would either need to create a custom IBL setup or take it into a software like Unity to add lighting in the same way you would in a video game.

    • @Hobnockers
      @Hobnockers 7 месяцев назад +1

      @@michaeltanzillo yeah, UE5 would be my choice. Nanite can be also used for VR.
      But one thing that I don’t get. When deploying a VR app to Meta Quest (mobile) you need to use forward shading.
      Other option is deferred rendering for Desktop and then stream it to the VR Headset.
      What I was wondering is if the Apple Vision Pro can handle deferred rendering, since it was advertised as its own powerful super computer with a great CPU & GPU?
      If that’s not possible, Meta Quest is undeniably beating the APV.
      I would say that’s key for VR applications . Imagine you can use deferred rendering and still get you 90FPS.

    • @michaeltanzillo
      @michaeltanzillo  7 месяцев назад

      That's a great point! I haven't fully tested that yet but definitely worth a look as I move forward. Thanks! @@Hobnockers

    • @Hobnockers
      @Hobnockers 7 месяцев назад

      @@michaeltanzillo that would be amazing. I think that would be the real deal.
      But in case Apple Vision Pro doesn't support deferred rendering, or can't handle it like a PC, Workstation, or like a MacBook Pro. Well then, if you need a fully immersive VR Experience you have then the same situation when using the Meta Quest.
      But regardless, if the Apple Vision Pro can't handle the deferred rendering, you also would need for that another PC, Laptop, or Mac for computing.
      And if that's the case, there is absolutely no advantage over the Meta Quest.
      But if the Apple Vision Pro could handle that, computing deferred rendering, than this would be indeed a fantastic advantage.
      When you should Shader issues, like Translucency not beeing rendered, Emissive didn't work, of course Displacement didn't work, etc. this seems like Apple Vision Pro is also jus using forward shading, perhaps Vulkan.
      However, but that explains also the shader tests you have been running, that the AR Mode just uses forward rendering.
      If you get any chance to deploy an Unreal Project with PC Project Settings (Shader Model 6, deferred rendering) to the Apple Vision pro, and that works. Wow, that would be indeed great news.
      really curious now.. and keep rocking, your videos are fantastic! cheers

  • @tufanaydin6340
    @tufanaydin6340 5 месяцев назад +1

    Is it created in Unity?

    • @michaeltanzillo
      @michaeltanzillo  5 месяцев назад +1

      Nope. Created in Substance Painter and exported a USDz from there and straight into the Apple Vision Pro.

    • @tufanaydin6340
      @tufanaydin6340 5 месяцев назад

      @@michaeltanzillo What app in Apple Vision Pro???

    • @michaeltanzillo
      @michaeltanzillo  5 месяцев назад +1

      @@tufanaydin6340 No app at all. You can just “double click” a USDz file in the Apple Vision Pro, this is how it appears. You can check out my workflow here:
      ruclips.net/video/wuDbkXWSGIc/видео.htmlsi=7zAv55eCTrqhbRZt

  • @jsardone
    @jsardone 3 месяца назад

    I'm curious if anyone has had any luck with imposing emissive light on the XR environment by introducing psudo surfaces using the LiDAR scan of the room?

    • @michaeltanzillo
      @michaeltanzillo  3 месяца назад +1

      me too!

    • @jsardone
      @jsardone 3 месяца назад

      @@michaeltanzillo i wonder if we can somehow use collision objects as pseudo / nulls that accept light - or possibly the plane id system? true depth camera etc.

  • @cjadams7434
    @cjadams7434 9 месяцев назад

    I suppose the resolution is probably the foveated rendering updating

    • @michaeltanzillo
      @michaeltanzillo  9 месяцев назад

      Sure. That's playing a role. It's also a higher resolution when you are looking through it. 4K imaging scaled down and compressed for RUclips so you will have to fill in some quality gaps there :)