The problem is that there is little to no documentation on Reality Composer Pro. Apple released two WWDC videos on Vision Pro and Reality Composer Pro and essentially said, "Figure it out." The reason I am learning Unity as an iOS developer is that the learning path is much smoother than what Apple has to offer.
Thank you for making this! I’m a little confused after watching Apple’s wwdc videos. When using URP, we don’t have access to the passthrough camera, right? Goal is to make a glow-y/bloom-y emissive material in an AR environment, but I’m not seeing any examples of post processing effects from Apple’s side.
Hey Tassilo, that was a very clear and useful comparison! Thanks. I guess, you also have ideas about the further workflow that people need to think about to get a full Vision OS app together. For example what I struggle with is that Reality Composer Pro has a nice shader graph editor, but for now I can only apply those materials to the standard primitive objects like cube and sphere. Or do you agree that RC-pro is still just half finished as a beta? I started following you. 👍
Glad you got some value out of it. I agree that Reality Compose Pro is still a work in progress. I think Apple wants to get the tools out there as quickly as possible so people can start developing for the platform which means that they might release half-baked features on the developer side. Since the tools are so new I need some more time to form a deeper opinion on some things.
Hey, the RC Pro Shader Graph tool is much more powerful than what's seen on a first look. The custom material shader can be used to do geometry modification etc. One of the WWDC videos on material shader has a demo. It's called "Explore materials in Reality Composer Pro"
Hey man Love your youtube! I remember you had a video like "my strategy for Vision OS development" but now I can't find it. You said that probably you'll focus on Unity development because it will be nice to test on existing VR headsets instead of waiting Vision Pro availability. I really wanted to share this video with my colleagues :D
I unlisted the video because I changed my mind about the exact strategy I want to pursue, but you can find the video here ruclips.net/video/m1dAarw_jQA/видео.html
@@alexander99099 I think the right decision comes down to what type of app you want to build. I'm still trying to figure that out but wanted to go understand visionOS dev in Swift deeply. There are pros to it such as having access to SwiftUI which allows you to create much more sophisticated UI than in Unity ...
Apple has some excellent materials teaching Swift UI. There are also a ton of RUclipsrs like Hacking with Swift teaching Swift UI. As for Reality Composer, I would probably watch past WWDC presentation.
🎯 Key points for quick navigation: 00:00 *📚 Overview: Introduction to Unity and Apple native frameworks for Vision Pro app development.* 00:26 *🧐 Clarification: Immersive and fully immersive means augmented reality (AR) and virtual reality (VR).* 01:05 *📐 UI Design: AR apps can operate in full or shared spaces with specific UI constraints.* 01:30 *🎨 Rendering Engine: Reality Kit renders AR apps in both shared space and full space.* 01:57 *⚙️ Unity Translation: Unity uses poly spatial for translating materials and shaders into Reality Kit.* 02:24 *🛠️ Development Tools: Apple native frameworks provide Xcode, Swift UI previews, and Reality Composer Pro for development.* 03:05 *🏗️ Integrated Environment: Apple's frameworks offer a seamless, integrated development experience for AR/VR.* 03:20 *🔄 Unity Support: Poly spatial supports Unity's features like particle effects but not handwritten shaders.* 04:14 *📸 Volumetric Cameras: Essential for rendering content in shared spaces.* 04:27 *👀 User Interaction: Vision Pro utilizes eyes, hands, and gestures for user input within Unity.* 04:55 *🚀 Coming Soon: Poly spatial is not available yet, but a beta program is open for applications.* 05:08 *🌐 Cross-Platform: Unity's Universal Render Pipeline supports cross-platform compatibility.* 05:37 *🔀 Decision Factors: Choose technology based on cross-platform needs and development preferences.* 07:12 *🚀 Speed of Execution: Personal familiarity with development tools influences the speed of reaching an MVP.* 07:41 *📢 Viewer Engagement: Request for audience feedback on content preferences and future topics.* Made with HARPA AI
With all existing XR developers using either Unity (or other game engines), what incentive do they have to bow down to Apple's ecosystem and pay for Unity Pro's $2000 a year fee and Apple's $100 developer fee, even just to dabble in making something? They barely even advertise Unity on their website (it's a small font sentence at the end of one of the sections). XR/immersive apps are already difficult to profit from, so what incentive do devs have to make these when most could barely even afford the headset, meaning userbase is low?
One big issue is mobile developers know nothing about unity 3d/c-sharp development and visa versa. I think at most we may get 2.5 d in ipad apps. But since there NO CUSTOMERS for this for at least 2/3 years this could be as dead in the water app-wise than Apple Watch apps or Apple TV apps. The apple watch 1 had terrible connectivity between watch and phone so app devs gave up developing for it. The apple tv apps never caught on. But at least the apple watch had customers at launch :) Also will VR devs bother converting their apps to hand tracking. And since even some devs wont be able to afford this device for years I think this will morph into a Mac on your face for mobile working or and Apple TV on your face for content (not some amazing 3d AR device). Mostly used by mobile workers and travellers. well as long as they only watch movies that are under 2 hours long :) And i hope they settle with Epic as what this device really needs is Unreal nanite and Lumen :)
Thank you, your explanation is crystal clear! Your video is still useful after Unity has officially released PolySpatial.
Good job. Clean and objective. That’s what I was looking for. Thanks!
The problem is that there is little to no documentation on Reality Composer Pro. Apple released two WWDC videos on Vision Pro and Reality Composer Pro and essentially said, "Figure it out." The reason I am learning Unity as an iOS developer is that the learning path is much smoother than what Apple has to offer.
That's true there are so many good Unity resources out there
This is nothing but the truth.
I found that i will barely be using RCP but because Ive been using other softwares I’ve figured it oit. Mostly.
Check that Unity license fee before you go too far down that road for Vision Pro.
Thanks for the clear explanation. I wouldn’t mind if you went into more technical detail.
Thank you so much for sharing this! It was very helpful!
Hey man, love your content! Wouldn't mind if the videos get a bit more technical as I am starting out in XR dev/design.
Thanks for the summary
Thank you.
Can we build app for apple vision pro with unreal?
Any updated version of this video?
Thank you for making this! I’m a little confused after watching Apple’s wwdc videos. When using URP, we don’t have access to the passthrough camera, right?
Goal is to make a glow-y/bloom-y emissive material in an AR environment, but I’m not seeing any examples of post processing effects from Apple’s side.
Hey Tassilo, that was a very clear and useful comparison! Thanks. I guess, you also have ideas about the further workflow that people need to think about to get a full Vision OS app together. For example what I struggle with is that Reality Composer Pro has a nice shader graph editor, but for now I can only apply those materials to the standard primitive objects like cube and sphere. Or do you agree that RC-pro is still just half finished as a beta? I started following you. 👍
Glad you got some value out of it. I agree that Reality Compose Pro is still a work in progress. I think Apple wants to get the tools out there as quickly as possible so people can start developing for the platform which means that they might release half-baked features on the developer side. Since the tools are so new I need some more time to form a deeper opinion on some things.
Hey, the RC Pro Shader Graph tool is much more powerful than what's seen on a first look. The custom material shader can be used to do geometry modification etc. One of the WWDC videos on material shader has a demo. It's called "Explore materials in Reality Composer Pro"
thanks for the video
Hey man
Love your youtube!
I remember you had a video like "my strategy for Vision OS development" but now I can't find it. You said that probably you'll focus on Unity development because it will be nice to test on existing VR headsets instead of waiting Vision Pro availability. I really wanted to share this video with my colleagues :D
I unlisted the video because I changed my mind about the exact strategy I want to pursue, but you can find the video here ruclips.net/video/m1dAarw_jQA/видео.html
@@tassilovg Thank man! Would be interesting why you changed it? Decided to stick with Swift?
@@alexander99099 I think the right decision comes down to what type of app you want to build. I'm still trying to figure that out but wanted to go understand visionOS dev in Swift deeply. There are pros to it such as having access to SwiftUI which allows you to create much more sophisticated UI than in Unity ...
@@tassilovgthat’s definitely a strong alert on where to use the available time for learning.
Pretty informative. Any idea on where to start learning the Apple native framework (SwiftUI , Reality Composer )?
Apple has some excellent materials teaching Swift UI. There are also a ton of RUclipsrs like Hacking with Swift teaching Swift UI. As for Reality Composer, I would probably watch past WWDC presentation.
Swift first
Your content is awesome!🎉
🎯 Key points for quick navigation:
00:00 *📚 Overview: Introduction to Unity and Apple native frameworks for Vision Pro app development.*
00:26 *🧐 Clarification: Immersive and fully immersive means augmented reality (AR) and virtual reality (VR).*
01:05 *📐 UI Design: AR apps can operate in full or shared spaces with specific UI constraints.*
01:30 *🎨 Rendering Engine: Reality Kit renders AR apps in both shared space and full space.*
01:57 *⚙️ Unity Translation: Unity uses poly spatial for translating materials and shaders into Reality Kit.*
02:24 *🛠️ Development Tools: Apple native frameworks provide Xcode, Swift UI previews, and Reality Composer Pro for development.*
03:05 *🏗️ Integrated Environment: Apple's frameworks offer a seamless, integrated development experience for AR/VR.*
03:20 *🔄 Unity Support: Poly spatial supports Unity's features like particle effects but not handwritten shaders.*
04:14 *📸 Volumetric Cameras: Essential for rendering content in shared spaces.*
04:27 *👀 User Interaction: Vision Pro utilizes eyes, hands, and gestures for user input within Unity.*
04:55 *🚀 Coming Soon: Poly spatial is not available yet, but a beta program is open for applications.*
05:08 *🌐 Cross-Platform: Unity's Universal Render Pipeline supports cross-platform compatibility.*
05:37 *🔀 Decision Factors: Choose technology based on cross-platform needs and development preferences.*
07:12 *🚀 Speed of Execution: Personal familiarity with development tools influences the speed of reaching an MVP.*
07:41 *📢 Viewer Engagement: Request for audience feedback on content preferences and future topics.*
Made with HARPA AI
nice video
Keep up the good work!
Thank you
Great video!
great video, still useful 7mo later
Good video mate. I want to create a vision pro digital twin of my 110 acre jungle property. How should I approach that?
Interesting. Let me think about it. I'll DM you
With all existing XR developers using either Unity (or other game engines), what incentive do they have to bow down to Apple's ecosystem and pay for Unity Pro's $2000 a year fee and Apple's $100 developer fee, even just to dabble in making something? They barely even advertise Unity on their website (it's a small font sentence at the end of one of the sections). XR/immersive apps are already difficult to profit from, so what incentive do devs have to make these when most could barely even afford the headset, meaning userbase is low?
Best explanation yet.
One big issue is mobile developers know nothing about unity 3d/c-sharp development and visa versa. I think at most we may get 2.5 d in ipad apps. But since there NO CUSTOMERS for this for at least 2/3 years this could be as dead in the water app-wise than Apple Watch apps or Apple TV apps. The apple watch 1 had terrible connectivity between watch and phone so app devs gave up developing for it. The apple tv apps never caught on. But at least the apple watch had customers at launch :) Also will VR devs bother converting their apps to hand tracking. And since even some devs wont be able to afford this device for years I think this will morph into a Mac on your face for mobile working or and Apple TV on your face for content (not some amazing 3d AR device). Mostly used by mobile workers and travellers. well as long as they only watch movies that are under 2 hours long :) And i hope they settle with Epic as what this device really needs is Unreal nanite and Lumen :)
Show me Action game or somethings? Show the people about this?