Chris Young
Chris Young
  • Видео 141
  • Просмотров 83 835
2013 iOS AR / 2024 VisionOS AR
What a difference a decade makes…
In 2013 I tried to develop an animated series about a group of teenagers that hunted monsters using augmented reality glasses. The teens drove around in these cool dekotora style trucks capturing monsters. To pitch the project I augmented the printed presentation materials using Unity and Qualcomm’s QCAR api (eventually that became Vuforia). Needless to say I wasn’t able to sell the concept, but It’s still fun to look back on how game engines, realtime rendering, AR/VR have matured since those early days.
I can’t wait to see what the next decade brings us. :)
Просмотров: 23

Видео

Apple Vision Pro - third person controller
Просмотров 60День назад
Working out basic character controller using a simple pinch/drag gesture for both direct and indirect input. Unity/Polyspatial
Apple Vision Pro OpenAI Interactive Character
Просмотров 10614 дней назад
I created a little interactive character that you can chat with on Apple Vision Pro that leverages the OpenAI API. Latency on the requests are a little painful... but acceptable considering the amount of data being processed in both directions. The only prompt I gave the model was that it's an ancient warrior with a bad temper and he loves to work out.
Spatial Cartoons on Apple Vision Pro
Просмотров 2512 месяца назад
In 2015 I was working on a real-time rendered animated pilot in UE4. As part of the development materials I made a companion immersive cartoon experience for the Oculus DK2 with a port to the original HoloLens using Unity... I even jumped out of acting retirement and into a mocap suit to capture the main characters performance. 🤪 With the recent update to AVP 2.0 and the new timeline support in...
Apple Vision Pro -- Unreal USD to Reality Composer Pro
Просмотров 4162 месяца назад
While waiting for a solid UE to AVP pipeline I've been playing with the native SDK and kind of surprised how easy it is to take USD directly from UE into Reality Composer Pro. Looking forward to digging into v2.0 and checking out the new RCP timeline and lighting support. :)
Apple Vision Pro -- Spatial Video
Просмотров 1192 месяца назад
14 years ago, I rented a beam splitter and convinced a bunch of good friends to go out into the desert and shoot a 10 minute stereoscopic short film with me... today I encoded that short into the MV-HVEC codec and had my own private little screening in the Vision Pro. :)
Apple Vision Pro Hue API
Просмотров 1594 месяца назад
Friday afternoon hack... SwiftUI controlling Hue API.
Spatial Cartoons in Apple Vision Pro
Просмотров 1154 месяца назад
In the early 2000’s I founded a mobile gaming startup. I wanted to create original IP for wireless devices and then spin it off into traditional media. I designed a mobile game for Nextel, AT&T, Motorola and Nokia called "Wasabi Run" and developed "Wasabi Warriors", a companion 2d animated presentation done in Flash. In 2012, during the stereoscopic boom, I found the project files for the short...
Realtime Mocap Previz XR Unreal Engine Meta Quest
Просмотров 1294 месяца назад
Xsens body data is being recorded (for later post processing) in MVN and simultaneously streaming over Wi-Fi (via Live Link) to Unreal and the packaged Android XR app. iPhone is recording MetaHuman Animator files for high fidelity facial performance solving/retargeting. iPad is streaming Live Link ARKit Face over Wi-Fi to Unreal and to the packaged Android XR app. Unreal is recording the previz...
Meta Quest Movement SDK
Просмотров 2334 месяца назад
d I did some reading up on Meta’s Movement SDK and realized that there was a live link metahuman retargeting component. The documentation for setup is pretty thin but following along with their live link body example I discovered you can easily add a face and eye LL source to access that data. Overall the face tracking on the QuestPro is pretty satisfying and works surprisingly well right out o...
controlling my coffee maker with Apple Vision Pro.
Просмотров 2365 месяцев назад
SwiftUI, RealityKit and Webhooks.
Mocopi streaming to Apple Vision Pro
Просмотров 3905 месяцев назад
Mocopi streaming to Apple Vision Pro
Apple Vision Pro Spatial Portal Animation Reality Composer Pro
Просмотров 6306 месяцев назад
Apple Vision Pro Spatial Portal Animation Reality Composer Pro
#unrealengine5 #quest3 #XR
Просмотров 1927 месяцев назад
#unrealengine5 #quest3 #XR
#quest3 #XR #unrealengine
Просмотров 1599 месяцев назад
#quest3 #XR #unrealengine
Quest 3 Passthrough Advanced Locomotion System
Просмотров 6719 месяцев назад
Quest 3 Passthrough Advanced Locomotion System
#mocopi #arkit #ue5 #uefn
Просмотров 48210 месяцев назад
#mocopi #arkit #ue5 #uefn
Gaussian Splatting in UEFN
Просмотров 2,2 тыс.11 месяцев назад
Gaussian Splatting in UEFN
Meet The Voxels
Просмотров 56811 месяцев назад
Meet The Voxels
Meet The Voxels -- Behind The Scenes
Просмотров 23811 месяцев назад
Meet The Voxels Behind The Scenes
🐍SnakeHollywood🐍 - Gameplay 4413-4757-7096
Просмотров 4511 месяцев назад
🐍SnakeHollywood🐍 - Gameplay 4413-4757-7096
🐍SnakeHollywood🐍 - UEFN world scale landscape from LiDAR
Просмотров 23111 месяцев назад
🐍SnakeHollywood🐍 - UEFN world scale landscape from LiDAR
#metahuman #mocopi #fortnitecreative #uefn
Просмотров 2,5 тыс.Год назад
#metahuman #mocopi #fortnitecreative #uefn
Mocopi Mocap Metahuman
Просмотров 8 тыс.Год назад
Mocopi Mocap Metahuman
UEFN and Metahumans!
Просмотров 1,5 тыс.Год назад
UEFN and Metahumans!
Live Metahumans - Pixel Streaming / WebXR / CoreWeave
Просмотров 368Год назад
Live Metahumans - Pixel Streaming / WebXR / CoreWeave
Quest Pro Passthrough with Live Metahuman performance
Просмотров 880Год назад
Quest Pro Passthrough with Live Metahuman performance
#unrealengine #vtuber #animation #tiktok
Просмотров 526Год назад
#unrealengine #vtuber #animation #tiktok
WebXR -- AR, VR, Apple Quick Look.
Просмотров 251Год назад
WebXR AR, VR, Apple Quick Look.
#unrealengine #questpro pixel streaming editor.
Просмотров 462Год назад
#unrealengine #questpro pixel streaming editor.

Комментарии

  • @filipnguyen561
    @filipnguyen561 19 часов назад

    Tutorial would be awesome!

  • @jyoung741
    @jyoung741 3 дня назад

    That was pretty amazing, based on the few prompts you gave him! I wanted to hear more.…

  • @animeFXnstuff
    @animeFXnstuff 6 дней назад

    Been a while since I've seen this!

  • @yuriythebest
    @yuriythebest 10 дней назад

    Thanks for the comparison, I did some experimentation with the qualcom AR stuff back in the day as well, this apple AR stuff would be fun to develop for if it ever really becomes mainstream - I bought a gear VR when those were new and ported a game to it, but back then the approval proccess was super long and I think I spent like 6 months or so sending a build, then waiting a week or so for a response, repeat 20 times. By the time the game was published and I was considering making a second one, they made the gear VR obsolete. I really hate this about the VR world in general, where you buy a "thing" and it's supposed to be your "console", but in a year or two it's no longer supported and you must get the new thing. sorry for the long and somewhat tangential rant.

  • @alimwilliams5935
    @alimwilliams5935 10 дней назад

    Can you share the stack this was built on. Was it unity or did you leverage native Xcode trying to understand how you got lip syncing

  • @grimsk
    @grimsk 13 дней назад

    what a shy warrior he is :)

  • @philrynda
    @philrynda 13 дней назад

    High protein meals, like grilled meat. LOL

    • @brycelynch
      @brycelynch 13 дней назад

      Clearly my person life is creeping into this alternate AI reality.

  • @GigiGartenberg
    @GigiGartenberg Месяц назад

    Could this be back doored into other games such as secondlife perhaps?

  • @jeromehinds
    @jeromehinds Месяц назад

    Been Watching for a while I like your content

  • @creatortrainer9305
    @creatortrainer9305 Месяц назад

    how to buy these tools? and name of the software?

  • @salduchi1785
    @salduchi1785 2 месяца назад

    Has this been dumped? I would love to try this

  • @mannychop
    @mannychop 2 месяца назад

    super cool, thanks for sharing your insights too. cant wait to see more!!

  • @grimsk
    @grimsk 2 месяца назад

    wow, RCP 2.0 seems more promising than before :)

  • @tomhalpin8
    @tomhalpin8 2 месяца назад

    That parallax effect is how I want every old flatscreen game to be ported over to the Vision Pro. Like a window into another world!

  • @hazeyteeg1641
    @hazeyteeg1641 2 месяца назад

    You are a wizard

  • @gaussiansplatsss
    @gaussiansplatsss 2 месяца назад

    What camera did you use here?

  • @SteveTalkowski
    @SteveTalkowski 2 месяца назад

    I pretty much flipped out when I recently saw this, as I want to do exactly this with my Sketchbot IP on Vision Pro!

  • @SteveTalkowski
    @SteveTalkowski 2 месяца назад

    Love this!

  • @fr_tripzzy69
    @fr_tripzzy69 3 месяца назад

    can you make a tut video

  • @MrAkradio
    @MrAkradio 3 месяца назад

    最高だ

  • @Growth-VR
    @Growth-VR 4 месяца назад

    could you upload the tutorial for making this.

  • @Leitin
    @Leitin 4 месяца назад

    how did you put the audio in your metahuman

  • @ilyanemihin6029
    @ilyanemihin6029 4 месяца назад

    Great example! Thanks for sharing demo

  • @adventureskulldraws
    @adventureskulldraws 4 месяца назад

    Facenating. Any way to track fingers seperately? Could a leap motion be added to the set up easily? I am highly intriqued and will be subbing for more!

  • @adventureskulldraws
    @adventureskulldraws 4 месяца назад

    Is there any utility to this? Can you record any animations with it and export fbx? I'd much rather buy apple vision pro than a rokoko suit if it can do the job!!

  • @MV-et1ff
    @MV-et1ff 5 месяцев назад

    What sort of hardware are you using? That's incredibly fast tracking. Impressive.

    • @brycelynch
      @brycelynch 5 месяцев назад

      That was with a 2080ti running on Linux.

  • @xaby996
    @xaby996 5 месяцев назад

    Still bloated with bones. Oh well

  • @xaby996
    @xaby996 5 месяцев назад

    Are they more performance than ue5 editor?

  • @mikemanzano5363
    @mikemanzano5363 5 месяцев назад

    Is this an app you wrote?

    • @brycelynch
      @brycelynch 5 месяцев назад

      Yes, it’s just a demo. SwiftUI and RealityKit using webhooks to send commands to coffee machine.

  • @bestplayern1ytrox321
    @bestplayern1ytrox321 5 месяцев назад

    Great! But could you please tell me how to import ( bring ) the MetaHuman in UEFN? So that you can Lou Che the game with out errors

    • @brycelynch
      @brycelynch 5 месяцев назад

      If you pause the video at the beginning you can see the skeleton hierarchy. It requires unbinding meshes, combining head and body skeletons under same root, reskinning to new skeleton (copy skin weight back).

  • @bestplayern1ytrox321
    @bestplayern1ytrox321 5 месяцев назад

    How to bring the meta human in UEFN?

  • @matchdance
    @matchdance 5 месяцев назад

    Oh my god! This looks awesome!!!

  • @hazeyteeg1641
    @hazeyteeg1641 5 месяцев назад

    This is super awesome.

  • @tomhalpin8
    @tomhalpin8 6 месяцев назад

    Imagine if all flatscreen games were given this depth. So even old games could feel new

  • @hazeyteeg1641
    @hazeyteeg1641 7 месяцев назад

    Hello Chris. Is it possible to use the mocopi system for capturing the body, while using a quest pro for hand, face, and eye tracking, all at the same time? I’m considering buying StretchSense gloves, but I could save a lot of money if I just use my quest pro.

    • @brycelynch
      @brycelynch 7 месяцев назад

      It's been awhile since I looked into this... but I think the issue that I was running into was getting mocopi to compile for android. I don't believe Sony's unreal plugin will allow this. There might be a way to do it in windows... but probably not worth the effort. Depending on the level of fidelity you need... the new full body movement SDK from meta is probably the way to go. If you're comfortable building from source, it's straightforward to setup and supports everything you are looking for. The Q3 solves the realtime body a little better than the pro but the pro has the facial tracking. Trade offs everywhere. 🤪

  • @Instant_Nerf
    @Instant_Nerf 7 месяцев назад

    BS he used a vhs app to record

    • @brycelynch
      @brycelynch 7 месяцев назад

      Not sure what you mean by your post... but the VHS source was captured using a Aja Kona card and ~200 frames were extracted and then processed using the official gaussian-splatting reference implementation available through GitHub. I trained the data set for 30k iterations on a 4090. The splat was then edited down using the UnityGaussianSplatting toolset (available on Github) and A-Frame handled the immersive webXR render to headset.

    • @jimgunzel
      @jimgunzel 4 месяца назад

      @@brycelynchwhat kind of webxr system did you use?

  • @Reliix1
    @Reliix1 7 месяцев назад

    Looks sweet, do you know if mocopi works with iPhone 15? It's not listed on their site.

    • @brycelynch
      @brycelynch 7 месяцев назад

      It did for me. :)

    • @Reliix1
      @Reliix1 7 месяцев назад

      Great. Would really appreciate if you made a video of how mocopi handles sitting down, standing up and moving around it seems to do great but sitting down is a mystery🪑@@brycelynch

  • @freenomon2466
    @freenomon2466 7 месяцев назад

    this is great! maybe mocoppi and livelink arkit can be used for Vtubing!

    • @brycelynch
      @brycelynch 7 месяцев назад

      Sure can! I'd imagine there are a lot of examples of people doing exactly that. :)

  • @cj5787
    @cj5787 7 месяцев назад

    bro missed the chance to do Thriller

  • @shableep
    @shableep 8 месяцев назад

    Is this all running on the headset or on PC via wireless VR?

    • @brycelynch
      @brycelynch 8 месяцев назад

      This is headset through immersive web browser. WebXR. aframe.

  • @rmontti
    @rmontti 8 месяцев назад

    how did you streaming everything? or did you recorded first?

    • @brycelynch
      @brycelynch 8 месяцев назад

      Body and face is livelink. Real-time not recorded. A couple cameras switched using touch OSC. NDI is streaming live video feed into game and OBS is recording game output.

  • @parallax6005
    @parallax6005 8 месяцев назад

    Can you please do a tutorial how to connect mocopi to metahuman live thanks

  • @khayelihlembele6008
    @khayelihlembele6008 9 месяцев назад

    I needed this!! Thank you

  • @xvar.studio
    @xvar.studio 9 месяцев назад

    how did you import this to the uefn editor??

    • @brycelynch
      @brycelynch 9 месяцев назад

      I used a gs plugin on marketplace and imported to ue5. It creates a Niagara particle system to render the splat and I migrated that to UEFN.

  • @haikeye1425
    @haikeye1425 9 месяцев назад

    Nice!

  • @harwitz
    @harwitz 9 месяцев назад

    That looks fun. What app is that? Or is it your own unreleased thing?

    • @brycelynch
      @brycelynch 9 месяцев назад

      Yeah, just an internal (unreleased) project. :)

    • @harwitz
      @harwitz 9 месяцев назад

      @@brycelynchas someone who loves playing with puppets and is just getting into vr… it looks great! I like the way the puppets can be customized too. I wish there was a link for “buy beta version” 😊 I found this video looking for a vr puppet app. I wish it was available!

  • @KyleCypher
    @KyleCypher 9 месяцев назад

    Could you release this project via git?

    • @alleycatsphinx
      @alleycatsphinx 7 месяцев назад

      Seconded! I'm working on a similar implementation - got to the point where things are rendering, but of course everything is flipped... >.< Headache inducing...

    • @brycelynch
      @brycelynch 7 месяцев назад

      Hard to now why that's happening, guessing that the orientation of the splat doesn't align with your world coordinates. I edited this splat using the UnityGaussianSplatting tool on Github and you can bake out modified world transforms. I did this to place the splat at the world origin, I also parented it to an invisible cube in A-Frame (that had collision) so I could pick it up. When you parent you could try changing the rotation if it's not oriented correctly.

    • @brycelynch
      @brycelynch 7 месяцев назад

      Here is the original test. If you view source you in your browser you can see everything that is going on. xr.chrisyoung.com/content/test2/

  • @Flowstatecaptain
    @Flowstatecaptain 9 месяцев назад

    Nice what is this app?

    • @brycelynch
      @brycelynch 7 месяцев назад

      This is a one off demo I built for myself. I used VRE Plugin for Unreal to handle interactions

    • @Flowstatecaptain
      @Flowstatecaptain 7 месяцев назад

      @@brycelynch nice. I’ve always thought it would be dope to have a AR/VR library with real readable books. I wonder if Apple will do that with the Vision Pro and Apple Books 🤔

  • @realsammyt
    @realsammyt 9 месяцев назад

    This is awesome! How did you make it?

    • @brycelynch
      @brycelynch 7 месяцев назад

      Unreal Engine. Oculus source build with passthrough. VRE Plugin for interactions.

    • @realsammyt
      @realsammyt 7 месяцев назад

      sweet! Just starting Exploration and PrePro on a game, I really like this mechanic @@brycelynch

  • @tobiasmyers3505
    @tobiasmyers3505 10 месяцев назад

    omg!