Unreal Engine + Kinect for Immersive Experiences

Поделиться
HTML-код
  • Опубликовано: 21 окт 2024

Комментарии • 48

  • @shj4166
    @shj4166 8 месяцев назад +1

    I'm trying to make immersive art with this amazing tutorial, thank you so much!

  • @borjonx
    @borjonx Год назад +2

    This is an excellent tutorial - thx for sharing!!! Exactly what I was looking for to get into UE

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад

      Our pleasure! Glad to hear it :) Highly recommend continuing to explore beyond what is covered in the video, you can do a lot with UE!

  • @keshav2136
    @keshav2136 2 месяца назад

    You are making use of a technology made in 2013. Great!

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 месяца назад

      Yes, hard to believe it's been around for that long! You'd be surprised how common it is to still find the Kinect v2 in use. Sometimes, having the dedicated sensors and well-tested and documented software support is beneficial. And the low cost is great for artists on a budget 🙂

  • @Buklen
    @Buklen Год назад +1

    Amazing tutorial, thanks very much!

  • @Buklen
    @Buklen Год назад

    0:00 Intro
    1:40 Touch designer setup
    8:15 Unreal Setup
    13:07 Creting a particles system and camera with view target blueprint (skip to 36:27 if you already know how)
    28:23 BP_CameraView (first blueprint here)
    36:27 Bringing the OSC data into Unreal Engine

  • @bentheremedia3011
    @bentheremedia3011 Год назад +1

    This is incredible! Thank you so much!

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад +1

      Our pleasure, hope you found it helpful! :)

    • @bentheremedia3011
      @bentheremedia3011 Год назад

      @@TheInteractiveImmersiveHQ I sure did! I was able to follow along easily with just one watch and got everything working right away. Excited to play around with this!

  • @heinzmaier3062
    @heinzmaier3062 8 месяцев назад

    such a nice video. it's so cool. And I'm excited that I understood it, because i have no knowledge about unreal

  • @VíctorAlonsoGonzález-n1w
    @VíctorAlonsoGonzález-n1w 7 месяцев назад

    Very helpful bro, thank you for your hard work

  • @ScottBarrettihq
    @ScottBarrettihq Месяц назад

    This is a very interesting and unique tutorial. I did not understand why you set up the camera in the way that you did. I don't think it needed to be attached into your system because all you really needed was a simple player view. I was expecting, as you were hooking it all up, that you were going to pipe the head tracking movement in there from the kinect so that the player could not only move his hands around to make the sparks happen, he could also 'look' around the scene with his head. In the end though, there was none of that. I believe all you really needed to do was possess the player actor and position it where you wanted the view to be. The 'target' setup was either wasted or just inconsequential. Did I miss something?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  20 дней назад

      Usually the idea with these tutorials is to provide a foundation that the viewer can build upon, so I'm assuming that the Cinematic Camera was utilized for the ability to add that sort of functionality if desired, or to be able to access the various camera settings to aesthetically impact the output.

    • @ScottBarrettihq
      @ScottBarrettihq 20 дней назад

      @@TheInteractiveImmersiveHQ I did take this to a successful conclusion and get head tracking with the camera being the player view. It acts like the 16yr old Johnny Lee video where he uses a Wii in reverse to provide head depth. The issue that keeps it from really working well is the lag. It is present enough to inform the brain that you’re not really changing perspective live. I wonder if the Kinect plugins that exist for unreal would remove a bit of lag with touch designer removed from the loop.

  • @guillermoandresirrenorodri3049
    @guillermoandresirrenorodri3049 3 месяца назад

    Thanks in advance for the tutorial, I have a problem and that is that I am receiving the coordinates from an mqtt protocol, the information reaches the program, however when I receive the location messages in real time, Unreal crashes, I would like to know if that can happen with OSC? Or is it because the Unreal MQTT plugin is experimental? thank you very much for your help

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 месяца назад

      From what I'm seeing online, crashes with the MQTT plugin look to be pretty common, so it might be worth avoiding for now until the bugs have been sorted out. I haven't run into these sorts of issues when using OSC with UE on my machine, so I'd say it's definitely worth giving a try!

  • @craigrooney5459
    @craigrooney5459 2 месяца назад

    Any tips on using Touchscreens and Unreal, multitouch would be really interesting.

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 месяца назад

      Assuming you're looking to develop for desktop and not mobile, it is possible to use a multi-touch display and build control functionality through blueprints. This RUclips series might be helpful to get started with: ruclips.net/video/Wxr-jIA2bUg/видео.html
      Also, a number of pre-built blueprints are available on the Unreal Marketplace to make implementation easier: www.unrealengine.com/marketplace/en-US/product/touch-gesture
      Hope that helps!

  • @ScottBarrettihq
    @ScottBarrettihq Месяц назад

    I'm knee deep in it now.
    I am in fact aiming for head tracking instead of hand tracking. I filtered my results from the kinect to read: head_tx, head_ty, head_tz. how does that affect what the code in the DAT should be?
    instead of:
    nullChop = op(channel.owner)
    posVals = [nullChop[i].eval() for i in range(6)]
    lPos = posVals[0:3]
    rPos = posVals[3:6]
    oscDat = op('oscout1')
    oscDat.sendOSC('/left', lPos, asBundle=True)
    oscDat.sendOSC('/right', rPos, asBundle=True)
    it should be:
    nullChop = op(channel.owner)
    posVals = [nullChop[i].eval() for i in range(3)]
    headPos = posVals[0:3]
    oscDat = op('oscout1')
    oscDat.sendOSC('/head', headPos, asBundle=True)
    ? Is this correct? I changed the range to 3 (instead of 6 (one head vs two hands), changed lPpos to headPos, dropped the rPos line and the rpos osc line. Hope I have this right. If anyone sees an error, please let me know!

  • @csm3535
    @csm3535 Месяц назад

    Hi!, do you need the Kinect Adapter to work with the v2 and touchdesigner?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Месяц назад

      Yes, without the Kinect Adapter for Windows there's not a way to plug the V2 into the computer -- it uses a proprietary connector that supplies power and data in a single port.

  • @LucasChagas20
    @LucasChagas20 6 месяцев назад

    Hello, how are you? Thank you in advance for the video, nice work. I was wondering if you can dare me to understand why my connection when I paste the bracnh node it gets lost and gives the following message in the log; logOSC: warning: outer object not set. OSC serve my be garbage collected if not referenced.

  • @kb470
    @kb470 3 месяца назад

    does this work with the original kinect for the 360?
    i am poor

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 месяца назад

      Yes! Assuming you've got a Windows machine, TouchDesigner still natively supports the Kinect V1. The CHOP channel names might be slightly different, but you'll have access to the same skeletal tracking features that are required for this tutorial

  • @cristiancamacho5506
    @cristiancamacho5506 3 месяца назад

    How could I do this same thing but connecting an Arduino sensor to the touch and send the information from said sensor to unreal and have the niagara react to the information from the sensor that the touch is sending

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 месяца назад

      If you set up the Arduino to output the sensor data via Serial, you can receive that data in TouchDesigner via the Serial CHOP. Once it's connected to TouchDesigner, the incoming data will be a CHOP channel which means you can process it in the same way that we do in this video, and then send it to Unreal Engine via OSC with the OSC Out DAT or OSC Out CHOP.

  • @adrianfxd
    @adrianfxd Год назад

    hey this tutorial was really easy to follow and I love how thorough you are with it. How would I go about tracking the full body and all the points from the kinect to OSC data? would I have to change the values of the chopexec and remove the handtip filters? does that also mean I have to go about adding everything in the UE5 blueprints? is there an easier way for full body tracking??

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад +1

      Thanks so much, glad to hear you found it helpful! Are you looking to track the body's general position, or do you want access to all of the data channels within Unreal? You can definitely select different channels from the Kinect CHOP and send those over as bundles like we've done in the project, but that sounds like a lot of code to write! There is also an OSC Out _CHOP_ , which could be useful since it will send the CHOP channels you input to it without requiring any code, but you won't be able to send them as bundles like we've done with the OSC Out DAT (meaning you'll need some slight changes in the Unreal BP since you won't be receiving bundles anymore).

    • @roodri77
      @roodri77 11 месяцев назад

      Have you managed to make a full body tracking from Touchdesigner to UE?

  • @VíctorAlonsoGonzález-n1w
    @VíctorAlonsoGonzález-n1w 7 месяцев назад

    Hi! I'm working in a live installation and I was wondering if you could tell us or make a tutorial for a multiplayer version of this. As in touchdesigner being able to trasnfer multiple player inputs to unreal. Thank you!

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  7 месяцев назад +1

      You could filter out the channels for additional players by adding a few more Select CHOPs and changing the *Channel Names* and *Rename from* parameter to read p2/handtip* and p2/handtip_*:*, replacing the number in p2 with the player's number that you want to use (Kinect v2 supports up to six). You might consider choosing a slightly different renaming structure in the Rename to parameter as you'll be sending a more channels and would need a way to differentiate.
      The code in the CHOP Execute DAT would need to be updated to parse out the additional channels, which would follow the same technique (creating lists for left and right hands of each player). Then you'd need to send additional OSC Bundles for each player's left and right hands (again choosing different names for the address so you can differentiate between players).
      In Unreal, you'd need to add additional Niagara systems for the new players, and then add to the BP_OSCIn blueprint to parse the additional OSC channels. Then follow the same method of modifying that data and using it to Set Actor Relative Locations for each of the new Niagara systems. It's a bit of work, but definitely possible! Hope that helps :)

  • @UfukAslan-g9g
    @UfukAslan-g9g Год назад

    Unfortunately the playback at unreal doesn't show the recorded movement with the Kinect v2. Already processed the video twice. Really don't know whyyyyy. Hope somebody can help me with it.

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  11 месяцев назад

      Could you provide a little more info about the issues you're running into? Are you successfully receiving the Kinect sensor data in TouchDesigner? Have you been able to get the OSC receiver functionality working in Unreal?

  • @4rzky536
    @4rzky536 5 месяцев назад

    can i use Kinect v1 ?

  • @aubreymorgan9763
    @aubreymorgan9763 Год назад

    can this do real time tracking like for live streaming?

  • @MotsGamingChannel
    @MotsGamingChannel Год назад

    Can this work with the kinect v1

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад

      Yes! You'll have to set the Hardware Version parameter in the Kinect CHOP to Kinect V1. The CHOP channel names may be slightly different than the V2, so you might have to choose different channels to send. Otherwise, you can follow the same procedure to send the data over to UE. Hope that helps!