Kinect Azure Point Cloud in TouchDesigner Tutorial

Поделиться
HTML-код
  • Опубликовано: 17 ноя 2024

Комментарии • 61

  • @gregderivative2647
    @gregderivative2647 2 года назад +2

    Hey Elburz, this is a great tour through the Azure .

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 года назад

      Thanks Greg! We'll have a bunch more Azure pieces now that I finally got my hands on one :)

  • @Psybernetiks
    @Psybernetiks 3 года назад +7

    Hey! following the same steps, when I right click my kinect top to view as points, all the points inside the top goes mad and the image just randomly shifts around and the same issue is with the geo. sop. What could be the issue

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 года назад +1

      Before you go into view as points mode, do you textures look similar to mine or are the flat textures also going wild? What kind of GPU do you have as well?

    • @kblinse06
      @kblinse06 2 года назад

      @@TheInteractiveImmersiveHQ I'm getting the same issue. There is a red x in the upper right corner of my math and null operators and the picture in geo is the depth image but its moving around like crazy. I have a 3090, so it shouldn't be the card. but it is the free version of TD, so maybe it has something to do with that?

    • @kblinse06
      @kblinse06 2 года назад +8

      @@TheInteractiveImmersiveHQ Solved my problem by turning 'adaptive homing' off

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 года назад +3

      @@kblinse06 oh great! Yes with any kind of particle system or point clouds (basically anything that isn't a static model) I recommend turning off adaptive homing. Some folks like it, but I prefer it off generally, and I set it to default off in the main TouchDesigner application preferences.

    • @liveperformance3608
      @liveperformance3608 2 года назад +1

      @@TheInteractiveImmersiveHQ Thank you for this! Solved my issue :)

  • @kblinse06
    @kblinse06 2 года назад +1

    Subscribed! Thanks so much for this, I'm looking forward to the videos coming out and I'll def check your program out!

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 года назад +1

      Great! Our pleasure to dive into a topic you're interested in! More coming soon :)

  • @medialuke2
    @medialuke2 3 года назад +3

    Is this possible with the Intel Realsense? Or the old Kinect?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 года назад

      It absolutely is. A few of the operators might be slightly different, but as long as you have a point cloud texture (which the Kinect 2 does) you can do a similar process for sure.

  • @ВиталийРоздольский
    @ВиталийРоздольский 3 года назад +1

    Tell us how to use Azure Kinect and calibrate the camera to project a Kinect image!))

  • @cjadams7434
    @cjadams7434 3 года назад +1

    Very cool effect! - Question...does this require TD-Pro...or can this be done with Commercial?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 года назад

      Absolutely can be done with commercial. It would even work on the free learning version as well, since the resolution of the point clouds is under 1280x720.

  • @beanco5130
    @beanco5130 Год назад

    Could I ask how you'd go about rigging up the color with just a regular old kinect 2? There isn't a konect2 select top...

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад

      With the Kinect 2, you can just add additional Kinect TOPs to the network and set the Image parameter to the one that you need. Depending on the setup that you're working with, you might need to use the Camera Remap parameter to align depth image textures with color camera textures.

  • @mattsoson
    @mattsoson 3 года назад +1

    Great meeting you at LDI! Do you think it possible to output two offset versions of this for left and right eye and send the two images through NDI or whatever into Unity and send each to corresponding eye in connected HMD for 3D viewing? Could do a similar effect with shader graph in Unity natively, but thinking of other possibilities with more efficient processing in TD…

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 года назад

      Hi Matt! Our pleasure :) What you could do is setup the instancing setup like this, then create two Render TOPs and two Camera COMPs, then you have move the cameras slightly off from each other which would give you your left/right eye renders that you can NDI over into Unity. Could you give something like that a try?

    • @mattsoson
      @mattsoson 3 года назад +1

      @@TheInteractiveImmersiveHQ yah I think that was the idea, thanks for the wise/affirmative nod! Looking forward to playing with iPhone LiDAR too from your other vids.

  • @thesuperh4992
    @thesuperh4992 Год назад

    This is super cool! I have my point cloud looking awesome, but I was wondering what all I could do with it from here. I'd like to add some cool effects to it, but I'm unsure of where to start. I was thinking about adding a delay to where the squares have a little bit of lag getting to the points as it captures movement, any ideas on how I could make this happen?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад +1

      Thanks! 😀 You might try looking into something like the Cache TOP or Time Machine TOP for adding time delay effects to the point cloud TOP texture (before it's used in the Geometry COMP). Also check out our video Generative Point Clouds in TouchDesigner (ruclips.net/video/__dHYGe9bQs/видео.html) for some additional inspiration/a look at some techniques for processing the data. Hope that helps!

    • @thesuperh4992
      @thesuperh4992 Год назад

      @@TheInteractiveImmersiveHQ Ooooh thanks so much!!

  • @ianemcdermott5090
    @ianemcdermott5090 2 года назад +1

    Great video! Is there any way to import an mkv file from the Kinect DK Recorder and use that for point cloud data in place of a live azure? Thanks!

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 года назад

      I haven't done that personally but you could try pointing a Movie File In TOP at the mkv file and seeing if it loads it up. If not, you might need to do an intermediary step of converting the MKV file to another format that supports 32-bit depth, is lossless, and can be read by TouchDesigner (like an exr sequence or similar).

  • @NickFidalgo
    @NickFidalgo Год назад

    How would I output from Geometry to an image file (png)? In addition, how could I have it save and overwrite that image file every x seconds?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад +1

      First, you'd need to add a rendering pipeline, which consists of a Camera COMP, Light COMP, some kind of material (a Phong MAT might work here) and a Render TOP. You'll need to make sure that you've assigned the MAT to the Geometry COMP, by dragging the MAT onto the Geo COMP's _Material_ parameter on the Render page.
      Once you've done this, you'll have a rendered view of the network output as a texture in the Render TOP, which you can then save/add post effects to/whatever else you might want to do with it.
      To save the texture to a file, you can use the Movie File Out TOP. Set the _Type_ parameter to Image, and then pick the file type you want via the _Image File Type_ parameter.
      To repeatedly save images after an interval of time, you can use the Timer CHOP. On the Timer page of the Timer CHOP's parameters, set the _Length_ parameter to the number of seconds you want the interval to be, and then turn the _Cycle_ parameter on and the _Cycle Limit_ parameter off. Finally, on the outputs page, turn _Cycle Pulse_ on.
      Then, make a CHOP reference from the newly added cycles_pulse channel to the _Add Frame_ button within the Movie File Out TOP, and it will save the image every time the timer finishes the particular interval! If the _Unique Suffix_ parameter in the Movie File Out TOP is turned off, the file will be overwritten each time, as the file name will stay the same. Hope that helps!

  • @JasonTopo
    @JasonTopo 3 года назад +2

    This is not possible with a regular webcam right? 😅

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 года назад +2

      Nope, because what the Kinect is doing is actually giving you all of the 3D information of the scene and then using that data to put little boxes all over the 3D environment and then colour them using it's normal RGB camera. A webcam on it's own only contains RGB information, but it doesn't have any 3D data streams in it.

  • @PashkovGS
    @PashkovGS Год назад

    Please tell me how to add a video texture to a point cloud if I have kinect 2 ?Thanks

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад +1

      Sure, you'd follow the same approach described in the video, but instead of using the Kinect Azure/Kinect Azure Select TOPs, you can use the Kinect TOP and set the Image parameter to Depth Point Cloud or Color Point Cloud. Everything else should be the same from there. Hope that helps!

  • @lee_sung_studio
    @lee_sung_studio Год назад

    Thank you. 감사합니다.

  • @unnikrishnankalidas967
    @unnikrishnankalidas967 2 года назад

    Is it possible to connect two SDKs onto TD? was trying to make a similar project with two sensors for more detailed depth data.

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад

      Great question! If you're using the Azure Kinect, you can connect as many to your computer as you have bandwidth for. If you're using the Kinect v2, however, you're limited to just one per computer. Hope that helps!

  • @Adenistuff
    @Adenistuff 2 года назад

    Hi I have a project I need help can I use 2 d415 real sense camera to make one point could mesh ?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 года назад

      You can, but you'll have to do the sensor fusion manually, unfortunately. Unless you find another app that does the combination for you or you do your best to line up the point clouds using something like the Point Transform TOP to manually orient everything together.

  • @Jhj11755
    @Jhj11755 Год назад

    Can you tell me what the salary range might be for touchdesigner career?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад

      This is dependent on a lot of factors, including location, experience, whether full-time or freelance, etc. It's worth checking out our blog (interactiveimmersive.io/blog/) for some materials on the topic or joining the Interactive & Immersive HQ PRO (interactiveimmersive.io/lp/hq-pro-full-trial/) to get assistance from industry pros to help you find the appropriate range

  • @RusticRaver
    @RusticRaver 2 года назад

    Could you show us how to do close loop kinect scanning if it is possible in Touchdesigner, like in this video ; Large-scale real-time mapping example using Azure Kinect thx

    • @RusticRaver
      @RusticRaver 2 года назад

      Actually I would not recommend buying a kinect Azure for anyone, I'm surprised nobody points out that it has terrible latency, actually worst than previous version, it is so bad it defeats the purpose of it, I sold it after a week. Maybe if they fix SDK it will be usable. serious rip off if you ask me.

  • @tiporight
    @tiporight 3 года назад +1

    Great! is it possible to do the same with kinect v2?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 года назад +2

      Yup! A few operators might be slightly different, but as long as you get the point cloud texture from the Kinect 2, you can use almost the same setup as here.

    • @tiporight
      @tiporight 3 года назад +1

      @@TheInteractiveImmersiveHQ could you recommend a tutorial for kinect2? thanks

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 года назад

      @@tiporight Inside of The HQ PRO we have a full course about using Kinect 2 and the different ways you can use all the data (similar to this video). I'd recommend giving the free trial a try, I think you'll really enjoy it:
      interactiveimmersive.io/lp/hq-pro-full-trial/

  • @drrobotsir
    @drrobotsir Год назад

    How can we save the point cloud to a file, so it can be used by other apps such as Meshlab?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад +2

      Great question! To save the point cloud to a file, you can use the Movie File Out TOP. The Movie File Out TOP allows you to save .exr files using the OpenEXR format.
      To do this, connect the null1 TOP to a Movie File Out TOP. Then, in the Movie File Out set the Type parameter to Image, and the Image File Type parameter to OpenEXR. On the EXR parameter page, turn on the “Save As Point Cloud” setting. Below that, you have the ability to choose how you want to save the data from the null1 TOP.
      In the default configuration, you’ll only get the point positions and colors found within null1. However, you can also save the colors from null2 into the same .exr file by clicking the small “+” icon below the alpha parameter, which allows you to add additional data from a separate TOP. Drag the null2 TOP into the Additional Input TOP 1 parameter, and then make sure to rename the RGBA channels to something else, because channels with the same name will overwrite each other. I usually rename the first set of channels (directly below the “Save As Point Cloud” switch) to X, Y and Z, as I use them for position data, and then use the channels below Additional Input TOP 1 as R, G, B, and A, as I use them for colour.
      Back on the Move Out page, make sure you've set the file name you want to use (under the File parameter), and then click Add Frame to save the file. If you want to save a sequence of images instead, you can change the Type parameter from Image to Image Sequence, and then turn the Record switch on (it'll record an .exr for every frame until you turn the switch off).
      Hope that helps!

  • @shanukagayashan
    @shanukagayashan Год назад

    is it possible read diameter of ring?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Год назад

      Could you clarify what sort of ring you're looking to measure? Are you looking to take the measurement from the Azure's point cloud?

  • @kkjoky
    @kkjoky 3 года назад +1

    Yooo thats hard!!!

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  3 года назад +1

      Haha I know right? How many developers does it take to make a colour point cloud? :)

  • @yuan-dongzhuang5971
    @yuan-dongzhuang5971 Год назад

    great!

  • @-303-
    @-303- 3 года назад +1

    Azure rhymes with measure. “Azhurr”.

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 года назад

      Is that the official ruling? I feel like everyone says it a bit different haha, makes my life difficult!

  • @marioscharalambous7907
    @marioscharalambous7907 2 года назад +1

    hello i would like to ask how i can put this in the unreal engine 5 with the owl plugin or something else?