Whoa! That tip about recording and looping!!!!! You were explaining it, and I zoned out like... who cares about a simple record? But then you explained how you can use this recording, and that it captures ALL that data. Holy cow! My apologies for such a rude dismissal! Amazing. GAME CHANGER. Thank you for pointing this out!
Our pleasure, glad you found it helpful! :) Sometimes the simplest features can be surprisingly useful -- and this is definitely one of those! Super helpful for testing out your project
Great! Although this is more focused around the Kinect v2 and Azure, there might be some tips here that you find helpful. We've also posted a number of videos over the years that cover creating effects with the v2 and Azure, so be sure to check those out for some additional creative inspiration!
Have you ever tried a Sonnet Allegro 4-port USB-A 3.2 PCIe card? It has dual controllers (x2 USB-A 3.2 ports per controller), so you would plug into Port 1 and 3 (or 2 and 4) respectively. Fair warning though, they're not the cheapest, but they've always been able to allow me to use several cameras on a single, powerful machine. Sorry I'm late to the party lol!
This is pretty situation dependent, as there are many factors contributing to the quality of the results you get out of the Kinect including lighting, number of users, mounting position of the Kinect, etc. I've found the Azure to be more than adequate in a variety of situations. Another thing to consider about the Kinect v2 is that you can only connect one sensor per computer, so you'd need a second computer to use two.
Can you use both kinects with either windows 10 or 11? I've read conflicting things about azure and windows 10, where it does not work outside of TD but does work inside of TD. Thank you, and thank you for the video!
Our pleasure! Yes, either OS should be fine for the Azure. The issues I’m seeing talked about online were reported when Windows 11 first came out, but were fixed with a firmware update. Hope that helps!
It's getting hard to get a Azure, I have two realsense from intel and they just straight up doesn't work with Touch Designer. The forum says it should be on a USB 3 port, but when plugged it says it's 2.1, crazy. It's getting really stressing. any ideas? I know it's kinda unrelated to the video.
One really important thing to check when working with RealSense is that you're using the version of the firmware recommended for the particular version of the RealSense SDK used in TouchDesigner (see the RealSense TouchDesigner wiki page for more info: wiki.derivative.ca/RealSense). The most recent version of TouchDesigner (2022.32120) is using RealSense SDK v2.50.0. Check out the GitHub page for the recommended firmware version for your particular hardware: github.com/IntelRealSense/librealsense/releases/tag/v2.50.0
@@TheInteractiveImmersiveHQ Yes! That too, and still not luck. I've loaded the 2.50 and the 2.38, both are unresponsive. There a couple of people in the forums with the same issues. Thanks for replying!
Unfortunately there’s no Mac version ☹️ as a Microsoft product the Kinect has traditionally been Windows-only (and it seems likely to stay that way moving forward). Some Mac apps like MadMapper have added limited support for the older Kinect v2 recently, but you’re only able to access the RGB, IR, and depth sensors - no body tracking or player index functionality. The Kinect v2 has been out of production since 2017, so it doesn't seem likely that support will be added to TouchDesigner, but never say never!
There is support for a variety of tracking devices, ranging from optical tracking to laser scanners, but these tend to be commercial-oriented products (i.e. $$$ compared to Kinect). An option could be to use ZIGCAM, which lets you send camera data from iPhone/iPad to TouchDesigner and other software - allowing you to use it much like a Kinect. You can access the LiDAR depth sensor, face tracking features, people segmentation, etc. More info: apps.apple.com/us/app/zigcam/id1570755292 github.com/Tech-Directors-Association/ZIGCAM_Samples
When the Kinect detects more than one person, it’ll automatically add new channels to the Kinect CHOP. Each channel will start with “p” and then a number, which allows you to filter out the movements of that person (i.e. p1 for person #1, p2 for person #2, etc.). You can then use a Select CHOP to select the channels that are associated with that person. Pattern matching comes in handy here - using a wildcard expression like p2/* for the Channel Names parameter will allow you to select all the channels that start with “p2/“. Hope that helps!
@@TheInteractiveImmersiveHQ thanks. but how we use this multiple values for one particular top as reference like say we have transform top and we use multi person values for tx then how we create multiple references?
Whoa! That tip about recording and looping!!!!!
You were explaining it, and I zoned out like... who cares about a simple record?
But then you explained how you can use this recording, and that it captures ALL that data. Holy cow! My apologies for such a rude dismissal!
Amazing. GAME CHANGER. Thank you for pointing this out!
Our pleasure, glad you found it helpful! :) Sometimes the simplest features can be surprisingly useful -- and this is definitely one of those! Super helpful for testing out your project
Sick, wish I could do the same but I own Kinect Zero and Kinect None
😂😂😂
I have a Kinekt 1 - I'll be watching this more in depth after a few other patches.
Great! Although this is more focused around the Kinect v2 and Azure, there might be some tips here that you find helpful. We've also posted a number of videos over the years that cover creating effects with the v2 and Azure, so be sure to check those out for some additional creative inspiration!
Very interested in content for Kinect v2. will join pro courses also if there is more (up-to-date) around kinect v2, workflows, et c
Thanks for the feedback! It's on our list of upcoming topics - stay tuned :)
Have you ever tried a Sonnet Allegro 4-port USB-A 3.2 PCIe card? It has dual controllers (x2 USB-A 3.2 ports per controller), so you would plug into Port 1 and 3 (or 2 and 4) respectively. Fair warning though, they're not the cheapest, but they've always been able to allow me to use several cameras on a single, powerful machine. Sorry I'm late to the party lol!
That's a great tip, thanks for sharing! 🙂
Will I get better results if I use dual kinect v2 instead of single azure kinect dk ?
This is pretty situation dependent, as there are many factors contributing to the quality of the results you get out of the Kinect including lighting, number of users, mounting position of the Kinect, etc. I've found the Azure to be more than adequate in a variety of situations. Another thing to consider about the Kinect v2 is that you can only connect one sensor per computer, so you'd need a second computer to use two.
Can you use both kinects with either windows 10 or 11? I've read conflicting things about azure and windows 10, where it does not work outside of TD but does work inside of TD. Thank you, and thank you for the video!
Our pleasure! Yes, either OS should be fine for the Azure. The issues I’m seeing talked about online were reported when Windows 11 first came out, but were fixed with a firmware update. Hope that helps!
It's getting hard to get a Azure, I have two realsense from intel and they just straight up doesn't work with Touch Designer. The forum says it should be on a USB 3 port, but when plugged it says it's 2.1, crazy. It's getting really stressing. any ideas? I know it's kinda unrelated to the video.
One really important thing to check when working with RealSense is that you're using the version of the firmware recommended for the particular version of the RealSense SDK used in TouchDesigner (see the RealSense TouchDesigner wiki page for more info: wiki.derivative.ca/RealSense).
The most recent version of TouchDesigner (2022.32120) is using RealSense SDK v2.50.0. Check out the GitHub page for the recommended firmware version for your particular hardware: github.com/IntelRealSense/librealsense/releases/tag/v2.50.0
@@TheInteractiveImmersiveHQ Yes! That too, and still not luck. I've loaded the 2.50 and the 2.38, both are unresponsive. There a couple of people in the forums with the same issues. Thanks for replying!
Is there any chance mac users will be able to use kinect without the microsoft bootcamp? Aren't there any kinect hardware for Mac ?
Unfortunately there’s no Mac version ☹️ as a Microsoft product the Kinect has traditionally been Windows-only (and it seems likely to stay that way moving forward). Some Mac apps like MadMapper have added limited support for the older Kinect v2 recently, but you’re only able to access the RGB, IR, and depth sensors - no body tracking or player index functionality. The Kinect v2 has been out of production since 2017, so it doesn't seem likely that support will be added to TouchDesigner, but never say never!
@@TheInteractiveImmersiveHQ too bad. Thanks anyway for your quick answer
Are there any Sensors that work with MAC ?
There is support for a variety of tracking devices, ranging from optical tracking to laser scanners, but these tend to be commercial-oriented products (i.e. $$$ compared to Kinect).
An option could be to use ZIGCAM, which lets you send camera data from iPhone/iPad to TouchDesigner and other software - allowing you to use it much like a Kinect. You can access the LiDAR depth sensor, face tracking features, people segmentation, etc.
More info: apps.apple.com/us/app/zigcam/id1570755292
github.com/Tech-Directors-Association/ZIGCAM_Samples
how to use kinect for interaction of more then one person by using kinect chop?
When the Kinect detects more than one person, it’ll automatically add new channels to the Kinect CHOP. Each channel will start with “p” and then a number, which allows you to filter out the movements of that person (i.e. p1 for person #1, p2 for person #2, etc.). You can then use a Select CHOP to select the channels that are associated with that person. Pattern matching comes in handy here - using a wildcard expression like p2/* for the Channel Names parameter will allow you to select all the channels that start with “p2/“. Hope that helps!
@@TheInteractiveImmersiveHQ thanks. but how we use this multiple values for one particular top as reference like say we have transform top and we use multi person values for tx then how we create multiple references?
i think kinect 2 is better than kinect azure.
There are definitely some compelling arguments for it, especially with the cost being so low these days! Still a useful tool even in 2024