You are viewing an archive of the TroikaTronix/Isadora Forum.
The active forum can be found at this link

Kinect + Isadora Tutorials Now Available

edited January 2016 in How To Vote Up0Vote Down

Dear Community,

I'm happy to present to you a three-part series on working with the Kinect using Isadora. These tutorials, written by Montgomery Martin, were based on techniques presented by Mark Coniglio during Troika Ranch’s Live-I Workshop in July 2015. They now live on the main TroikaTronix site.

Over the three part tutorial, we discuss how to use the open-source sketchbook application Processing to act as a communication bridge between the Kinect hardware and Isadora.

Part 1 serves as an introduction to the Kinect and what it can offer, along with some important points about particular versions of the camera:

In Part 2 of this tutorial, we will learn how to download, install, and configure Processing to receive full body motion data and infrared video from an Xbox Kinect motion sensor. 

http://troikatronix.com/support/kb/kinect-tutorial-part2/

In Part 3, you will learn how to set up Isadora to receive the OSC and Syphon feeds that Processing broadcasts from the Xbox 360 Kinect camera.

http://troikatronix.com/support/kb/kinect-tutorial-part3/


Thank you especially to the rest of the TroikaTronix team for their valuable feedback and assistance in compiling, reviewing, and editing these tutorials: Mark Coniglio, Jamie Griffiths, and Ryan Webber, Michel Weber and Graham Thorne.

Cheers, and enjoy!

Sincerely,

Monty Martin and The TroikaTronix Team

[http://www.montycmartin.com] Located in Toronto, Canada.

«1

Comments

  • 110 Comments sorted by Votes Date Added
  • I've also done a video walkthrough (unofficialy)


    Graham Thorne | http://www.grahamthorne.co.uk |
    RIG 1: Windows 10, Quad Core i7 4.0GHz OC, GTX 1080, 512 m.2 SSD, 64gig Hyper X RAM.
    RIG 2: Apple rMBP i7, 8gig RAM 256 SSD, OS X 10.12.12
    Located in Doncaster/York, UK.

  • edited March 2016 Vote Up1Vote Down
    @mark_m: add...

    int kCameraInitMode = 6;

    ...at line 40. Looks like I missed defining a global variable when I did the Windows file. Soz! I presume that means you're the first to actually use the Windows files as no-one else has picked up on it! Will fix it in the repo when I get home later.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Hello,

    I tried to open the tutorial but there is only the top of the text, I tried in safari, firefox and chrome (tacos), and it's the same.
    Jacques

    Jacques Hoepffner website http://hoepffner.info MBP i7 2.6Ghz 16Go / MacOs10.11.6 / Izzy 2.5

  • If you're on Mac, you can open the link with the right mouse button and then select "Open in New Window".
    Best.
    Javi

  • it works, thanks

    Jacques Hoepffner website http://hoepffner.info MBP i7 2.6Ghz 16Go / MacOs10.11.6 / Izzy 2.5

  • Hi and thank you very very much for these tutorials. Well done, great job!

    I have a little question about the last part

    on "SETTING UP THE SYPHON FEED"

    "Windows: Add a SpoutReceiver2 actor to the Scene Editor, and click on its ‘server’ input"

    there is no server input in my SpoutReceiver2


    thanks!

    www.circuslumineszenz.com | www.leobettinelli.com | www.ninosconsentidos.eu

    i5 M520 @ 2.40GHz / 4Gb Ram / Win 7

  • edited January 2016 Vote Up0Vote Down
    It's the 'Select' input. (trigger type)

    It should open a little SpoutPanel dialog (if you put spoutpanel.exe beside isadora.exe in the Isadora folder). 
    In that panel you can select the input source.
    If you want it 'hardcoded' to the SpoutReceiver, you will need to type the name show in the SpoutPanel into the first input of the SpoutReceiver (Sender Name).
    It is case sensitive. Also beware of whitespace (trailing spaces etc..)
    Once sent and working, save your Isadora file, and the Sender Name will remain.

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • ok thanks dusx

    I was actually expecting a "drop down dialog, displaying all the active Syphon servers on your computer", as the tutorial states, but I only see the "depth 640*480"


    I realized however to modify the sketch in processing in order to send either rgb camera image // infra red camera image // depth without colored bodies of tracked bodies // depth image with colored bodies of tracked bodies

    www.circuslumineszenz.com | www.leobettinelli.com | www.ninosconsentidos.eu

    i5 M520 @ 2.40GHz / 4Gb Ram / Win 7

  • edited January 2016 Vote Up0Vote Down
    @camilozk

    It seems you might be confusing Spout with Syphon.
    If you are on a PC you will be dealing exclusively with Spout.
    If on Mac it will be Syphon.
    if you wish to share you code modifications regarding the video sent to Isadora, I am sure other users would appreciate the information.
    (if you post code, wrapped in html PRE tags, the forum will auto format the code. It must be done in the html edit mode.)

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • Am I right in thinking this will only track one person at a time?

    Graham Thorne | http://www.grahamthorne.co.uk |
    RIG 1: Windows 10, Quad Core i7 4.0GHz OC, GTX 1080, 512 m.2 SSD, 64gig Hyper X RAM.
    RIG 2: Apple rMBP i7, 8gig RAM 256 SSD, OS X 10.12.12
    Located in Doncaster/York, UK.

  • Dear @Skulpture,

    Well, yes and no. I had at least six people in the frame at one point and they all had Skeletons. But I didn't revise the code to _handle_ those multiple skeletons. We should make this change. I can't do it at the moment with my coding responsibilities for the main program, but maybe someone here has the desire to make it happen?
    Best,
    Mark

  • Hello,

    I am running Izzy version 2.1 on a mbp retina on yosemite. I have followed all the steps for this tutorial, I have run the processing sketch  and have an image coming from the kinect in the viewing box, but there is no skeleton image..am I missing something?
    thanks

  • Hello,

    It works great...thanks a lot for all.
    Best,
    Pascal

  • @ leben

    It can take sometime. You need to stand in front of the kinect, allow it to detect your figure (will color your form a solid colour)
    After a little while (if your form is clear) it should switch to adding the skeleton. (this process is very cpu intense, and on my PC almost freezes things for a couple seconds) but once in this mode returns to performing well.

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • yes it works great! i was just wondering if there would be a possibility through processing to get kind of a ghost image and/ switch between different possibilities?

    Running MBP late 2012 / Osx 10.10.5 / 8Gb Ram / Latest Isadora Version / www.gapworks.at

  • @gapworks

    Like @mark mentioned in regards to tracking multiple skeletons, switching the video feed would have to be added to the Processing code.
    I imagine that OSC control of the video type could be added.
    Perhaps a Processing Pro in the forum might take this one and share the final script.

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • @ DusX

    All is working fine. Thanks for the help, and to those who wrote the tutorial. this all very exiting

  • Thanks all for the wonderful comments and helpful feedback. Get out there and make some cool art :D

    [http://www.montycmartin.com] Located in Toronto, Canada.

  • I am guessing from the screenshots and from my own patch that all the red text (errors?) that show up in Processing patch are normal?

    Graham Thorne | http://www.grahamthorne.co.uk |
    RIG 1: Windows 10, Quad Core i7 4.0GHz OC, GTX 1080, 512 m.2 SSD, 64gig Hyper X RAM.
    RIG 2: Apple rMBP i7, 8gig RAM 256 SSD, OS X 10.12.12
    Located in Doncaster/York, UK.

  • edited January 2016 Vote Up0Vote Down

    "I am guessing from the screenshots and from my own patch that all the red text (errors?) that show up in Processing patch are normal?"

    Copy & paste a sample - all library initialisation output is generally red by default... so _shouldn’t_ be anything to worry about, but I can take a look.

    "Like @mark mentioned in regards to tracking multiple skeletons, switching the video feed would have to be added to the Processing code.
    I imagine that OSC control of the video type could be added.
    Perhaps a Processing Pro [user] in the forum might take this one and share the final script."
    Can do... but probs not til the weekend.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • @Marci 

    It would be a greatly appreciated contribution.

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • edited January 2016 Vote Up0Vote Down
    Have hoisted the code over to my GitHub repo, so just keep an eye on that... feel free to report issues via that too as we go along, or fork / contribute etc. Not done anything to it at the moment, just preparing the ground so to speak.

    @Mark / @mc_monte / @dusx / @skulpture / @michel et alle - if you want some specific text putting in the README.md acknowledging orig sources etc, just fire it over to me via message or similar and I’ll swap it in. At the minute I’ve left license type as “none”... suggest ‘CC0' seeing as it’s all open source anyways. Thoughts?

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • @skulpture - Added link to README.md

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Cheers @Marci 

    Graham Thorne | http://www.grahamthorne.co.uk |
    RIG 1: Windows 10, Quad Core i7 4.0GHz OC, GTX 1080, 512 m.2 SSD, 64gig Hyper X RAM.
    RIG 2: Apple rMBP i7, 8gig RAM 256 SSD, OS X 10.12.12
    Located in Doncaster/York, UK.

  • edited January 2016 Vote Up0Vote Down
    OK - 2 things y'all need to be aware of when using these tutorials...

    1: natural limitations of SimpleOpenNI
    The SimpleOpenNI library used in Processing is one of a few frameworks for interfacing with a Kinect. It combines the old way of doing things where one had to manually install OpenNIv1 and NiTe etc to get skeleton / limb and user tracking, thus simplifying the process. The Kinect itself doesn’t do any skeleton or limb identification or tracking. 
    If on a USB3 host (Macbook Retina for instance), running anything other than Depth camera at the same time as skeleton tracking may randomly throw an iso_callback() error and/or trigger 'Isochronous transfer error log messages’... this is inherent to SimpleOpenNI and can’t be avoided, and will render everything unstable. It could go at any time, whenever it feels like it. It will either plain bomb the processing sketch (in case of the iso_callback() error), or cause _everything_ to lock up (in case of the Isochronous transfer error) until eventually the sketch bombs (which occasionally you have to force by simply pulling the USB cable).

    The only way to get round this is to use a USB2 host (older Macbook Pro, MacMini), or just chance your luck. (I’ve documented this over on the GitHub Repo, yonder: https://github.com/PatchworkBoy/isadora-kinect-tutorial-files/issues/1)

    I’ve documented previously on here that skeletons should only be used and implemented within a sketch when absolutely necessary / desired... as these are the only features unique to the SimpleOpenNI libraries and introduce a lot of CPU weight. If not using skeletons, turn to lighterweight OpenKinect libraries etc for simple camera feeds & depth point tracking / blob tracking. From the Isadora end of things, dealing with Skeletons is the easiest way of handling Kinect data in an obvious way... but from the Kinect middleware point of view, it produces the most unstable results. Purely a result of SimpleOpenNI being out of development (hence no Kinect v2 support, and no fix to this particular issue as when it went out of development [aka, the PrimeSense technology & software rights were snapped up by Apple then passed to Occipital and are now part of the Structure.IO SDK] USB3 wasn’t released - OpenNIv1 and v2 are now in a complete code freeze with no further development).  Ultimately, that means this all has rather a limited lifespan - sorry!

    And

    - Mirror mode only affects the RGB / IR output. It has no impact on depth or user output. 
    2: natural limitations of the Kinect v1 hardware
    You can either:
    - Start in RGB mode, and switch between that and User & Depth mode.
    OR
    - Start in IR mode and switch between that and User & Depth mode.
    Once the camera has been initialised in either RGB or IR mode, it can’t be switched to the other mode basically!
    To change from RGB to IR mode, the sketch must be stopped and restarted.
    To change from IR to RGB mode, the sketch must be stopped and restarted.
    You can’t:
    - Start in RGB mode and switch between that and IR mode
    - Start in IR mode and switch between that and RGB mode
    Bearing all of that in mind... the GitHub Repo is now updated with an updated Processing sketch and Izzy file (for Mac) with stream switching enabled via OSC and a few other bits.
    Please read the Warnings on the GitHub page, the OSC MultiTransmit Actor notes in the Izzy file, & comments in the processing sketch code.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • this is great, thanks

    17"MBP 2.93GHZ Core2Duo OSX10.10.5 8GB, 1TBCrucial_SSD, izzy 2.4.5b3 (2.2.2)

  • Thanks for the detailed reply @Marci.

    We are semi-aware that this method has a lifespan. But we get a lot of questions on the forum, inbox messages and emails asking how to get the kinect sensor working with isadora. I must get two emails a week to my personal email. So we thought it was best to come up with an 'official' method.

    I must admit I was not aware of the 'iso_callback() error' I have had the occasional lock up - but nothing too bad. I often wondered why and this may be it.

    Like you've said, the lifespan of this method is limited with the take over and apple; prime sense, etc etc. Which is a shame of course. Similarly we get/I get a lot of questions about the Kinect V2 and why won't it work, etc.

    This is a tough nut to crack really.... But thanks for your help and notes on github, etc.

    Cheers.

    Graham Thorne | http://www.grahamthorne.co.uk |
    RIG 1: Windows 10, Quad Core i7 4.0GHz OC, GTX 1080, 512 m.2 SSD, 64gig Hyper X RAM.
    RIG 2: Apple rMBP i7, 8gig RAM 256 SSD, OS X 10.12.12
    Located in Doncaster/York, UK.

  • edited February 2016 Vote Up0Vote Down
    It’s same discussion as we had previous. First question should be: skeletons and limbs, or blobs. What tool do i specifically need...?

    If blobs, OpenKinect framework which is still maintained, supports KinectONE & USB3 fully iirc. If skeleton, OpenNI framework as no other choice.  Skeletons are quicker and easier to get an idea going with, but usually the same can always be achieved with blobs also.
    If you must use skeletons and it’s going to be mission critical to a show or installation, use an older MacMini / MBPro / Laptop that only has USB2 support dedicated to handling the Kinect side of things, and fire the OSC over the network to your ‘main’ system. Avoids all the risk.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • @Marci a ghost Image would be simply great!! I mainly need the shape of the performer/s and i´m not tracking a skeleton nor OSC date. Only a simple Shape trying to get close to an infrared cam quality.

    Running MBP late 2012 / Osx 10.10.5 / 8Gb Ram / Latest Isadora Version / www.gapworks.at

  • Ghost support added... https://github.com/PatchworkBoy/isadora-kinect-tutorial-files/blob/master/Isadora_Kinect_Tracking_Mac/Isadora_Kinect_Tracking_Mac.pde

    Run the sketch, let it get your skeleton, hit 5 key on keyboard to switch to ghost view, s key to disable skeleton.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • edited February 2016 Vote Up0Vote Down
    To change ghost color, in the source code, search for...

         // set Ghost color here
    ...and change the color(255,255,255) values: (R,G,B) where values can be between 0 & 255.
    Add some blur via Isadora if needed.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • @Marci What do you think would be involved in porting this over to pc?

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • Errrr.... Good question. On PC iirc we're reliant on the proper MS Kinect SDK aren't we? Thought that had nuked out the opportunity to use anything but the proper XBox for Windows models which would preclude me from testing...?

    Will need to read up on it. I'll look into it...!

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • First, I've been so busy working on Isadora v2.2 that I haven't been looking at the forum much in the last weeks. I was excited to see all the energy here among the community for this Kinect solution. Thanks to everyone for contributing, and especially @Marci for updates and setting up the GitHub.

    I wanted to note to @Marci that I tested the Windows version using a Kinect 1473. So you definitely are not _forced_ to use Xbox for Windows models.
    For a recent workshop, I did a quick and dirty implementation of multiple skeletons. Once v2.2 is out, I'll add that to the repro so everyone can have it.
    Best Wishes,
    Mark

  • Ah cool - in which case I’ll look to reproduce for win. Never really bothered hooking it up to a PC as it was always simpler to get things going on OSX.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Just skimmed through the windows sketch - yeah it'll be no bother to port. Will get it done this week.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • edited February 2016 Vote Up0Vote Down
    Something I cannot find is the way to change the Processing sketch to start with camera in RGB mode or IR MODE.

    I tried with no success to change it in the OSC multitrasmit of the "Isadora Skeleton Test Mac" supplied in 
    https://github.com/PatchworkBoy/isadora-kinect-tutorial-files

    and the Processing sketch of 9 days ago and the today one.

    I searched in the Processing sketch if there was something I would be able to change but nope... It starts on kCameraImage_Depth = 3.

    I would like to try the others kCameraImage.

    Thanks for enlighten me.

    17"MBP 2.93GHZ Core2Duo OSX10.10.5 8GB, 1TBCrucial_SSD, izzy 2.4.5b3 (2.2.2)

  • edited February 2016 Vote Up0Vote Down
    Dear @bruper,

    Look at this part of the code:
    // --------------------------------------------------------------------------------
    //  CAMERA IMAGE SENT VIA SYPHON
    // --------------------------------------------------------------------------------
    int kCameraImage_RGB = 1;                // rgb camera image
    int kCameraImage_IR = 2;                 // infra red camera image
    int kCameraImage_Depth = 3;              // depth without colored bodies of tracked bodies
    int kCameraImage_User = 4;               // depth image with colored bodies of tracked bodies
    int kCameraImage_Ghost = 5;

    int kCameraImageMode = kCameraImage_IR; // << Set this value to one of the kCamerImage constants above
                                             // for purposes of switching via OSC, we need to launch with 
                                             // EITHER kCameraImage_RGB, or kCameraImage_IR
    You should be able to set it to kCameraImage_RGB to show the RGB channel.

    Best,
    Mark

  • @mark

    thanks, yes how silly of me.. totally overlooked...

    17"MBP 2.93GHZ Core2Duo OSX10.10.5 8GB, 1TBCrucial_SSD, izzy 2.4.5b3 (2.2.2)

  • @Marci thanks for the fast reply ! i will give it a try on Wednesday when i return from venice as i don´t travel with my kinect! 

    Running MBP late 2012 / Osx 10.10.5 / 8Gb Ram / Latest Isadora Version / www.gapworks.at

  • edited February 2016 Vote Up0Vote Down
    "tried with no success to change it in the OSC multitrasmit of the "Isadora Skeleton Test Mac" supplied in  https://github.com/PatchworkBoy/isadora-kinect-tutorial-filesand the Processing sketch of 9 days ago and the today one.


    Like I said, if you start in RGB you can’t switch to IR. If you start in IR you can’t switch to RGB. Limitation of the Kinect hardware, not Processing. Nothing anyone can do about it regardless of what software youre working in.

    Both can’t be activated and switched between in a Processing sketch. You must choose one to work with. You can have (IR _OR_ RGB) & Depth & User & Ghost. You can’t have IR & RGB & Depth & User & Ghost.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • edited February 2016 Vote Up0Vote Down
    The confusion everyone is making here is passing the camera images to Isadora full stop.

    If you’re creating and affecting visuals in Isadora, pass OSC data to Isadora, and do everything visual in Isadora.
    If you’re creating visual effects in processing, just use processing. Then only pass the syphon feed to Isadora to integrate that into an Isadora scene. Pass the minimal OSC data as needed to control anything additional you may want to control in Isadora, if anything at all.
    The majority of folks dabbling in Kinect at the moment here seem to be wanting to produce visual effects that are done wholly in processing (i.e.: stuff from openprocessing.org, but substitute mouse for hand/kinect).
    Isadora exists so people don’t have to learn code to create visuals. When using Processing, you’re just using Isadora as a camera / media player switcher / sequencer, and hardcoding all your visuals and interaction within Processing.
    You have to abstract how you think about it. Work out what you want to achieve and WHERE to achieve it...

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • **UPDATES**

    Have brought the Windows Processing Sketch up to date with the Mac Processing Sketch. Both now identical bar the syphon / spout bits (i.e.: Windows uses Spout, Mac uses Syphon).

    Also, both sketches now output all detected skeletons via OSC to the address:
    /skeleton/[useridnumber]/[limbpart]
    I’ve updated the Mac Isadora file to reflect this in the stream setup FOR THE 1st SKELETON ONLY! I can’t do the Windows Isadora file as don’t have a windows machine handy I’m afraid (if someone wants to open up the Windows Izzy file and replicate across the OSC MultiTransmit actor and comments from the Mac file, and prefix all the OSC streams in stream setup with /skeleton/1 and attach it here, I’ll add that to the gitrepo also.)
    To receive multiple skeletons in Isadora, users will need to head to stream setup and add the channels for /skeleton/2/  thru to /skeleton/6/, and then duplicate the various Kinect Point actors and set their channels correctly (again, if anyone wants to do that and send me the files I’ll happily add them to the gitrepo).

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Marci -- when I pull the izzy file (sadora Skeleton Test Mac.izz) from github it tells me that the file can't open because it was written in a later version... I have 2.1 -- what can I do? I have a mac and just got 2.1 in Nov...
  • download the file from github again... sounds like an interrupted upload, so have just reuploaded it.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • (For reference, it was written in 2.1)

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • edited February 2016 Vote Up0Vote Down
    @Marci

    I also have this problem with both the Mac and Windows files, and I don't understand it.
    I will pm you.

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • Well I haven't touched the Windows Izzy file, just downloaded and uploaded to GitHub... the Mac file works for me, and I’ve just redownloaded it from github, and that one works for me also.

    Will fire it over momentarily...

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • files now in hands of the Izagods!

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • @Marci --

    Thank you so much for your AMAZING contributions to the Kinect project!

    [http://www.montycmartin.com] Located in Toronto, Canada.

  • edited February 2016 Vote Up0Vote Down
    No probs - all work I'd already done in Sensadora - an Open Source CC0 licensed Processing sketch specifically for getting all the various Kinect usage cases into Isadora - not finished yet. Single sketch with options for blobs, skeleton, depth thresholds, pointclouds, gestures, 2D & 3D physics engines, & multiple kinects. Also with added leap motion support - idea being to allow the obvious Minority Report display interaction with a media library of audio, video, stills and live streams, along with generic OSC, serial, & http outputs for controlling Ptz cameras (VISCA / IP / Galileo) and link to Domoticz home automation for physical control over mains sockets, inputs from contact sensors etc & other smart devices.

    I got sidetracked by homebridge-edomoticz development, but that's launched & support requests slowing down now so should be able to get back onto it in a week or two.

    S'one of those ideas that just keeps expanding! Joys of a severely autistic son who demands LOTS of sensory input!

    The playlist_sketch at https://www.dropbox.com/sh/lyxj60kf0k6jke7/AAAF5MszGamDSuiSP4fgH9t5a?dl=0 is the experimental playground for a lot of it.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • So there is a issue, with downloading individual files from Github. 

    If you save a .izz from a link it will not open.

    Individual files must be downloaded from the 'view raw' option.
    Otherwise, downloading the ZIP package is also fine.

    For example, right clicking to save file as, from this link:
    https://github.com/PatchworkBoy/isadora-kinect-tutorial-files/tree/master/Isadora_Kinect_Tracking_Mac
    Will deliver a unuasable Isadora file.


    To get the same, you need to click the filename, and then 'view raw'

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • edited February 2016 Vote Up0Vote Down
    Ah, yeah. Standard / by design. GitHub is a version manager / online source viewer. Not designed for pulling individual files. Download via provided Download as Zip button top right of the repo frontpage, via view as raw / open binary link on individual files, or use any git client (https://desktop.github.com) to properly clone the repo... which will then handle incremental updating etc within the client, along with ability to roll back to previous versions, switch to alternative branches (eg: master, beta, stable), allow end-users to submit proposed changes back to the repo, issue tracking etc.

    Any right-click > save as action will just download the JS/htm src for the file viewer at a guess (open it in notepad to verify)

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • I'm sending here an updated version of my very old Isadora Kinect Skeleton Part user actor.  (for History I put it in the ni-mate forum a long time ago while I still used their public beta before the 1.0. They liked it so much that they offered me a licence of the version one when it came out).

    It is essentially what Mark has done for the new tutorial files but you can see in the actor's output the actual name of the part which I find useful and there is an additional input I called osc offset. I added it because kinect might not be your ONLY Bosc input and if you have fed Izzy with another device ( like an iPad with ouch osc BEFORE you send the kinect data in... you're screwed. Kinect point actor won't work unless you change your osc input inside the actor on inside the Stream Setup window.
    With my actor and you just give to osc offset input the last Bosc port number you already have.
    The actor will automatically shift all three osc listener channels inside so you have the right value for the right body part 

    iua
    iua
    Kinect Skeleton Part.iua
    36K

    Armando Menicacci www.danse.uqam.ca www.digitalflesh.org www.emdl.eu

    15' i7 2,7 Ghz Macbook pro Retina Display 16 GB OSX 10.12.1 Latest Isadora version

  • ey!

    great discussion here. Have the files in the tutorial been updated?

    www.circuslumineszenz.com | www.leobettinelli.com | www.ninosconsentidos.eu

    i5 M520 @ 2.40GHz / 4Gb Ram / Win 7

  • The updated files are at https://github.com/PatchworkBoy/isadora-kinect-tutorial-files - follow the link, click the “Download Zip” button top right corner.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Am working my way through this on Windows 10 with a Kinect that I previously had working with the original Tutorial files.

    I'm trying to run the latest, updated files, downloaded from GitHub. When I run the Processing Sketch Isadora_Kinect_Tracking_Win.pde I get an error unexpected token: =  and the line
                    if (userList.length !== 0) {

    Can anyone proffer any help?

    Thanks

    Mark

    Dell Precision m3800 16GB RAM, Quadro K1100, Win10x64 X99 Desktop, i7 5930, GeForce GTX980, 64GB RAM, Win7x64

  • Try changing  


    if (userList.length !== 0) {    

    to    

    if (userList.length != 0) {

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Thanks, Marci, yes I did try that and it gave another error further down:

    Cannot find anything named kCameraInitMode
    with this line highlighted:

            kCameraInitMode = 1;
    
    Thanks
    
    Mark 
    

    Dell Precision m3800 16GB RAM, Quadro K1100, Win10x64 X99 Desktop, i7 5930, GeForce GTX980, 64GB RAM, Win7x64

  • edited March 2016 Vote Up0Vote Down
    @Marci thank you for your added ghost imageon kinect! one more question. is there a way to smooth the ghost image. as it is ,lets call it quite nervous. apart from this it works great. 

    But i had to delete the first couple of lines (all the ones with until "import...." because they created an error. an info that might be also useful to others here in this forum.
    best
    ps: and if i press 1 (case1) on my laptop it crashes my computer completely!! but i don´t need it so i just avoid it. 

    Screen Shot 2016-03-01 at 13.41.28.png
    508 x 590 - 75K

    Running MBP late 2012 / Osx 10.10.5 / 8Gb Ram / Latest Isadora Version / www.gapworks.at

  • edited March 2016 Vote Up0Vote Down
    Opening line should begin /* If it doesn't then it'll cause errors. /* denotes start of a comment block, */ denotes end of a comment block. Looks like a case of clumsy fingers your end I'm afraid, as the files in the repo all have it correctly in place.

    Pressing 1 will try to init RGB mode. If sketch not launched in RGB mode then this is expected behaviour... like I said, can't switch between IR & RGB once launched.

    Smoothing - add a blur actor in Isadora after syphon receiver, before however you're converting it to a mask (assuming that's what you're doing)

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • @Marci thank you for the quick response. Blur in Isadora was my solution to solve the "Nervousness". I was just curious if there was an Option within Processing. 

    best

    Running MBP late 2012 / Osx 10.10.5 / 8Gb Ram / Latest Isadora Version / www.gapworks.at

  • edited March 2016 Vote Up0Vote Down
    @gapworks: lots of options in processing, canvas.filter(BLUR, 6); after line44, or something similar. See https://processing.org/reference/filter_.html or https://processing.org/examples/blur.html

    Ultimately, doing it in Isadora is simpler for you to maintain tho.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • edited March 2016 Vote Up0Vote Down
    Hi Marci,
    Maybe I am the only person using this on Windows!

    So I just downloaded your updated file that you did about 1800 01 March.
    This still has the == issue. (see above). I fix that, but now I get an error further down when I try and run the processing files

    I don't know if this is user error or what - my programming skills sadly lacking - but when you have a mo' I'd be grateful if you could have a look

    Cheers

    Mark M - lone Windows user!

    When I run the processing file the error says:
    cannot find anything named "inUserid"
    line 187 says

        sendOSCSkeletonPosition("/skeleton/"+inUserId+"/head", inUserID, SimpleOpenNI.SKEL_HEAD);
    
    
    

    Dell Precision m3800 16GB RAM, Quadro K1100, Win10x64 X99 Desktop, i7 5930, GeForce GTX980, 64GB RAM, Win7x64

  • Oops - grab the latest zip from the repo... have corrected both issues. Any mention of inUserId should be inUserID


    (Or you could just do a find and replace in Processing / any text editor)

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Thanks for all the work gone in here... For Kinect tutorials which all worked for me, and here also to Marci.  Great stuff. 


    I also had to correct the following only mac (retina) - and lower case Id to ID but then it seems to work as it is below in sample pic.  
    if (userList.length != 0) {

  • 5 & s key solves Ghost riddle. Brilliant!    

    Now on Ghost and without skeleton the IR image keeps cutting out and then re-appearing.  Any ideas on why this is? Struggling to reboot it.  

    Is Alpha Mask the best way to turn this into a mask along with an Izzy Blur to project into?  Will check the follow mask tutorial too.  Sorry for sketchy posts, am mid production with this unfortunately not getting the time it deserves to implement.  Ghost working in Processing is great, Now to turn that into a mask and i am a happy bunny for now.  

    Many thanks for any and all help

    G

  • edited March 2016 Vote Up0Vote Down
    On ghost without skeleton there should be no IR image, just the ghost...? Do you mean the ghost keeps vanishing and re-aooearing? This will be down to OpenNI not being able to detect a user... ghost / skeleton needs as much of the entire person from feet to head in the shot as possible as OpenNI works out that a person is stood there by recognising an entire stick figure. It can’t do a tight shot where the user is only there from the waist upwards for instance... or if 50% of the person’s stick figure moves out of shot to the side. 

    Easier way to diagnose: video your screen on your mobile phone or summat whilst it’s happening and bash it on youtube and post me the link so I can see what’s going on...
    Could also be USB bandwidth which would cause OpenNI to drop and have to timeout & reinitialise the sensor... which sometimes it can’t do without a complete shutdown of processing / reopening the sketch. I’d only expect this to be an issue if you were using USB external disks or cameras hooked up by USB that’d start saturating your total available USB bus bandwidth.
    And yes, alpha mask and izzy to blur...
    At some point I’ll add a blur option into the processing sketch, but can’t give any indication as to when that may be - at the moment the bill-paying work’s having to take priority!

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Ah sorry, my bad on naming issue - yes I can get ghost without skeleton, thanks to shortcut keys.  

    I think you are right as to why its not been getting the full me in shot.  I still have to work with distancing but this week am working from home, so just need to step back more.  
    Thanks for the offer on the youtube clip review, at the moment I am struggling to get it working again where it was last night.  I saved the processing file of your Ghost code, which was working with the keys etc too, but is now just beach balling and not picking me up.  Hmmm will have to re-assess... 
    I will get back on this when I have figured out what that could be.  
    Could you possibly point me to that mask tutorial you mentioned?  Is it the 'how do you get a mask content to follow the mask' one? 
    Do prioritise what you need to keep the roof over your head warm.  But all this help is much appreciated, I am no coder, so have to pick up on what language I do understand, it is like being an english tourist abroad, only I am scottish!  

    Thanks
    G

  • edited March 2016 Vote Up0Vote Down
    Was just looking for it and can't find now. Will try n throw an example Izzy file up later this evening.

    NB: if it cuts out and fails to start, and you end up stuck with a beachball, generally it means unplug / replugin your kinect... a resource hasn’t been correctly freed. Or, you’ve forgot to plug in the kinect PSU.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • If all you're wanting is a ghost mask there are other ways that don't involve OpenNI / user / skeleton at all, so can accept waist up etc... and reduces USB bandwidth & CPU load. Will see what I've got in my toolbox and start a new thread for it once I've found it.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Wow! Thanks so much. Am checking the other thread too.  Ideally wanted to use 3 or 4 instances of the ghost, and one of skeleton, as I had wanted to do something where it started in the axis centre (solar plexus) in the chest and spread from there. But will see where I get to with this first as anything approximating would be great, and I may be able to opt for bigger centralised blur etc instead for now.  

    I would be more than happy with one scene with 2-3 adjust keys, so it can all be fired up in one sitting. 
    Great to have the mask IZZY set up pic too. 
    If I was to use separate scenes, for separate FX using the ghost or skeleton, does the signal take a while to load, and does it interfere with the process if I am not always in view of the Kinect IR sensor?
    As in is it better to have one complicated scene where the signal is established even if I move in and out of shot, rather than jump between scenes each of which has the set up active as needs to be, user actors etc.   
    Ideally I would have more complicated scenes, which a keyboard watcher can trigger between.  At the moment, having achieved skeleton, and being blown away by the possibilities of all the information received and relayed, my head nearly popped, but for this version with so little time, a ghost IR will be more than great.
    One issue I have is my costume - in one scene its a straightjacket, and its lighting up the IR sensor (like its alive - why I don't know!!!), but obviously my arms aren't available to provide skeleton info feed. So again I think I will have to use the ghost only feed.  I'm guessing your Kinect Ghost Mask is going to save my proverbial.  Thank you once again @Marci 
        

  • edited March 2016 Vote Up0Vote Down
    Anything reflective... Buckles etc ...will reflect and scatter the infrared pattern that the Kinect projects and then detects. Use talc to dull off reflective fabrics, Vaseline to dull off metals.

    My glasses always cause problems like this... have to wear contact lenses if I'm working with Kinect.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • Aah, smart thanks for the tip.  It may work in my favour if I can get it to read blob and use IR when I am not in it.  Thanks Dude. 
  • Hi!

    Thank you for great work you do!
    Following the tutorial I ran into a problem running Processing file on Mac
    Error says:
    "The method enableUser(int) in the type SimpleOpenNI is not applicable for the arguments()"
    What is the problem here?
    Igor

  • Having trouble linking my kinect to my mac. After I hit run in processes this is what happens.
    I'm very new to this. If anyone can help I would really appreciate it.

    Screen Shot 2016-04-12 at 2.52.42 PM.png
    1352 x 658 - 88K
    Screen Shot 2016-04-12 at 2.52.55 PM.png
    1370 x 1054 - 232K
    Screen Shot 2016-04-12 at 3.44.36 PM.png
    1660 x 1192 - 194K
  • edited April 2016 Vote Up0Vote Down
    Dear @Bunker,

    Are you using Processing v2.2? As noted in the tutorial, SimpleOpenNI only works with v2.2. It does not work with version 3.
    As for the error in the third picture, the it should be "!=" instead of "!=="
    Best Wishes,
    Mark 

  • Last error (third picture) is now fixed in the repo (must’ve been doing a lot of JS work that day!)... however, it’ll still error in Processing3 regardless as per @Mark above.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • edited April 2016 Vote Up0Vote Down
    Dear @Marci,

    As it says in Step 2 of the second tutorial you cannot use Processing v3, you must use Processing v2.2.1. To wit:
     IMPORTANT! You must install Processing v2.2.1 because SimpleOpenNI has not been updated to support the latest version of Processing!

    So please try v2.2.1 and let us know how it goes.
    Best Wishes,
    Mark

  • After doing the changes.. it worked, my mac recognizes now the kinect device. Thank you!!

    Is not in this area of the forum but as it seems I'm enjoying this program. is there a site or a place where I can get more tutorials besides the basic 13 episode form youtube??

    Once again thank you for your help and such prompt response.

  • @Mark ...??? That's what I reinforced in the post above your response... confused

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • @Marci

    I think the confusion is the word regardless. It sounds like regardless what Mark said. Better would have been: it’ll still error in Processing3 as per @Mark above.

    Best Michel

    Bug Report/Feature Request FORM
    Michel Weber | http://www.filmprojekt.ch | rMBP i7, 16gig, OS X 10.11 | located in Winterthur Switzerland.

  • @Marci,

    Sorry, I thought you were the original user who had the problem (i.e., @Bunker) That's why I addressed you in the way I did. Sorry for the confusion and thanks for the update to the code. ;-)

  • Would someone mind giving me some guidance? I've followed the instructions in the tutorial to the letter and am still getting nothing showing up in my canvas when I run the setup and an error message I don't even begin to understand in Processing. I'm using the 1414 model Kinect with a MBP Retina 15, bought last year. I understand that the usb3 port may be an issue, is there a work around? Am I screwed?

    Screen Shot 2016-04-28 at 11.18.41 am.png
    1402 x 1714 - 289K
  • Dear @George_A,

    From what I can see, there is a crash in the Java runtime environment when attempting to get a list of USB Devices. The crash is happening in libusb, which is the open source USB input/output library, in a call named darwin_get_device_list.
    What version of Mac OS are you using?
    I am unsure how to guide you on this. Many users are using these tools successfully. The only thing I could suggest is to somehow update libusb. The way to do that would be to install Homebrew (see here) and then enter the command line in Apple's "Terminal" program:
    brew install libusb

    I guess that's the only thing I can suggest here as a possible solution.

    Best Wishes,
    Mark

  • Wandering off on a tangent... a tip for Kinect concept development work based on some playing around this weekend:

    Get some Lego, some of these: https://www.ebay.co.uk/ulk/itm/121953917174, and apply some judicious use of gaffa tape and scalextric cars & track... ;) Start small scale in your studio / kitchen, then scale up. Much easier than leaping about in front of camera / directing small child or significant other... and you can map the whole 'stage' with a single Kinect and projector. Allows for small scale testing of scenarios. :D

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • @george_a might also be worth removing and reinstalling latest JRE to see if it has any impact... I'm running v8... you look to still be on v7.

    http://twitter.com/marcisshadow || rMBP A1398 (EMC 2881) / Kinect + Leap / TH2Go || West Yorkshire, UK || Aspie Warning!

  • hi guys

    do i need to upgrade to version 2 to make the izzy patch work ?
    i'm currently getting an error message that the following actors could not be loaded

    Syphon Receiver (ID = '53795243')

    Projector (ID = '5F50524A')

    izzy 2.2.2. mbp 2.66 osx mavericks 10.9.5 i am dyslexic.

  • @particlep Hi, Yes tutorials were made in version 2. 

    Have you upgraded? if not we can try and suggest a work around. 

    Graham Thorne | http://www.grahamthorne.co.uk |
    RIG 1: Windows 10, Quad Core i7 4.0GHz OC, GTX 1080, 512 m.2 SSD, 64gig Hyper X RAM.
    RIG 2: Apple rMBP i7, 8gig RAM 256 SSD, OS X 10.12.12
    Located in Doncaster/York, UK.

  • hi @Skulpture 


    i've not upgraded yet as i've been busy with another aspect of my practice of late.

    would upgrading be a better option than the workaround ?

    izzy 2.2.2. mbp 2.66 osx mavericks 10.9.5 i am dyslexic.

  • @particlep version 2 has lots of amazing new features - i'm sure you've read about most of them. (http://troikatronix.com/about-isadora-2-0/


    But (hopefully) you know we/I am not here to push sales. We don't do that.

    The errors are basically missing actors. Version 2 has a standard Syphon actor as standard now. So no Quartz, freeframe, third party actors needed. There is also a new projector that accepts all video types (texture/video/image, etc) so thats the second error.

    It may be worth downloading the version 2 demo and giving it a go. You can still run both versions. I can talk you through the process if that's any help.

    Regards,
    Graham :) 

    Graham Thorne | http://www.grahamthorne.co.uk |
    RIG 1: Windows 10, Quad Core i7 4.0GHz OC, GTX 1080, 512 m.2 SSD, 64gig Hyper X RAM.
    RIG 2: Apple rMBP i7, 8gig RAM 256 SSD, OS X 10.12.12
    Located in Doncaster/York, UK.

  • @particlep

    If you don't want to upgrade you have to replace the syphon receiver with for example THIS one, if you already have one you can use that one and also replace all the projectors with the ones you have available.

    Best Michel

    Bug Report/Feature Request FORM
    Michel Weber | http://www.filmprojekt.ch | rMBP i7, 16gig, OS X 10.11 | located in Winterthur Switzerland.

  • edited May 2016 Vote Up0Vote Down
    Anyone running on Windows, you can use the Kinect V2 with Isadora.

    This project gets your OSC values from the skeleton tracking:
    https://github.com/microcosm/KinectV2-OSC
    Very easy to setup, and it runs great.

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • @DusX my Kinect V1 walked off of its own accord sometime this week. Thought I might move on up to Kinect V2. Is there any particular model that I need for Windows? Can I just buy any old Kinect V2 and the Xbox One Kinect Adapter for Windows, or do I need something more particular?

    Thanks

    Mark

    Dell Precision m3800 16GB RAM, Quadro K1100, Win10x64 X99 Desktop, i7 5930, GeForce GTX980, 64GB RAM, Win7x64

  • @mark_m

    They have made 2 models, the Xbox kinect 2, and Kinect 2 for windows (no longer offered).
    Both are the same, and the SDK works for both. Additionally both need the power adapter.

    !* Windows only.
    On OSX, there is the ability to get an depth image, but I don't know much about that.

    [Troikatronix Technical Support Staff] Bug Report/Feature Request FORM
    aka: Ryan Webber | http://www.dusxproductions.com | Win 8.1/10 64bit, i7-4810mq, 16gb, nVidia GTX 870m | located in Ontario Canada.

  • Hi all.  Just tried to download the SimpleOpenNI v1.96 (112Mb) from the link in the tutorial, but the link has expired.  Anyone know where I can find the version for the Kinect model 1473 camera (which I just got).

    Thanks for your help,
    Eric

    Isadora 2.0 / Macbookpro / Mac OS 10.6.8 / QT 7.6.6 / 2.2 Ghz Intel Core Duo / 2 GB 667 MHz RAM / SSD (ext. esata)

  • @ ursullivision

    You'll now this now, but for future reference SimpleOpenNI v1.96 is here:

    http://troikatronix.com/files/SimpleOpenNI_1.96_osx_kinect_1473.zip

    Dell Precision m3800 16GB RAM, Quadro K1100, Win10x64 X99 Desktop, i7 5930, GeForce GTX980, 64GB RAM, Win7x64

Sign In or Register to comment.