Vergleich verschiedener Maus-Emulatoren für Microsoft Kinect | Soziotechnische Integration Mit dem Erscheinen von Microsoft Kinect als Zubehör für die Spielekonsole Xbox 360 im November 2010 wurde erstmal ein kostengünstiger Infrarot-Tiefensensor für eine breite Nutzerschaft verfügbar und schuf somit die Möglichkeit zur Entwicklung von Anwendungen, die durch eine gestenbasierte Nutzerinteraktion ohne zusätzliche Eingabegeräte das Potential zur Revolution der Gestaltung der Human-Computerrec Interaction versprechen. Daher entstand in kurzer Zeit eine Community, die die Anbindung an einen PC zunächst mit selbstentwickelten Treibern, einige Wochen später dann mit Treibern und Software Development Kit (SDK) von dem ebenfalls an der Entwicklung von Kinect beteiligten Unternehmen Primesense ermöglichte und erste Anwendungen mit vielfältigen Anwendungsgebieten veröffentlichte. Kinect Maus-Emulatoren KinEmote KinEmote Fokusbereiche von FAAST
OpenKinect An open source implementation of KinectFusion - Point Cloud Library Kinect - Medien Wiki The Microsoft® XBOX 360 Kinect is a motion controller developped by PrimeSense. It projects a pattern with infrared light and calculates a depth image using a camera. It also has a color camera and four microphones. The Y axis of the sensor is remote controllable with an in-build motor. The color of the LED is settable by software as well. About Blogs and portals Software Applications CocoaKinect App Freenect by Robert Pointon Synapse generates sceleton data and provides it as OSC ofxFaceTracker provides face metrics (orientation, eye and mouth open/closed) over OSC codelaboratories.com/kb/nui TUIO Kinect lets you define a depth range where multiple blobs can be detected. SDKs, Frameworks and Libraries Depth image openkinect.org Drivers, Installation pix_freenect for Pd (incl. binaries work without any compiling) fux_kinect Object for Pd ofxKinect openFramworks Kinect integration vvvv kinect integration Skeleton data Running depth image and skeleton data on workstation Rafael
Berkeley 3-D Object Dataset Daniel Shiffman The Microsoft Kinect sensor is a peripheral device (designed for XBox and windows PCs) that functions much like a webcam. However, in addition to providing an RGB image, it also provides a depth map. Meaning for every pixel seen by the sensor, the Kinect measures distance from the sensor. This makes a variety of computer vision problems like background removal, blob detection, and more easy and fun! The Kinect sensor itself only measures color and depth. What hardware do I need? First you need a “stand-alone” kinect. Standalone Kinect Sensor v1. Some additional notes about different models: Kinect 1414: This is the original kinect and works with the library documented on this page in the Processing 3.0 beta series. SimpleOpenNI You could also consider using the SimpleOpenNI library and read Greg Borenstein’s Making Things See book. I’m ready to get started right now What is Processing? What if I don’t want to use Processing? What code do I write? import org.openkinect.processing Kinect kinect;
The Kinect effect: how Harmonix mastered Dance Central's menus Dance Central has quickly emerged as one of the best-received titles for Microsoft's Kinect. Not only is the game great, but the smaller details like the way the menu system works are also superior to the other current Kinect offerings. Since gesture-based controls are such a new frontier, the developers at Harmonix had a difficult time creating an intuitive control scheme for navigating menus. The solution? Creating lots of prototypes and simply seeing what worked. The major problem for Harmonix was that, aside from that one scene in Minority Report, there weren't any real solid examples that the team could look at for inspiration. The main goal was to create a menu system that didn't make players wish they were using a controller instead. Another early prototype featured a virtual scroll wheel, something along the lines of the big wheel from The Price Is Right. Things went on like this for around two months. All that, just to create a menu.
openFrameworks Skeleton Tracking using Kinect : Overview - Voratima I’ve been wanted to experiment with skeleton tracking with Microsoft Kinect in forever. Finally, that time has arrived. And, boy! There are many ways you can do this. I did a lot of research at the the beginning trying to understand how all the pieces fit together, which frameworks/SDK to use, etc. Some what confused at first, I thought, I should start with a list of requirements. Open sourceRun on Mac (and maybe Windows too)Widely adopted (which also implies a large community and easy to get supports)Give access to low level APIsLend itself to interactive wall projection There are five elements you will need to do the development: development environmentthe programming languageKinect driver or libraryDriver/Library wrapperthe Kinect itself Development Environments First let’s start with development environment i.e. a place where you actually do the coding. Xcode is a development environment for creating apps for Mac or iOS devices. Visual Studio will get the job done for Windows users.