background preloader

Daniel Shiffman

Daniel Shiffman
The Microsoft Kinect sensor is a peripheral device (designed for XBox and windows PCs) that functions much like a webcam. However, in addition to providing an RGB image, it also provides a depth map. Meaning for every pixel seen by the sensor, the Kinect measures distance from the sensor. This makes a variety of computer vision problems like background removal, blob detection, and more easy and fun! The Kinect sensor itself only measures color and depth. However, once that information is on your computer, lots more can be done like “skeleton” tracking (i.e. detecting a model of a person and tracking his/her movements). What hardware do I need? First you need a “stand-alone” kinect. Standalone Kinect Sensor v1. Some additional notes about different models: Kinect 1414: This is the original kinect and works with the library documented on this page in the Processing 3.0 beta series. SimpleOpenNI I’m ready to get started right now What is Processing? What if I don’t want to use Processing? Related:  Kinect Hacks

OpenKinect Kinect - Medien Wiki The Microsoft® XBOX 360 Kinect is a motion controller developped by PrimeSense. It projects a pattern with infrared light and calculates a depth image using a camera. It also has a color camera and four microphones. About Blogs and portals Software Applications CocoaKinect App Freenect by Robert Pointon Synapse generates sceleton data and provides it as OSC[[OSC|Open Sound Control]] ofxFaceTracker provides face metrics (orientation, eye and mouth open/closed) over OSC[[OSC|Open Sound Control]] codelaboratories.com/kb/nui TUIO Kinect lets you define a depth range where multiple blobs can be detected. SDKs, Frameworks and Libraries Depth image openkinect.org Drivers, Installation pix_freenect for Pd[[Pure Data]] a dataflow programming environment (incl. binaries work without any compiling) fux_kinect Object for Pd[[Pure Data]] a dataflow programming environment ofxKinect openFramworks Kinect integration vvvv kinect integration Skeleton data Successors/Competitors

Kinect - Medien Wiki The Microsoft® XBOX 360 Kinect is a motion controller developped by PrimeSense. It projects a pattern with infrared light and calculates a depth image using a camera. It also has a color camera and four microphones. About Blogs and portals Software Applications CocoaKinect App Freenect by Robert Pointon Synapse generates sceleton data and provides it as OSC ofxFaceTracker provides face metrics (orientation, eye and mouth open/closed) over OSC codelaboratories.com/kb/nui TUIO Kinect lets you define a depth range where multiple blobs can be detected. SDKs, Frameworks and Libraries Depth image openkinect.org Drivers, Installation pix_freenect for Pd (incl. binaries work without any compiling) fux_kinect Object for Pd ofxKinect openFramworks Kinect integration vvvv kinect integration Skeleton data Running depth image and skeleton data on workstation Rafael Change user of the computer to admin (Apple menu: logout user/User abmelden). Depth image in Pd Skeleton Data in Pd Successors/Competitors

PCL - Point Cloud Library (PCL) Kinect Hacks - Supporting the Kinect Hacking news and community OpenKinect Kinect Hacking using Processing About Processing from Processing.org: Processing is an open source programming language and environment for people who want to create images, animations, and interactions. Initially developed to serve as a software sketchbook and to teach fundamentals of computer programming within a visual context, Processing also has evolved into a tool for generating finished professional work. Today, there are tens of thousands of students, artists, designers, researchers, and hobbyists who use Processing for learning, prototyping, and production. About Kinect The Kinect is a stereo camera (actually triple camera including the IR sensor) that has some pretty sophisticated firmware algorithms that can spit out a wide variety of depth and motion tracking data. About this tutorial "Kinect for Processing" involves configuring a set of libraries that can be compiled with the Processing programming environment to parse and manipulate Kinect data. UPDATED NOTES ON CERTAIN KINECT MODELS! HOWEVER! Gah!

libfreenect/OpenNI2-FreenectDriver at master · OpenKinect/libfreenect openFrameworks Skeleton Tracking using Kinect : Overview - Voratima I’ve been wanted to experiment with skeleton tracking with Microsoft Kinect in forever. Finally, that time has arrived. And, boy! Some what confused at first, I thought, I should start with a list of requirements. Open sourceRun on Mac (and maybe Windows too)Widely adopted (which also implies a large community and easy to get supports)Give access to low level APIsLend itself to interactive wall projection There are five elements you will need to do the development: development environmentthe programming languageKinect driver or libraryDriver/Library wrapperthe Kinect itself Development Environments First let’s start with development environment i.e. a place where you actually do the coding. Xcode is a development environment for creating apps for Mac or iOS devices. Processing is an open source IDE running on its own language also called Processing. Visual Studio will get the job done for Windows users. Programming Languages If you’re brand new to programming, I suggest Processing. Libraries

Setting up Kinect on Mac | black label creative Update 27/04/2013: Latest test of the OpenNI 2.1.0 beta and NITE2 was good but it’s not working with the SimpleOpenNI library yet. I’ll keep watching for updates and let you know when it’s all running. Thanks to open source projects like OpenNI and OpenKinect, you can now use Microsoft’s Kinect on more than just Windows. This guide is for those running OSX 10.6.8 or newer but might also be applicable to anyone still running older versions or Linux. The main parts involved here are OpenNI, SensorKinect, NITE. Which Kinect? The hardware and interface differences mean that, for this guide and my experience at least, you will need an Xbox Kinect. Before we begin There are some things you’ll need to install before we can start with the Kinect utilities. First up is Xcode. Secondly you’ll need to get MacPorts. It’s worth restarting at the point (if MacPorts doesn’t get you to do it anyway) just to make sure any dependencies that are loaded at startup are there. The dependencies The Kinect

kinect_openni Le Framework OpenNI (les API) et les pilotes (Sensor) sont en licence GNUGPL Creation de l'environnement de travail (par défaut, dans /home/$USER/) mkdir ~/kinect && cd ~/kinect Récupération des fichiers avec git git clone Compilation et installation Les paramètres de compilation sont en SSE3, par défaut. cat /proc/cpuinfo Si, dans les flags vous voyez sse3 ou msse3 ou ssse3, vous n'avez pas a changer les paramètres de compilation. cd OpenNI/Platform/Linux/Build make && sudo make install Le contenu de ce wiki est sous licence : CC BY-SA v3.0

kinect_calibration/technical Description: Technical aspects of the Kinect device and its calibration Tutorial Level: ADVANCED Authors: Kurt Konolige, Patrick Mihelich Imager and projector placement The Kinect device has two cameras and one laser-based IR projector. This image is provided by iFixit. All the calibrations done below are based on IR and RGB images of chessboard patterns, using OpenCV's calibration routines. Depth calculation The IR camera and the IR projector form a stereo pair with a baseline of approximately 7.5 cm. Depth is calculated by triangulation against a known pattern from the projector. Disparity to depth relationship For a normal stereo system, the cameras are calibrated so that the rectified images are parallel and have corresponding horizontal lines. z = b*f / d, where z is the depth (in meters), b is the horizontal baseline between the cameras (in meters), f is the (common) focal length of the cameras (in pixels), and d is the disparity (in pixels). d = 1/8 * (doff - kd), to find b and doff.

Related: