Light Leaks - Filling a room with projected light / by @kcimc + @halfdanj Created by Kyle McDonald and Jonas Jongejan for the CLICK Festival 2013 in Elsinore, Denmark, Light Leaks is a light installation comprised of fifty mirror balls projecting controlled light in the room. The general idea was to make use of found objects, in this case mirror balls, which as Kyle explains to CAN have a fairly chaotic structure compared to the perfect grid of a projected image we are accustomed to. Having been influenced by Kyle’s work with Joanie Lemercier at ScreenLab in Manchester last year , where he learned how important peripheral vision can be in creating an immersive experience, Light Leaks is an attempt to fill a room with projected light in a way that can’t be achieved with projectors alone. The pile of fifty balls sites in the centre of the room that has three projectors pointed at them. Optically, this is much related to Kyle’s work with Elliot Woods earlier this year, ExR3 . Kyle McDonald | Jonas Jongejan | More images on Flickr | Code on GitHUb
Vizual Invaders - blog U.F.O. par VJ ZERO. Voici la nouvelle création de l’artiste visuel ZERO (www.zero.com) U.F.O (Unknow Flashing Object), mise en œuvre par le label et très certainement le plus beau projet auquel nous avons participé. CLIQUER ICI POUR VOIR LA VIDEO ! ∞INFINITY∞ par Vizual Invaders. Voici la nouvelle scénographie du label, polymorphe, elle est conçue pour s’adapter à toutes les jauges de scène. MirrorFugue - Music collaboration across space and time / by @xiaosquared @medialab MirrorFugue is a Ph.D. research project by Xiao Xiao at the MIT Media Lab, exploring communicating gesture in music collaboration across space and time. The project is comprised of a set of interfaces for piano to visualise the gesture of a performance. Based on the idea that the visibility of gesture contributes to learning and synchronisation, MirrorFugue displays the hand and body movements of piano playing using metaphors from the physical world to connect musicians from disparate spaces and times – and you can even play with yourself from the past. We designed two modes to visualize the hand gesture of a performance, which we term “Reflection” and “Organ”. Inspired by the reflective surface on a lacquered grand piano that mirrors the keyboard and player’s hands, Reflection mode shows the mirrored keyboard and hands of a performance. MirrorFugue can be used in remote lessons to enable teachers and students to see each other’s playing. Project Page | Xiao Xiao
Getting Started with Kinect and Processing So, you want to use the Kinect in Processing. Great. This page will serve to document the current state of my Processing Kinect library, with some tips and info. The current state of affairs Since the kinect launched in November 2010, there have been several models released. Kinect 1414: This is the original kinect and works with the library documented on this page in Processing 2.1 Kinect 1473: This looks identical to the 1414, but is an updated model. Now, before you proceed, you could also consider using the SimpleOpenNI library and read Greg Borenstein’s Making Things See book. I’m ready to get started right now What hardware do I need? First you need a “stand-alone” kinect (model 1414 only for now!). Standalone Kinect Sensor If you have a previous kinect that came with an XBox, it will not include the USB adapter. Kinect Sensor Power Supply Um, what is Processing? What if I don’t want to use Processing? ofxKinectKinect CinderBlock More resources from: The OpenKinect Project So now what? 1.
Kinect - Medien Wiki The Microsoft® XBOX 360 Kinect is a motion controller developped by PrimeSense. It projects a pattern with infrared light and calculates a depth image using a camera. It also has a color camera and four microphones. The Y axis of the sensor is remote controllable with an in-build motor. The color of the LEDLight-emitting diode is settable by software as well. About Blogs and portals Software Applications CocoaKinect App Freenect by Robert Pointon Synapse generates sceleton data and provides it as OSC[[OSC|Open Sound Control]] ofxFaceTracker provides face metrics (orientation, eye and mouth open/closed) over OSC[[OSC|Open Sound Control]] codelaboratories.com/kb/nui TUIO Kinect lets you define a depth range where multiple blobs can be detected. SDKs, Frameworks and Libraries Depth image Skeleton data Running depth image and skeleton data on workstation Rafael Change user of the computer to admin (Apple menu: logout user/User abmelden). Successors/Competitors