Kinect + Arduino | Tanner's Website
With an Arduino Ethernet, Processing, and a Kinect, I was able to easily create this little demo where hand movement can control a servo. This is just a tiny step in my master plan to create a robot clone so that I don’t have to leave my chair. <p>[Javascript required to view Flash movie, please turn it on and refresh this page]</p> The following libraries and drivers made this work and also made it super easy for me to create it: OpenKinectDaniel Shiffman’s Processing Kinect Library (he knows his stuff and has great examples on his site)Arduino Ethernet UDP send / receive string Servo:EMAX ES08A Servo How it works: The Arduino Ethernet acquires an IP address and waits for UDP packets on a certain port.The machine with the Kinect sends packets to the Arduino that contain hand coordinate data.The Arduino then takes this data (an integer) and maps the range from 0 to 180 degrees.The mapped value is sent to the servo.
Kinect - Medien Wiki
The Microsoft® XBOX 360 Kinect is a motion controller developped by PrimeSense. It projects a pattern with infrared light and calculates a depth image using a camera. It also has a color camera and four microphones. About Blogs and portals Software Applications CocoaKinect App Freenect by Robert Pointon Synapse generates sceleton data and provides it as OSC[[OSC|Open Sound Control]] ofxFaceTracker provides face metrics (orientation, eye and mouth open/closed) over OSC[[OSC|Open Sound Control]] codelaboratories.com/kb/nui TUIO Kinect lets you define a depth range where multiple blobs can be detected. SDKs, Frameworks and Libraries Depth image openkinect.org Drivers, Installation pix_freenect for Pd[[Pure Data]] a dataflow programming environment (incl. binaries work without any compiling) fux_kinect Object for Pd[[Pure Data]] a dataflow programming environment ofxKinect openFramworks Kinect integration vvvv kinect integration Skeleton data Successors/Competitors
Mobile Autonomous Robot using the Kinect
Given a priori knowledge of the environment and the goal position, mobile robot navigation refers to the robot’s ability to safely move towards the goal using its knowledge and sensorial information of the surrounding environment. In fact, in mobile robot operating in unstructured environment, the knowledge of the environment is usually absent or partial. Therefore, obstacle detection and avoidance are always mentioned for mobile robot missions. Kinect is not only normal camera sensor but also a special device can provide depth map.Depth map is acquired through OpenNI library then processed by Point Cloud library to extract accurate information about the environment. Here is link of full project: (code + references in English, others in Vietnamese but still good to understand from the source code) Some fun stuffs using kinect are available on my channel.
Vizual Invaders - blog
U.F.O. par VJ ZERO. Voici la nouvelle création de l’artiste visuel ZERO (www.zero.com) U.F.O (Unknow Flashing Object), mise en œuvre par le label et très certainement le plus beau projet auquel nous avons participé. CLIQUER ICI POUR VOIR LA VIDEO ! ∞INFINITY∞ par Vizual Invaders. Voici la nouvelle scénographie du label, polymorphe, elle est conçue pour s’adapter à toutes les jauges de scène.
rosnodejs - Program robots with JavaScript
Rosnodejs is currently deprecated. Most of my efforts for JavaScript and robotics has been shifted to the Robot Web Tools project. I highly recommend taking a look at Robot Web Tools if interested in putting your robot on the web. Features include: JavaScript interface to ROS functionality 2D tools for mapping and more 3D tools for robot visualization in a 3D environment I still feel a Node.js interface into ROS is important. Rosnodejs is a Node.js module that lets you use JavaScript to interact with the Robot Operating System, an open-source robot framework used by many of the top universities and research programs around the world. Perform a range of robotic tasks, from controlling the motors on an Arduino to processing Kinect sensor data using JavaScript and Node.js. The goal is to make the field of robotics more accessible to the countless intelligent web developers out there. One of the top frameworks to program robots with today is the Robot Operating System (ROS).
Kinect | Doc-Ok.org
I just read an interesting article, a behind-the-scenes look at the infamous “Milo” demo Peter Molyneux did at 2009′s E3 to introduce Project Natal, i.e., Kinect. This article is related to VR in two ways. First, the usual progression of overhyping the capabilities of some new technology and then falling flat on one’s face because not even one’s own developers know what the new technology’s capabilities actually are is something that should be very familiar to anyone working in the VR field. But here’s the quote that really got my interest (emphasis is mine): Others recall worrying about the presentation not being live, and thinking people might assume it was fake. Gee, sounds familiar? With the “Milo” demo, the problem was similar. The take-home message here is that mainstream games are slowly converging towards approaches that have been embodied in proper VR software for a long time now, without really noticing it, and are repeating old mistakes.
OpenGl - Tutorial 09 : Blending
Introduction Blending is commonly used to make objects translucent. To view and understand some blending effects, it requieres some learning on how OpenGl computes Blending. This is a little longer so I put this in a Lesson 3. It is highly recommanded to read it for an accurate understanding of this tutorial. Sample use of Blending In this tutorial, we will see some blending application, the technique part is written in Lesson 3. Make an object translucent Mixing Pictures Filter effect Many other other effects can be created with blending. Translucent object Translucent objects is the common use of Blending. Without blending, when an object is rendered, all pixels drawn replace existing pixels in the frame buffer. The Blending formula defined with glBlendFunc is : srcColor+destColor In case of multiple translucent object, disable the writting in the depth buffer (Lesson 3). You can control how the object is translucent. Translucent object Mixing pictures First Method The alpha value is 0.75. Keys
Basic OpenGL Lighting.
by Steve Baker Introduction. Many people starting out with OpenGL are confused by the way that OpenGL's built-in lighting works - and consequently how colour functions. I hope to be able to clear up some of the confusion. What is needed to explain this clearly is a flow chart: Lighting ENABLED or DISABLED? The first - and most basic - decision is whether to enable lighting or not. glEnable ( GL_LIGHTING ) ; ...or... glDisable ( GL_LIGHTING ) ; If it's disabled then all polygons, lines and points will be coloured according to the setting of the various forms of the glColor command. glColor3f ( 1.0f, 0.0f, 0.0f ) ; ...gets you a pure red triangle no matter how it is positioned relative to the light source(s). With GL_LIGHTING enabled, we need to specify more about the surface than just it's colour - we also need to know how shiney it is, whether it glows in the dark and whether it scatters light uniformly or in a more directional manner. glMaterial and glLight glColorMaterial It's slow. glNormal
Invisible Piano (Keyboard Anywhere, a Kinect Piano)
After writing my previous instructable, I was asked about installing some slightly different software to use with the Kinect. Since I'd already done it, I figured it wouldn't take to long to retrace my steps and write the instructable. After much frustration, I figured out a really easy process to get everything installed and talking. This instructable with walk you though getting a virtual keyboard working with the current release (11.04) of Ubuntu. There are other ways of doing this (which I've done in the past), but trying to reaccomplish the task, I found many shortcuts to what I did in the command line previously. If you have any questions on the command line or getting around in Ubuntu, please see my previous instructable. Also, don't transpose anything to terminal between [ ], it's there for reference.