background preloader

Arduino + Servo + openCV Tutorial [#openFrameworks] by Joshua Noble

Arduino + Servo + openCV Tutorial [#openFrameworks] by Joshua Noble
One of the my favorite things about creativeapplications.net has always been the small tags one can find beneath the name of an application indicating among other things, the technology used to create it. That little nod to the process and to all the work that went into creating the libraries and techniques that an artist or designer uses helps not only contextualize the work but it also helps give recognition to everyone who has contributed their time and expertise to building tools for creative expression in code. Figuring that some of the readers might be interested in learning a little more about these frameworks I’ve put together a quick walk-through of how to connect up two of those tools that one so often sees attached to the names of the projects profiled here: openFrameworks and Arduino. Arduino For this tutorial you’ll need a few things: 1 x Arduino-compatible device 1 x Trossen Servokit 1 x USB cable 1 x Breadboard and wires to connect the Servos to the Arduino

OpenCV Tutorials and Source-Code by Shervin Emami Last updated on 3rd October, 2010 by Shervin Emami . Posted originally on 2nd June, 2010. OpenCV is a great library for creating Computer Vision software using state-of-the-art techniques, and is freely available for Windows, Linux, Mac and even the iPhone. OpenCV was originally designed by Intel in 1999 to show how fast Intel CPUs can run. However, OpenCV is mainly used for tasks that are complex in nature, often requiring post-grad experience in the fields of Computer Vision or Artificial Intelligence (AI). If you have never used OpenCV before, you should first read my Introduction To OpenCV . OpenCV is for creating futuristic applications that maybe no-one else has done before, so its important that you are good at computer programming BEFORE you start using OpenCV!

bradhayes 3dtracking - jbarandiaran Abstract Method for real-time 3D object tracking. During the tracking process, the algorithm is continuously projecting the 3D model to the current frame by using the pose estimated in the previous frame. Once projected, some control points are generated along the visible edges of the object. Introduction The implemented model-based tracking system follows a recursive scheme. Thus, the result of the iteration is a 3x4 transformation matrix V = [ R | t ] (composed by a 3x3 rotation matrix and a 3x1 translation vector), called the motion matrix, that transforms the pose calculated in the previous frame into the pose of the object in the new frame. In order to obtain the 2D data, a CAD model of the object is employed. Publications Barandiaran, J., Borro, D., Basogain, X., and Izkara, J.L., "MobileAugmented Reality for Providing Guide in Maintenance Tasks", Poster Contribution of the Laval Virtual, 9th International Conferenceon Virtual Reality 2007 (VRIC 2007). Software used Links

ilab.cs.ucsb Overview The Handy AR presents a vision-based user interface that tracks a user's outstretched hand to use it as the reference pattern for augmented reality (AR) inspection, providing a 6-DOF camera pose estimation from the tracked fingertip configuration. A hand pose model is constructed in a one-time calibration step by measuring the fingertip positions relative to each other in presence of ground-truth scale information. Through frame-by-frame reconstruction of the camera pose relative to the hand, we can stabilize 3D graphics annotations on top of the hand, allowing the user to inspect such virtual objects conveniently from different viewing angles in AR. Fingertip Detection Fingertips are detected using a curvature-based algorithm on the contour of a user's hand. Camera Pose Estimation Interaction The Handy AR can be used for interacting with AR objects such as world-stabilized objects using other marker-based AR library such as ARTag. Videos Source Code Publications T. T. T. T.

Related: