The next information revolution will be 100 times bigger than the Internet.
Glass. Chromecast Is Google's Miracle Device. At an event where everyone was expecting a new Android tablet ( and got one) Google announced something far more interesting and important: The Chromecast, a small stick that jacks into the back of your television via HDMI and allows you to sling content via Wi-Fi from your phone, tablet or computer to the big screen.
It costs $35, and comes with three months of free Netflix (even for existing subscribers), which means it effectively costs $11 plus shipping.* On paper at least, it’s the best device Google has ever announced. Www.ipf.tuwien.ac.at/publications/np_annapolis_2001.pdf.
Adobe announces first hardware, the Project Mighty smart stylus and Napoleon ruler. Adobe has just announced its first hardware initiative, a pressure sensitive stylus and an electronic ruler that will tightly integrate with its software applications.
The company's Project Mighty stylus and Napoleon ruler have been showcased connecting to an iPad and iPhone over Bluetooth. The pen works much like existing styli, but when working alongside Napoleon, the two tools can be used to create curved and angled shapes in a way that would be difficult to do with a third-party stylus. So far, the tools have only been demonstrated working with an unreleased app, which Adobe told us was created specifically for the hardware. How Adobe Reinvented The Pen To Draw On The Internet.
This week, Adobe announced that the Creative Suite was becoming the subscription-based Creative Cloud.
It didn’t go so well. But amidst the bad news, we may have lost sight of Adobe’s rationale for pushing the cloud beyond profits. And you can see that rationale hiding inside Project Mighty. On one hand, it’s just an aluminum stylus that can replace your finger on the iPad screen. A New Flexible Keyboard Features Clickable Buttons. A very thin keyboard that uses shape-changing polymers to replicate the feel and sound of chunky, clicking buttons could be in laptops and ultrabooks next year.
Strategic Polymers Sciences, the San Francisco-based company that developed the keyboard, is working on transparent coatings that would enable this feature in touch screens. Today’s portable electronics provide rudimentary tactile feedback—many cell phones can vibrate to confirm that the user has pressed a button on a touch screen, for example. These vibrations are produced by a small motor, meaning the entire phone will move rather than just the appropriate spot on the screen where the button is, and there can be a lag in response time. Tangible Media Group. InFORM is a Dynamic Shape Display that can render 3D content physically, so users can interact with digital information in a tangible way. inFORM can also interact with the physical world around it, for example moving objects on the table’s surface.
Remote participants in a video conference can be displayed physically, allowing for a strong sense of presence and the ability to interact physically at a distance. inFORM is a step toward our vision of Radical Atoms: For press inquiries email: sfollmer (at) mit (dot) edu and daniell (at) mit (dot) edu Credits:Daniel Leithinger*, Sean Follmer*, Hiroshi Ishii* Contributed EquallyAcademic Support:Alex Olwal Software Engineering Support:Akimitsu Hogge, Tony Tang, Philip Schoessler Hardware Engineering Support:Ryan Wistort, Guangtao Zhang, Cheetiri Smith, Alyx Daly, Pat Capulong, Jason Moran. Project Tango Shows a Future Filled with 3-D Mapping. Four months after Google unveiled Project Tango—a prototype Android smartphone with cameras and sensors that capture the phone’s environment in 3-D—developers are using the device to make cheap drones for surveying zones, more immersive video games, and even a body-part scanning system that could lead to a better-fitting suit.
As well as a normal front-facing camera and sensors that measure orientation and motion, the device has a fish-eye camera on its rear that sees 180 degrees and a depth-sensing camera that uses infrared to capture a 3-D representation of the surrounding world. Microsoft’s Perceptive Pixel premise: The future of touch computing isn’t stuck in your pocket. When Microsoft purchased the Perceptive Pixel company, maker of 55 and 82 inch touchscreen displays that go by the eponymous acronym PPI, what it intended to do with the firm wasn’t completely clear.
At the time, Microsoft stated the following: The acquisition of PPI allows us to draw on our complementary strengths, and we’re excited to accelerate this market evolution [...] PPI’s large touch displays, when combined with hardware from our OEMs, will become powerful Windows 8-based PCs and open new possibilities for productivity and collaboration. For more context, here’s how TNW reported Microsoft CEO Steve Ballmer’s announcement of the purchase: According to Ballmer, Microsoft will [utilize] its research, development and production of multi-touch technologies to further upcoming software and hardware, with the company showing off its huge 82-inch touch-enabled screen at the WPC event. This Amazing 3-D Desktop Was Born at Microsoft. SpaceTop, a 3-D desktop environment you can reach into, was shown at the TED conference today by Jinha Lee, who developed the system during and after his internship at Microsoft Applied Science.
Photo: TED/Flickr LONG BEACH, California – The history of computer revolutions will show a logical progression from the Mac to the iPad to something like this SpaceTop 3-D desktop, if computer genius Jinha Lee has anything to say about it. Imogen Heap Performance with Musical Gloves Demo: Full Wired Talk 2012. Gadget iGeak – NFC-powered Ring. WiSee. Eye Tracking Research and Human-Computer Interaction. Eye tracking has long been used to analyze user behavior and user interface usability in a wide range of human-computer interaction (HCI) research and practices and as an actual control medium in a human–computer dialogue.
Eye tracking to analyze user behavior and usability When analyzing user behavior and usability, user eye movements during systems interaction are recorded and later analyzed. Eye movements provide objective data on the physiological and perceptual impact of interaction. Eye tracking measurers are seldom used in isolation, but together with other physiological measures and qualitative methods. Eye tracking is commonly used to test usability of websites, software, computer games, interactive TV, digital map interfaces, mobile devices and other physical devices.
Below is a heatmap from the interactive TV format, The Space Trainees. iDTV Lab at Åbo Academy in Finland tested the format using eye tracking. Head Tracking for Desktop VR Displays using the WiiRemote. Oculus Rift - Virtual Reality Headset for 3D Games. Microsoft HoloLens - Possibilities.
LED Lights Make Augmented Vision a Reality. LED Lights Make Augmented Vision a Reality Okay, this is just freaky.
We know LED lights are versatile enough to be used for practically anything, but LED contact lenses? Really?! Yes, as it turns out, really. University of Washington researchers have figured out how to implant semitransparent red and blue LED lights in contact lenses, for the purpose of receiving and displaying data in sharp visual images and video.
Once miniature green LEDs are developed (and they’re in the works, as of now), full color displays will be possible. Lead researcher Babak Parvis comments “You won’t necessarily have to shift your focus to see the image generated by the contact lens,” it would just appear in front of you and your view of the real world will be completely unobstructed when the display is turned off. Contact Lens Computer: Like Google Glass, without the Glasses. For those who find Google Glass indiscreet, electronic contact lenses that outfit the user’s cornea with a display may one day provide an alternative. Built by researchers at several institutions, including two research arms of Samsung, the lenses use new nanomaterials to solve some of the problems that have made contact-lens displays less than practical.
A group led by Jang-Ung Park, a chemical engineer at the Ulsan National Institute of Science and Technology, mounted a light-emitting diode on an off-the-shelf soft contact lens, using a material the researchers developed: a transparent, highly conductive, and stretchy mix of graphene and silver nanowires. The researchers tested these lenses in rabbits—whose eyes are similar in size to humans’—and found no ill effects after five hours. The animals didn’t rub their eyes or grow bloodshot, and the electronics kept working. Elon Musk shapes a 3D virtual rocket part with his hands — and Leap Motion. In his typical fashion, SpaceX and Tesla founder Elon Musk has been teasing his Twitter followers for a few weeks about an Iron Man-like system to design rocket parts with hand gestures before 3D printing them. A video documenting the process just went live, and it’s nothing too groundbreaking.
SpaceX paired a Leap Motion gesture reader with its Siemens NX computer aided design software and added 3D glasses, allowing a designer to shape the part with their hands in a 3D environment. They can’t build a design from scratch, but they can take actions like modifying the shape of an object. Musk demonstrated in the video below that it is also a useful way to examine a design in three dimensions. SpaceX has spent a few months working with a Leap Motion controller, during which time it shifted from displaying the designs on a simple computer screen to the 3D environments. A Gestural Interface for Smart Watches. If just thinking about using a tiny touch screen on a smart watch has your fingers cramping up, researchers at the University of California at Berkeley and Davis may soon offer some relief: they’re developing a tiny chip that uses ultrasound waves to detect a slew of gestures in three dimensions.
The chip could be implanted in wearable gadgets. The technology, called Chirp, is slated to be spun out into its own company, Chirp Microsystems, to produce the chips and sell them to hardware manufacturers. Welcome to Project Soli. How An Intelligent Thimble Could Replace the Mouse In 3D Virtual Reality Worlds. The way in which humans interact with computers has been dominated by the mouse since it was invented in the 1960s by Doug Engelbert.
A mouse uses a flat two-dimensional surface as a proxy for a computer screen. Speech Recognition Breakthrough for the Spoken, Translated Word. Invention Awards: A Real-Life Babel Fish For the Speaking Impaired. The Audeo captures electronic signals between the brain and vocal cords and synthesizes clear, spoken words By Lisa Katayama Posted 05.20.2009 at 10:53 am Neck Talk Electrodes on the throat pick up electrical signals between the brain and the vocal cords John B. Brain Hacking: Scientists Extract Personal Secrets With Commercial Hardware. The Wildly Ambitious Quest to Build a Mind-Controlled Exoskeleton by 2014. A Brain-Controlled Robotic Arm. I was about 15 minutes late for my first phone call with Jan Scheuermann. A sensational breakthrough: the first bionic hand that can feel - News - Gadgets & Tech.
A Prosthetic Hand That Sends Feelings to Its Wearer. There have been remarkable mechanical advances in prosthetic limbs in recent years, including rewiring nerve fibers to control sophisticated mechanical arms (see “A Lifelike Prosthetic Arm”), and brain interfaces that allow for complicated thought control (see “Brain Helps Quadriplegics Move Robotic Arms with Their Thoughts”).
Steered by thoughts, drone flies through hoops - tech - 05 June 2013. Harvard creates brain-to-brain interface, allows humans to control other animals with thoughts alone. This Mind-Reading Headset Gives You The Power Of The Force. Five years ago, Vietnamese-Australian inventor and Emotiv CEO Tan Le released the Emotiv EPOC neuroheadset, what was billed as the world’s first commercial brain-computer interface. The product, which still sells for $300, proved to be a hit, making it clear that the public craved this new kind of wearable technology. David Eagleman: Can we create new senses for humans?