Texas Instruments (TI) is collaborating with SoftKinetic, a leading provider of end-to-end 3D sensor and gesture recognition middleware solutions, in a move designed to grow adoption of gesture control in televisions (TVs), personal computers (PCs), and a wide variety of other consumer and industrial devices.
At the 2013 Consumer Electronics Show (CES), TI will demonstrate its new 3D time-of-flight (ToF) image sensor chipset, which integrates SoftKinetic’s DepthSense pixel technology and runs SoftKinetic’s iisu middleware for finger, hand and full-body tracking.
The TI chipset is inside 3D cameras that control a laptop and a smart TV to access and navigate movies, games and other content with the wave of a hand. The TV demonstration also features TI’s OMAP 5 processor, which powers a natural user interface with gesture recognition and full-HD graphics.
“SoftKinetic has long believed that motion control and gesture recognition is the future of user interfaces and digital interactivity,” said Michel Tombroff, chief executive officer of SoftKinetic, “and are pleased to collaborate with TI to help bring this technology into the mass market.”
Current 3D gesture recognition solutions lack real-time tracking and tend to suffer from poor sensitivity, which can cause sluggish performance. TI’s ToF chipset, featuring a 3D sensor employing SoftKinetic’s DepthSense pixel technology, looks to overcome this problem. The solution enables precise tracking of finger, hand and full-body gestures. TI plans to follow its initial products with a complete portfolio of solutions suitable for various applications and form factors.
“There are a plethora of applications that can benefit from the accuracy and resolution of this technology,” said Gaurang Shah, vice president of Audio and Imaging Products at TI. “Imagine an end equipment designer tilting, rotating, compressing and expanding a new product in 3D to inspect and evaluate it on their PC before committing to a hardware prototype. We believe our collaboration with SoftKinetic will ignite more applications like this, and foster further technology innovation to simplify the way we interact with machines.”