eyeSight, a developer of intuitive gesture control technology, has in partnership with ARM completed extensive work to optimise its gesture recognition solution for use on Mali T600 Graphics Processing Units. Manufacturers using Mali GPUs will now be able to use eyeSight’s advanced gesture control capabilities, optimised using GPU Compute for improved accuracy and energy efficiency.
The improved efficiency of gesture computation through use of the GPU will also improve a variety of new use cases, such as face and emotion detection, long distance finger tracking, and even 3D motion recognition (such as finger pointing for selection).
The ARM Mali T600 GPU Compute optimised engine from eyeSight now provides a solution for enabling gesture in mobiles, tablets and TVs and a range of other devices. Products featuring ARM Mali GPUs with eyeSight’s gesture solution will be able to allow users to control user interfaces (UIs) and content such as music and movies, to activate usability applications, play games, or to browse menus with easy yet powerful hand and fingertip-level gestures.
Using eyeSight’s technology, Mali devices will now be able to recognise a richer language of gestures including directional gestures, (such as up, right, wave, etc.), hand signs (such as a ‘thumbs-up’), and tracking of hands and even fingertips, (for mouse-cursor-accuracy).
“ARM is excited to be working closely with eyesight to offer our mutual customers advanced gesture recognition technology,” commented Pete Hutton, executive vice president and general manager, Media Processing Division, ARM. “The optimisation of gesture middleware solutions using Mali GPU Compute in combination with ARM Cortex-A processors using NEON technology is an industry first. It enables impressive performance, accuracy, robustness and efficiency. Developers no longer need to worry about processing or ambient limitations as they create the gesture-enabled applications of the future.”
“For ARM to have enabled us to optimise our software for use as part of its ARM Mali GPU software stack shows that ARM understands the huge market appeal of our software and of gesture technology more generally, which is quickly becoming a prominent differentiating feature in devices of all kinds,” commented Gideon Shmuel, CEO of eyeSight. “By making it easier to integrate gesture into ARM powered devices, eyeSight and ARM are providing a route to really bring gesture to the masses. We look forward to developing our partnership with ARM.’
Low-quality video resulting from low light conditions, or slow CPUs, usually results in poor or compromised gesture recognition. However, eyeSight’s technology is particularly efficient, and Mali GPUs take processing load away from the CPU. What’s more, eyeSight’s video pre-processing sits between the camera and the gesture control, ‘cleaning up’ the image so that the shapes and movement of hands and fingers can be recognised, even when the original image is substandard, improving overall performance in low-light conditions or when using low-resolution cameras.
By using eyeSight’s technology, Mali GPUs will also be able to process gesture in three dimensions, via two-camera (‘stereoscopic’) devices and devices with infrared (IR) depth perception.