It’s been exciting to track the buzz at SxSW for Leap Motion, probably the hottest NUI controller to hit the market since Kinect for Windows. As with Kinect, the potential of Leap Motion lies not in the hardware (which isn’t anything radical) but in the software and the potential for game changing gesture controlled applications.
Adapptor was lucky enough to be invited into Leap Motion’s Developer Program back in December 2012. Since then, and after a short wait for the hardware to arrive, we have spent some time familiarising ourselves with the SDK and racking our collective brains for useful applications of the technology.
As we collected these ideas we quickly identified that applications fell into one of three buckets: utility, education and play. Using just hand gestures to accurately control an interface opened up some exciting opportunities but, in our minds, any such use case must still present an improvement over the trusty mouse or touch screen.
It may be the mobile app blood pumping through our veins but this got us thinking about maps and more specifically way-finding in open spaces (think university campuses, shopping centres, theme parks and train stations).
As these types of destination migrate towards a digital signage strategy, how will they allow public interaction with their maps? Giant touch screens? Maybe but the hardware is currently cost prohibitive and requires users to be up close and personal with the screen, which in turn makes them at perfect vandal height and hard to make weather proof.
Enter Leap Motion, an $80 controller that works at distance from the screen it’s connected to, and claims millimeter level accuracy to make for a compelling user experience . The Kinect device struggles with this type of precision tracking so we were interested to see how Leap Motion would fair when put to the task. The answer, as you can see from the prototype mapping app we created, is well, very well indeed.