Windows 7 Touch API

Microsoft is definitely pushing the concept of a “Natural User Interface” with the development of their Surface product as well as some related projects (like a much more cost-effective rear-projection / laser based unit), and the new Windows 7 Touch API furthers the concept by moving multi-touch controls away from specialized devices like in-store kiosks and putting it on every PC that has Windows 7 on it. Of course you’ll still need the hardware to take advantage of the multitouch features, but the cool thing is MS has made it pretty easy for developers to integrate the technology. Here’s a cheat sheet Willson found on the MSDN site:

From the MSDN Documentation

As you can see, with the exception of a few gestures, the Windows Touch API will handle translating user input into common mouse and navigation events, which makes multi-touch coding as simple as listening to a few mouse events on a container. Of course, if you want to handle more complex gestures you’ll have access to the raw input data, but for most application needs this API will do. You can even code your own gesture handlers, which would translate the raw input and throw your own events – for example, translating a “S” movement into a edit command. I think that the events will also translate to Surface apps with little-to-no change required, which is a plus if you’re planning on targeting multiple platforms.

Comments