The Browser Touch Events API
Lately I have been remembering my first project after grad school creating plant floor ssystems that utilized both touch and voice recognition. The monitors we used were in the $2000 each range, but supported touch. The voice recognition was another whole ball game, but was wireless. This was way back in 1996 and I believe were built on top of Windows 3.1 so think about the technology and cost challenges compared to what is available today.
In particular I have been thinking about the touch interfaces because touch UIs are now available to the masses with the latest generation of mobile devices. This means that not only do we as developers need to start incorporating this natural user input but for web applications we need a standard touch API to program against.
Currently the W3 has the Touch Events Specification in draft, and as I am writing this the latest version is May 5, 2011. The API specifies a set of events to capture a user touching the screen and the event data generated. This data consist of various X and Y coordinates, the touch radius, rotation and angle. There is also a collection of touch points available called a TouchList. This is a collection of points touched during the touch event such as touchMove.
There are 6 touch events defined, touchStart, touchEnd, touchCancel, touchMove, touchEnter and touchLeave. touchStart, touchEnd and touchMove should be obvious as to when they fire. touchEnter fires when a touch action moves into an 'iinteractive area' or specified DOM element. Here you can imagine a user touching the screen and swiping across to a DIV or button element. When the user swipes out of an element the touchLeave event fires. The touchCancel event fires when the user does something that 'cancels' the touch action. This can be a variety of causes, such as a second touch point in some cases. The following is the current definition for touchCancel:
A user agent must dispatch this event type to indicate when a touch point has been disrupted in an implementation-specific manner, such as a synchronous event or action originating from the UA canceling the touch, or the touch point leaving the document window into a non-document area which is capable of handling user interactions. (e.g. The UA's native user interface, plug-ins) A user agent may also dispatch this event type when the user places more touch points on the touch surface than the device or implementation is configured to store, in which case the earliest Touch object in the TouchList should be removed.
Safari on the iPhone/iPad extends the touch API by adding a series of gesture events; gestureStart, gestureChange and gestureEnd. These events are fired once a user has two fingers are touching the screen. These extensions are helpful when trying to define pinch, expand, rotate and other actions involving more than one finger.
Luke Wroblewski has created a Touch Gesture Reference Guide that I recommend you download. Luke's page does a great job of listing all sorts of great touch reference data for many platforms. The reference guide and the other research Luke has collected are a treasure trove of information.
As a web developer utilizing jQuery for all my solutions I have found the jQuery TouchSwipe plugin to be an easy way to incorporate touch in my applications. I will post more on using this plugin in the coming days. The plugin abstracts the touch API events into a simple plugin call. My first attempt incorporating this plugin took 15 minutes to have a working touch solution.
As a client developer you need to start incorporating appropriate responses to various touch gestures because touch has become a ubiquitous user input technique. Last week Google announced they are currently activating 400,000 Android devices a day. Apple is activating around 250,000 iOS devices a day. All these Droids and iOS devices support touch in their native applications as well as their browsers. To put that in perspective think about 20 million new phone & tablet touch enabled devices being activated each month!
Not having a web solution leveraging the touch API is missing out on a large portion of your visitors and potential customers. Sure these touch devices will fire mouse events in response to a user touching the device, but I have found those abstractions to be somewhat inconsistent. Plus having a real touch solution means you will have a more appealing user experience that should translate to a more satisfied customer base.
Supporting touch in web sites and native client applications is a must to remain competitive in today's world. Devices offering touch interfaces must also implement the touch API in their browsers if they want to stay viable. Failure to do so is effectively waiving the white flag for a business.