Interactivity: The Way of Touch

Interactivity: The Way of Touch

In transitioning to using touch events, let’s first take a look at the event-naming conventions, as displayed in Table 7–1.

Table 7–1. How Touch Events Relate to Respective Mouse Events

Interaction Style

End Event Mouse

Start Event

“Continue” Event

mousedown mouseover mouseup

Touch

touchdown touchmove touchup

CHAPTER 7: Exploring Interactivity

The naming of these functions gives us a clue to the differences between working with touch and mouse events. Both mouse and touch events have “down” and “up” events to signify that interaction has started and ended, respectively.

The primary difference, however, is between the mouseover and touchmove events. A touch event has no concept of hovering, and thus we have no touchover event, so it is replaced with the touchmove event, signifying that a touch event has started and the touch points are changing. This is an important point to note, as familiar web concepts such as “hover states” have no effect on mobile devices, so it’s important to consider alternative mechanisms to provide feedback to your app users.

We will now create our touchcanvas.html and touchcanvas.js files. As per the mouse canvas example, the HTML file is very simple, so just make a copy of the previous mousecanvas.html file and tweak the references.

Our touchcanvas.js file is more or less a replacement of the mouse event handlers with the relevant touch event handlers:

(function() { var canvas = null, context = null;

function resetCanvas() { canvas = document.getElementById("simple");

// set the canvas height to the window height and width canvas.width = window.innerWidth; canvas.height = window.innerHeight;

// get a reference to our drawing context context = canvas.getContext("2d");

} // resetContext

$(window).bind("resize", resetCanvas).bind("reorient", resetCanvas);

$(document).ready(function() { window.scrollTo(0, 1); resetCanvas();

document.body.addEventListener("touchstart", function(evt) { context.beginPath(); context.moveTo(evt.touches[0].pageX, evt.touches[0].pageY);

evt.preventDefault();

}, false);

document.body.addEventListener("touchmove", function(evt) { context.lineTo(evt.touches[0].pageX, evt.touches[0].pageY); context.stroke();

}, false);

document.body.addEventListener("touchend", function(evt) { }, false);

CHAPTER 7: Exploring Interactivity

With the preceding code implemented, you should be able to draw using touch on your Android device and simulate touch events in the emulator. Figure 7–3 shows an example.

Figure 7–3. More advanced drawings are possible given the intuitive nature of the touch interface. The primary differences between this code and the mousecanvas.js file are:

With mouse events, mouse button information is included to signify whether the left, right, or other button was pressed. When it comes to touch events, we have no concept of varying buttons, and as such there is no need to monitor button states. Given this situation, the touchstart handler has no code to do this, and the touchend event handler does nothing and could quite simply be removed.

References to evt.pageX and evt.pageY are replaced with references to the touches array of the event object. In our example, we reference evt.touches[0].pageX and evt.touches[0].pageY to get the screen coordinates of the first touch.

The touchstart handler makes a call to the preventDefault method of the event object to tell the browser not to take any further action with this event. Without this call, the browser will initiate scrolling on the window; this is not desirable behavior, as it would interfere with our attempts to draw in the canvas area.

With the touch canvas example complete, you should now have a basic understanding of how to use both the HTML5 canvas and touch interactivity to create some simple interactive mobile web apps. Time now to take this further.

CHAPTER 7: Exploring Interactivity

NOTE: In the last few chapters, we have been exploring components of the emerging HTML5 spec. As such, it might be natural to expect that touch is part of that specification; however, it isn’t.

A separate W3C working group has been set up for standardizing touch interaction, so over time we would expect the way we implement touch interfaces to change slightly as the different organizations working with touch interfaces come to agree on a standard implementation.

If you are interested, the URL for the working group is www.w3.org/2010/07/touchinterface-charter.html.