Adobe is changing the world through digital experiences. Our creative, marketing and document solutions empower everyone — from emerging artists to global brands — to bring digital creations to life and deliver them to the right person
at the right moment for the best results.

Preorder Estimated Availability Date. Your credit card will not be charged until the product is shipped. Estimated availability date is subject to change.Preorder Estimated Availability Date. Your credit card will not be charged until the product is ready to download. Estimated availability date is subject to change.

User level

Sample files

When talking about features of iPhones, iPads, or Android devices, the first thing that comes to mind is their support for multi-touch events. Multi-touch refers to the ability of a touch-sensing surface to recognize the presence of two or more points of contact with the surface. This plural-point awareness is often used to implement advanced functionality such as zoom, or to activate predefined programs. Here, I explore the concepts of touch events and gestures in reference to normal HTML and JavaScript rendered in the browsers (Safari/WebKit) of iPhone, iPad, or Android devices.

Touch events supported by iPhones, iPads, and Android devices

All touch-enabled devices provide support for touch events—a set of events that let you take advantage of the touch screen interface. When you put a finger down on the touch-enabled screen (say, an iPhone or Android screen), it starts the lifecycle of a touch event. Within such a device's web browser, each time a new finger touches the screen, a new DOM touchstart event happens. As each finger lifts up, a touchend event happens. If, after touching the screen, you move any of your fingers around, touchmove events happen.

So we have the following touch events in the DOM:

touchstart : Initiated when a finger is placed on the screen

touchend : Kicked off every time a finger is removed from the screen

touchmove : Triggered when a finger already placed on the screen is moved across the screen

touchcancel : The system cancels events

And the good thing is that the Webkit engine (the HTML layout rendering engine inside the Safari and Android native browsers) supports all these events. So, by implementing touch events in our Edge composition, you can benefit from touch-enabled screens to give the user a richer user experience on iPhone and Android devices.

Differences between touch and mouse events

When you start comparing the touch events with the mouse events, notice some differences:

A touch is very hard to keep steady at one point. In the case of a mouse, it can stay at a fixed position. With a touch on the screen, you can go directly from a touchstart event to a touchmove event and thereby to a touchend . In the case of a mouse, a mousedown event is not required to happen before a mousemove .

There is no mouseover equivalent in touch events. So if you have any functionality in your application that uses mouseover to trigger something, you need to change that part in the context of a touch-enabled user experience.

iPhone, iPad and Android devices are developed with the fact in mind that these will be used by touch from human fingers, so a touch point of the surface is an averaged point taken from the surface area in contact with the pointing device (the finger) translated to pixel coordinates—like finding the center of a circle. In the case of a mouse, the mouse position/coordinate is very precise, and no averaging need be done.

The WebKit Event Object for touch events

With a mouse, there is only one point of contact: where the cursor is positioned in the screen. But things are different in the case of touch events. In the real world, it is possible for the user to keep two or more fingers held down on the left of the screen while at the same time tapping the right side of the screen. For that reason, the TouchEvent object in WebKit has a list (an array) called touches containing information for each and every finger currently in touch with the screen. There are two more lists in the event object. One, named targetTouches , contains the information for fingers that originated from the same node or target element on screen. The other list, changedTouches , contains only information for fingers associated with the current event.

touches : an array of touch information created upon touchstart and touchmove , but not touchend events

targetTouches : an array of information for touches originating from the same target element

changedTouches : an array of touch information regarding the current event

The data stored in the above lists are stored as objects as follows:

target : The element in the HTML DOM from which the touch event originated

identifier : The identifying number, unique to each touch event, that can be used to track it

clientX : The x coordinate of the touch relative to the viewport or the viewing area browser. This excludes the scroll offset in the browser.

clientY : The y coordinate of the touch relative to the viewport or the viewing area browser. This excludes the scroll offset in the browser.

screenX : The x coordinate relative to the screen.

screenY : The y coordinate relative to the screen.

pageX : The x coordinate relative to the full page, which includes scrolling.

pageY : The y coordinate relative to the full page, which includes scrolling.

Enough of concepts! You'll now do some experimenting to see these concepts in action and make things more clear.

Setting up your environment

You will now create and edit some HTML and JavaScript files outside of Edge to understand certain concepts related to touch and gesture events. To test these on a device, you should be able to access them on a device browser through a URL.

Set up a local web server

For this, you need to set up a local server on your PC and host your HTML and associated files. Then you can access them on your iPhone, iPad or Android device. First, run a local server on your PC; you can run IIS (if you are on a Windows system) or Apache. Many free small web servers are available on the web that you can download and use.

If you want to install an Apache server, I would recommend XAMPP, an easy-to-install Apache distribution that contains MySQL, PHP, and Perl. XAMPP is a good server for first-timers and is available for Windows, Mac, Linux, and Solaris.

Run the xampp-control file to open a window listing the different servers to run (see Figure 1).

Figure 1. XAMPP control panel provides an easy way to, start and stop the server.

Click the Start button for Apache, and you have your Apache server running!

By default, Apache server uses port 80, so you can access this server in your browser through the localhost URL, like so: http://localhost:80/.

To serve your HTML pages through this server so that they can be accessed through the localhost URL, place them in the folder called htdocs found inside the xampp folder.

You can place your files directly in that folder, or alternatively place a project folder inside htdocs. For example if you place index.html inside htdocs, you can access that through the URL http://localhost:80/index.html. In the case where you place a project folder named project1 (containing a file index.html) inside htdocs, then you can access it through the URL http://localhost:80/project1/index.html. See Figure 2.

Figure 2. Files should be placed inside htdocs to be served by the server through the localhost URL.

Editing files on your web server

To make changes, you can directly edit the files in the server folders. Then, on the device, you refresh the browser to see the effect of the changes you have made to your files.

Deploy the sample files on your local server:

Download and unzip the sample file referenced at the top of this article.

Copy the TouchEventsWithOutEdge folder into the web page folder of your local web server (see Figure 3). For example, if you used the XAMPP installation as I've described, you would copy the TouchEventsWithOutEdge folder into htdocs.

Note: When you create or edit the Edge composition files using the Edge IDE, you don't need to run a separate web server. The Edge IDE has a web server enabled that serves the HTML pages to the browser when you preview the project in the browser.

Connect your device

In case you do not have a WiFi router or hub, you can download and run a virtual WiFi hotspot. For example, running Connectify on your PC creates a WiFi hotspot through which your device can connect to your server (see Figure 4).

Figure 4. If you don't have WiFi, create a hotspot with a program like Connectify.

Connect your device to the same WiFi network as the PC that is hosting the local web site. Once your PC is connected through a WiFi network, it will be assigned an IP address. Now the same localhost URL you used (for example, http://localhost:80/project1/index.html) can be accessed based on this IP address, like so: http://192.168.0.100:80/project1/index.html. Basically, in the URL, the "localhost" part is replaced with the IP address of the PC hosting the webserver. This URL can be used to access the file from the server from any device connected to the same network, either through the same WiFi or by any other means (such as a LAN network). So the important thing here is to get the IP address of your PC running the web server.

To obtain the IP address of the development computer:

On Windows, start a command-line session and run the ipconfig command. Look for the IPv4 entry in the results displayed (see Figure 5).

On Mac OS, on the Apple menu, select System Preferences. In the System Preferences window, click the Network icon, which will open the Network preferences window that displays the the IP address.

On your device, open the browser. (On iPhone or iPad, use Safari. On an Android device, use the native WebKit-based browser.) Type in the web server address prefixed with the IP address of the system. You will be now able to access the HTML files you placed on your server directly from your device.

Now access the folder from your device. You will be able to see the list of all the files in this folder. Select TouchEventsWithOutEdge.html to open it in your device's browser (see Figure 6).

Note: I am using an iPad to load it, and so the screenshots will be based on iPad.

Figure 6. Open TouchEventsWithOutEdge.html on your device browser.

Viewing touch events

Once TouchEventsWithOutEdge.html is loaded on your device, you will be able to see a green rectangular area on the page (created by a <div> element in the code). Now touch it with just one finger. You will see the information regarding the finger touch within the rectangle itself (see Figure 7).

Similarly, if you touch with three fingers, you will see the three different lists with information related to each respective finger. Notice that in each case, the identifieris unique,and the finger coordinates for each finger are different based on the actual contact point of each finger.

Next, check the objects associated with touchmove. Open the file TouchEventsWithOutEdgeWithMove.html. This file has additional code to record the objects associated with touchmove events. It does so by means of the following lines of additional code in the init()function, where it tracks the details associated with touchmove events:

Now, take a look at the information stored in the three different lists of the touch event. To do this, open TouchEventsWithOutEdgeWithMoveAndEnd-targetTouches-ChangedTouches.html on your device browser.

When you touch with one finger, all three lists will have the same information. Here changedTouches will have the information for the same finger as that which caused the change in the event.

If you touch the green rectangle with two fingers at exactly the same time, the changedTouches object will have information about both fingers. But if you touch the second finger after the first one, touches will have two sets of information, one for each finger. The targetTouches list will have two items as both fingers are placed in the same area; however, changedTouches will have the information related to the second finger, because the second finger triggered this event.

If you move your fingers, then only changedTouches will have the updated information. Depending on the number of fingers moving, it will have different sets of information related to each respective finger.

When you remove your finger, it will be removed from touches and targetTouches , and the updated information will appear in changedTouches , because it's what caused the event.

Removing all fingers will remove all the information from the touches and targetTouches lists, leaving them empty, and changedToucheswill contain information for the last finger removed.

Note: Apple's WebKit implementation has a few things that are different from the Android implementation. The touchend event removes the current touch from the event.touches array. With Apple's implementation, you have to look inside the event.changedTouches array.

Viewing gesture events

Gestures are a special set of events where more than one finger is used on a multi-touch screen. A gesture event occurs any time two or more fingers are touching the screen. If either finger lands in the node to which you've connected any of the gesture handlers ( gesturestart , gesturechange , gestureend ), you'll start receiving the corresponding events.

The event object for gesture events looks very different from that for touch events. It contains scale and rotation values and no touch objects.

The scale and rotation values are the two important keys of this event object. While scale gives you the multiplier the user has pinched or pushed in the gesture (relative to 1), rotation gives you the amount in degrees the user has rotated their fingers.

Copy the GestureEventsWithOutEdge folder (from the sample files) and place it in your local web server to access it from your multi-touch device (see Figure 9).

Once it is loaded in your device browser, you will see a similar rectangular shape (the <div> element) on the page. Now, with two fingers, try to scale it and at the same time rotate it. Notice as you move both fingers that the information regarding gesturestart ,gesturechange and their associated information objects like the scale and rotation will appear (see Figure 11).

Figure 11. Track gesture events as you scale and rotate the green rectangle.

If you open this file in a text editor, you will see the following lines of code actually getting this information from gesture events:

Where to go from here

Adobe Edge implements some of the custom events of jQuery Mobile which offers several custom events built on native events to create useful hooks for development. These events employ various touch, mouse, and window events. These events can be bound to different elements and the window for use in both handheld and desktop environments. You can bind to these events as you would with other jQuery events, using live() or bind() .