User Agents that run on terminals which provide touch input to use web applications typically use interpreted mouse events to allow the users to access interactive web applications. However, these interpreted events, being normalized data based on the physical touch input, tend to have limitations on delivering the exact intended user experience. Additionally, it is not possible to handle concurrent input regardless of the device capability, due the constraints of mouse events: both system level limitations, and legacy compatibility.
Meanwhile, native applications are capable of handling both cases with the provided system APIs.
The Touch Events specification provides a solution to this with a interface to allow web applications to directly handle touch events, and multiple touch points for capable devices.
This specification defines conformance criteria that apply to a single product: the user agent that implements the interfaces that it contains.
Implementations that use ECMAScript to implement the APIs defined in this specification must implement them in a manner consistent with the ECMAScript Bindings defined in the Web IDL specification [[!WEBIDL]] as this specification uses that specification and terminology.
A conforming implementation is required to implement all fields defined in this specification.
This interface defines an individual point of contact for a touch event.
The algorithm for determining the identifier value is as follows:
0
1
if no value is known. The value must be positive.
Issue: What are units of radiusX/radiusY? CSS Pixels?
1
if no value is known. The value must be positive.
the angle (in degrees) that the ellipse described by radiusX
and radiusY is rotated clockwise about its center;
0
if no value is known. The value must be greater
than or equal to 0
and less than 90
.
If the ellipse described by radiusX and radiusY is
circular, then rotationAngle has no effect. The user agent
may use 0
as the value in this case, or it may use any
other value in the allowed range. (For example, the user agent may
use the rotationAngle value from the previous touch event,
to avoid sudden changes.)
0
to 1
, where 0
is no pressure, and 1
is the highest level of pressure the touch device is capable of sensing; 0
if no value is known. In environments where force is known, the absolute pressure represented by the force attribute, and the sensitivity in levels of pressure, may vary.
Issue: Consider aligning with other "channels" and values from Ink Markup Language (InkML), in addition to force, e.g. adding angle, clientZ, rotation, etc.
This interface defines a list of individual points of contact for a touch event.
This interface defines the touchstart, touchend, touchmove, touchenter, touchleave, and touchcancel event types.
true
if the alt (Alternate) key modifier is activated; otherwise false
true
if the meta (Meta) key modifier is activated; otherwise false
. On some platforms this attribute may map to a differently-named key modifier.
true
if the ctrl (Control) key modifier is activated; otherwise false
true
if the shift (Shift) key modifier is activated; otherwise false
null
.
Issue: define behavior of preventDefault() method.
A user agent must dispatch this event type to indicate when the user places a touch point on the touch surface.
A user agent must dispatch this event type to indicate when the user removes a touch point from the touch surface, also including cases where the touch point physically leaves the touch surface, such as being dragged off of the screen.
A user agent must dispatch this event type to indicate when the user moves a touch point along the touch surface, even outside the interactive area of the target element.
If the values of radiusX, radiusY, rotationAngle, or force are known, then the user agent also must dispatch this event type to indicate when any of these attributes of a touch point have changed.
Note that the rate at which the user agent sends touchmove events is implementation-defined, and may depend on hardware capabilities and other implementation details.
A user agent must dispatch this event type to indicate when a touch point moves onto the interactive area defined by a DOM element. Events of this type must not bubble.
A user agent must dispatch this event type to indicate when a touch point moves off the interactive area defined by a DOM element. Events of this type must not bubble.
A user agent must dispatch this event type to indicate when a TouchPoint has been disrupted in an implementation-specific manner, such as a synchronous event or action originating from the UA canceling the touch, or the touch point leaving the document window into a non-document area which is capable of handling user interactions. (e.g. The UA's native user interface, plug-ins) A user agent may also dispatch this event type when the user places more touch points on the touch surface than the device or implementation is configured to store, in which case the earliest TouchPoint object in the TouchList should be removed.
The user agent may dispatch both touch events and mouse events [[!DOM-LEVEL-2-EVENTS]] in response to the same user input. If the user agent dispatches both touch events and mouse events in response to a single user action, then the touchstart event type must be dispatched before any mouse event types for that action. The ordering of any further touch events and mouse events is left to the implementation, except as specified elsewhere.
Many thanks to the WebKit engineers for developing the model used as a basis for this spec, Neil Roberts (SitePen) for his summary of WebKit touch events, Peter-Paul Koch (PPK) for his writeups and suggestions, Robin Berjon for developing the ReSpec.js spec authoring tool, and the WebEvents WG for their many contributions.
Many others have made additional comments as the spec developed, which have led to steady improvements. Among them are Matthew Schinckel, . If I inadvertently omitted your name, please let me know.