ISO/IEC 30113-1:2015
(Main)Information technology — User interface — Gesture-based interfaces across devices and methods — Part 1: Framework
Information technology — User interface — Gesture-based interfaces across devices and methods — Part 1: Framework
ISO/IEC 30113-1:2015 defines a framework and guidelines for gesture-based interfaces across devices and methods in supporting interoperability. NOTE Some of these devices include mice, touch screens, touch pads, 3D mice, joysticks, game controllers, wired gloves, depth-aware cameras, stereo cameras, Web cameras. ISO/IEC 30113-1:2015 does not define or require specific technology for recognizing gesture of users. It focuses on the description of a gesture and its functions for utilizing ICT systems. NOTE Operation of a physical keyboard is not addressed in this part of ISO/IEC 30113.
Technologies de l'information — Interface utilisateur — Interfaces fondés sur la gestuelle entre dispositifs et méthodes — Partie 1: Cadre
General Information
Standards Content (Sample)
INTERNATIONAL ISO/IEC
STANDARD 30113-1
First edition
2015-04-15
Information technology — User
interface — Gesture-based interfaces
across devices and methods —
Part 1:
Framework
Technologies de l’information — Interface utilisateur — Interfaces
fondés sur la gestuelle entre dispositifs et méthodes —
Partie 1: Cadre
Reference number
©
ISO/IEC 2015
© ISO/IEC 2015
All rights reserved. Unless otherwise specified, no part of this publication may be reproduced or utilized otherwise in any form
or by any means, electronic or mechanical, including photocopying, or posting on the internet or an intranet, without prior
written permission. Permission can be requested from either ISO at the address below or ISO’s member body in the country of
the requester.
ISO copyright office
Case postale 56 • CH-1211 Geneva 20
Tel. + 41 22 749 01 11
Fax + 41 22 749 09 47
E-mail copyright@iso.org
Web www.iso.org
Published in Switzerland
ii © ISO/IEC 2015 – All rights reserved
Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Conformance . 1
3 Terms and definitions . 1
4 Overview of gesture-based interface . 2
4.1 General . 2
4.2 User’s actions for gesture input. 2
4.3 Gesture input device . 3
4.4 ICT system . 3
4.5 Cultural Adaptability . 3
4.6 Accessibility . 3
5 Requirements and recommendations. 3
5.1 Activating/finishing a gesture . 3
5.2 Performing a gesture . 4
5.3 Feedback for confirming a gesture . 4
5.4 Feed forward . 4
5.5 Cancelling a gesture . 4
5.6 Criteria of gesture size . 4
5.7 Controlling the criteria . 4
5.8 Changing correspondence of a gesture to a gesture command . 5
5.9 Descriptions of individual gestures within the part . 5
Annex A (informative) Outline for describing the ISO/IEC 30113 series . 6
Bibliography .13
© ISO/IEC 2015 – All rights reserved iii
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work. In the field of information technology, ISO and IEC have established a joint technical committee,
ISO/IEC JTC 1.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular the different approval criteria needed for
the different types of document should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent rights.
Details of any patent rights identified during the development of the document will be in the Introduction
and/or on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation on the meaning of ISO specific terms and expressions related to conformity
assessment, as well as information about ISO’s adherence to the WTO principles in the Technical Barriers
to Trade (TBT), see the following URL: Foreword — Supplementary information.
The committee responsible for this document is ISO/TC JTC 1, Information technology, Subcommittee
SC 35, User interfaces.
ISO/IEC 30113 consists of the following parts, under the general title Information technology — User
interfaces — Gesture-based interfaces across devices and methods:
— Part 1: Framework
— Part 11: Single-point gestures for common system actions
iv © ISO/IEC 2015 – All rights reserved
Introduction
Gestures are used for performing a variety of commands (such as scrolling a Web page up) as an
alternative input method (to typing or using a mouse to select objects).
Given the limited number of basic gestures, the same gesture is often used for a variety of different
commands in different situations. It is important that wherever possible, these different commands are
similar to one another (i.e. by having a similar effect on different objects) so that users are not confused
about what a gesture will do in a given situation.
Standardized gesture descriptions and commands minimize user confusion when interacting with
various software systems and applications on various ICT devices. This International Standard is aimed
at designers and developers of software applications.
This International Standard is intended to help users to more easily navigate and control application
software on various ICT devices by standardizing gestures and gesture commands.
This part of ISO/IEC 30113 defines a framework of gesture-based interfaces to support interoperability
among gesture-based interfaces with various input devices and methods.
Subclause A.1 gives informative description about the structure of ISO/IEC 30113 in detail.
© ISO/IEC 2015 – All rights reserved v
INTERNATIONAL STANDARD ISO/IEC 30113-1:2015(E)
Information technology — User interface — Gesture-based
interfaces across devices and methods —
Part 1:
Framework
1 Scope
This part of ISO/IEC 30113 defines a framework and guidelines for gesture-based interfaces across
devices and methods in supporting interoperability.
NOTE Some of these devices include mice, touch screens, touch pads, 3D mice, joysticks, game controllers,
wired gloves, depth-aware cameras, stereo cameras, Web cameras.
This part of ISO/IEC 30113 does not define or require specific technology for recognizing gesture of
users. It focuses on the description of a gesture and its functions for utilizing ICT systems.
NOTE Operation of a physical keyboard is not addressed in this part of ISO/IEC 30113.
2 Conformance
A gesture-based interface is conformant to this part of ISO/IEC 30113 if it meets all requirements of Clause 5.
3 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
3.1
gesture
movement or posture of the whole body or parts of the body
3.2
gesture-based interface
gesture interface
user interface that provides information and controls for a user to accomplish specific tasks with the
interactive system by his/her gestures
[SOURCE: ISO 9241-171: 3.29]
3.3
gesture command
instruction to the system resulting from a gesture input by the user, e.g. select, move, delete
[SOURCE: ISO/IEC 14754:1999, 4.5]
3.4
gesture software
software for implementing gesture-based interface functionality including gesture recognition,
command processing, and feedback generation
Note 1 to entry: Gesture recognition software is usually contained within the operating system and specific
device drivers. Information on gestures that are recognized is made available to the operating system and/or the
application software, so that the intended command(s) are performed in response to the gesture.
© ISO/IEC ISO pub-date year – All rights reserved 1
4 Overview of gesture-based interface
4.1 General
Users can use gestures to interact with interface objects. Interface objects have representational
properties (e.g. how are they rendered to user) and operational properties (e.g. what do they do) that
can be effected by gestures.
Human-machine interaction involves a loop of execution and evaluation. A machine offers feed forward
and a user manipulates interface objects (execution). The machine displays feedbacks and new feed
forward (evaluation) and the user adjusts manipulation, and so on. The user produces gestures and the
machine understands them based on the properties of the gestures that it recognizes.
For a successful interaction, the machine needs an input device in order to collect gesture properties.
Those properties will be analysed by gesture software to compare those properties to pre-defined
gesture command properties, and then decide to operate associated functions.
Figure 1 illustrates a model of human-machine interaction based on a gesture-based interface. It presents
a schematic diagram of relationships among the user, gesture command, input device and machine
(ICT system) when the user utilizes a gesture-based interface during human-machine interaction. The
gesture-based interface includes hardware (physical) and software (logical) components. The input
device is the hardware which recognizes the gesture and sends its associated input signal to the ICT
system. The gesture software finds a command which is pre-defined and mapped to the input signal.
The application software generates its feedback to the user using the output device.
Input device
ICT device
Input Signal
User
Gesture SW
Command
Application SW
Output device
Figure 1 — Loop of human-machine interaction with a gesture-based interface
4.2 User’s actions for gesture input
A user generates actions for gesture inputs which are two-dimensional motions relative to its supporting
surface, two-dimensional or three-dimensional finger/hand/body postures/motions in a space,
postures/motions of fingers on a surface and so on. A gesture can also be generated by a tool, as an
extension of the body (such as: a wand, a pen, a mouse, a remote control or a glove).
Some gestures are controlled by a discrete body part such as one finger, several fingers, hand movement
or fingers associated to hand movement. Facial expression, eye gaze and eyelid blinking can also provide
a user’s action for gesture input. Other gestures might be generated with a whole body or a coordination
of several body parts coordination. They could involve arms, hand and fingers, and their coordination.
2 © ISO/IEC 2015 – All rights reserved
Physiological constraints which apply to gesture generation are important to take into account before
defining gestures. For example, some gestures are difficult to be produced with a mouse in the hand on
a 2D surface, however, easy to be produced with a finger on a 2D surface.
All gestures invol
...
Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.