Concept
user experienceGesture Alphabet
We propose a gesture alphabet to describe gestures that are used to interact with an elastic display. The alphabet contains 7 basic hand postures which can be classified into single finger, multiple finger and hand postures, reflecting the area touched on the display when performing an interaction. According to Kammer et al.’s classification in Investigating Gestures on Elastic Tabletops, elastic gestures also differ in the type of deformation they cause on the display (push, touch or pull). Furthermore, they can be static, dynamic or circular using one or both hands. In real application scenarios, several users often interact with the application simultaneously.
Research Background
Gestures performed on an elastic display are a subset of gestures described in HCI research. For example, interaction does not contain gestures performed in mid-air or beat gestures marking the rhythm of speech. Elastic displays are instead suitable for types of gestures that can be performed by physically touching and shaping a malleable surface. In research, such gestures are most often referred to as deictic gestures (pointing), iconic gestures (follow conventions to illustrate speech such as thumbs up) and symbolic/emblematic gestures (independent of speech such as sign-language).
The gesture alphabet is derived from 27 single- and both-handed gestures defined by Troiano et al. in User-Defined Gestures for Elastic, Deformable Displays) and a 10 gesture collection by Dand and Helmsley in Obake: interactions with a 2.5D elastic display. When combining the two collections we adopted only those gestures which in our experience have proven to be useful.
Examples
There is no example yet. Feel free to edit this page.