Intuitive Gesture Responses to Public Walk-Up-and-Use-Interactions
Description
Technological advances in the past decade alone are calling for modifications to the usability of various devices. Physical human interaction is becoming a popular method to communicate with user interfaces. This ranges from touch-based devices such as an iPad or tablet to free space gesture systems such as the Microsoft Kinect. With the rise in popularity of these types of devices comes the increased amount of them in public areas. Public areas frequently use walk-up-and-use displays, which give many people the opportunity to interact with them. Walk-up-and-use displays are intended to be simple enough that any individual, regardless of experience using similar technology, will be able to successfully maneuver the system. While this should be easy enough for the people using it, it is a more complicated task for the designers who are in charge of creating an interface simple enough to use while also accomplishing the tasks it was built to complete. A serious issue that I'll be addressing in this thesis is how a system designer knows what gestures to program the interface to successfully respond to. Gesture elicitation is one widely used method to discover common, intuitive, gestures that can be used with public walk-up-and-use interactive displays. In this paper, I present a study to extract common intuitive gestures for various tasks, an analysis of the responses, and suggestions for future designs of interactive, public, walk-up-and use interactions.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
2015-05
Agent
- Author (aut): Van Horn, Sarah Elizabeth
- Thesis director: Walker, Erin
- Committee member: Danielescu, Andreea
- Contributor (ctb): Economics Program in CLAS
- Contributor (ctb): Department of Finance
- Contributor (ctb): Barrett, The Honors College