Intuitive Gesture Responses to Public Walk-Up-and-Use-Interactions

136160-Thumbnail Image.png
Description
Technological advances in the past decade alone are calling for modifications to the usability of various devices. Physical human interaction is becoming a popular method to communicate with user interfaces. This ranges from touch-based devices such as an iPad or

Technological advances in the past decade alone are calling for modifications to the usability of various devices. Physical human interaction is becoming a popular method to communicate with user interfaces. This ranges from touch-based devices such as an iPad or tablet to free space gesture systems such as the Microsoft Kinect. With the rise in popularity of these types of devices comes the increased amount of them in public areas. Public areas frequently use walk-up-and-use displays, which give many people the opportunity to interact with them. Walk-up-and-use displays are intended to be simple enough that any individual, regardless of experience using similar technology, will be able to successfully maneuver the system. While this should be easy enough for the people using it, it is a more complicated task for the designers who are in charge of creating an interface simple enough to use while also accomplishing the tasks it was built to complete. A serious issue that I'll be addressing in this thesis is how a system designer knows what gestures to program the interface to successfully respond to. Gesture elicitation is one widely used method to discover common, intuitive, gestures that can be used with public walk-up-and-use interactive displays. In this paper, I present a study to extract common intuitive gestures for various tasks, an analysis of the responses, and suggestions for future designs of interactive, public, walk-up-and use interactions.
Date Created
2015-05
Agent

Efficient Gestures In Users' Preference, Health, And Natural Inclination For Non-Touch-Based Interface

136153-Thumbnail Image.png
Description
Along with the number of technologies that have been introduced over a few years ago, gesture-based human-computer interactions are becoming the new phase in encompassing the creativity and abilities for users to communicate and interact with devices. Because of how

Along with the number of technologies that have been introduced over a few years ago, gesture-based human-computer interactions are becoming the new phase in encompassing the creativity and abilities for users to communicate and interact with devices. Because of how the nature of defining free-space gestures influence user's preference and the length of usability of gesture-driven devices, defined low-stress and intuitive gestures for users to interact with gesture recognition systems are necessary to consider. To measure stress, a Galvanic Skin Response instrument was used as a primary indicator, which provided evidence of the relationship between stress and intuitive gestures, as well as user preferences towards certain tasks and gestures during performance. Fifteen participants engaged in creating and performing their own gestures for specified tasks that would be required during the use of free-space gesture-driven devices. The tasks include "activation of the display," scroll, page, selection, undo, and "return to main menu." They were also asked to repeat their gestures for around ten seconds each, which would give them time and further insight of how their gestures would be appropriate or not for them and any given task. Surveys were given at different time to the users: one after they had defined their gestures and another after they had repeated their gestures. In the surveys, they ranked their gestures based on comfort, intuition, and the ease of communication. Out of those user-ranked gestures, health-efficient gestures, given that the participants' rankings were based on comfort and intuition, were chosen in regards to the highest ranked gestures.
Date Created
2015-05
Agent