Title:
Novel gestures for wearables

dc.contributor.advisor Inan, Omer T.
dc.contributor.author Zhang, Cheng
dc.contributor.committeeMember Starner, Thad E.
dc.contributor.committeeMember Harrison, Chris
dc.contributor.committeeMember Plötz, Thomas
dc.contributor.department Interactive Computing
dc.date.accessioned 2019-05-29T13:59:04Z
dc.date.available 2019-05-29T13:59:04Z
dc.date.created 2018-05
dc.date.issued 2018-04-09
dc.date.submitted May 2018
dc.date.updated 2019-05-29T13:59:04Z
dc.description.abstract Wearable computing is an inevitable part of the next generation of computing [abowd2016beyon]. Compared with traditional computers (e.g., laptop, smartphones), wearable devices are much smaller, creating new challenges for the design of both hardware and software. Providing appropriate input capabilities for wearables is one such challenge. The input techniques that have been proven efficient on traditional computing devices (e.g., keyboard, touchscreen) are no longer appropriate for wearables due to various reasons. One is the inherently small size of wearables. For instance, it is impossible to place a physical keyboard on a wearable device. Most of the commodity wearable devices such as the smartwatch and Google Glass adopt a touch-based input solution, which suffers from the small operation area as well as the limited richness of input vocabulary. The other reason is the more dynamic working environment the wearables are exposed to. For instance, wearable devices are expected to be functional and efficient even when the user is in motion (e.g., walking). Traditional input devices are no longer appropriate to address these challenges. Compared with input on the physical keyboard or touchscreen, gesture-based input provides the user with much larger freedom of operation, which can potentially improve the interaction experience. In this thesis, I explore designing and implementing various novel gestures to address the input challenges on wearables: from using the built-in sensors of an off-the-shelf device to building customized hardware; from 2D on-body interaction to 3D input with a larger freedom of operation; from recognizing predefined and discrete hand or finger gestures with machine learning to providing continuous input tracking through a deeper understanding of physics. I start with exploring the natural and novel input gestures that can be supported by using only the built-in sensors of a smartwatch. I describe WatchOut and TapSkin, which allow user input on the watch case, band, and the skin around the watch. Though using only the built-in sensors is more practical, the richness of input gestures and performance of recognition are not optimized because of the limited choice of sensors. To better address the input challenge of wearable, I designed and implemented another sets of wearable input techniques using customized hardware (e.g., a thumb-mounted ring), which provides new input gestures for wearable that are not available on a commodity device, such as, input number digits, Graffiti-style characters, menu selection and quick response with the protection of users' privacy. However, these complementary gestures can only partially improve the interaction experience. To fundamentally address the input challenge on wearables, new interaction paradigms are needed to replace the touch-based on-device interaction. Such an interaction paradigm is usually comprised of various low-level input events of high resolution. For instance, the press and release of the mouse keys are the low-level input events for the WIMP interface. This leads to the last section of my dissertation work: providing low-level input events for wearables, such as continuously tracking of the position of different body parts of interest in 3D space using wearables. I also demonstrate how such low-level input event can be used to design interaction on wearables.
dc.description.degree Ph.D.
dc.format.mimetype application/pdf
dc.identifier.uri http://hdl.handle.net/1853/61142
dc.language.iso en_US
dc.publisher Georgia Institute of Technology
dc.subject Wearables
dc.subject Gestures
dc.title Novel gestures for wearables
dc.type Text
dc.type.genre Dissertation
dspace.entity.type Publication
local.contributor.advisor Inan, Omer T.
local.contributor.corporatename College of Computing
local.contributor.corporatename School of Interactive Computing
relation.isAdvisorOfPublication fb82ce90-ad3a-45a6-b0e2-f1ee6fe6f744
relation.isOrgUnitOfPublication c8892b3c-8db6-4b7b-a33a-1b67f7db2021
relation.isOrgUnitOfPublication aac3f010-e629-4d08-8276-81143eeaf5cc
thesis.degree.level Doctoral
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
ZHANG-DISSERTATION-2018.pdf
Size:
17.85 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
3.86 KB
Format:
Plain Text
Description: