Show simple item record

dc.contributor.authorMoran Ledesma, Marco Aurelio
dc.contributor.authorSchneider, Oliver
dc.contributor.authorHancock, Mark
dc.date.accessioned2021-12-21 15:31:42 (GMT)
dc.date.available2021-12-21 15:31:42 (GMT)
dc.date.issued2021-11-05
dc.identifier.urihttps://doi.org/10.1145/3486954
dc.identifier.urihttp://hdl.handle.net/10012/17792
dc.description©2021 Association for Computing Machinery. This is the author’s version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in Proceedings of the ACM on Human-Computer Interaction, https://doi.org/10.1145/3486954.en
dc.description.abstractWhen interacting with virtual reality (VR) applications like CAD and open-world games, people may want to use gestures as a means of leveraging their knowledge from the physical world. However, people may prefer physical props over handheld controllers to input gestures in VR. We present an elicitation study where 21 participants chose from 95 props to perform manipulative gestures for 20 CAD-like and open-world game-like referents. When analyzing this data, we found existing methods for elicitation studies were insufficient to describe gestures with props, or to measure agreement with prop selection (i.e., agreement between sets of items). We proceeded by describing gestures as context-free grammars, capturing how different props were used in similar roles in a given gesture. We present gesture and prop agreement scores using a generalized agreement score that we developed to compare multiple selections rather than a single selection. We found that props were selected based on their resemblance to virtual objects and the actions they afforded; that gesture and prop agreement depended on the referent, with some referents leading to similar gesture choices, while others led to similar prop choices; and that a small set of carefully chosen props can support multiple gestures.en
dc.description.sponsorshipNSERC, Discovery Grant 2016-04422 || NSERC, Discovery Grant 2019-06589 || NSERC, Discovery Accelerator Grant 492970-2016 || NSERC, CREATE Saskatchewan-Waterloo Games User Research (SWaGUR) Grant 479724-2016 || Ontario Ministry of Colleges and Universities, Ontario Early Researcher Award ER15-11-184en
dc.language.isoenen
dc.publisherACMen
dc.relation.ispartofseriesProceedings of the ACM on Human Computer Interaction;
dc.subjecthuman-computer interactionen
dc.subjectsimilarity measuresen
dc.subjectimmersive interactionen
dc.subjectagreement scoreen
dc.subjectelicitation techniqueen
dc.subjectvirtual realityen
dc.subjectgestural inputen
dc.subject3D physical propsen
dc.subjectResearch Subject Categories::TECHNOLOGY::Information technology::Computer science::Computer scienceen
dc.titleUser-Defined Gestures with Physical Props in Virtual Realityen
dc.typeArticleen
dcterms.bibliographicCitationMarco Moran-Ledesma, Oliver Schneider, and Mark Hancock. 2021. User-Defined Gestures with Physical Props in Virtual Reality. Proc. ACM Hum.-Comput. Interact. 5, ISS, Article 488 (November 2021), 23 pages. DOI:https://doi.org/10.1145/3486954en
uws.contributor.affiliation1Faculty of Engineeringen
uws.contributor.affiliation2Games Instituteen
uws.contributor.affiliation2Management Sciencesen
uws.contributor.affiliation2Systems Design Engineeringen
uws.typeOfResourceTexten
uws.peerReviewStatusRevieweden
uws.scholarLevelFacultyen
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages