Augmented reality views for occluded interaction

Klemen Lilija, Henning Pohl, Sebastian Boring, Kasper Hornbæk

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

23 Citations (Scopus)

Abstract

We rely on our sight when manipulating objects. When objects are occluded, manipulation becomes difficult. Such occluded objects can be shown via augmented reality to re-enable visual guidance. However, it is unclear how to do so to best support object manipulation. We compare four views of occluded objects and their effect on performance and satisfaction across a set of everyday manipulation tasks of varying complexity. The best performing views were a see-through view and a displaced 3D view. The former enabled participants to observe the manipulated object through the occluder, while the latter showed the 3D view of the manipulated object offset from the object’s real location. The worst performing view showed remote imagery from a simulated hand-mounted camera. Our results suggest that alignment of virtual objects with their real-world location is less important than an appropriate point-of-view and view stability.

Original languageEnglish
Title of host publicationCHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
Number of pages12
PublisherAssociation for Computing Machinery
Publication date2019
Article number446
ISBN (Electronic)9781450359702
DOIs
Publication statusPublished - 2019
Event2019 CHI Conference on Human Factors in Computing Systems, CHI 2019 - Glasgow, United Kingdom
Duration: 4 May 20199 May 2019

Conference

Conference2019 CHI Conference on Human Factors in Computing Systems, CHI 2019
Country/TerritoryUnited Kingdom
CityGlasgow
Period04/05/201909/05/2019
SponsorACM SIGCHI

Keywords

  • Augmented reality
  • Finger-camera
  • Manipulation task

Cite this