hig.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard-cite-them-right
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • sv-SE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • de-DE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
On the precision of third person perspective augmented reality for target designation tasks
University of Gävle, Faculty of Engineering and Sustainable Development, Department of Industrial Development, IT and Land Management, Computer science. Centre for Image Analysis, Uppsala University, Uppsala, Sweden. (Geospatial informationsteknik)
University of Gävle, Faculty of Engineering and Sustainable Development, Department of Industrial Development, IT and Land Management, Computer science. Centre for Image Analysis, Uppsala University. (Geospatial informationsteknik)ORCID iD: 0000-0003-0085-5829
2017 (English)In: Multimedia tools and applications, ISSN 1380-7501, E-ISSN 1573-7721, Vol. 76, no 14, p. 15279-15296Article in journal (Refereed) Published
Abstract [en]

The availability of powerful consumer-level smart devices and off-the-shelf software frameworks has tremendously popularized augmented reality (AR) applications. However, since the built-in cameras typically have rather limited field of view, it is usually preferable to position AR tools built upon these devices at a distance when large objects need to be tracked for augmentation. This arrangement makes it difficult or even impossible to physically interact with the augmented object. One solution is to adopt third person perspective (TPP) with which the smart device shows in real time the object to be interacted with, the AR information and the user herself, all captured by a remote camera. Through mental transformation between the user-centric coordinate space and the coordinate system of the remote camera, the user can directly interact with objects in the real world. To evaluate user performance under this cognitively demanding situation, we developed such an experimental TPP AR system and conducted experiments which required subjects to make markings on a whiteboard according to virtual marks displayed by the AR system. The same markings were also made manually with a ruler. We measured the precision of the markings as well as the time to accomplish the task. Our results show that although the AR approach was on average around half a centimeter less precise than the manual measurement, it was approximately three times as fast as the manual counterpart. Additionally, we also found that subjects could quickly adapt to the mental transformation between the two coordinate systems.

Place, publisher, year, edition, pages
2017. Vol. 76, no 14, p. 15279-15296
Keywords [en]
Augmented reality, Third person perspective, Target designation, Precision study, Experiment
National Category
Human Computer Interaction Computer Sciences
Research subject
Sustainable Urban Development; Intelligent Industry
Identifiers
URN: urn:nbn:se:hig:diva-22349DOI: 10.1007/s11042-016-3817-0ISI: 000404609900004Scopus ID: 2-s2.0-84984843529OAI: oai:DiVA.org:hig-22349DiVA, id: diva2:957928
Available from: 2016-09-05 Created: 2016-09-05 Last updated: 2022-09-19Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Liu, FeiSeipel, Stefan

Search in DiVA

By author/editor
Liu, FeiSeipel, Stefan
By organisation
Computer science
In the same journal
Multimedia tools and applications
Human Computer InteractionComputer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 467 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard-cite-them-right
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • sv-SE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • de-DE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf