Abstract:
Mobile Augmented Reality (AR) is most commonly implemented using a camera and a flat screen. Such implementation removes binocular disparity from users’ observation. To compensate, people use alternative depth cues (e.g. depth ordering). However, these cues may also get distorted in certain AR implementations, creating depth distortion. One such example is virtual tracing — creating a physical sketch on a 2D or 3D object given a virtual image on a mobile device. When users’ hands and drawn contours are introduced to the scene, the rendering of the virtual contour with the correct depth order is difficult as it requires real time scene reconstruction. In this paper we explore how depth distortion affects 3D virtual tracing by implementing a first of its kind 3D virtual tracing prototype and run an observational study. Contrary to our initial expectations, drawing performance exceeded our expectations suggesting that the lack of visual depth cues, whilst 3D virtual tracing, is not as important as initially expected. We attributed this to the positive impact of proprioception on drawing performance enhanced by holding the object in hand while drawing. As soon as the participants were asked to hold the mobile device in their hands while drawing, their performance drastically decreased.