ABSTRACT
In this paper we investigate user performance and user behavior, related to the issue of who controls the point of view in a remote assistance scenario. We describe an experiment that examined users completing two different tasks with the aid of a remote gesturing device under two conditions: when control of the camera and gesturing point of view was in the hands of the remote helper, and when it was in the hands of the worker. Results indicate that in general, when most of the knowledge is with the helper, it is preferable to leave control in the hands of the helper. However, these results may depend on the situation and task at hand.
- Baird, K. M. and Barfield, W. Evaluating the effectiveness of augmented reality displays for a manual assembly task. Virtual Reality 4, 4 (1999), 250--259.Google ScholarDigital Library
- Bauer, M., Kortuem, G., Segall, Z. Where are you pointing at? A study of remote collaboration in a wearable videoconference system. Proc. of International Symposium on Wearable computers (1999). 151--161 Google ScholarDigital Library
- Birnholtz, J., Ranjan, A., Balakrishnan, R., Providing Dynamic Visual Information for Collaborative Tasks: Experiments With Automatic Camera Control, Human Computer Interaction 25, 3 (2010), 261--287.Google ScholarCross Ref
- Clark, H. H., & Brennan, S. E. (1991). Grounding in communication. Perspectives on socially shared cognition, 13(1991), 127--149.Google Scholar
- Flagg, M. and Rehg, J. M. Projector guided painting. In Proc. UIST, ACM Press (2006), 235--244. Google ScholarDigital Library
- Fussell, S. R., Setlock L. D., and Kraut R. E., Effects of headmounted and scene-oriented video systems on remote collaboration on physical tasks. In Proc. CHI 2003, ACM Press (2003), 513--520. Google ScholarDigital Library
- Fussell, S. R., Setlock, L. D., Yang J., Ou J., Mauer E., and Kramer, A. D. I. Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks, Human-Computer Interaction, 19, 3 (2004), 273--309. Google ScholarDigital Library
- Gergle, D., Kraut, R. E., and Fussell, S. R. Action as language in a shared visual space. In Proceedings of. CSCW'04, ACM Press (2004), 487--496. Google ScholarDigital Library
- Gurevich, P., Lanir, J., Cohen, B., and Stone, R., TeleAdvisor: a Versatile Augmented Reality Tool for Remote Assistance. In Proc. CHI'12, (2012), 619--622. Google ScholarDigital Library
- Hughes, S., Lewis, M., Robotic camera control for remote exploration. In Proc. CHI 2004, ACM Press (2004), 511--517. Google ScholarDigital Library
- Kirk, D., Fraser, D. S., Comparing remote gesture technologies for supporting collaborative physical tasks. In Proc. CHI '06, ACM Press (2006), 1191--1200. Google ScholarDigital Library
- Kirk, D., Rodden, T., Fraser, D. S. Turn it this way: grounding collaborative action with remote gesture. In Proc. CHI '07, ACM Press (2007), 1039--1048. Google ScholarDigital Library
- Kraut, R. E., Fussell, S. R., & Siegel, J. Visual information as a conversational resource in collaborative physical tasks. Human-Computer Interaction, 18, (2003), 13--49. Google ScholarDigital Library
- Kuzuoka, H., Kosuge, T., & Tanaka, K. GestureCam: A video communication system for sympathetic remote collaboration, Proceedings of CSCW'94, (1994), 35--43. Google ScholarDigital Library
- Kuzuoka H., et al., GestureMan: A mobile robot that embodies a remote instructor's actions, in Proc. CSCL 2000, ACM Press (2000), 155--162. Google ScholarDigital Library
- Lanir, J., Kuflik, T., Wecker, A. J., Stock, O., and Zancanaro, M. Examining proactiveness and choice in a location-aware mobile museum guide. Interacting with Computers. 23, 5, (2011), 513--524. Google ScholarDigital Library
- Linder, N., Maes, P. LuminAR: portable robotic augmented reality interface design and prototype. In Adjunct proc. UIST '10. ACM Press, (2010), 395--396. Google ScholarDigital Library
- Mistry, P., Maes, P., Chang, L. WUW - Wear Ur World: a wearable gestural interface. In Ext. abstracts CHI'09. ACM Press. (2009) Google ScholarDigital Library
- Motokawa, Y. and Saito, H. Support system for guitar playing using augmented reality display. Proc. ISMAR, 2006, 243--244. Google ScholarDigital Library
- Neumann, U. and Majoros, A. Cognitive, performance, and systems issues for AR applications in manufacturing and maintenance. Proc. of VR, (1998), 4--11. Google ScholarDigital Library
- O'Neill J., Castellani S., Roulland F., Hairon N., Juliano C, Dai L. From ethnographic study to mixed reality: a remote collaborative troubleshooting system. Proceedings of CSCW'11, ACM Press. (2011), 225--234. Google ScholarDigital Library
- Ou, J., Fussell, S. R., Chen, X., Setlock, L. D., & Yang, J. Gestural communication over video stream: Supporting multimodal interaction for remote collaborative physical tasks. Proceedings of ICMI 2003, ACM Press (2003), 242--249. Google ScholarDigital Library
- Ranjan, A., Birnholtz, J. P., Balakrishnan, R., Dynamic Shared Visual Spaces: Experimenting with Automatic Camera Control in a Remote Repair Task. In Proc. of CHI 2007, ACM Press (2007), 1177--1186 Google ScholarDigital Library
- Shneiderman, B. & Plaisant, C., Designing the user interface: Strategies for effective Human-Computer Interaction. Boston, MA: Addison Wesley, 2004 Google ScholarDigital Library
- Sodhi, R., Benko H., & Wilson A. LightGuide: projected visualizations for hand movement guidance. In proc. CHI'12, ACM Press (2012). 179--188. Google ScholarDigital Library
- Tang, A., Neustaedter, C., and Greenberg, S. VideoArms: Embodiments for Mixed Presence Groupware. In Proc. of the 20th British HCI Conference (HCI'06). (2006). 85--102.Google Scholar
Index Terms
- Ownership and control of point of view in remote assistance
Recommendations
Comparing remote gesture technologies for supporting collaborative physical tasks
CHI '06: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsThe design of remote gesturing technologies is an area of growing interest. Current technologies have taken differing approaches to the representation of remote gesture. It is not clear which approach has the most benefit to task performance. This study ...
Remote Assistance for Blind Users in Daily Life: A Survey about Be My Eyes
PETRA '16: Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive EnvironmentsThe mobile application Be My Eyes connects blind users with sighted volunteers through a video and audio connection. In contrast to previous attempts, Be My Eyes has a large number of blind and sighted users. In this paper we report from a survey among ...
A Web-Based Remote Assistance System with Gravity-Aware 3D Hand Gesture Visualization
ISS '19: Proceedings of the 2019 ACM International Conference on Interactive Surfaces and SpacesWe present a remote assistance system that enables a remotely located expert to provide guidance using hand gestures to a customer who performs a physical task in a different location. The system is built on top of a web-based real-time media ...
Comments