ABSTRACT
Many mobile phones integrate services such as personal calendars. Given the social nature of the stored data, however, users often need to access such information as part of a phone conversation. In typical non-headset use, this re-quires users to interrupt their conversations to look at the screen.
We investigate a counter-intuitive solution: to avoid the need for interruption we replace the visual interface with one based on auditory feedback. Surprisingly, this can be done without interfering with the phone conversation. We present blindSight, a prototype application that replaces the traditionally visual in-call menu of a mobile phone. Users interact using the phone keypad, without looking at the screen. BlindSight responds with auditory feedback. This feedback is heard only by the user, not by the person on the other end of the line.
We present the results of two user studies of our prototype. The first study verifies that useful keypress accuracy can be obtained for the phone-at-ear position. The second study compares the blindSight system against a visual baseline condition and finds a preference for blindSight.
- microsoft.com/windowsmobile/voicecommand/default.mspxGoogle Scholar
- www.tellme.com.Google Scholar
- Aucella, A. F. and Ehrlich, S. F. Voice messaging enhancing the user interface design based on field performance. Proc. CHI '86, pp. 156--161. Google ScholarDigital Library
- Baudisch, P. Don't Click, Paint! Using Toggle Maps to Manipulate Sets of Toggle Switches. Proc. UIST '98, pp. 65--66. Google ScholarDigital Library
- Blanchard, H.E. and Lewis, S.H. (1999). Voice messaging user interface. In D. Gardner-Bonneau (Ed.), Human Factors and Voice Interactive Systems, Kluwer, pp. 257--284.Google ScholarCross Ref
- Brewster, S. (1998). Using nonspeech sounds to provide navigation cues. ACM TOCHI 5(3). pp. 224--259. Google ScholarDigital Library
- Dietz, P. H. and Yerazunis, W. S. Real-time audio buffering for telephone applications. Proc. UIST'01, pp. 193--194. Google ScholarDigital Library
- Fukumoto, M. and Tonomura, Y. Whisper: a wristwatch style wearable handset. Proc. CHI '99, pp. 112--119. Google ScholarDigital Library
- Gaver, W. The Sonic Finder: An interface that uses auditory icons. Human Computer Interaction, 4(1). pp. 67--94. Google ScholarDigital Library
- Goldberg, D., and Richardson, C., Touch-typing with a stylus. Proc. INTERCHI'93, pp. 80--87. Google ScholarDigital Library
- González, I. E., Wobbrock, J. O., Chau, D. H., Faulring, A., and Myers, B. A. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. Proc. GI'07, pp. 95--102. Google ScholarDigital Library
- Hansen, J.H.L., Plucienkowski, J., Gallant, S., Pellom, B.L., Ward, B. CU-Move: Robust Speech Processing for In-Vehicle Speech Systems. Proc. ICSLP'00, pp. 524--527.Google Scholar
- Hiraoka, S., Miyamoto, I., Tomimatsu, K. (2003) Behind Touch, a Text Input Method for Mobile Phones by The Back and Tactile Sense Interface. Information Processing Society of Japan, Interaction'03. p. 131--138.Google Scholar
- Hornstein, T. 1994. Telephone voice interfaces on the cheap. UBILAB Rep, Union Bank of Switzerland, Zurich.Google Scholar
- Hudson, S. & I. Smith. Electronic Mail Previews Using Non-Speech Audio. CHI'96 Extended Abstracts, 237--238. Google ScholarDigital Library
- Ito, Mizuko, et. al., eds. Personal, Portable, Pedestrian. Mobile Phones in Japanese Life, MIT Press, Cambridge, Mass. Google ScholarDigital Library
- Luk, J., Pasquero, J., Little, J., MacLean, K., Levesque, V., Hayward, V. A role for haptics in mobile interaction: initial design using a handheld tactile display prototype. Proc. CHI'06, pp. 171--180. Google ScholarDigital Library
- Lyons, K., Skeels, C., Starner, T., Snoeck, C.M., Wong, B.A., Ashbrook, D. Augmenting conversations using dual--purpose speech. Proc. UIST'04, pp.237--246. Google ScholarDigital Library
- Lyons, K., Starner, T., Plaisted, D., Fusia, J., Lyons, A., Drew, A., and Looney, E. W. Twiddler typing: one-handed chording text entry for mobile phones.Proc.CHI'04,671--678. Google ScholarDigital Library
- MacKenzie, S. Mobile text entry using three keys. Proc. Nor-diCHI'02, pp. 27--34. Google ScholarDigital Library
- Marics, M., and Engelbeck, G. (1997). Designing voice menu applications for telephones, in Handbook of Human-Computer Interaction, Helander, M., Landauer, T., and Prabhu, P. Edi-tors. Elsevier, pp. 1085--1102.Google ScholarCross Ref
- Perugini, S., Anderson, T. J., and Moroney, W. F. A study of out-of-turn interaction in menu-based, IVR, voicemail systems. Proc. CHI'07, pp. 961--970. Google ScholarDigital Library
- Resnick, P. and Virzi, R. A. Skip and scan: cleaning up telephone interface. Proc. CHI 1992, pp. 419--426. Google ScholarDigital Library
- Rosenberger, J., and Gasco, M. (1983). Comparing location estimators: Trimmed means, medians, and trimean. in Under-standing robust and exploratory data analysis, Hoagan, D., Mosteller, F., and Tukey, J., Editors. John Wiley: New York.Google Scholar
- Starner, T. E., Snoeck, C. M., Wong, B. A., and McGuire, R. M. Use of mobile appointment scheduling devices. CHI'04 Extended Abstracts, pp.1501--1504. Google ScholarDigital Library
- Sugimoto, M. Hiroki, K. HybridTouch: an intuitive manipulation technique for PDAs using their front and rear surfaces. Proc. MobileHCI'06, pp. 137--140. Google ScholarDigital Library
- Teslar, L. The Smalltalk Environment, BYTE, 6,'81, 90--147.Google Scholar
- Tucker, S., and Whittaker, S. Time is of the Essence: An Evaluation of Temporal Compression Algorithms. Proc. CHI'06, pp 329--338. Google ScholarDigital Library
- Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, Chia. LucidTouch: A See-Through Mobile Device. Proc. UIST'07, pp.269--278. Google ScholarDigital Library
- Wobbrock, J. O., Chau, D. H., and Myers, B. A. An alterna-tive to push, press, and tap-tap-tap: gesturing on an isometric joystick for mobile phone text entry. Proc.CHI'07, 667--676. Google ScholarDigital Library
- Wobbrock, J. O., Myers, B. A., and Kembel, J. A. EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion. Proc. UIST'03, pp. 61--70. Google ScholarDigital Library
- Yin, M, Zhai, S. The Benefits of Augmenting Telephone Voice Menu Navigation with Visual Browsing and Search. Proc. CHI'06, 319--328 Google ScholarDigital Library
- Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., and Baudisch, P. Earpod: eyes-free menu selection using touch input and reactive audio feedback. Proc. CHI'07, pp. 1395--1404. Google ScholarDigital Library
Index Terms
- Blindsight: eyes-free access to mobile phones
Recommendations
EarPut: augmenting behind-the-ear devices for ear-based interaction
CHI EA '13: CHI '13 Extended Abstracts on Human Factors in Computing SystemsIn this work-in-progress paper, we make a case for leveraging the unique affordances of the human ear for eyes-free, mobile interaction. We present EarPut, a novel interface concept, which instruments the ear as an interactive surface for touch-based ...
EarPut: augmenting ear-worn devices for ear-based interaction
OzCHI '14: Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of DesignOne of the pervasive challenges in mobile interaction is decreasing the visual demand of interfaces towards eyes-free interaction. In this paper, we focus on the unique affordances of the human ear to support one-handed and eyes-free mobile interaction. ...
Design Strategies for Efficient Access to Mobile Device Users via Amazon Mechanical Turk
CrowdSenSys '17: Proceedings of the First ACM Workshop on Mobile Crowdsensing Systems and ApplicationsIt is often challenging to access a pool of mobile device users and instruct them to perform an interactive task. Yet such data is often vital to provide design insight at various stages of the design process of a mobile application, service or system. ...
Comments