skip to main content
10.1145/2556288.2556955acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Duet: exploring joint interactions on a smart phone and a smart watch

Published:26 April 2014Publication History

ABSTRACT

The emergence of smart devices (e.g., smart watches and smart eyewear) is redefining mobile interaction from the solo performance of a smart phone, to a symphony of multiple devices. In this paper, we present Duet -- an interactive system that explores a design space of interactions between a smart phone and a smart watch. Based on the devices' spatial configurations, Duet coordinates their motion and touch input, and extends their visual and tactile output to one another. This transforms the watch into an active element that enhances a wide range of phone-based interactive tasks, and enables a new class of multi-device gestures and sensing techniques. A technical evaluation shows the accuracy of these gestures and sensing techniques, and a subjective study on Duet provides insights, observations, and guidance for future work.

Skip Supplemental Material Section

Supplemental Material

pn0142-file3.mp4

mp4

88.7 MB

p159-sidebyside.mp4

mp4

171.6 MB

References

  1. Amento, B., Hill, W., and Terveen, L. The sound of one hand. CHI '02, 724--725. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Ballagas, R., Borchers, J., Rohs, M., and Sheridan, J.G. The Smart Phone. IEEE Pervasive Computing 5, 1 (2006), 70--77. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Baudisch, P. and Chu, G. Back-of-device interaction allows creating very small touch devices. CHI '09, 1923--1932. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Butler, A., Izadi, S., and Hodges, S. SideSight. UIST '08, 201--204. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Buxton, W. Integrating the periphery and context: A new taxonomy of telematics. GI '95, 239--246.Google ScholarGoogle Scholar
  6. Buxton, W.A.S. Chunking and phrasing and the design of human-computer dialogues. IFIP '86, 494--499.Google ScholarGoogle Scholar
  7. Chen, G., and Kotz, D., A survey of context-aware mobile computing research. Technical Report TR2000381, Dept. of Computer Science, Dartmouth College, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Crossan, A., Williamson, J., Brewster, S., and MurraySmith, R. Wrist rotation for interaction in mobile contexts. MobileHCI '08, 435--438. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Falk, J. The conversational duet. Proceedings of the Annual Meeting of the Berkeley Linguistics Society, 2011.Google ScholarGoogle Scholar
  10. Fitzmaurice, G.W. Situated information spaces and spatially aware palmtop computers. CACM 36, 7 (1993), 39--49. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Gratier, M. Grounding in musical interaction: Evidence from jazz performances. Musicae Scientiae 12, 1 Suppl (2008), 71--110.Google ScholarGoogle Scholar
  12. Harrison, C. and Hudson, S.E. Abracadabra. UIST '09, 121--124. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Harrison, C., Schwarz, J., and Hudson, S.E. TapSense. UIST '11, 627--634.Google ScholarGoogle Scholar
  14. Hinckley, K., Pierce, J., Horvitz, E., and Sinclair, M. Foreground and background interaction with sensorenhanced mobile devices. TOCHI '12, 1 (2005), 31--52. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E. Sensing techniques for mobile interaction. CHI '00, 91--100. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Hinckley, K., Ramos, G., Guimbretiere, F., Baudisch, P., and Smith, M. Stitching. AVI '04, 23--30.Google ScholarGoogle Scholar
  17. Hinckley, K. Synchronous gestures for multiple persons and computers. UIST '03, 149--158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Holmquist, L.E., Mattern, F., Schiele, B., Alahuhta, P., Beigl, M., and Gellersen, H. Smart-Its Friends. Ubicomp '01, 116--122. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Hudson, S.E., Harrison, C., Harrison, B.L., and LaMarca, A. Whack gestures. TEI '10, 109--103.Google ScholarGoogle Scholar
  20. Ishiguro, Y., Mujibiya, A., Miyaki, T., and Rekimoto, J. Aided eyes. AH '10, 1--7.Google ScholarGoogle Scholar
  21. Jones, B., Sodhi, R., Forsyth, D., Bailey, B., and Maciocci, G. Around device interaction for multiscale navigation. MobileHCI '12, 83--92. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Kim, D., Hilliges, O., Izadi, S., et al. Digits. UIST '12, 167--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Kim, J., He, J., Lyons, K., and Starner, T. The Gesture Watch. ISWC '07, 1--8.Google ScholarGoogle Scholar
  24. Kortuem, G., Kray, C., and Gellersen, H. Sensing and visualizing spatial relations of mobile devices. UIST '05, 93. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Kray, C., Rohs, M., Hook, J., and Kratz, S. Group coordination and negotiation through spatial proximity regions around mobile devices on augmented tabletops. 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems, 1--8.Google ScholarGoogle Scholar
  26. Lee, S.C., Li, B., and Starner, T. AirTouch. ISWC '11, 3--10.Google ScholarGoogle Scholar
  27. Lucero, A., Keränen, J., and Korhonen, H. Collaborative use of mobile phones for brainstorming. MobileHCI '10, 337. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Lyons, K., Nguyen, D., Ashbrook, D., and White, S. Facet. UIST '12, 123--130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Mann, S. Smart clothing. CACM 39, 8 (1996), 23--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Mann, S. 'WearCam' (The Wearable Camera). ISWC '98, 124--131.Google ScholarGoogle Scholar
  31. Marquardt, N., Ballendat, T., Boring, S., Greenberg, S., and Hinckley, K. Gradual engagement. ITS '12, 31--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Maurer, U., Rowe, A., Smailagic, A., and Siewiorek, D.P. eWatch. BSN'06, 142--145. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. McCallum, D.C. and Irani, P. ARC-Pad. UIST '09, 153.Google ScholarGoogle Scholar
  34. Merrill, D., Kalanithi, J., and Maes, P. Siftables. TEI '07, 75--78. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Ni, T. and Baudisch, P. Disappearing mobile devices. UIST '09, 101--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Oney, S., Harrison, C., Ogan, A., and Wiese, J. ZoomBoard. CHI '13, 2799--2803.Google ScholarGoogle Scholar
  37. Pebble. Pebble E-Paper Watch. http://getpebble.com/.Google ScholarGoogle Scholar
  38. Post, E.R. and Orth, M. Smart fabric, or 'wearable clothing.' ISWC '97, 167--168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Rahman, M., Gustafson, S., Irani, P., and Subramanian, S. Tilt techniques. CHI '09, 1943--1952.Google ScholarGoogle Scholar
  40. Rekimoto, J. Tilting operations for small screen interfaces. UIST '96, 167--168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Rekimoto, J. Pick-and-drop. UIST '97, 31--39. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Rekimoto, J. GestureWrist and GesturePad. ISWC '01, 21--27.Google ScholarGoogle Scholar
  43. Ruiz, J. and Li, Y. DoubleFlip. CHI '11, 2717--2720. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Santosa, S. and Wigdor. D.. A field study of multidevice workflows in distributed workspaces. UbiComp '13. 63--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Schilit, B., Adams, N., and Want, R. Context-Aware Computing Applications. First Workshop on Mobile Computing Systems and Applications, 85--90. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Schmidt, D., Seifert, J., Rukzio, E., and Gellersen, H. A cross-device interaction style for mobiles and surfaces. DIS '12, 318--327. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Siek, K.A., Rogers, Y., and Connelly, K.H. Fat finger worries INTERACT '05, 267--280. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Streitz, N.A., Konomi, S., and Burkhardt, H.-J., Roomware for cooperative buildings. Cooperative Buildings: Integrating Information, Organization, and Architecture, 1998. 4--21. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Strohmeier, P., Vertegaal, R., and Girouard, A. With a flick of the wrist. TEI '12, 307--308. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Vogel, D. and Baudisch, P. Shift. CHI '07, 657--666. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., and Shen, C. Lucid touch. UIST '07, 269--278. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Yee, K.-P. Peephole displays. CHI '03, 1--9.Google ScholarGoogle Scholar

Index Terms

  1. Duet: exploring joint interactions on a smart phone and a smart watch

    Recommendations

    Reviews

    JOSE CARLOS MORENO UBEDA

    The main idea of this paper is very interesting. A connection between a smart watch and a smart phone is established to work with a common shared human interface using the accelerometer present in both devices to recognize user gestures, dealing with both devices as a single unified platform. These gestures are proposed as a way to overcome the limitations of direct touch on wrist-worn devices. The paper begins with a very good review of the literature about the interaction techniques for handheld and wrist-worn devices, both individually and when they are associated. Some of these ideas are used in individual commercial devices, such as the Pebble smart watch, which can be shaken to turn on the backlight, or in iOS devices, which can be shaken to undo the last operation. Similar actions can initiate file transfers between a pair of Android devices. In fact, Pebble (and the like) should appear in the paper (in Table 1) as a device used for context and activity sensing (with applications such as Morpheuz, SwimIO, and others), operating in the background based on Buxton's framework [1]. The authors provide a set of six types of gestures, in addition to gesture recognition methods based on the movement, orientation, and touching of both devices, for the interaction between the watch and the mobile phone. They use machine learning techniques and hard-coded heuristics for implementing their recognition system. More than 7,000 data points were taken and analyzed, providing results of over 90 percent mean accuracy in the detection of the gestures for different people. A set of applications (Duet) is used to test the usability of the proposal. The paper is highly applicable to contemporary mobile devices. Online Computing Reviews Service

    Access critical reviews of Computing literature here

    Become a reviewer for Computing Reviews.

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2014
      4206 pages
      ISBN:9781450324731
      DOI:10.1145/2556288

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 26 April 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '14 Paper Acceptance Rate465of2,043submissions,23%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader