ABSTRACT
Tactile displays have predominantly been used for information transfer using patterns or as assistive feedback for interactions. With recent advances in hardware for conveying increasingly rich tactile information that mirrors visual information, and the increasing viability of wearables that remain in constant contact with the skin, there is a compelling argument for exploring tactile interactions as rich as visual displays. Direct Manipulation underlies much of the advances in visual interactions. In this work, we introduce the concept of a Direct Manipulation-enabled Tactile display (DMT). We define the concepts of a tactile screen, tactile pixel, tactile pointer, and tactile target which enable tactile pointing, selection and drag & drop. We build a proof of concept tactile display and study its precision limits. We further develop a performance model for DMTs based on a tactile target acquisition study. Finally, we study user performance in a real-world DMT menu application. The results show that users are able to use the application with relative ease and speed.
Supplemental Material
- Stephen A Brewster and Alison King. 2005. The design and evaluation of a vibrotactile progress bar. In World Haptics 2005. IEEE, 499--500. Google ScholarDigital Library
- Matt Brian. 2015. Sony is crowdfunding a smart watch with a dumb face. (2015). http://www.engadget.com/ 2015/08/31/sony-wena-smartwatch/Google Scholar
- William Buxton. 1990. A three-state model of graphical input. In Proc. INTERACT'90. 449--456. http://billbuxton.com/3stateModel.pdf Google ScholarDigital Library
- Marta G. Carcedo, Soon Hau Chua, Simon T. Perrault, Paweł Wozniak, Raj Joshi, Mohammad Obaid, Morten Fjeld, and Shengdong Zhao. 2016. HaptiColor: Interpolating Color Information as Haptic Feedback to Assist the Color Blind. In Proc. CHI '16. 1--12. http://dx.doi.org/10.1145/2858036.2858220 Google ScholarDigital Library
- Géry Casiez, Daniel Vogel, Ravin Balakrishnan, and Andy Cockburn. 2008. The impact of control-display gain on user performance in pointing tasks. Human-Computer Interaction 23, 3 (2008), 215--250. http://dx.doi.org/10.1080/07370020802278163Google ScholarCross Ref
- Hsiang-yu Chen, Joseph Santos, Matthew Graves, Kwangtaek Kim, and Hong Z. Tan. 2008. Tactor Localization at the Wrist. In Proc. Eurohaptics'08. 209--218. http://citeseerx.ist.psu.edu/viewdoc/ summary?doi=10.1.1.151.7152&rank=1 Google ScholarDigital Library
- Roger W. Cholewiak and Amy A. Collins. 2003. Vibrotactile localization on the arm: Effects of place, space, and age. Percept. Psychophys. 65, 7 (2003), 1058--1077. http: //www.springerlink.com/index/10.3758/BF03194834Google ScholarCross Ref
- Sarah A Douglas, Arthur E Kirkpatrick, and I Scott MacKenzie. 1999. Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard. In Proc. CHI '99. 215--222. http://dx.doi.org/10.1145/302979.303042 Google ScholarDigital Library
- James J. Gibson. 1962. Observations on Active Touch. Psychological Review 69, 6 (1962), 477--491. http://psycnet.apa.org/psycinfo/1963-06036-001Google ScholarCross Ref
- Ali Israr and Ivan Poupyrev. 2011. Tactile Brush: Drawing on Skin with a Tactile Grid Display. In Proc. CHI'11. 2019--2028. http://doi.acm.org/10.1145/1978942.1979235 Google ScholarDigital Library
- Yeon Sub Jin, Han Yong Chun, Eun Tai Kim, and Sungchul Kang. 2014. VT-ware: A wearable tactile device for upper extremity motion guidance. In Proc. RO-MAN'14. 335--340. http://ieeexplore.ieee.org/ lpdocs/epic03/wrapper.htm?arnumber=6926275Google ScholarCross Ref
- 12. Seungyon "Claire" Lee and Thad Starner. 2010. BuzzWear: alert perception in wearable tactile displays on the wrist. In Proc. CHI'10. 433--442. http: //dl.acm.org/citation.cfm?id=1753326.1753392 Google ScholarDigital Library
- Charles Lenay, S. Canu, and P. Villon. 1997. Technology and perception: the contribution of sensory substitution systems. In Proc. ICCT'97. 44--53. http://dx.doi.org/10.1109/CT.1997.617681 Google ScholarDigital Library
- Shachar Maidenbaum, Sami Abboud, and Amir Amedi. 2014. Sensory substitution: closing the gap between basic research and widespread practical visual rehabilitation. Neuroscience & Biobehavioral Reviews 41 (2014), 3--15.Google ScholarCross Ref
- Michael Matscheko, Alois Ferscha, Andreas Riener, and Manuel Lehner. 2010a. Tactor placement in wrist worn wearables. In Proc. ISWC'10. 1--8. http://ieeexplore.ieee.org/lpdocs/epic03/ wrapper.htm?arnumber=5665867Google ScholarCross Ref
- Michael Matscheko, Alois Ferscha, Andreas Riener, and Manuel Lehner. 2010b. Tactor placement in wrist worn wearables. In Int. Symp. Wearable Comput. 2010. IEEE, 1--8. DOI: http://dx.doi.org/10.1109/ISWC.2010.5665867Google ScholarCross Ref
- Hugo Nicolau, João Guerreiro, Tiago Guerreiro, and Luís Carriço. 2013. UbiBraille: designing and evaluating a vibrotactile Braille-reading device. In Proc. ASSETS'13. 1--8. http://dx.doi.org/10.1145/2513383.2513437 Google ScholarDigital Library
- Jerome Pasquero, Scott J. Stobbe, and Noel Stonehouse. 2011. A haptic wristwatch for eyes-free interactions. In Proc. CHI'11. 3257--3266. http: //dl.acm.org/citation.cfm?id=1978942.1979425 Google ScholarDigital Library
- L. Rahal, Jongeun Cha, A.E. Saddik, J. Kammerl, and E. Steinbach. 2009. Investigating the influence of temporal intensity changes on apparent movement phenomenon. In Virtual Environments, Human-Computer Interfaces and Measurements Systems, 2009. VECIMS '09. IEEE International Conference on. 310--313. DOI: http://dx.doi.org/10.1109/VECIMS.2009.5068914 Google ScholarDigital Library
- Christian Schönauer, Kenichiro Fukushi, Alex Olwal, Hannes Kaufmann, and Ramesh Raskar. 2012. Multimodal motion guidance: techniques for adaptive and dynamic feedback. In Proc. ICMI'12. 133--140. http: //dl.acm.org/citation.cfm?id=2388676.2388706 Google ScholarDigital Library
- Ben Shneiderman. 1983. Direct Manipulation: A Step Beyond Programming Languages. Computer 16, 8 (1983), 57--69. http://dx.doi.org/10.1109/MC.1983.1654471 Google ScholarDigital Library
- Georg von Bekesy. 1957. Sensations on the Skin Similar to Directional Hearing, Beats, and Harmonics of the Ear. J. Acoust. Soc. Am. 29, 4 (1957), 489--501. http://psycnet.apa.org/doi/10.1121/1.1908938Google ScholarCross Ref
- Cheng Xu, Ali Israr, Ivan Poupyrev, Olivier Bau, and Chris Harrison. 2011. Tactile display for the visually impaired using TeslaTouch. In Ext. Abs. CHI'11. 317--322. http://dx.doi.org/10.1145/1979742.1979705 Google ScholarDigital Library
Index Terms
- Direct Manipulation in Tactile Displays
Recommendations
Dialogic manipulation for tactile displays
IHM '15: Proceedings of the 27th Conference on l'Interaction Homme-MachineTactile displays have predominantly been used for information transfer using patterns or as assistive feedback for interactions. With recent advances hardware for conveying increasingly rich tactile information that mirrors visual information, and the ...
Direct manipulation video navigation on touch screens
MobileHCI '14: Proceedings of the 16th international conference on Human-computer interaction with mobile devices & servicesDirect Manipulation Video Navigation (DMVN) systems allow a user to directly drag an object of interest along its motion trajectory and have been shown effective for space-centric video browsing tasks. This paper designs touch-based interface techniques ...
Direct and gestural interaction with relief: a 2.5D shape display
UIST '11: Proceedings of the 24th annual ACM symposium on User interface software and technologyActuated shape output provides novel opportunities for experiencing, creating and manipulating 3D content in the physical world. While various shape displays have been proposed, a common approach utilizes an array of linear actuators to form 2.5D ...
Comments