skip to main content
10.1145/2556288.2557090acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

MixFab: a mixed-reality environment for personal fabrication

Published:26 April 2014Publication History

ABSTRACT

Personal fabrication machines, such as 3D printers and laser cutters, are becoming increasingly ubiquitous. However, designing objects for fabrication still requires 3D modeling skills, thereby rendering such technologies inaccessible to a wide user-group. In this paper, we introduce MixFab, a mixed-reality environment for personal fabrication that lowers the barrier for users to engage in personal fabrication. Users design objects in an immersive augmented reality environment, interact with virtual objects in a direct gestural manner and can introduce existing physical objects effortlessly into their designs. We describe the design and implementation of MixFab, a user-defined gesture study that informed this design, show artifacts designed with the system and describe a user study evaluating the system's prototype.

Skip Supplemental Material Section

Supplemental Material

pn0736-file3.mp4

mp4

95.2 MB

References

  1. Autodesk 123D Design. http://www.123dapp.com/design.Google ScholarGoogle Scholar
  2. TinkerCAD: Mind to design in minutes. https://tinkercad.com/.Google ScholarGoogle Scholar
  3. Anderson, D., Frankel, J. L., Marks, J., Agarwala, A., Beardsley, P., Hodgins, J., Leigh, D., Ryall, K., Sullivan, E., and Yedidia, J. S. Tangible interaction + graphical interpretation: a new approach to 3d modeling. In Proc. SIGGRAPH (2000), 393--402. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Arisandi, R., Takami, Y., Otsuki, M., Kimura, A., Shibata, F., and Tamura, H. Enjoying virtual handcrafting with tooldevice. In Adjunct Proc. UIST (2012), 17--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Benko, H., Jota, R., and Wilson, A. Miragetable: Freehand interaction on a projected augmented reality tabletop. In Proc. CHI (2012), 199--208. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Bradski, G., and Kaehler, A. Learning OpenCV: Computer Vision with the OpenCV Library. O'Reilly Media, Incorporated, 2008.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Buchmann, V., Violich, S., Billinghurst, M., and Cockburn, A. Fingartips: Gesture based direct manipulation in augmented reality. In Proc. GRAPHITE (2004), 212--221. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Douglas, D. H., and Peucker, T. K. Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartographica: The International Journal for Geographic Information and Geovisualization 10 (1973), 112--122.Google ScholarGoogle ScholarCross RefCross Ref
  9. Follmer, S., Carr, D., Lovell, E., and Ishii, H. Copycad: remixing physical objects with copy and paste from the real world. In Adjunct Proc., UIST (2010), 381--382. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Follmer, S., and Ishii, H. Kidcad: digitally remixing toys through tangible tools. In Proc. CHI (2012), 2401--2410. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Hilliges, O., Kim, D., Izadi, S., Weiss, M., and Wilson, A. Holodesk: direct 3d interactions with a situated see-through display. In Proc. CHI (2012), 2421--2430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Holz, C., and Wilson, A. Data miming: inferring spatial object descriptions from human gesture. In Proc. CHI (2011), 811--820. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., and Olivier, P. Digits: freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proc. UIST (2012), 167--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kim, H., Albuquerque, G., Havemann, S., and Fellner, D. W. Tangible 3d: hand gesture interaction for immersive 3d modeling. In Proc. EGVE (2005), 191--199. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Lau, M., Hirose, M., Ohgawara, A., Mitani, J., and Igarashi, T. Situated modeling: a shape-stamping interface with tangible primitives. In Proc. TEI (2012), 275--282. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Lau, M., Saul, G., Mitani, J., and Igarashi, T. Modeling-in-context: user design of complementary objects with a single photo. In Proc. SBIM (2010), 17--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Llamas, I., Kim, B., Gargus, J., Rossignac, J., and Shaw, C. D. Twister: a space-warp operator for the two-handed editing of 3d shapes. In ACM SIGGRAPH (2003), 663--668. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Mori, Y., and Igarashi, T. Plushie: an interactive design system for plush toys. In ACM SIGGRAPH (2007). Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Mueller, S., Lopes, P., and Baudisch, P. Interactive construction: interactive fabrication of functional mechanical devices. In Proc. UIST (2012), 599--606. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Nishino, H., Utsumiya, K., and Korida, K. 3d object modeling using spatial and pictographic gestures. In Proc. VRST (1998), 51--58. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Oka, K., Sato, Y., and Koike, H. Real-time 'ngertip tracking and gesture recognition. Computer Graphics and Applications, IEEE 22, 6 (2002), 64--71. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Pavlovic, V. I., Sharma, R., and Huang, T. S. Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Trans. Pattern Anal. Mach. Intell. 19, 7 (July 1997), 677--695. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Saul, G., Lau, M., Mitani, J., and Igarashi, T. Sketchchair: an all-in-one chair design system for end users. In Proc. TEI (2011), 73--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Schkolne, S., Pruett, M., and Schroder, P. Surface drawing: creating organic 3d shapes with the hand and tangible tools. In Proc. CHI (2001), 261--268. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Sheng, J., Balakrishnan, R., and Singh, K. An interface for virtual 3d sculpting via physical proxy. In Proc. GRAPHITE (2006), 213--220. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Starner, T., Leibe, B., Minnen, D., Westyn, T., Hurst, A., and Weeks, J. The perceptive workbench: Computer-vision-based gesture tracking, object tracking, and 3d reconstruction for augmented desks. Machine Vision and Applications 14, 1 (2003), 59--71. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Weichel, C., Lau, M., and Gellersen, H. Enclosed: A component-centric interface for designing prototype enclosures. In Proc. TEI (2013), 215--218. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Wilson, A. D. Using a depth camera as a touch sensor. In Proc. ITS (2010), 69--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. In Proc. CHI (2009), 1083--1092. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. MixFab: a mixed-reality environment for personal fabrication

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2014
      4206 pages
      ISBN:9781450324731
      DOI:10.1145/2556288

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 26 April 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '14 Paper Acceptance Rate465of2,043submissions,23%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader