skip to main content
10.1145/3306214.3338548acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
poster

VRProp-net: real-time interaction with virtual props

Published:28 July 2019Publication History

ABSTRACT

Virtual and Augmented Reality (VR and AR) are two fast growing mediums, not only in the entertainment industry but also in health, education and engineering. A good VR or AR application seamlessly merges the real and virtual world, making the user feels fully immersed. Traditionally, a computer-generated object can be interacted with using controllers or hand gestures [HTC 2019; Microsoft 2019; Oculus 2019]. However, these motions can feel unnatural and do not accurately represent the motion of interacting with a real object. On the other hand, a physical object can be used to control the motion of a virtual object. At present, this can be done by tracking purely rigid motion using an external sensor [HTC 2019]. Alternatively, a sparse number of markers can be tracked, for example using a motion capture system, and the positions of these used to drive the motion of an underlying non-rigid model. However, this approach is sensitive to changes in marker position and occlusions and often involves costly non-standard hardware [Vicon 2019]. In addition, these approaches often require a virtual model to be manually sculpted and rigged which can be a time consuming process. Neural networks have been shown to be successful tools in computer vision, with several key methods using networks for tracking rigid and non-rigid motion in RGB images [Andrychowicz et al. 2018; Kanazawa et al. 2018; Pumarola et al. 2018]. While these methods show potential, they are limited to using multiple RGB cameras or large, costly amounts of labelled training data.

Skip Supplemental Material Section

Supplemental Material

a31-taylor.mp4

mp4

20.1 MB

References

  1. M. Andrychowicz, B. Baker, M. Chociej, R. Józefowicz, B. McGrew, J. W. Pachocki, A. Petron, M. Plappert, G. Powell, A. Ray, J. Schneider, S. Sidor, J. Tobin, P. Welinder, L. Weng, and W. Zaremba. 2018. Learning Dexterous In-Hand Manipulation. CoRR (2018).Google ScholarGoogle Scholar
  2. R. D. Cook. 1989. Concepts and applications of finite element analysis. (3rd ed.).Google ScholarGoogle Scholar
  3. K. He, X. Zhang, S. Ren, and J. Sun. 2016. Identity mappings in deep residual networks. In European conference on computer vision. 630--645.Google ScholarGoogle Scholar
  4. HTC. 2019. Discover Virtual Reality Beyond Imagination. https://www.vive.com/uk/.Google ScholarGoogle Scholar
  5. A. Kanazawa, M. J Black, D. W Jacobs, and J. Malik. 2018. End-to-end recovery of human shape and pose. In Proceedings of the CVPR. 7122--7131.Google ScholarGoogle Scholar
  6. Microsoft. 2019. Microsoft HoloLens | Mixed Reality Technology for Business. https://www.microsoft.com/en-us/hololens.Google ScholarGoogle Scholar
  7. Oculus. 2019. Oculus Rift. https://www.oculus.com/rift/.Google ScholarGoogle Scholar
  8. A. Pumarola, A. Agudo, A. Porzi, L. and Sanfeliu, V. Lepetit, and F. Moreno-Noguer. 2018. Geometry-aware network for non-rigid shape prediction from a single view. In Proceedings of CVPR. 4681--4690.Google ScholarGoogle Scholar
  9. Vicon. 2019. Motion Capture Systems. https://www.vicon.com/.Google ScholarGoogle Scholar
  10. S. Zagoruyko and N. Komodakis. 2016. Wide residual networks. arXiv preprint arXiv:1605.07146 (2016).Google ScholarGoogle Scholar

Index Terms

  1. VRProp-net: real-time interaction with virtual props

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          SIGGRAPH '19: ACM SIGGRAPH 2019 Posters
          July 2019
          148 pages
          ISBN:9781450363143
          DOI:10.1145/3306214

          Copyright © 2019 Owner/Author

          Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 28 July 2019

          Check for updates

          Qualifiers

          • poster

          Acceptance Rates

          Overall Acceptance Rate1,822of8,601submissions,21%

          Upcoming Conference

          SIGGRAPH '24

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader