Skip to main content

2020 | Buch

Virtual, Augmented and Mixed Reality. Industrial and Everyday Life Applications

12th International Conference, VAMR 2020, Held as Part of the 22nd HCI International Conference, HCII 2020, Copenhagen, Denmark, July 19–24, 2020, Proceedings, Part II


Über dieses Buch

The 2 volume-set of LNCS 12190 and 12191 constitutes the refereed proceedings of the 12th International Conference on Virtual, Augmented and Mixed Reality, VAMR 2020, which was due to be held in July 2020 as part of HCI International 2020 in Copenhagen, Denmark. The conference was held virtually due to the COVID-19 pandemic.
A total of 1439 papers and 238 posters have been accepted for publication in the HCII 2020 proceedings from a total of 6326 submissions.
The 71 papers included in these HCI 2020 proceedings were organized in topical sections as follows:
Part I: design and user experience in VAMR; gestures and haptic interaction in VAMR; cognitive, psychological and health aspects in VAMR; robots in VAMR.
Part II: VAMR for training, guidance and assistance in industry and business; learning, narrative, storytelling and cultural applications of VAMR; VAMR for health, well-being and medicine.


Correction to: Sampling Electrocardiography Confirmation for a Virtual Reality Pain Management Tool
Jessie Y. C. Chen, Gino Fragomeni

VAMR for Training, Guidance and Assistance in Industry and Business

Navigating a Heavy Industry Environment Using Augmented Reality - A Comparison of Two Indoor Navigation Designs
The fourth industrial revolution seeks to enhance and optimize industrial processes through digital systems. However, such systems need to meet special criteria for usability and task support, ensuring users’ acceptance and safety. This paper presents an approach to support employees in heavy industries with augmented reality based indoor navigation and instruction systems. An experimental study examined two different user interface concepts (navigation path vs. navigation arrow) for augmented reality head-mounted-displays. In order to validate a prototypical augmented reality application that can be deployed in such production processes, a simulated industrial environment was created. Participants walked through the scenario and were instructed to work on representative tasks, while the wearable device offered assistance and guidance. Users’ perception of the system and task performance were assessed. Results indicate a superior performance of the navigation path design, as it granted participants significantly higher perceived support in the simulated working tasks. Nevertheless, the covered distance by the participants was significantly shorter in navigation arrow condition compared to the navigation path condition. Considering that the navigation path design resulted in a higher perceived Support, renders this design approach more suitable for assisting personnel working at industrial workplaces.
Alexander Arntz, Dustin Keßler, Nele Borgert, Nico Zengeler, Marc Jansen, Uwe Handmann, Sabrina C. Eimler
A GPU Accelerated Lennard-Jones System for Immersive Molecular Dynamics Simulations in Virtual Reality
Interactive tools and immersive technologies make teaching more engaging and complex concepts easier to comprehend are designed to benefit training and education. Molecular Dynamics (MD) simulations numerically solve Newton’s equations of motion for a given set of particles (atoms or molecules). Improvements in computational power and advances in virtual reality (VR) technologies and immersive platforms may in principle allow the visualization of the dynamics of molecular systems allowing the observer to experience first-hand elusive physical concepts such as vapour-liquid transitions, nucleation, solidification, diffusion, etc. Typical MD implementations involve a relatively large number of particles N = O(\(10^4\)) and the force models imply a pairwise calculation which scales, in case of a Lennard-Jones system, to the order of O(\(N^2\)) leading to a very large number of integration steps. Hence, modelling such a computational system over CPU along with a GPU intensive virtual reality rendering often limits the system size and also leads to a lower graphical refresh rate. In the model presented in this paper, we have leveraged GPU for both data-parallel MD computation and VR rendering thereby building a robust, fast, accurate and immersive simulation medium. We have generated state-points with respect to the data of real substances such as CO\(_2\). In this system the phases of matter viz. solid liquid and gas, and their emergent phase transition can be interactively experienced using an intuitive control panel.
Nitesh Bhatia, Erich A. Müller, Omar Matar
User Interface for an Immersive Virtual Reality Greenhouse for Training Precision Agriculture
In the past 50 years, farm producers have increased use of electronic control systems for controlled environment agriculture in greenhouses and animal production. These systems use environmental sensors to automate environmental management of greenhouses and can lead to significantly larger yields. However, mismanagement of a CEA system can also lead to loss of entire crops. Due to the sensitivity of crops in CEA and potential losses, students are rarely allowed to work directly with CEA systems. In order to increase opportunities for students to interact with CEA systems, an immersive virtual greenhouse simulation was created. Implementing realistic interfaces for physical tasks and for the CEA system provides students with ‘hands-on’ training in the virtual greenhouse. However, there are challenges that limit implementation of physical actions and realistic interfaces in virtual reality. We consider interfaces for three types of interactions: system interactions, physical greenhouse interactions, and interactions with the CEA system. Potential interface designs are presented with a discussion of the benefits and costs associated with each design.
Daniel W. Carruth, Christopher Hudson, Amelia A. A. Fox, Shuchisnigdha Deb
A Comparison of Augmented and Virtual Reality Features in Industrial Trainings
Short-term qualification for temporary workers is a constant challenge for manufacturing companies. Cycle times of machines often have to be reduced for training processes, which demands time and financial resources. This increases the need for near-the-job trainings without manipulating cycle times of the machine. Digital visualization tools using Mixed Reality (MR) promise opportunities for application-oriented practical training. However, especially for industrial applications, where procedural knowledge has to be transferred, it is not clear which MR technology should be used for which purpose. In order to answer this question, this paper examines the underlying MR-features of the technology. In an experimental setting, the same virtual training for the assembly of a pneumatic cylinder is examined with an augmented reality/augmented virtuality (AR/AV) based application in comparison to a virtual reality (VR) based application. Based on the carried out study, there are significant differences in the evaluation of the system usability, but no differences in the evaluation of the ergonomics and the perceived task load during the task. Out of 16 test persons, 14 would choose the VR system in the final analysis. The results are discussed in the paper and recommendations for the design of MR based systems in an industrial context are given.
Lea M. Daling, Anas Abdelrazeq, Ingrid Isenhardt
AR Assisted Process Guidance System for Ship Block Fabrication
The hull structure of the ship block is complex and the number of its constituted components is very large. In order to reduce the rework of the construction process, an augmented reality (AR) assisted information visualization and process guidance system is proposed. Based on the 3D model and the fabrication process of the ship block, a fabrication process information (FPI) model of the ship block is established for the augmented reality system. With the help of the Microsoft HoloLens, the human-computer interface is designed by defining gestures and voice commands. The prototype system is developed, and case study and user test are conducted to verify the effectiveness and feasibility of the system.
Jiahao Ding, Yu Zhu, Mingyu Luo, Minghua Zhu, Xiumin Fan, Zelin Zhou
The Virtual Dressing Room: A Return Rate Study
This paper presents an evaluation of a virtual dressing room’s impact on the return rate of garments purchased online. First, we introduce our recent developed prototype of a virtual dressing room. Next, we present the research and test design. This work involves a comparative study of the virtual dressing room with a traditional web shop and a real physical fitting room. Results show that, although the test participants (n = 75) preferred the real physical fitting room, the virtual dressing room showed significant (p = 0.003) better performance on delivering information on the shape/fit than the traditional web shop. While the virtual dressing room did not provide significant different performance over the web shop with respect to delivering information on the fabric, color nor size (p = 0.08), helping choosing the correct size and shape/fit were qualitatively addressed as the main advantages of the virtual dressing room. Wrong size and shape/fit were also found to be the two most important reasons for returning online purchases.
Michael Boelstoft Holte
A Context-Aware Assistance Framework for Implicit Interaction with an Augmented Human
The automotive industry is currently facing massive challenges. Shorter product life cycles together with mass customization lead to a high complexity for manual assembly tasks. This induces the need for effective manual assembly assistances which guide the worker faultlessly through different assembly steps while simultaneously decrease their completion time and cognitive load. While in the literature a simulation-based assistance visualizing an augmented digital human was proposed, it lacks the ability to incorporate knowledge about the context of an assembly scenario through arbitrary sensor data. Within this paper, a general framework for the modular acquisition, interpretation and management of context is presented. Furthermore, a novel context-aware assistance application in augmented reality is introduced which enhances the previously proposed simulation-based assistance method by several context-aware features. Finally, a preliminary study (N = 6) is conducted to give a first insight into the effectiveness of context-awareness for the simulation-based assistance with respect to subjective perception criteria. The results suggest that the user experience is improved by context-awareness in general and the developed context-aware features were overall perceived as useful in terms of error, time and cognitive load reduction as well as motivational increase. However, the developed software architecture offers potential for improvement and future research considering performance parameters is mandatory.
Eva Lampen, Jannes Lehwald, Thies Pfeiffer
A Literature Review of AR-Based Remote Guidance Tasks with User Studies
The future of work is increasingly mobile and distributed across space and time. Institutions and individuals are phasing out desktops in favor of laptops, tablets and/or smart phones as much work (assessment, technical support, etc.) is done in the field and not at a desk. There will be a need for systems that support remote collaborations such as remote guidance. Augmented reality (AR) is praised for its ability to show the task at hand within an immersive environment, allowing for spatial clarity and greater efficiency, thereby showing great promise for collaborative and remote guidance tasks; however, there are no systematic reviews of AR based remote guidance systems. This paper reviews the literature describing AR-based remote guidance tasks and discusses the task settings, technical requirements and user groups within the literature, followed by a discussion of further areas of interest for the application of this technology combined with artificial intelligence (AI) algorithms to increase the efficiency of applied tasks.
Jean-François Lapointe, Heather Molyneaux, Mohand Saïd Allili
Development of an Augmented Reality System Achieving in CNC Machine Operation Simulations in Furniture Trial Teaching Course
In schools, due to narrow processing spaces and limited machines, students can only understand the operation and processing procedures of computer numerical control (CNC) machines through the unilateral operation and explanation of technicians or by group operation or observations in turn. Under the existing teaching restrictions, students can only obtain limited operating experience through the partial processing steps of group machine operation. Students are therefore not familiar with the operation of the machines and it is difficult to complete consistent operation procedures, resulting in fear and uncertainty in the operation of the machines. Therefore, this research used augmented reality (AR) technology to simulate the complete operation flow and processing information of CNC processing machines, and combined the practical teaching courses to deepen the students’ understanding of CNC machine operation.
In this study, a total of ten participants were recruited to conduct AR system operation experiments. The follow-up verification was conducted through real CNC machine operation and by filling in the System Usability Scale (SUS) to evaluate the experience feedback after using the AR-CNC training system. Finally, the results of this study showed that the AR-CNC training system could not only solve the problem of machining operation interruption caused by students’ rotating use of the CNC machine but also solve the hardware limitations of the actual machining space and the machine itself, so that every student could learn how to operate the CNC machine as if they were operating it in person.
Yu-Ting Lin, I-Jui Lee
Virtual Reality (VR) in the Computer Supported Cooperative Work (CSCW) Domain: A Mapping and a Pre-study on Functionality and Immersion
This pre-study investigates how Virtual Reality (VR) can be used to support collaborative work in a business setting. Based on a literature review it’s explored how immersive technologies like Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) fit into the Computer Supported Cooperative Work (CSCW) domain. By using the time and space matrix the immersive technologies are classified. Based on the findings, immersive VR supporting synchronous and remote collaboration is chosen for further investigation, where the immersive VR application, MeetinVR, has been tested. Two small-scale experiments have been conducted (n = 3 and n = 10) to test the functionality of the MeetinVR application and the level of immersion when conducting interviews compared to an online communication tool, like Skype. The initial results indicate that an immersive VR application is relatively user-friendly and provide a high level of immersion.
Gitte Pedersen, Konstantinos Koumaditis
Measurement Based AR for Geometric Validation Within Automotive Engineering and Construction Processes
We look at the final stages of the automobile design process, during which the geometric validation process for a design, in particular for the vehicle front end, is examined. A concept is presented showing how this process can be improved using augmented reality. Since the application poses high accuracy requirements the augmented reality also needs to be highly accurate and of measurable quality. We present a Measurement Based AR approach to overlaying 3D information onto images, which extends the existing process and is particularly suited to the application in question. We also discuss how the accuracy of this new approach can be validated using computer vision methods employed under the appropriate conditions. The results of an initial study are presented, where the overlay accuracy is expressed in image pixels as well as millimeters followed by a discussion on how this validation can be improved to meet the requirements posed by the application.
Muhammad Ali Shahid, Benjamin-Paul Jesche, Manuel Olbrich, Holger Graf, Andreas Franek, Arjan Kuijper, Ulrich Bockholt, Michael Schmitt
Augmented Instructions: Analysis of Performance and Efficiency of Assembly Tasks
Augmented Reality (AR) technology makes it possible to present information in the user’s line of sight, right at the point of use. This brings the capability to visualise complex information for industrial maintenance applications in an effective manner, which typically rely on paper instructions and tacit knowledge developed over time. Existing research in AR instruction manuals has already shown its potential to reduce the time taken to complete assembly tasks, as well as improving accuracy [13]. In this study, the outcomes of several aspects of AR instructions are explored and their effects on the chosen Key Performance Indicators (KPIs) of task completion time, error rate, cognitive effort and usability are assessed. A standardised AR assembly task is also described for performance comparison, and a novel AR experimental tool is presented, which takes advantage of the flexibility of internet connected peripherals, to explore various different aspects of AR app design to isolate their effects. Results of the experiments are given here, providing insight into the most effective way of delivering information and promoting interaction between user and computer, in terms of user performance and acceptance.
Eleanor Smith, Gordon Semple, Dorothy Evans, Kenneth McRae, Paul Blackwell
Interactive Mixed Reality Cooking Assistant for Unskilled Operating Scenario
As the further development of virtual reality technology, mixed reality (MR) promotes the innovation of human-computer interaction. It can integrate virtual information (objects, pictures, videos, etc.) into the real environment, and directly interact with people. It can greatly improve the efficiency of information reception, reduce the cost of understanding and learning, and improve the fluency and accuracy of operational tasks. In the kitchen scene, when the novices cook according to the recipe, they usually do not perform well in memory and learning, which causes many problems in the cooking process. Therefore, we propose an interactive intelligent cooking system design, which focuses on the five kinds of information that novices need most help in the cooking process, and subverts the traditional recipe with MR presentation. The experiment also shows that the intelligent cooking system can make up for the guidance to novice which is easy to be ignored by traditional recipes, and can effectively improve the user experience of novices.
Ke-yu Zhai, Yi-ming Cao, Wen-jun Hou, Xue-ming Li

Learning, Narrative, Storytelling and Cultural Applications of VAMR

Engaging Place with Mixed Realities: Sharing Multisensory Experiences of Place Through Community-Generated Digital Content and Multimodal Interaction
This paper discusses the motivation and potential methodologies for the use of mixed reality and multimodal interaction technologies to engage communities and members of the public with participation in the active creation and use of urban data. This research has been conducted within the context of a wider research program investigating the use of data dashboard technologies and open data to more effectively communicate information to urban authorities and citizens and enable more evidence-based decision making. These technologies have drawn criticism for promoting objectifying, data-driven approaches to urban governance that have proven insensitive to the specificity of place and the contexts of citizens’ daily lives. Within the digital and spatial humanities, there has been growing interest in ‘deep mapping’ as a means for recovering the sense of place and the nuances of everyday life through the incorporation of spatial narratives and multimedia in their mapping practices. This paper considers the ways in which mixed realities can contribute to these efforts, and in particular the unique affordances of virtual reality for evoking an embodied sense of presence that contributes to the communication of a sense of place via rich multisensory experiences. The paper concludes with the discussion of a pilot study conducted with members of the public. This demonstrates the ways in which virtual environments can be created in ways that maintain contextual and affective links to the places they represent as a result of involvement in ‘hands-on’ activity of mapping through urban sensing and the capture of place-based media.
Oliver Dawkins, Gareth W. Young
Augmented Reality and Microbit for Project-Based Learning
The research proposes an augmented reality system integrating with the BBC Microbit microcontroller for student’s project-based learning. It is a learning tool that helps the students to understand about the augmented reality (AR) and the Microbit technologies. To increase the student’s motivation, the final project was conducted as an AR projectile-based shooting game application. The Microbit was also used as an external remote controller. Each player has an AR marker to define the position and the orientation of the player’s avatar and the Microbit for shooting the bullets. The learning goal of this proposed system is to help the student to understand about physics simulation related to the projectile, augmented reality, C# programming, the block-based programming with Microbit, basic electronics, hardware communication, computer graphics, painting, and the coordinate system. Furthermore, the proposed system was designed to support for various teaching and learning approaches including the interdisciplinary learning, project-based learning, and STEAM education. This proposed hands-on learning tool assists the learner to understand about the integration of core knowledges in science, engineering, arts, and math. The proposed system was used by 294 participants including Thai high school students, vocational students, and teachers from 3 provinces of Thailand. The experimental results showed that the learning achievement of students received improvement up to 19–36.28% by comparing the average T-scores of pretest and posttest. In addition, the teachers and students showed a good level of satisfaction on using the proposed system in teaching and learning.
Poonsiri Jailungka, Siam Charoenseang, Chaowwalit Thammatinno
Research on the Perceptual Interaction Model of Virtual Reality Films
Virtual reality technology has three basic characteristics: immersion, interaction and imagination. When it combines with film art, it gives a new vitality to the film, allowing the audience to gain immersed audio-visual experience, and let them have more interactive links with the film. However, in the context of film, narrative and interaction are contradictory. How to integrate interaction into film narrative is the focus of this paper. First of all, the paper analyzes the audience’s psychological motivation while they are watching films to explore when and how do they want to interact, then proposes a new concept “perceptual interaction” to define the interaction form of VR films, and summarizes the features of “perceptual interaction”, including deep immersion and “suitable to film conditions”. Though great differences are in the design of interactive forms between different categories of films, “perceptual interaction” has a set of interacting mechanisms at the emotional level, so the interaction mode of “attention - cognition - awaken - interaction - feedback” is proposed, which provides a reference for interaction design of VR films.
Yunpeng Jia, Ziyue Liu, Chuning Wang, Lei Xu
Interactive Narrative in Augmented Reality: An Extended Reality of the Holocaust
In this research, the author descripted new narrative media known as Immersive Augmented Reality Environment (IARE) with HoloLens. Aarseth’s narrative model [17] and all available input design in IARE were reviewed and summarised. Based on these findings, The AR Journey, a HoloLens app aiming at interactive narrative for moral education purpose, was developed and assessed. Qualitative methods of interview and observation were used and the results were analysed. In general, narrative in IARE were proved to be valid for moral education purpose, and findings including valid narrative structure, input model, design guidelines were revealed.
Yunshui Jin, Minhua Ma, Yun Liu
Learning in Virtual Reality: Investigating the Effects of Immersive Tendencies and Sense of Presence
The goal of this study is to examine the effects of the sense of presence and immersive tendencies on learning outcomes while comparing different media formats (Interactive VR, Non-interactive VR and Video). An experiment was conducted with 36 students that watched a Biology lesson about the human cells. Contrary to expected, the results demonstrate that the Non-interactive VR was the most successful format. Sense of presence and immersive tendencies did not have an effect on learning gain, and the latter was not a critical factor to experience the sense of presence. The findings provide empirical evidence to help understand the influence of these variables on learning in VR.
Aliane Loureiro Krassmann, Miguel Melo, Bruno Peixoto, Darque Pinto, Maximino Bessa, Magda Bercht
Empeiría*: Powering Future Education Training Systems with Device Agnostic Web-VR Apps
This paper presents Empeiría, which uses cutting-edge technologies and novel virtual reality systems to enhance future in-class education training. Empeiría incorporates JavaScript, WebGL, WebVR, and powerful web graphics engines like Babylon.js to create immersive training experiences. It demonstrates a useful application of computer science concepts and increases the dissemination of edge technologies across academic disciplines. Most importantly, Empeiría improves education technology and professional training through the creation of two major software products; an immersive experience editing system (Empeiría-E) and an immersive experience viewing system (Empeiría-V). We show how these virtual reality systems can lead to more effective training and improve our understanding of trainees.
Matthew E. Miller, Yuxin Yang, Karl Kosko, Richard Ferdig, Cheng Chang Lu, Qiang Guan
Did You Say Buttonless? Exploring Alternative Modes of Sensory Engagement for Augmented Reality Storytelling Experiences
Augmented Reality designers and content creators continue to explore ways to engage audiences. However, studies have yet to focus on how different modes of interaction affect understanding and immersion in AR environments. To address this, a simulation and focus group was conducted to elicit feedback about five different modes of interaction: sound, touch, haptic feedback, presence, and gesture. Results identified four themes, with gesture interaction garnering more appeal and immersion than alternatives. Accessibility and self-consciousness in public settings were illuminated during the simulation, which highlighted barriers related to some modes of interaction.
Richard Olaniyan, Travis Harvey, Heather Hendrixson, Jennifer Palilonis
Using Laser Scans and ‘Life History’ to Remember Heritage in Virtual Environments
When building is demolished or at risk, it is important to capture more than just a visual of the brick and mortar we need to preserve the structure’s community ties its “life history” - the human element; the stories of design, construction, community activity, and events are lost with the passing of each individual associated with it. One tool that can be utilized is the terrestrial laser scanner, which can capture an authoritative record of a structure or object at a moment in time. Comprehensively, scans are useful documentation to assist site restoration in the event of a natural disaster, fire, war, or deterioration due to weather. This value has been demonstrated with both the 2019 fire at Notre Dame and the UNESCO/CyArk conservation efforts at the Ananda Ok Kyaung temple that was impacted by an earthquake in 2016. Working in conjunction with local communities, researchers are able to solicit private collections of memorabilia, photographs, and documentation that cannot be found in archives and libraries. The life history approach combines the detailed laser scans with this rich multi-disciplinary documentation to capture the structure’s place in the community broadening its utility to enhance VR and AR experiences.
Lori C. Walters, Robert A. Michlowitz, Michelle J. Adams
Study on Learning Effectiveness of Virtual Reality Technology in Retail Store Design Course
Information and communication technology is regarded as a crucial tool in the field of education. The Retail Store Design course in Shunde Polytechnic includes industry–education integration and a project-oriented teaching approach. Various types of retail store design projects are introduced through different channels, and project-based teaching is implemented in accordance with a market-oriented design studio workflow. The interactive nature of virtual reality technology can transform conventional teaching from passive into active learning, thus improving teacher–student communication and information transmission in the classroom. This study adopted virtual reality technology as a tool to facilitate teaching and learning.
A 720° virtual reality training system was developed using Photoshop and the 720yun application to provide students with a virtual survey experience for the retail store construction site before remodeling. The Retail Store Design course of Shunde Polytechnic was used as an example to evaluate the system according to three aspects: onsite survey accuracy, onsite survey speed, and learning enjoyment.
Participants were recruited from students enrolled in the Retail Store Design course at Shunde Polytechnic; a set of objective and subjective questionnaire measurements were designed and distributed to the aforementioned students.
The results demonstrated that virtual reality yielded more satisfactory learning achievements compared with conventional teaching methods. Using virtual reality technology for the virtual survey of construction sites in the Retail Store Design course was extremely valuable and effective for students.
Chu-Jun Yang, Chih-Fu Wu

VAMR for Health, Well-being and Medicine

Development and Human Factors Considerations for Extended Reality Applications in Medicine: The Enhanced ELectrophysiology Visualization and Interaction System (ĒLVIS)
With the rapid expansion of hardware options in the extended realities (XRs), there has been widespread development of applications throughout many fields, including engineering, entertainment and medicine. Development of medical applications for the XRs have a unique set of considerations during development and human factors testing. Additionally, understanding the constraints of the user and the use case allow for iterative improvement. In this manuscript, the authors discuss the considerations when developing and performing human factors testing for XR applications, using the Enhanced ELectrophysiology Visualization and Interaction System (ĒLVIS) as an example. Additionally, usability and critical interpersonal interaction data from first-in-human testing of ĒLVIS are presented.
Jennifer N. Avari Silva, Mary Beth Privitera, Michael K. Southworth, Jonathan R. Silva
Classifying the Levels of Fear by Means of Machine Learning Techniques and VR in a Holonic-Based System for Treating Phobias. Experiments and Results
This paper presents the conceptual design, implementation and evaluation of a VR based system for treating phobias that simulates stress-provoking real-world situations, accompanied by physiological signals monitoring. The element of novelty is the holonic architecture we propose for the real-time adaptation of the virtual environment in response to biophysical data (heart rate (HR), electrodermal activity (EDA) and electroencephalogram (EEG)) recorded from the patients. In order to enhance the impact of the therapy, we propose the use of gamified scenarios. 4 acrophobic patients have been gradually exposed to anxiety generating scenarios (on the ground and at the first, 4th and 6th floors of a building, at different distances from the railing), where EEG, EDA and HR have been recorded. The patients also reported their level of fear on a scale from 0 to 10. The treatment procedure consisted in a VR-based game where the subjects were exposed to the same heights. They had to perform some small quests at various distances from the railing and report the in-game stress level, while biophysical data was recorded. The real-life scenarios have been repeated, with the purpose of assessing the efficiency of the VR treatment plan.
Oana Bălan, Gabriela Moise, Alin Moldoveanu, Florica Moldoveanu, Marius Leordeanu
Multi-channel Interaction Design and Implementation of Medical Pendant Based on Virtual Reality Technology
This paper takes the medical pendant as the research object of virtual reality technology multi-channel interaction design. It expounds the characteristics of multi-channel human-computer interaction in the virtual reality system and discusses the technology realization process of multi-channel human-computer interaction in the virtual reality system. From the aspects of the visual channel, sound channel and haptic channel, the current technical difficulties are analyzed in combination with the principle of human-computer interaction. The significance of multi-channel interaction design research is further verified, and the important principles of interaction design in the virtual reality system are proposed. Simulating the product in a virtual environment improves the sense of use and safety of medical device products. Combined with the user interaction task experiment of the virtual reality system of the medical pendant, the design practice process was verified.
Dini Duan, Zhisheng Zhang, Hao Liu, Zhijie Xia
A Virtual Reality Dental Anxiety Mitigation Tool Based on Computerized Cognitive Behavioral Therapy
Dental anxiety has become one of the most important problems affecting patients’ timely consultation and visiting experience. The purpose of this study is to help patients who have the emotion of dental anxiety to relieve anxiety by a virtual reality product based on computerized cognitive behavioral therapy. The study collected MDAS, GSR, and HRV data of 24 adults with dental anxiety through experiments to evaluate the degree of dental anxiety, and evaluated the user experience by user experience questionnaire. According to the data analysis, MDAS, GSR, and HRV has been decreased after the intervention, which confirmed the effectiveness of the virtual reality dental anxiety mitigation tool based on computerized cognitive behavioral therapy. At the same time, direct exposure to the virtual dental environment also has certain utility, but the effect is not equal to the former.
Ting Han, Hanyue Xiao, Tianjia Shen, Yufei Xie, Zeshi Zhu
Sampling Electrocardiography Conformation for a Virtual Reality Pain Management Tool
Previous research has shown that Virtual Reality (VR) technology may provide an alternative solution to pain management for clinical applications based on some psychological intervention strategies. Additional research has suggested that Electrocardiography (ECG) can be an objective measure of pain, with evidence showing that as pain increases, ECG signals should also increase. The aim of this study is to examine the effect of VR on naturally occurring pain when no pharmacological analgesics nor psychological intervention strategies are applied. The above statement will be validated via physiological responses, such as ECG and a correlation between subjective and objective measurements of pain will be made. The findings of the present study extend our understanding of the physiological and psychological effects of VR, providing useful insights into the relationship of VR and the levels of pain and discomfort caused by an exhaustive single limb muscle contraction. The main conclusion reached is that the use of VR can reduce physiological and psychological responses associated with negative sensations. Specifically, the results suggested that VR technology can significantly reduce ECG by 6 bmp, and perceived pain and exertion up to 50%, it can also significantly increase pain tolerance up to three minutes, without the use of any pharmacological analgesics and psychological intervention strategies.
Maria Matsangidou, Alexis R. Mauger, Chee Siang Ang, Constantinos S. Pattichis
VREye: Exploring Human Visual Acuity Test Using Virtual Reality
Human Eye is a complex sense organ that allows vision. Vision problems may arise due to various reasons. Vision tests serve us to determine the levels of vision degradation. Early observations can support us to provide appropriate intervention for vision issues. However, one needs an Optometrist (eye specialist) to identify and validate vision issues. Scheduling an eye check-up can be tedious or unaffordable to people across various parts of the world. Acknowledging the real-world challenges on a scale of affordance to laziness, a simplified solution for vision tests can ease understanding the levels of vision issues. Considering this use case, we attempt to build a virtual reality (VR) based vision testing mechanism for studying vision issues for individuals across all age groups called ‘VREye’. In this paper, we discuss our journey towards developing a VR based solution for detecting myopic vision. We detail our challenges and insights on building the overall VR scene design, developing a virtual distance scale, and using real-world test subjects for initial validation. We also discuss our plans to extend this application to include more vision problems like hypermetropia and color blindness.
Shivang Shekar, Pranav Reddy Pesaladinne, Sai Anirudh Karre, Y. Raghu Reddy
Virtual, Augmented and Mixed Reality. Industrial and Everyday Life Applications
herausgegeben von
Jessie Y. C. Chen
Gino Fragomeni
Electronic ISBN
Print ISBN