Zum Inhalt

Immersive Inspection: Intuitive Material Analysis using X-Ray Computed Tomography Data in AR

  • Open Access
  • 01.09.2025
Erschienen in:

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Die rasche Digitalisierung der Industrie hat zu einer Zunahme komplexer, hochdimensionaler Daten geführt, insbesondere bei der zerstörungsfreien Prüfung von Materialien. Um die Ziele von Industrie 4.0 wie Echtzeit-Überwachung und Qualitätskontrolle zu erreichen, sind fortschrittliche analytische Arbeitsabläufe unverzichtbar. Dieser Artikel stellt ein bahnbrechendes Rahmenwerk vor, das Augmented Reality (AR) für die immersive Analyse von Röntgen-Computertomographie-Daten (XCT) nutzt. Durch die Überlagerung virtueller Daten mit realen Objekten ermöglicht AR Inspektionen und Analysen vor Ort, wodurch die Flexibilität und Effizienz des Workflows gesteigert wird. Das Framework unterstützt intuitive, menüfreie Interaktionen und automatische Materialerkennung, wodurch Experten dreidimensionale Volumen und ihre physischen Gegenstücke in einer immersiven Umgebung erkunden können. Eine qualitative Studie mit NDT-Experten bewertet die Effizienz und Benutzerfreundlichkeit dieses neuartigen Workflows und hebt das Potenzial für zukünftige immersive analytische Anwendungen hervor. Der Artikel diskutiert auch die Möglichkeiten und Herausforderungen immersiver Analytik in der Materialwissenschaft, einschließlich verkörperter Navigation, räumlicher Eintauchen und der Visualisierung abstrakter Daten. Das vorgestellte Rahmenwerk stellt einen bedeutenden Fortschritt in den Bereichen Materialcharakterisierung, Qualitätskontrolle und Metrologie dar und bietet einen intuitiveren und kontextbewussteren Ansatz bei der Datenanalyse.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

1 Introduction

In materials science, the development and continuous optimization of novel and advanced materials is essential. The rapid digitalization of industry has generated increasing volumes of complex, heterogeneous, and high-dimensional data, particularly in non-destructive testing (NDT) of materials and components [13]. Meeting key Industry 4.0 objectives, like location-independent working, real-time monitoring, visualization, quality control, and process optimization, requires advanced analytical workflows for these complex datasets.
Immersive analytics (IA) technologies, particularly augmented reality (AR), offer novel approaches to explore heterogeneous datasets and support cognitively demanding analytical tasks. Significant progress has been made implementing digital support in industrial applications including assembly, maintenance, and training [4, 5]. However, materials characterization, quality control, and metrology domains have seen limited adoption of immersive visualization systems for materials science data. This gap highlights the need for immersive systems that facilitate intuitive analysis of complex data from advanced imaging techniques such as X-ray computed tomography (XCT) for investigating defects in spatial renderings or exploring multidimensional secondary data onsite.
AR head-mounted displays (HMDs) extend analytical capabilities beyond traditional settings. By overlaying virtual data onto real objects, AR enables on-site inspections and analyses, enhancing workflow flexibility and efficiency. The technology provides contextual information that bridges the gap between users and their physical environment. Consequently, inspections and analyses are no longer confined to office environments but can be conducted at scanning facilities utilizing the captured material samples. AR HMDs provide the opportunity to stay aware of the local surroundings while maintaining both hands free for other activities, essential in industrial environments [6]. In addition, AR HMDs offer the possibility to use other visible systems without restrictions, thus expanding the possible range of available functions. When used as complementary interface, AR can improve productivity and accuracy in analytical workflows [4, 7]. It enhances the traditional 2D workflow of expert systems with spatial visualization and more natural interaction. Research indicates that AR visualizations can match the accuracy of 2D alternatives while providing more immersive experiences [8, 9]. Specifically, AR’s capability to enable users to walk through and zoom into renderings intuitively improves the exploration and analysis process [10].
Our work presents a framework for immersive analysis of complex materials science data using AR. We explore the positive impact of embodied navigation and interaction, as well as the natural analysis of three-dimensional volumes and their physical counterparts on NDT data. To our knowledge, this is the first framework enabling material experts an AR analysis of spatial and abstract NDT data. We evaluated the potential benefits of this novel workflow through expert user studies and identified opportunities for future immersive analytical applications in this domain.
To summarize, the main contributions of this work are seen in the following points:
  • A novel AR-based framework enabling immersive, location-independent analysis of complex spatial and abstract NDT data.
  • An intuitive menu-free workflow enabling automatic material recognition and tangible interaction
  • A qualitative study with NDT experts to evaluate the efficiency and usability of immersive material analysis.
Immersive analytics as a scientific field has experienced significant growth since its origins around 2015, as evidenced by significant research efforts in both AR and virtual reality (VR) [11]. This growth is aligned with the broader Industry 4.0 paradigm, which is characterized by the integration of advanced digital technologies into industrial processes. It also reflects increasing analytical demands for spatial, heterogeneous, and high-dimensional datasets.

2.1 Opportunities of Immersive Analytics with Respect to NDT Analyses

Over the last years researchers discovered promising opportunities for IA across diverse analytical domains [6, 1214]. It aims to improve data understanding and decision making by bridging the gap between the users and their data through more natural, engaging, and immersive analysis. Although IA encompasses a variety of technological modalities, recent frameworks have shown an increasing tendency to leverage AR and VR capabilities. Particularly in the context of Industry 4.0, AR acts as a key enabling technology, offering increased productivity, accuracy, and autonomy in the quality sector [4, 5].
For materials science, a field of research characterized by complex spatial structures and large amounts of high-dimensional data, IA offers a significant potential for transforming tasks such as materials characterization, quality control, and metrology. For these tasks, technologies such as AR offer new ways of engaging with data analysis. This may significantly change how we perform and interact with data in future as outlined in the following paragraphs:
  • Embodiment: Embodied data exploration replaces traditional input mechanisms (mouse, keyboard) with intuitive body movements, gestures, and tangible interactions. This allows users to navigate and explore complex data sets through natural physical engagement. This approach reduces cognitive load and facilitates more intuitive analytical workflows when examining material structures [15]. This allows experts to virtually ”step inside” the micro-structure of a material, providing a unique perspective on the material structure
  • Spatial immersion: IA environments provide expansive three-dimensional (virtual) workspaces that render spatial data with perceptual fidelity. The immersive workspace allows experts to organize visualizations strategically, supporting analytical workflows that better align with the inherently spatial nature of material analysis tasks [16].
  • Using depth & non flat surfaces: Stereoscopic rendering in immersive environments facilitates improved perception of spatial relationships and enables the development of novel immersive visualization techniques [17]. This allows complex material structures to be examined from arbitrary perspectives previously unattainable through conventional visualization methodologies, thereby illuminating structural characteristics that might otherwise remain obscured
  • Arranging multiple views: Additional information may be provided in separate views, which can be arranged in different layouts in the environment. In particular, AR enables the extension of existing workflows through complementary interfaces that provide an extended virtual workspace for comprehensive analyses beyond the limits of conventional desktops [7, 17].
  • Engagement: Immersive environments foster increased presence and engagement with analytical tasks. This can potentially induce flow states that have a positive impact on user experience and productivity. The heightened engagement strongly affects the experience of users and degree of productivity [17].
  • Situated analytics: By augmenting physical objects with contextually relevant information, situated analytics enables on-site, in-place data exploration through direct interaction with specimens of interest. Mobile immersive devices make these capabilities location-independent, facilitating analytical processes across diverse environments [6, 18, 19]. In industrial settings, see-through AR HMDs offer additional advantages, as workers maintain environmental awareness in potentially hazardous workplaces while keeping hands free for sample manipulation [6].
  • Collaboration: Immersive technologies facilitate socially engaged co-located and remote collaboration through embodied exploration in shared environments. This capability can eliminate the requirement for on-site experts during inspection tasks [20] and enhances multi-dimensional data analysis through novel input modalities and virtual avatars [9].
  • Visualising abstract data with a spatial embedding: Information-Rich Virtual Environments (IRVEs) [17, 21, 22] represent a particularly promising concept for material analysis, where inherently spatial data integrates harmoniously with abstract information. This enriches the user’s understanding of the dataset, creating an even more comprehensive and seamless user experience.
Based on these opportunities, we conclude that immersive analytics presents a compelling research direction for materials science, with AR emerging as a particularly promising technology for industrial NDT applications.

2.2 Applications of Immersive Analytics in NDT

Our focus regarding applications is on IA systems that employ spatial data, particularly within the domain of material science and that facilitate an analysis. There are few immersive analytics systems available analyzing volumetric data as well as high-dimensional data, and they are divided between AR- and VR-focused systems. However, systems that allow for the material characterization of the generated NDT data by means of immersive techniques are not widely available. This highlights a significant gap in literature. A large proportion of applications in the industrial environment involves assembly, maintenance, and training tasks. Such tasks are not relevant to our research focus. Here, we rather present the most relevant work for AR and VR based analysis and visualization of NDT data.
Wang et al.  [23] proposed the manipulation of CT datasets and their transfer function in VR. Their system allows tweaking various rendering parameters using hand gestures, but it does not offer any analysis options. Tadeja et al. [24] use VR for creating an aerospace design environment for digital twinning. Their tool allows for loading CAD geometries of aircraft components and respective exploration of their performance parameters through 3D scatterplots. ImNDT by Gall et al. [16] offers material experts an immersive workspace for the analysis of multi-dimensional data from NDT. The authors developed visual metaphors and interaction techniques for their VR system, which can only be applied to secondary data and do not allow a visualization of voxel data. Like in any other VR system, users are also here isolated from the real world. Gall et al. [25] extended the ImNDT system to allow for Cross-Virtuality-Analysis (XVA). They added an AR view mode for the VR user to enable collaboration with users of a large touch screen. This study found that XVA as form of analysis is beneficial, but suffers in the presented application from the poor resolution of the used VR cameras.
Schickert et al. [26] demonstrated the use of AR for material inspection. The authors developed an application for a tablets to superimpose CAD models extracted from indications of ultrasound and radar images onto a concrete specimen. The tool positions CAD models in the location of a QR code, but does not provide the ability to modify the geometry or analyze additional secondary data. The system introduced by Ferraguti et al. [27] supports workers inspecting polished surfaces by projecting measurement data directly onto the component’s surface for intuitive quality evaluation. While visual analysis is possible, the system does not facilitate the exploration of additional data that is required for our material characterization. Corsican Twin by Prouzeau et al. [28] combines VR and AR devices to create an immersive authoring tool. It allows users to annotate the virtual replica of a real-world environment in VR, and displays the charts or CAD models to AR users in the real-world environment through placed QR codes. For material characterization, the approach does not provide opportunities to explore large amounts of high-dimensional data in addition to the real samples.

3 Design & Methods

To outline the design requirements and the structure of the proposed IA system, we first discuss the available datasets, their representation, the proposed architecture of the framework and finally the implemented interaction and visualization techniques. The proposed framework serves as a prototype to enable immersive analysis of materials, which works independent from traditional desktop settings. For this prototype, we are building on promising results from previous work in immersive analysis of spatial data [16, 2427, 29].

3.1 Material Data

The NDT data used is mainly obtained from composite materials, more precisely from fiber-reinforced polymers. The primary data is obtained by using various XCT imaging devices and consists of 3D reconstructed volume datasets, stored as a binary file (*.raw). Multidimensional data, i.e., secondary data, is derived from the generated primary data. The secondary data used in this framework was derived via the fiber characterization pipeline of Salaberger et al. [30]. In this process a Hessian matrix is used for finding the medial axis of the individual fibers after applying Gaussian blurring for noise reduction. The result is a multi-dimensional table of attributes featuring the individual characteristics for each fiber, such as its length, diameter, start/end point, and volume, among others. This information is stored in a comma-separated file (.csv).
A major advantage of our framework is that it requires no additional data preparation compared to conventional analysis systems. In accordance with the experts, we have designed the framework to accept similar primary data formats that are used in standard toolkits for materials science, such as open_iA [31] or VG Studio [32]. The primary data, stored as binary files, can also be accompanied by MetaImage header files (*.mhd) consistent with standard toolkits such as ITK [33], facilitating seamless integration into existing workflows. Similarly, secondary data maintains compatibility with the output formats of typical material characterization pipelines since it represents a text file with rows. This deliberately standardized approach to data handling ensures that analysts can move between traditional desktop-based analysis and our AR-based analysis without additional data conversion or pre-processing steps, minimizing adoption barriers and enabling our framework to serve as a complementary tool within existing analytical ecosystems.
Fig. 1
The AR HMD uses the stored shape to detect the material. The image of the sample captured by XCT is displayed alongside the physical sample and moves in sync with it. A traditional analysis can still be carried out via monitors in the background
Bild vergrößern

3.2 Framework Structure

Given the current lack of appropriate AR systems for materials characterization, indicated in Section 2, we developed a new framework to seamlessly integrate complex primary and secondary XCT data with real-world objects. This enables researchers and analysts to inspect material properties and structures in place, offering them a virtual workspace. To facilitate further development, the framework will be released as open source on GitHub [34].
The proposed framework is based on the Unity engine (Version 2021.3) [35] for rendering immersive environments. Unity is used because of its wide variety of capabilities and compatibility with numerous immersive devices and platforms. For this reason it has been adopted for a lot of immersive applications and also serves as an effective framework for InfoVis applications, demonstrating its versatility across different areas [13, 36, 37].
Our framework’s architecture has been deliberately designed for cross-platform compatibility, enabling deployment across various immersive hardware configurations. By leveraging Unity as our development environment, the system can be adapted for deployment on more economically accessible platforms including smartphones, tablets, and lower-cost AR headsets, thus extending its reach to a wider user community. This scalability is particularly important for industrial adoption scenarios where cost considerations may influence implementation decisions. Based on the reasons mentioned in Subsection 2.1, we are using the Microsoft Hololens 2 [38] as our see-through AR HMD. The device provides advanced spatial recognition and eye-tracking, which enhances interaction with objects and virtual representations.
To enable the tracking of objects, we use the Vuforia Engine [39] which enables the robust detection and tracking of multiple targets, such as images and objects. The decision to incorporate Vuforia as a marker-based tracking system was made due to its status as an industry standard, its performance, and its cross-platform compatibility. This tracking solution is optimized for precision in industrial environments and maintains compatibility with all major AR devices, ensuring that our framework remains device-agnostic.

3.3 Visualization Techniques and Interaction Paradigms

Our system enables the analysis of both primary and secondary derived datasets. Users may select and load XCT datasets manually using a file dialog or by physical object recognition using the cameras on the AR HMD. The latter can be performed in two different ways, which are image target recognition or shape recognition. Using image target recognition, the system detects and tracks images attached to a sample that were previously stored and linked to a specific XCT dataset. The shape recognition uses existing shape data, such as a CAD model or a surface mesh created via photogrammetry, to detect the sample and then load and display the corresponding XCT dataset. Figure 1 illustrates the detection process using a CAD model. The primary dataset is linked to the real object, which acts as a proxy for interactions. The XCT dataset is displayed using various rendering techniques, such as direct volume rendering, maximum intensity projection, or similar.
The user can load secondary data, such as lists of attributes from features of interests such as defects or other spatial objects. This information is extracted from a multi-dimensional table of characteristics stored as .csv file. In the case of fiber reinforced polymers this includes information on their length, diameter, and orientation. If spatial information, i.e., start/end points of fibers, is available in the derived data an abstracted representation can be used for further exploration. In our application, we use a model-based surface representation to depict the fibers as cylinders. This visualization is created using the fibers’ start and end points, along with their radii. This approach allows for a simplified and less cluttered visual representation of the volume, facilitating the exploration of larger datasets.
Abstract data is also highly important for a detailed analysis of materials. For records of secondary data in the .csv file, which cannot be spatially interpreted, such as stress tensor fields and statistics of characteristics, various charts as histograms or scatter plots may be generated. The visualization techniques and interaction paradigms will be explained in more detail in Subsection 3.3. Since the framework runs as a standalone application on the Hololens, it does not require a PC in an office or the factory hall. Therefore, location independent analysis can be performed anywhere at any place and any time. To extend the analysis, additional systems, such as traditional desktop-based analysis systems, can be included in the inspection (see Fig. 1). This is possible because see-through AR allows the natural environment to be retained, so the application can be used as a complementary interface to an existing traditional system.
Fig. 2
Visualization techniques for analyzing abstract data: The user in this example utilizes histograms, bar charts, a distribution plot, and a three-dimensional scatterplot to display different fiber characteristics, and has arranged these visualizations in the workspace
Bild vergrößern
After identifying a sample, the corresponding XCT volume and abstract representation (depending on what is available) are displayed alongside the physical material. Users interact with spatial representations using hand gestures. By grabbing a representation with two hands and pulling the hands apart or together, a zooming metaphor is applied. Similarly, the visualizations are rotated and positioned by intuitive gestures. Furthermore, natural interaction with volumes is also possible by just interacting with the physical sample at hand. This results in a more natural, embodied inspection of the material.
The framework offers various visualization techniques to analyze the secondary data, i.e., abstract data, in addition to the spatial data. These include histograms, 3D scatterplots, bar charts, and density plots. Figure 2 shows a collection of abstract visualization techniques arranged in an office setting. These charts may be created at arbitrary places in the world and are interactive during the analysis. Interactions with the virtual representations are again carried out using hand gestures through grabbing a handle in the form of a cube placed at the lower left side of each representation. The gestures applied to the handle allow for zooming, rotating, or similar interactions using a single or both hands. Visualizations may be generated for all attributes present in the secondary dataset. The histogram visualization displays the frequency distribution of a selected attribute, allowing for a quick understanding of the most common value ranges. The 3D scatterplot reveals correlations between three attributes and provides a deeper understanding of multi-dimensional relationships. The bar chart allows experts to compare various abstract data, enhancing their understanding of differences or similarities. The distribution plot provides experts with an analysis of the shape of the distribution of an attribute for understanding the variability, central tendency, and the presence of potential outliers in the data. The visual representations provide users with a quick overview of the material’s characteristics and structure, allowing them to conduct analysis in an immersive environment.

4 User Study

The presented framework is a starting point facilitating further research in AR based material analysis. For this reason our initial studies are mainly focused on qualitative evaluation. The aim was to find out if and how immersion helps simplifying complex analysis tasks.

4.1 Participants

Four domain experts participated in the study, comprising two specialists in materials science and computer tomography, and two experts in material science visualization. All participants had extensive experience with conventional material characterization workflows using industry-standard 2D tools like, VG Studio [32] and open_IA [31]. The participants stated that they had no prior experience with AR tools for material analysis.
Fig. 3
The material is recognized by tracking the image target, which automatically loads the associated data. The XCT scan and model-based surface representation are both displayed and synchronized with the position and orientation of the material sample
Bild vergrößern
Fig. 4
After recognizing the dataset, different visualization techniques can be utilized. In this case, two histograms (top) and a 3D scatterplot (bottom) are used to analyze, e.g., the lengths and diameters of the fibers
Bild vergrößern

4.2 Dataset

As use case for this evaluation we selected a material system with polyethylene terephthalate fibers in a polypropylene matrix material. The primary dataset of this material system was obtained from a respective CT scan and secondary data was subsequently derived by the experts through the fiber characterization pipeline outlined in Subsection 3.1. The investigated sample was cut out of a standard multi-purpose test specimen manufactured by injection molding. The CT dataset yielded a size of \(250 \times 250 \times 300\) voxels and contains 214 fibers, for each of which 20 distinct characteristics were computed. The validation of the data is not within the scope of our study. As the material is very thin and features a very homogeneous texture, two image targets were used for tracking. The image target on the front of the sample is visible in Fig. 3. As all study participants possess a background in analyzing fiber reinforced polymers, an exploration task was defined to ensure the expert’s workflow was in no way constrained. This enabled us to assess how the experts utilize our tool and identify any issues with it.

4.3 Procedure

For our study we used the Microsoft Hololens 2 see-through AR HMD as immersive device to run the application. At the beginning of the study, participants received instructions on the general controls and displays available in the application. They were able to follow this tutorial live alongside the instructor on a 2D screen. A cognitive walk-through methodology was employed to assess both interaction design and analytical workflow efficacy. Participants were given the freedom to use the application while sitting or standing as well as to ask about controls. To gather qualitative insights, they were instructed to use the ’Thinking Aloud’ technique during the testing phase, expressing their planned actions and any conclusions they drew.
The analysis starts with fitting and setting up the HMD. Then, the participants pick up the prepared material sample from the table to inspect it. After the detection phase of the sample, the relevant data stored on the Hololens is loaded automatically. The user is presented with the volume rendering of the XCT dataset and its model-based surface representation, both hovering over the sample (see Fig. 3). The virtual representations are manipulated in space using the physical sample. Additionally, pre-selected data visualizations for the secondary data are displayed next to the sample in the environment. In our use case, these are two histograms for different fiber lengths and a three-dimensional scatterplot encoding various spatial dimensions, which can be seen in Fig. 4. The first histogram visualizes the distribution of straight fiber lengths (the distance between the start and end points of the fiber), while the second histogram shows the distribution of actual lengths considering the curvature of each fiber. This approach enables the identification of clusters related to specific fiber lengths and curvature. The scatter plot shows the correlation of the fibers based on their diameter, area and length. This visualization enables users to draw conclusions regarding whether the fibers tend to elongate or thicken more, affecting their surface area. The representations of abstract data in the environment can be freely moved and enlarged by experts to adapt to their current investigation phase or question.

4.4 Results & Discussion

During the exploration phase, which on average lasted 25-minutes, experts were able to inspect materials and test the application’s functionality. Valuable feedback on the benefits of an immersive exploration of material datasets was provided by subsequent discussions with the experts. This feedback yielded that the presented framework seems to be promising for various material characterization tasks, with a significant advantage being the ability to interact directly with the material sample. The synchronization of translations and rotations between the physical samples and virtual representations creates a natural interaction, eliminating the need for traditional mouse and keyboard operations. Additionally, as the data is automatically loaded when the user starts interacting with the real sample, no further input is required, and the analysis is intuitively initiated. Some participants with prior experience in VR systems mentioned that the AR device is much lighter and more comfortable to use for longer data exploration. Additionally, it requires no cables and does not isolate them from their office environment and colleagues.
The experts however also identified some disadvantages: The analysis was challenging in bright environments due to low color fidelity, rendering the distinction of some representations from bright backgrounds difficult. Further comments referred to current hardware limitations of AR devices. Arguably, the field of view (FoV) is rather small, which becomes particularly obvious when enlarging volumetric data. In this case it is impossible to see the whole object within the available FoV. This requires more head movement, which can lead to fatigue. Finally, some experts expressed interest in analyzing larger primary datasets exceeding 500 megabytes. The use of the Hololens 2 in standalone mode with active object tracking and visible rendering leads to performance bottlenecks in this case and an interactive analysis without delay or unstable tracking can no longer be guaranteed.
Experts finally suggest that the system has significant potential for analyzing samples in combination with conventional analysis systems. Incorporating additional visualization techniques can enable more detailed analysis by highlighting defects in volume representations or providing insights for abstract 4D data.

5 Future Work

Through close collaboration with domain experts, we identified key areas for future enhancements to our framework. These range from advanced techniques for defect visualization, integration of additional modalities, superposition of scans on real objects, and novel visual representations for time varying data.
The analysis of time-varying data is another crucial step in materials characterization. Adding novel methods to inspect primary and secondary data will facilitate understanding of changes during in-situ testing. This facilitates a comparison of the severity of changes between data sets and thereby improves the material characterization process. Improvements to rendering algorithms and data compression to handle larger datasets are also of interest, alongside the potential benefits of future hardware improvements for immersive analysis. While our current implementation employs basic downsampling for larger volumes (exceeding 500 MB), we intend to implement wavelet-based compression techniques [40] that enable visually lossless representations, maintaining critical structural information. This approach will significantly enhance the framework’s capacity to process high-resolution volumetric datasets without compromising analytical fidelity. Building on our preliminary qualitative assessment, we plan to conduct comprehensive comparative evaluations between our AR framework and traditional material analysis systems. This comparison will also allow for a quantitative analysis of task completion, accuracy, and efficiency, as well as standardized measures for usability and cognitive demand.

6 Conclusion

Our research resulted in the development of a novel, innovative augmented reality framework. The new system allows for immersive analysis of materials data through AR HMDs and represents a significant improvement over existing methods. A central element of our proposed framework is the implementation of situated analytics, fostering context-aware visualization and interactions. This approach leverages users’ gaze direction at real specimens for data loading, enhancing usability and creating a tangible link between real material samples and their virtual renderings. The intuitive recognition of real-world material samples and the control of virtual renderings based on them provide a new level of interaction for experts. By leveraging natural depth perception and embodied navigation and interaction, it enhances the overall orientation and shape understanding of three-dimensional volume analysis. By utilizing arrangeable charts, our system provides users with an immersive, customizable, and flexible workspace that supports the analysis of abstract data. Feedback from our user study with materials specialists validates the utility and usability of our tool, marking a significant step forward in this domain.
The insights gained provide researchers with innovative opportunities for future exploration in this important field.

Declarations

Competing interests

The authors declare no competing interests.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
download
DOWNLOAD
print
DRUCKEN
Titel
Immersive Inspection: Intuitive Material Analysis using X-Ray Computed Tomography Data in AR
Verfasst von
Alexander Gall
Anja Heim
Patrick Weinberger
Bernhard Fröhler
Johann Kastner
Christoph Heinzl
Publikationsdatum
01.09.2025
Verlag
Springer US
Erschienen in
Journal of Nondestructive Evaluation / Ausgabe 3/2025
Print ISSN: 0195-9298
Elektronische ISSN: 1573-4862
DOI
https://doi.org/10.1007/s10921-025-01220-x
1.
Zurück zum Zitat Ida, N., Meyendorf, N. (eds.): Handbook of Advanced Nondestructive Evaluation. Springer, Cham, Switzerland (2019). https://doi.org/10.1007/978-3-319-26553-7
2.
Zurück zum Zitat Heinzl, C., Kirby, R.M., Lomov, S.V., Requena, G., Westermann, R.: Visual computing in materials sciences (Dagstuhl Seminar 19151). Dagstuhl Rep. 9(4), 1–42 (2019). https://doi.org/10.4230/DagRep.9.4.1CrossRef
3.
Zurück zum Zitat Heinzl, C., Stappen, S.: STAR: Visual computing in materials science. Comput. Graph. Forum 36(3), 647–666 (2017). https://doi.org/10.1111/cgf.13214CrossRef
4.
Zurück zum Zitat Ho, P.T., Albajez, J.A., Santolaria, J., Yagüe-Fabra, J.A.: Study of augmented reality based manufacturing for further integration of quality control 4.0: A systematic literature review. Appl. Sci. 12(4), 1961 (2022). https://doi.org/10.3390/app12041961CrossRef
5.
Zurück zum Zitat Evangelista, A., Ardito, L., Boccaccio, A., Fiorentino, M., Petruzzelli, A.M., Uva, A.E.: Unveiling the technological trends of augmented reality: A patent analysis. Comput. Ind. 118, 103221 (2020). https://doi.org/10.1016/j.compind.2020.103221CrossRef
6.
Zurück zum Zitat Ens, B., Bach, B., Cordeil, M., Engelke, U., Serrano, M., Willett, W., Prouzeau, A., Anthes, C., Büschel, W., Dunne, C., Dwyer, T., Grubert, J., Haga, J.H., Kirshenbaum, N., Kobayashi, D., Lin, T., Olaosebikan, M., Pointecker, F., Saffo, D., Saquib, N., Schmalstieg, D., Szafir, D.A., Whitlock, M., Yang, Y.: Grand challenges in immersive analytics. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 8–13. ACM, New York, NY, USA (2021). https://doi.org/10.1145/3411764.3446866
7.
Zurück zum Zitat Zagermann, J., Hubenschmid, S., Balestrucci, P., Feuchtner, T., Mayer, S., Ernst, M.O., Schmidt, A., Reiterer, H.: Complementary interfaces for visual computing. it - Information Technol. 64(4–5), 145–154 (2022). https://doi.org/10.1515/itit-2022-0031CrossRef
8.
Zurück zum Zitat Schroeder, K., Ajdadilish, B., Henkel, A.P., Valdez, A.C.: Evaluation of a financial portfolio visualization using computer displays and mixed reality devices with domain experts. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA (2020). https://doi.org/10.1145/3313831.3376556
9.
Zurück zum Zitat Billinghurst, M., Cordeil, M., Bezerianos, A., Margolis, T.: Collaborative immersive analytics. In: Marriott, K., Schreiber, F., Dwyer, T., Klein, K., Riche, N.H., Itoh, T., Stuerzlinger, W., Thomas, B.H. (eds.) Immersive Analytics, pp. 221–257. Springer, Cham, Switzerland (2018). https://doi.org/10.1007/978-3-030-01388-2_8
10.
Zurück zum Zitat Wang, X., Besançon, L., Rousseau, D., Sereno, M., Ammi, M., Isenberg, T.: Towards an understanding of augmented reality extensions for existing 3D data analysis tools. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA (2020). https://doi.org/10.1145/3313831.3376657
11.
Zurück zum Zitat Marriott, K., Schreiber, F., Dwyer, T., Klein, K., Riche, N.H., Itoh, T., Stuerzlinger, W., Thomas, B.H. (eds.): Immersive analytics. Lecture Notes in Computer Science, vol. 11190. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01388-2
12.
Zurück zum Zitat Chandler, T., Cordeil, M., Czauderna, T., Dwyer, T., Glowacki, J., Goncu, C., Klapperstueck, M., Klein, K., Marriott, K., Schreiber, F., Wilson, E.: Immersive analytics. In: Big Data Visual Analytics (BDVA), pp. 1–8 (2015). https://doi.org/10.1109/BDVA.2015.7314296
13.
Zurück zum Zitat Fonnet, A., Prie, Y.: Survey of immersive analytics. IEEE Trans. Visual Comput. Graphics 27(3), 2101–2122 (2021). https://doi.org/10.1109/TVCG.2019.2929033CrossRef
14.
Zurück zum Zitat Kraus, M., Fuchs, J., Sommer, B., Klein, K., Engelke, U., Keim, D., Schreiber, F.: Immersive analytics with abstract 3D visualizations: a survey. Comput. Graph. Forum 41(1), 201–229 (2022). https://doi.org/10.1111/cgf.14430CrossRef
15.
Zurück zum Zitat Büschel, W., Chen, J., Dachselt, R., Drucker, S., Dwyer, T., Görg, C., Isenberg, T., Kerren, A., North, C., Stuerzlinger, W.: Interaction for immersive analytics, pp. 95–138. Springer, Cham, Switzerland (2018). https://doi.org/10.1007/978-3-030-01388-2_4
16.
Zurück zum Zitat Gall, A., Gröller, E., Heinzl, C.: ImNDT: Immersive workspace for the analysis of multidimensional material data from non-destructive testing. In: Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology. VRST ’21, pp. 1–11. ACM, New York, NY, USA (2021). https://doi.org/10.1145/3489849.3489851
17.
Zurück zum Zitat Marriott, K., Chen, J., Hlawatsch, M., Itoh, T., Nacenta, M.A., Reina, G., Stuerzlinger, W.: 2. In: Marriott, K., Schreiber, F., Dwyer, T., Klein, K., Riche, N.H., Itoh, T., Stuerzlinger, W., Thomas, B.H. (eds.) Immersive Analytics: Time to Reconsider the Value of 3D for Information Visualisation, pp. 25–55. Springer, Cham, Switzerland (2018). https://doi.org/10.1007/978-3-030-01388-2_2
18.
Zurück zum Zitat Willett, W., Jansen, Y., Dragicevic, P.: Embedded data representations. IEEE Trans. Visual Comput. Graphics 23(1), 461–470 (2017). https://doi.org/10.1109/TVCG.2016.2598608CrossRef
19.
Zurück zum Zitat Bressa, N., Korsgaard, H., Tabard, A., Houben, S., Vermeulen, J.: What’s the situation with situated visualization? A survey and perspectives on situatedness. IEEE Trans. Visual Comput. Graphics 28(1), 107–117 (2022). https://doi.org/10.1109/TVCG.2021.3114835CrossRef
20.
Zurück zum Zitat Gall, A., Fröhler, B., Schwajda, D., Anthes, C., Heinzl, C.: Towards remote analytics in nondestructive testing. In: AVI’22 Workshop Proceedings: "Enhancing Cross-reality Applications and User Experiences". ACM, Frascati, Italy (2022). https://cr-workshop.github.io/papers/gall2022-towards.pdf
21.
Zurück zum Zitat Polys, N.F., Bowman, D.A., North, C.: The role of depth and gestalt cues in information-rich virtual environments. Int. J. Hum Comput Stud. 69(1–2), 30–51 (2011). https://doi.org/10.1016/j.ijhcs.2010.05.007CrossRef
22.
Zurück zum Zitat Skarbez, R., Polys, N.F., Ogle, J.T., North, C., Bowman, D.A.: Immersive analytics: Theory and research agenda. Front. Robot. AI 6, 82 (2019). https://doi.org/10.3389/frobt.2019.00082CrossRef
23.
Zurück zum Zitat Wang, S., Zhu, D., Yu, H., Wu, Y.: Immersive WYSIWYG (what you see is what you get) volume visualization. (2020). https://doi.org/10.1109/pacificvis48177.2020.1001
24.
Zurück zum Zitat Tadeja, S.K., Seshadri, P., Kristensson, P.O.: AeroVR: An immersive visualisation system for aerospace design and digital twinning in virtual reality. Aeronaut. J. 124(1280), 1615–1635 (2020). https://doi.org/10.1017/aer.2020.49CrossRef
25.
Zurück zum Zitat Gall, A., Fröhler, B., Maurer, J., Kastner, J., Heinzl, C.: Cross-virtuality analysis of rich x-ray computed tomography data for materials science applications. Nondestruct. Test. Eval. 37(5), 566–581 (2022). https://doi.org/10.1080/10589759.2022.2075864CrossRef
26.
Zurück zum Zitat Schickert, M., Koch, C., Bonitz, F.: Prospects for integrating augmented reality visualization of nondestructive testing results into model-based infrastructure inspection. In: NDE/NDT for Highways & Bridges: SMT 2018, pp. 214–223. ASNT, Columbus, OH, USA (2018)
27.
Zurück zum Zitat Ferraguti, F., Pini, F., Gale, T., Messmer, F., Storchi, C., Leali, F., Fantuzzi, C.: Augmented reality based approach for on-line quality assessment of polished surfaces. Robot. Comput.-Integr. Manuf. 59, 158–167 (2019). https://doi.org/10.1016/j.rcim.2019.04.007CrossRef
28.
Zurück zum Zitat Prouzeau, A., Wang, Y., Ens, B., Willett, W., Dwyer, T.: Corsican twin: Authoring in situ augmented reality visualisations in virtual reality. In: Proceedings of the International Conference on Advanced Visual Interfaces. ACM, New York, NY, USA (2020). https://doi.org/10.1145/3399715.3399743
29.
Zurück zum Zitat Gall, A., Fröhler, B., Heinzl, C.: Cross virtuality analytics in materials sciences. In: ISS’21 Workshop Proceedings: ”Transitional Interfaces in Mixed and Cross-Reality: A New Frontier?” (2021). https://doi.org/10.18148/kops/352-2-wugxhv7d696t7
30.
Zurück zum Zitat Salaberger, D., Kannappan, K.A., Kastner, J., Reussner, J., Auinger, T.: Evaluation of computed tomography data from fibre reinforced polymers to determine fibre length distribution. Int. Polym. Proc. 26(3), 283–291 (2011). https://doi.org/10.3139/217.2441
31.
Zurück zum Zitat Fröhler, B., Weissenböck, J., Schiwarth, M., Kastner, J., Heinzl, C.: open_iA: A tool for processing and visual analysis of industrial computed tomography datasets. J. Open Source Softw. 4(35), 1185 (2019). https://doi.org/10.21105/joss.01185CrossRef
32.
Zurück zum Zitat Volumegraphics: VGSTUDIO MAX Website. https://www.volumegraphics.com/de/produkte.html. Accessed 01 May 2025 (2024)
33.
Zurück zum Zitat Kitware: ITK — Insight Toolkit. https://itk.org/. Accessed 01 May 2025 (2024)
34.
Zurück zum Zitat Gall, A.: Link to framwork implementation on GitHub. https://github.com/GallAlex/AugmeNDT. Accessed 01 May 2025 (2024)
35.
Zurück zum Zitat Unity Technologies: Unity Real-Time Development Platform — 3D, 2D, VR & AR Engine. https://unity.com/. Accessed 11 January 2024
36.
Zurück zum Zitat Wagner, M., Blumenstein, K., Rind, A., Seidl, M., Schmiedl, G., Lammarsch, T., Aigner, W.: Native cross-platform visualization: A proof of concept based on the unity3D game engine. In: 20th International Conference Information Visualisation (IV). IEEE, New York, NY, USA (2016). https://doi.org/10.1109/iv.2016.35
37.
Zurück zum Zitat De Souza Cardoso, L.F., Mariano, F.C.M.Q., Zorzal, E.R.: A survey of industrial augmented reality. Comput. Ind. Eng. 139, 106159 (2020). https://doi.org/10.1016/j.cie.2019.106159CrossRef
38.
Zurück zum Zitat Microsoft: HoloLens 2—Overview, Features, and Specs. https://www.microsoft.com/en-us/hololens/hardware. Accessed 11 January 2024
39.
Zurück zum Zitat PTC: Vuforia Developer Website. https://developer.vuforia.com/. Accessed 11 January 2024
40.
Zurück zum Zitat Stock, A.M., Lang, T., Sauer, T.: Unav: The SCR file format. Fraunhofer-Gesellschaft (2023). https://doi.org/10.24406/PUBLICA-1214

JOT - Journal für Oberflächentechnik (Link öffnet in neuem Fenster)

Das führende Magazin für sämtliche Themen in der Oberflächentechnik.
Für Entscheider und Anwender aus allen Bereichen der Industrie.

    Bildnachweise
    Wagner Logo/© J. Wagner GmbH, Harter Drying Solutions/© HARTER GmbH, Cenaris Logo/© CENARIS GmbH, Ecoclean Logo/© SBS Ecoclean Group, Eisenmann Logo/© EISENMANN GmbH, L&S Logo/© L&S Oberflächentechnik GmbH & Co. KG, FreiLacke Logo/© Emil Frei GmbH & Co. KG, Afotek Logo/© @AFOTEK Anlagen für Oberflächentechnik GmbH, Fischer Logo/© Helmut Fischer GmbH, Venjakob Logo/© VENJAKOB Maschinenbau GmbH & Co. KG, Nordson Logo/© Nordson Deutschland GmbH, JOT - Journal für Oberflächentechnik, Chemetall und ZF optimieren den Vorbehandlungsprozess/© Chemetall