Skip to main content
Erschienen in: Business & Information Systems Engineering 2/2023

Open Access 19.12.2022 | Research Paper

User-Centered Requirements for Augmented Reality as a Cognitive Assistant for Safety-Critical Services

verfasst von: Julia Bräker, Anna Osterbrink, Martin Semmann, Manuel Wiesche

Erschienen in: Business & Information Systems Engineering | Ausgabe 2/2023

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Augmented reality (AR) is widely acknowledged to be beneficial for services with exceptionally high requirements regarding knowledge and simultaneous tasks to be performed and are safety-critical. This study explores the user-centered requirements for an AR cognitive assistant in the operations of a large European maritime logistics hub. Specifically, it deals with the safety-critical service process of soil sounding. Based on fourteen think-aloud sessions during service delivery, two expert interviews, and two expert workshops, five core requirements for AR cognitive assistants in soil sounding are derived, namely (1) real-time overlay, (2) variety in displaying information, (3) multi-dimensional tracking, (4) collaboration, and (5) interaction. The study is the first one on the applicability and feasibility of AR in the maritime industry and identifies requirements that impact further research on AR use in safety-critical environments.
Hinweise
Accepted after two revisions by Alexander Maedche.

1 Introduction

Over the years, human-centered service systems have evolved, and technology has become smarter over time, necessitating the development of improved systems to leverage new technologies (Peters et al. 2016). As these new technologies are becoming increasingly important for service development and improvement in various industries, the use of these technologies has gained attention in both service research and practice (Ostrom et al. 2015; Wirtz et al. 2018). When adopting new technologies, IT resources either complement or improve the impact of non-IT resources on the performance of a process or replace them (Jeffers et al. 2008). Well-known examples of services enhanced by technology are electronic check-in and check-out systems in hotels, electronic money transfer in financial services, e-tickets in the airline or event industry, etc. (Collier 1994; Böhmann et al. 2014). Focusing on the human factor and its capabilities when improving services, so-called cognitive assistance systems, which explicitly aim at enhancing human capabilities instead of replacing them, play an important role (Engelbart 1962; Peters et al. 2016). In this context, cognitive assistants offer the possibility of significantly improving the functioning of complex service systems (Peters et al. 2016).
An example of a cognitive assistant that has been studied in various research areas is augmented reality (AR) applications. With the support of AR head-mounted displays (HMDs), for instance, current challenges, such as in technical customer services, can be overcome (Niemöller et al. 2019). HMDs can be used to restructure services by enabling information to be displayed directly into the user’s field of view under hands-free navigation without media breaks and limited mobility (Niemöller et al. 2019). As a result, AR cognitive assistants, like HMDs, have a high potential to support and improve (Elder and Vakaloudis 2015) services such as health care (Klinker et al. 2017), technical customer service (Niemöller et al. 2019), and logistics services (Rauschnabel and Ro 2016).
While previous studies have primarily focused on services in which one task is performed, it is not well understood how AR cognitive assistants can support services where two separate tasks are frequently switched. This task switch is the situation in our use case of water depth management at a large European maritime logistics hub. As part of the water depth management, deviations in water depths of the port area must be continuously monitored, and if necessary, harbor areas must be dredged to maintain the waterway infrastructure (Osterbrink et al. 2021; Bräker et al. 2022b). The measurement of water depths is called soil sounding and is provided by special vessels equipped with echo-sounding systems. Floating drones can be used individually or to support vessels to collect supplementary measurement data by being more agile and flexible. Skippers and drone operators are responsible for two parallel tasks during service delivery. First, they must navigate their vessels safely through the harbor area during regular maritime traffic, i.e., taking into account other vessels, water levels, tides, currents and obstacles in the water. Second, the measurement is a task that needs high accuracy and is a collaboration with a measurement engineer. This involves continuously coordinating measurement areas and routes with the measurement engineer, operating the echo sounder system, and checking the data quality of the measured water depths in real time. The measurement engineer, who is likewise on board the vessel, guides the measuring route and controls the quality of the measured real-time data. Since mistakes or inaccuracies in the performance of the skippers and drone operators can have serious consequences, directly and indirectly, the use case is a safety-critical service. These can generally be characterized by the fact that a “failure might endanger human life, lead to substantial economic loss, or cause extensive environmental change” (Knight 2002, p. 547). Especially for this reason, effective support and improvement of the soil-sounding process are very important, as the resulting information is crucial for vessels to navigate harbor areas. Consequently, soil sounding is a key service to ensure safety for all actors that use the harbor infrastructure, as this infrastructure is partially dynamic due to tides and currents. Besides the two parallel tasks, another distinctive feature of our use case is the setting on the water. In contrast to the already difficult implementation of AR in navigation contexts such as cycling or driving a car, navigation on the water brings further challenges for developing an AR application. The instability of the water surface brings an additional dimension of motion that leads to, among others, much more complex tracking conditions. However, AR applications with an additional dimension of motion have not yet been investigated, so this represents another gap in previous research. Against this background, our research is guided by the questions (RQ1) “How amenable is the service process of soil sounding to be supported by AR?” and (RQ2) “What are the requirements for an AR cognitive assistant to improve the service process of soil sounding?”. The results demonstrate how AR can improve operations in an edge case in the context of safety-critical maritime navigation with high requirements. We derived five requirements for such applications that inform future AR solutions in a wide application area. Additionally, we deal with unique tracking challenges that increase the understanding of AR and support systems when dealing with applications for use on the water.
The remainder of the paper is structured as follows: First, we give an overview of the application of AR solutions in different service processes. We distinguish between static settings and those where the environmental setting adapts to the user’s motion, such as when driving a car and describe their technological challenges. In Sect. 3, we introduce the methodology. We conducted a case study with workshop data, expert interviews, and process tracing using video observations and verbal protocols in the form of think-aloud sessions. In Sect. 4, we present the use case of water depth measurement and analyze it firstly with regard to its augmentability (RQ1). In the second part of the analysis, we use the conducted think-aloud sessions to derive the user-centered requirements for an AR cognitive assistant for the service process of soil sounding (RQ2). The results and their limitations are discussed in Sect. 5, and in Sect. 6, we summarize the main findings and give an outlook on future research issues.
Innovations in the field of service continue to emerge, with many of the innovations today being digital by integrating resources throughout service systems (Lusch and Nambisan 2015). In order to apply service innovations and related innovative technologies that can be used to support services, it is important to understand the innovations within the service for which they are to be used, their potential, and the requirements involved (Matijacic et al. 2013; Jessen et al. 2020). One opportunity to support service is cognitive assistants, enabled or represented by rapidly evolving technologies and innovations. These assistants are primarily characterized by the aim to enhance human capabilities rather than replace them (Engelbart 1962; Peters et al. 2016).
One promising technology that falls into this category of cognitive assistants is AR. As previous research shows, this technology can be used in many different environments. In the context of maintenance and repair, increased efficiency was observed for using AR-applied devices (Henderson and Feiner 2011). Another AR application that was investigated is the use of smart glasses to support the runtime modeling of services, allowing the process to be documented on-site by the service provider during the execution of his activity (Niemöller et al. 2019). Further applications of AR do exist in the construction industry (Etzold et al. 2014), education (Kaufmann and Schmalstieg 2002), as well as healthcare (Klinker et al. 2020). These use cases tend to occur in static environments, i.e., the user does not move significantly, and the environment remains mostly constant.
On the other hand, AR use cases in mobile settings have rarely been the subject of research. These environments can be characterized by the fact that they are not limited in space and that objects – in contrast to those in static environments – do not necessarily have fixed positions, but the positions of these objects can change due to movements so that a motion emanates from the user’s environment. Examples of mobile environments include navigation contexts, such as driving a car, where the motion of other traffic participants influences their own process of driving, e.g., when it is necessary to brake because of the car in front. With regard to AR applications in the context of moving environments, the study by Berkemeier et al. (2018), in which requirements for an acceptable smart glasses-based information system to support cyclists in cycling training are identified, can be mentioned. The study aims to augment information such as speed or route details, which would otherwise have to be displayed by other devices, into the cyclist’s field of view to promote road safety. There are also studies on cars in which head-up displays or HMDs display relevant information in the driver’s field of view to reduce road traffic risks (Heymann and Degani 2016).
There are additional challenges when using AR in a maritime context compared to use cases in the domain of cycling or driving a car. One of the biggest technical challenges relates to tracking: “A ship is not fixed but will roll, pitch and yaw. So – depending on the use case – we have to determine the movement of the ship relative to the earth and the movement of a person (or device) relative to the ship” (von Lukas et al. 2014, p. 466). This means the vessel can move on the water in six degrees of freedom (DOF) (see Fig. 1). Mainly rotation motions are caused by the movement of the water on which the vessel is sailing, e.g., by currents or waves. In addition, the user can move on the ship in six DOF when wearing an HMD. The combination of the two makes tracking complex and challenging. In addition, harsh weather conditions, reflections from the water, and lack of GPS reception make accurate tracking difficult. For this reason, our use case presents a unique characteristic that raises a research gap that needs to be investigated.
Moreover, in other safety-critical services – as our use case – there are already some investigations on the support potential of AR and how it can be used. For example, the user requirements for an AR-based refinery training tool were determined for an oil refinery to develop usable and safe AR applications (Träskbäack and Haller 2004). Another example is the application of AR in an innovative airport control power, where the aim was to provide the air traffic control operators in the airport control tower with complete head-up information (Balci and Rosenkranz 2014; Bagassi et al. 2020). All safety-critical services require situational awareness of involved actors. Situational awareness in this regard is defined as “[…] the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future” (Endsley 1995, p. 36). The phenomenon is crucial in many applications and a lever for improved safety (Carroll et al. 2006; Jones et al. 2018; Reuter et al. 2019). Especially, AR bears the potential to improve situational awareness, as some recent findings conclude (Coleman and Thirtyacre 2021). In a user study, Lee et al. (2016) investigated how multitasking while driving distracts users and causes misbehavior. Study participants had to solve tasks on a classic infotainment system while driving a car. In particular, the risk of accidents increased when participants made mistakes in the parallel task and were distracted by view shifts and long glances at the infotainment system. Our research addresses a similar situation, albeit not in road traffic but maritime traffic.
Illing et al. (2021) examined in a study to what extent AR can help to support multitasking assembly exercises under time pressure. Especially when two or more tasks are performed in parallel, mental stress and reduced attention can occur. The authors were able to show that AR assistance systems can help to reduce the error rate and the cognitive load and, at the same time, increase the quality and motivation of the actors. The possibility of performing the task hands-free by using AR improves safety and helps users to comprehend the actual task faster. Hands-free working helps to support collaboration with other actors. Especially in mobile settings, where users move around the environment and have to collaborate and communicate at the same time, HMDs bring an advantage (Johnson et al. 2015). Nevertheless, in processes that require situational awareness and hands-free working while collaborating at the same time, AR support brings great benefits. An example of this is the security industry, where security personnel needs to be aware of the environment and simultaneously collaborate with, for example, the police while performing other activities that require hands-free working (Lukosch et al. 2015).
Theoretical accounts have developed concepts to capture how technologies affect service processes. A well-known theory is the process virtualization theory, which is concerned with explaining and predicting whether a process can be performed virtually (Overby 2008). Examples of virtualized processes include e-commerce, online distance learning, and online banking (Overby 2008; Balci and Rosenkranz 2014). The theory has been applied and adapted in many different contexts (Osterbrink et al. 2021; Bose and Luo 2011; Balci and Rosenkranz 2014) and was also examined with regard to augmentation, which led to the theory of process augmentability (Yeo 2017), which provides the basis for the analysis of the augmentability of our use case of the soil-sounding service. In contrast to the definition of virtualization, augmentation can be defined as the supplement of a synthetic or physical interaction. Consequently, the main constructs of the process virtualization theory concerning the potential removal of physical interactions from the process cannot be applied to augmentation. Therefore, a new main construct – the authenticity requirement – is proposed to affect the dependent variable, namely process augmentability, positively. Process augmentability is described as “how amenable a process is to being conducted in AR environments” (Yeo 2017, p. 5). The need for authenticity influences process augmentability, meaning that AR offers the possibility to provide authentic experiences, although a process is virtualized. An authentic experience is given if sensory genuineness is needed. Because AR can simulate sensory experiences, processes that require this authentic feeling are amenable to being augmented. Besides, three moderating constructs developed from the definition of AR are proposed to create an authentic experience. By definition, AR must meet three criteria: combine physical and virtual, be interactive in real-time, and be registered in the real world (Azuma 1997). Accordingly, Yeo (2017) proposes 3D visualization, spatial association, and synchronization as moderating constructs affecting authenticity, hereafter referred to as characteristics. The requirement for 3D visualizations means that three-dimensional objects and overlays enable a more authentic experience. For the characteristic of spatial association, examples were given by Yeo (2017) of how AR uses the geospatial environments to enrich process experiences (Henderson and Feiner 2011; Hertel et al. 2021; Bräker et al. 2022a). The spatial registration of objects in space influences augmentability, and if objects must be spatially anchored for an authentic experience, the process is suitable for AR. In terms of the synchronization characteristic, most physical processes conducted in the real world tend to be synchronous. Therefore, physical movement needs to be connected to maintain the sense of immersion (Overby 2008). This means that synchronization ensures an authentic experience, i.e., the process is more authentic if no delays can be tolerated.
Based on existing research, our case delineates as follows: we are in a mobile wide-area environment that is constantly changing. At the same time, we are inside a vessel whose interior initially remains static. Due to the vessel’s movements on the water, compared to conventional navigation settings, there is an added challenge that the water surface is unstable, and thus the vessel can move in 6 DOF. In addition, our users, the skippers, are performing a safety-critical process where errors can have drastic effects on themselves and others. They must be able to multitask, meaning they perform two parallel tasks and simultaneously collaborate and communicate with other involved parties. The combination of all these characteristics makes our use case a technological edge case, as it is particularly challenging, complex, and thus interesting for research. Likewise, aspects of it are found in loads of other use cases.

3 Method

We conducted a case study (Yin 2003) to explore the requirements for a user-centered AR cognitive assistant for safety-critical services in the maritime sector. The use case we investigated was the process of soil sounding, i.e., water depth measurement in a harbor environment carried out during navigation on a vessel or with the help of a soil-sounding drone.
In order to analyze how amenable the considered use case of the soil-sounding process is to be supported by AR (RQ1), we evaluated to what extent the aspects of the theory of process augmentability (Yeo 2017) are fulfilled. To assess the degree of fulfillment of the process augmentability criteria, we first studied a use case video of a European harbor operator that gave us a contextual understanding of the soil-sounding process as a foundation for further analysis. In the second step, we collected data by conducting two semi-structured interviews and two workshops with business and IT experts of the same harbor operator (see Table 1). The first interview was held with the port operator’s Head of IT Innovation to get an initial overview of the use case. The second interview occurred with the Deputy Port Hydrographer of the same port operator. It gave us a deeper understanding of the use case and the context in which the soil-sounding process is performed. Both the interviews and the workshops were documented by video recording. In addition, the participants provided more detailed documents concerning the use case, such as the depth measurement management cycle (see Fig. 3) and information about and pictures of the soil-sounding vessel and drone (see Figs. 4 and 5). Two workshops further contributed to the understanding of the use case. In the first workshop, with the Deputy Port Hydrographer, researchers, and port operator personnel, we briefly discussed the first ideas for initial requirements for an AR cognitive assistant for the service process of soil sounding (RQ2). In the second workshop, the Project Manager R&D, researchers and further port operator personnel participated. During this workshop, we defined the single process steps of the soil-sounding service process and discussed which initial requirements might apply to each process step.
Table 1
Data table of sources
#
Source
Format
Duration (hh:mm)
Focus
1
Harbor TV
Video
00:14
Use case context and understanding of the soil-sounding process
2
Head of IT Innovation
Interview
01:00
Overview case strategy
3
Deputy Port Hydrographer
Interview
01:00
Deeper understanding of the use case context
4
Deputy Port Hydrographer
Workshop
02:03
Requirements of the use case
5
Project Manager R&D
Workshop
00:57
Process steps and requirements of the use case
6
Soil sounding 1
Think-Aloud
01:27
Soil sounding on a shore (SA, MA)
7
Soil sounding 2
Think-Aloud
01:19
Soil sounding of harbor basin and berths (SA, MA)
8
Soil sounding 3
Think-Aloud
00:19
Follow-up soil sounding in a relatively small area (SA, MB)
9
Soil sounding 4
Think-Aloud
01:16
Soil sounding in a sidearm with several bridges and an open lock (SA, MB)
10
Soil sounding 5
Think-Aloud
00:28
Soil sounding of a berth on a quay (SA, MB)
11
Soil sounding 6
Think-Aloud
00:52
Supplementary soil sounding of a berth within a control measurement (SA, MB)
12
Soil sounding 7
Think-Aloud
00:47
Soil sounding of recently dredged fields during heavy traffic (SB, MA)
13
Soil sounding 8
Think-Aloud
00:44
Soil sounding of a dredging field (SB, MA)
14
Soil sounding 9
Think-Aloud
00:55
Soil sounding of a widened shipping channel (SB, MA)
15
Soil sounding 10
Think-Aloud
00:36
Soil sounding of a dredging field (SB, MA)
16
Soil sounding 11
Think-Aloud
00:12
Soil sounding after removal of a ground obstacle as control (SB, MA)
17
Soil sounding 12
Think-Aloud
02:03
Intermediate soil sounding for a dredging field (SC, MC)
18
Soil sounding 13
Think-Aloud
02:08
Intermediate soil sounding for fairway deepening (SC, MC)
19
Soil sounding 14
Think-Aloud
00:57
Soil sounding using an autonomous drone in a small area (DA, MD)
Since we wanted to identify user-centered requirements, we collected additional data with three skippers and one drone operator who performed the process of soil sounding. For simplicity, the common term “actor” is used hereafter for skippers and drone operators when both are meant. We were particularly interested in safety aspects, environmental conditions of the vessel, the cognitive and physical demands of the skipper, the type and intensity of collaboration between the actors, and general challenges and problems encountered by the actors in their daily work. In order to gain this central data source, we used the process-tracing method (Todd and Benbasat 1987) of thinking aloud (Van Someren et al. 1994), where the actors were asked to “think aloud” and to explain everything they do while simultaneously engaging in the soil-sounding service. We used this method to analyze which user-centered requirements an AR cognitive assistant must meet to support the soil-sounding process. As the situational features of the service are crucial, we extended the traditional implementation of the method by recording videos from different perspectives. For this purpose, we attached a camera to the actors’ forehead to retrace their field of view and placed another camera on the monitor to observe the actors directly (see Fig. 2). An exact tracking of the eye movements of the actors was not necessary since only the direction of the actors’ view was relevant. With the help of the two recording perspectives and the recorded audio track, we were able to trace the process of soil sounding and derive user-centered requirements for an AR cognitive assistant.
As the service is critical to maintaining harbor operation, only experienced actors are considered for soil sounding. The fourteen think-aloud sessions differ in the type of soil-sounding job and its focus as well as the difficulty of the soil-sounding, which depends on factors such as the soil-sounding environment or traffic volume (see Table 1). For each soil sounding, we added anonymized codes for the skipper and the measurement engineer. During the think-aloud sessions, three different skippers (SA–SC), one drone operator (DA) and four measurement engineers (MA–MD) were involved.
For analyzing the video recordings and verbal protocols of the think-aloud sessions, we used the scanning method, which is one of the four major categories of protocol analysis and the most straightforward one (Bouwman 1983). We did not perform a verbatim transcription of what was said since the observation of the actors was the primary object of investigation that we used for our analysis. Instead, the actor’s statements helped to supplement and explain what was observed. For the identified video sequences in which observed behaviors indicated challenges, we transcribed what was said, which often helped clarify and support the observation. We analyzed the video and audio material with three independent researchers and initially focused on existing challenges and problems in the process of soil sounding. After identifying all difficulties in the process, we derived problem categories by grouping duplicates and similar ones. Based on these problem categories, we derived contextual requirements for each problem. Afterward, we abstracted from the contextual requirement and derived five general requirements for augmenting the soil-sounding service. The detailed results are described in Sect. 4.2.

4 Use Case of Water Depth Management in a Harbor Environment

For our analysis, we chose the service process of water depth management in a European maritime logistics hub. Harbor personnel must continuously monitor water depth change due to sedimentation and erosion. To ensure the safety of vessel traffic and maintain the infrastructure, water depth management must be ensured by permanent soil sounding as well as finding and recognizing nautically critical obstacles on the water ground, e.g., bikes, cars, or shopping carts. Special soil-sounding vessels and drones are used that have the technical equipment to monitor water depth and generate a digital landscape model of the water ground live on board. Figure 3 illustrates the depth measurement management cycle to give an overview of the use case context. This states that measuring water depths generates knowledge, which in turn causes action, such as deepening a certain area. Since this action causes a change, the changing area must be controlled by measurement.
The most frequently used technique for measuring water depth is the use of soil-sounding vessels (see Fig. 4). These are special vessels equipped with an echo sounder that enables real-time measurement of water depths under the vessel. Different soil-sounding vessels operate daily in the harbor area, which an autonomous floating drone can optionally support. Each vessel needs to sound different defined areas in the port daily and investigate them for changes and irregularities to ensure safe operations within the harbor. Soil-sounding assignments and allocations are managed centrally by a hydrographic office and prioritized as needed and based on tides. The assignments are usually processed by two people on board each vessel: a skipper and a measurement engineer.
The task of measuring water depth is particularly challenging: The skippers of soil-sounding vessels use echo sounders to measure the water depths in the harbor while simultaneously paying attention to the vessel’s traffic and keeping an eye on a variety of information for measurement on monitors so that they have to permanently switch between the monitors and the view out of the window. This leads to limited safety in vessel traffic, extreme exhaustion due to the constant change of view and context, and several health issues. These side effects of the constant shift of perspective can also be observed in other safety-critical services, such as air traffic control in an airport control tower (Bagassi et al. 2020). In addition to simultaneous navigation of the vessel and measurement of data, the skipper needs to interact with the measurement engineer, who is accountable for controlling the quality of the measured data. If the measurement engineer cannot be on the vessel in person, it is possible to work from the home office.
To efficiently perform the measuring tasks assigned by hydrography, the skipper and the measurement engineer need to determine the best possible order of the tasks at the beginning of the workday. This depends on external factors, such as tide, currents or other vessels blocking measurement areas. In addition, the internal prioritization of the hydrographic office is considered. If the order of the assignments has been coordinated, the skipper navigates to the next measuring area. In consultation with the measurement engineer, the skipper either navigates the area at his own discretion and expertise or the measurement engineer places a profile with an outline on the skipper’s map view so that the skipper has a point of reference. In addition, the measurement engineer can draw a so-called bearing line for the skipper, which specifies the optimal route through the measurement area. Once the soil-sounding data is available in the appropriate quality, the next measurement area is approached.
In addition to the described approach using special soil-sounding vessels, floating drones can be used for certain use cases (see Fig. 5). The soil-sounding drone is approximately 1.65 m long and is controlled either from land or the soil-sounding vessel. It can be operated manually by a drone operator via remote control or autonomously along a predefined route given by the measurement engineer, which he defines on his monitor. On board the drone – analogous to the soil-sounding vessel – an echo sounder is installed and, in addition, a camera whose image shows the environment from the drone’s point of view and is streamed to the monitors of the drone operator and measurement engineer. Compared to the conventional approach of measuring water depth with a soil-sounding vessel, the drone is used for tasks where it would otherwise be too dangerous to use a soil-sounding vessel, for example, because the water is too shallow or maneuvering is difficult. In addition, the drone can be used as an escort vessel to assist in sounding large areas.
The main difference compared to the soil-sounding vessel is the vehicle size and the actor’s perspective. In contrast to the skipper, the drone operator can take two perspectives: the perspective from outside the drone and the drone’s perspective via the installed camera. Even though the perspective is slightly different from that of the skipper, the same negative side effects occur when controlling and measuring with the drone as when working with a soil-sounding vessel since the drone operator is also subject to a constant shift between different views. Furthermore, at long ranges, the distances between the drone and other objects in the water and the shoreline are more difficult for the drone operator to estimate. While the drone must always remain in view for safety and regulatory reasons, this can nonetheless be hundreds of meters to several kilometers away from the drone operator in broad harbor basins. Even if the drone operator additionally gets the camera image of the drone streamed to a display, this does not provide a substitute for the outside view of the drone since the image can be disturbed by splash water, for example, and it only visualizes a limited camera image and not a complete 360-degree view at once. Thus, the drone operator also has parallel tasks – first, the safe and precise control of the drone using two different perspectives, and second, the control of the measurement data in collaboration with the measurement engineer. Even in cases where the drone is operating autonomously, the drone operator must maintain full attention, as he or she must always be able to intervene in an emergency. For simplicity, the analysis of the use case uses the common term “vessel” for the soil-sounding vessel and the soil-sounding drone when both are intended.

4.1 Augmentability of the Use Case

Various attempts have already been made to solve the problem of exhaustive and safety-critical perspective shift between the window to see shipping traffic and the data on the monitor. However, for example, the idea of placing the monitor in the windshield at the height of the actor’s eyes so that he no longer has to look down caused the monitor in the windshield to obscure important objects such as other vessels on the water. A further idea to reduce the size of the monitor in the windshield to avoid overlaying real objects has, in turn, resulted in the view of the displayed information and data being too small. Since it is important to display information in the actor’s field of view, but a monitor in the windscreen overlays important real objects, the application of AR seems to be a promising way to solve the problem. With the help of AR, information can be displayed in the actor’s field of view without completely overlaying real objects. In order to investigate the applicability of AR for the use case, we have analyzed the use case with regard to the four characteristics that – according to the theory of process augmentability (Yeo 2017) – must be present in a process so that augmentation of the process is appropriate.
Authenticity. The authenticity characteristic is present in the soil-sounding service. The actor requires an authentic experience to be supported in navigating his vessel and measuring data during the actual soil-sounding process. This process cannot be virtualized or simulated due to its high complexity and dependency on the actions and decisions of the actor. Even if each soil-sounding procedure follows a similar pattern – (1) navigation to the measurement site, (2) navigation of the vessel through the measurement area and execution of the measurement while permanently controlling the quality of the data, (3) completion of the measurement as soon as all data are available – nevertheless each procedure is different. The process of measurement is influenced by the current environmental conditions, e.g., tide, current or wind, but also by other vessels, possible construction sites and obstacles in the water, which are potential sources of danger. In addition, regarding the water depths measured in real-time, the skipper must react quickly and adjust the route according to the current circumstances. Accordingly, the process can only be performed in reality, and the information and support required by the actor must correspond to this reality and extend it adequately.
3D visualization. The 3D visualization characteristic is present in the soil-sounding process. The actor requires 3D visualization for an authentic experience because a 2D visualization of obstacles and other vessels is too imprecise, making it difficult to estimate sizes and distances, which in turn can affect the quality of the measurement data and traffic safety. In addition, a 3D visualization of the real-time measurement results is needed. Analogous to the current map view on the skipper’s monitor, certain areas on the water surface could be color-coded in three-dimensional space.
Spatial association. The spatial association characteristic is present in the soil-sounding process. It is very important for the actor that information, such as water depth and currents, as well as obstacles in the water and other vessels, are displayed with geographical accuracy. The actor requires the possibility to obtain further information about, e.g., displayed objects through interaction with them. For example, suppose an obstacle or the water depth is not displayed geographically correctly, and the actor does not have access to critical information. The vessel may collide with the obstacle or run aground in that case.
Synchronization. The synchronization characteristic is present in the soil-sounding process. The 3D objects to be augmented and the collected measurement data, as well as information about boundary conditions such as currents or the tide, must be continuously synchronized and updated, enabling the actor to use this information to navigate his vessel safely and detect possible obstacles and measurement gaps or errors.
In summary, the soil-sounding service is amenable to be supported by AR based on the analyzed characteristics and identified needs of the actor involved in the process. Applying AR, we aim to improve the safety of the vessel traffic, both for the soil-sounding vessel and for other vessels and involved actors. The safety-critical process of soil sounding holds great potential for using AR to overlay specific information into the skipper’s field of view, thus increasing situational awareness. For example, by displaying water depths, i.e., how much space is left under the vessel until it runs aground, safety distances, underwater objects, or other vessels. This applies, in particular, to traffic participants who do not have AIS. As harbor areas have to become more efficient in order to fulfill the increasing demands of global supply chains, traffic in the harbor is becoming denser and timed more closely. Given the increased size of vessels and the amount of cargo loaded, slight delays in handling vessels have a huge economic impact. An extreme example of the consequences of such delays was the impact of the vessel Ever Given, which got stuck in the Suez Canal, leading to severe delays on a global scale with an estimated loss of 10 billion $ in trade value per day (Yee and Glanz 2021).
Furthermore, we aim to relieve the strain on the skipper, who is physically stressed by constant changes of view between monitor and windshield. We want to reduce cognitive stress since the process requires a lot of concentration and attention due to the parallel execution of several tasks using AR. Additionally, the use case is interesting from a technological perspective, as new tracking requirements for AR hardware arise due to the new dimension of motion caused by the water surface. Since the process requires spatial anchoring of objects in the environment and 3D visualizations are needed, accurate tracking is essential in this use case.

4.2 Requirements for AR Solutions

Based on the challenges and problems we observed in the think-aloud sessions and subsequently analyzed, we identified five generalized requirements for designing AR solutions as a cognitive assistant for service processes such as soil sounding. We specified the requirements together with the experts and actors of the harbor operator. Table 2 summarizes the results of the think-aloud sessions. We gradually derived the contextual requirements from the observed problems and then abstracted them, resulting in the five generalized requirements. For each contextual requirement, we specify from which think-aloud session, e.g., soil sounding 1 (SoSo1), we derived the requirement. The five requirements are (1) real-time overlay, (2) variety in displaying information, (3) multi-dimensional tracking, (4) collaboration, and (5) interaction.
Table 2
Requirements derived incrementally from the problems and challenges identified in the think-aloud sessions
Problem/challenge
Contextual requirement
Generalized requirement
Some real-time information is not visible and can only be captured by sensors, e.g., actual water depths, or measurement quality
Visualization/overlay of real-time sensor data (SoSo1, SoSo2)
Real-time overlay
Accuracy and actuality of the visualized data are critical to the safety of the skippers
Ensure reliability, synchronization, and actuality of visualized data (SoSo13)
On-board AIS information is not always reliable, i.e., not all vessels are shown, positions deviate, the orientation of non-moving vessels is not always correct, software bugs in the IT system exist
The skipper must lean forward to view the AIS system on board (small display behind the monitor)
Anchored vessels might hinder order planning, resulting in delays. Departure timings can be manually looked up in a sailing list, which is only available on paper
Visualization/overlay of real-time AIS data, including correct position and orientation (SoSo1, SoSo2, SoSo3, SoSo4, SoSo5, SoSo6, SoSo7, SoSo9, SoSo12, SoSo13, SoSo14)
External dependencies, such as tides, currents, wind direction, and wind speed, affect the feasibility of sounding orders and are significant considerations when arranging the order sequence
Data is not always available digitally and must be read from printed calendars, which is time-consuming
Visualization/overlay of the real-time tide, current, wind direction, and wind speed (SoSo1, SoSo2, SoSo3, SoSo4, SoSo12, SoSo13)
Real-time water depths are critical for safe vessel navigation as well as assuring measurement quality
Visualization/overlay of real-time water depths (SoSo1, SoSo2, SoSo14)
Real-time safety distance is critical for the vessel’s safe navigation
Distances to other vessels are difficult to predict without further assistance
Visualization/overlay of real-time safety distances to shore, obstacles, and other vessels (SoSo1)
GPS is not always available, and signal strength is a crucial indicator of both accuracy and safety
Visualization/overlay of the GPS signal, as well as a warning if the GPS signal is insufficient (SoSo1, SoSo4)
The bearing line is critical for proper vessel navigation; however, it is only visible on the monitor
Visualization/overlay of the bearing line and real-time deviations (SoSo2, SoSo6, SoSo8, SoSo14)
Information about the soil-sounding task is usually available but not always accessible in the IT system
Because of the multiple measuring ranges, there are frequently overlaps in the measurement data, which is time-consuming and ineffective/inefficient
Visualization of information regarding the soil-sounding field and the task (SoSo3, SoSo4, SoSo13, SoSo14)
The quality of the measurement data is critical in determining where to measure again or if the route must be corrected
Visualization/overlay of measurement data quality (SoSo1, SoSo2, SoSo4)
Knowing the soil-sounding vessel’s speed is essential for estimating maneuvers
For effective navigation, the skipper must be aware of whether the measurement is already enabled or deactivated
Visualization of the vessel’s speed and display activation/deactivation of the echosounder system (SoSo12, SoSo13)
Underwater obstacles can rapidly become dangerous. It is challenging to keep the existing maps up to date due to constantly changing conditions. If a new obstacle is discovered, it is not certain that the map will show it because sometimes it takes years to update it
Visualization of obstacles and ability to add new obstacles to the IT system (SoSo1, SoSo3, SoSo4, SoSo5, SoSo11, SoSo12, SoSo14)
The skipper must look down to observe critical data on the monitor. Because the monitor must not cover the windshield, it is small and situated far down
The skipper needs to frequently switch views between the screens and reality
Overlay the skipper’s field of view directly to enhance ergonomic posture and avoid view shifts/media breaks (SoSo1, SoSo3, SoSo4, SoSo6, SoSo7, SoSo9, SoSo10, SoSo11, SoSo12, SoSo13, SoSo14)
Multiple sensor data from the same area is represented on the screen in separate windows or even on different displays. The available display space is not being used properly, and the same area is being presented redundantly
Simultaneous visualization of several context-dependent information without redundancy (SoSo1, SoSo3, SoSo8, SoSo13, SoSo14)
Variety in displaying information
Because the process necessitates a high level of concentration, too much information might quickly distract attention from possible dangers, etc
Avoid visual clutter to avoid disrupting the skipper’s focus during a safety-critical operation (SoSo1, SoSo14)
During the measurement process, the position of the vessel is important for navigation. However, the skipper moves around on board the vessel, which might make standard tracking systems difficult to use. Thus, sailing on the water introduces a new dimension of motion for tracking
Tracking the position of the vessel with respect to the environment while the skipper is moving on board the vessel at the same time (SoSo12, SoSo13, SoSo14)
Multi-dimensional tracking
Because the setting takes place on a ship on the water, disturbances in position and rotation, e.g., caused by waves, are unavoidable
Tracking must be resistant to wave motion (SoSo1, SoSo2, SoSo9, SoSo14)
Because the skipper is inside the ship but perceives the outside world through glass windows, direct sunlight and reflections cannot be prevented
Because the measurements are being taken in the harbor area, there may be distracting noise in the surrounding area, such as from building activities
Tracking must be resistant to external sources of interference, such as sunlight, various weather conditions, and noise (SoSo9, SoSo10, SoSo13)
The skipper and the measurement engineer must collaborate to determine which area to measure in which direction. This can be accomplished through direct dialogue or by the measuring engineer drawing a line or polygon where the skipper must stay on/in
Relying solely on remote communication, such as phone calls, would not work because it is not easy to understand what the other person is saying when working remotely
Visual support for communication between the skipper and the measurement engineer allows them to gain a shared understanding by viewing the maps from the same position. If they are working remotely, encourage continual auditory communication (SoSo1, SoSo2, SoSo3, SoSo5, SoSo6, SoSo7, SoSo12, SoSo13, SoSo14)
Collaboration
When prioritizing soil-sounding requests, impacting factors such as tide, wind, and current must be considered. Some orders are internally prioritized; therefore, the skipper and measurement engineer must always examine the order in which the orders are processed
Support order planning between the skipper and the measurement engineer by facilitating order discussion and documenting a prioritization (SoSo1, SoSo2)
Additional communication with other vessels is required during the process. Thus, all vessels involved may maneuver securely. Communication breakdowns and misconceptions can occur
Support and ensure communication with other vessels (SoSo2, SoSo3, SoSo4, SoSo7, SoSo12, SoSo13, SoSo14)
The skipper needs both hands when he must control the vessel precisely and switch between joysticks quickly
When the measuring engineer is working remotely and connected by phone, or when the skipper needs to use the radio, he must have free hands
Support for hands-free communication and hands-free working (SoSo1, SoSo7, SoSo12)
The skipper cannot change the information on the display; in other words, he cannot interact with the system and only consume information. If the displayed content needs to be modified, it must be done by the measurement engineer
Support the skipper’s interaction with the IT system, such as adjusting chart display views (SoSo1, SoSo3, SoSo4, SoSo13, SoSo14)
Interaction
Real-time overlay requirement. In all the think-aloud sessions carried out, we observed that during soil sounding, the actor must be aware of several factors that may affect both navigation and measurement tasks. These include factors such as tides, currents, wind direction and speed, and safety distances. Possible dangers, such as in-water obstacles or general shipping traffic, are also involved, as well as data, such as the actual water level in the soil-sounding area and the measurement data quality. Further information that affects the process of soil sounding is GPS signal strength, speed of the vessel, information about the sounding job, indications of the sounding area and the optimal route across it. Since some information can only be acquired with the help of sensors and rapidly changing conditions prevail, the visualization of real-time information is an essential requirement to ensure that the actor can navigate his vessel safely and, for example, is not in danger of running aground or hitting an obstacle. Other information, such as when high tide is or which ships depart, can currently only be read off paper lists.
The reliability, synchronicity and actuality of the displayed data are furthermore a prerequisite for safe navigation: “I have to be able to rely on this monitor! You have to have trust in the equipment” (SoSo1).
During soil sounding on a shore, the skipper explained: “If you like, I’ll probably look 80% here on the monitor and maybe a little out the window” (SoSo1). Another skipper (SoSo12) even states that he looks at the monitor 95% of the time but always tries to keep an eye on his surroundings. Therefore, a further challenge, which can impair traffic safety and even lead to health problems, is the constant shift of the actor’s attention between the sailing window to keep an eye on traffic and his monitors to ensure the quality of the measured data, as seen in Fig. 6. The elimination of media breaks by overlaying information directly into the user’s field of view is required to meet this challenge. For example, overlaying real-time information about other vessels, such as their position or direction of navigation, which is received via the Automatic Identification System (AIS) and currently displayed on different monitors, could help to ensure safety. One skipper mentioned difficulties occur when not all vessels are detectable in AIS. For example, he needs the vessel’s name to contact it directly by radio (SoSo13). If he does not know the name, he would have to address the vessel by its approximate position, which is not always clearly defined.
The real-time overlay requirement, therefore, arises from the need to display information in real-time and to overlay this display in the actor’s field of view to improve vessel traffic safety and prevent health issues. Especially the permanent alternation between the task of navigating the vessel and the task of measuring the water depths is reflected in constant view shifts, which can be avoided by providing an AR real-time overlay. Additionally, such an overlay can improve safety by enhancing situational awareness, especially during harsh weather conditions that reduce sight while dependence on digital information increases.
Variety in displaying information requirement. In addition to the frequent shifting of view between the vessel’s window and the monitors, a variety of sensor information and data about the soil-sounding area is displayed on different monitors, resulting in an additional constant shift between the monitors to ensure the quality of the measurement. Thereby the available space of the monitors is not optimally used, and sometimes even redundant information is displayed (see Fig. 7). However, in order to improve usability and thus enable the actor to execute the measurement efficiently, the actor requires multiple sensor information simultaneously, such as different layers or perspectives of the area of soil sounding, without information and the representation of this information being displayed redundantly.
Another difficulty that also leads to the current use of multiple monitors is that information about the water level in general and the measured areas, which include the quality and density of the data, cannot be overlaid in the current IT system but are both required to ensure both traffic safety and measurement quality, i.e., the performance of the two parallel tasks. Therefore, a further essential requirement regarding the presentation of information is that different views and representation options of the information should be distinguishable. In this context, we found out during the various think-aloud sessions that it is useful, for example, to have different zoom levels for the maps, since the actor requires more detail to navigate his vessel precisely, for instance, when measuring within narrow shore areas, than in wider water areas where he needs a greater overview. In several sessions, we could also observe that the actors displayed the required information differently. In some measurement situations, for example, displaying the water depths in the measurement area using different color scales was more helpful than displaying this information as exact numerical values and vice versa. Because skippers described navigating the vessel as exhausting and requiring concentration, especially in narrow areas, another requirement is that visual cluttering should be avoided despite the combination of multiple views.
The variety in displaying information requirement, therefore, arises from the need to be able to choose different representation options of the information to display on demand in order to improve usability and traffic safety as well as ensure the quality of the measurement at the same time.
Multi-dimensional tracking requirement. A vessel has an additional dimension of motion than other means of transport, such as bicycles or cars, making tracking the vessel’s exact position even more difficult. This means that the tracking system must capture two different types of motion. The first is the movement of the ship on the water’s surface, which can be caused to sway by the waves. The other one is the skipper on the ship itself, who can move independently on the ship. It is important for the skipper to be able to change his position during the day. One skipper mentioned (SoSo12, SoSo13) that he has to vary his position to avoid physical discomfort. In doing so, he changes to the steering wheel when driving long distances, for example. When he has to drive short distances, where he must be more concentrated, he prefers to navigate precisely with the joystick.
However, it is an essential requirement for the navigation of the vessel during soil sounding to track the vessel’s exact position in relation to the environment. On the one hand, an inaccurate position determination can lead to measurement gaps and, accordingly, to a reduced measurement quality and, on the other hand, traffic safety can be impaired by incorrect positioning since, for example, distances to obstacles or other vessels can no longer be displayed correctly. Although GPS data are available, they are not sufficient for this case since these also change due to the rotation movements of the water surface. The tracking must be very resistant to wave-induced movements. For example, in soil sounding 1, soil sounding 2, and soil sounding 9, there were waves caused by other vessels coming very close to the sounding vessel, and we saw that the vessel was moving a lot, so the skipper had to hold on to the vessel.
In the case of the soil-sounding vessel, tracking is also made more difficult because the skipper is inside the ship and perceives the outside world through glass windows, making it difficult to avoid direct sunlight and reflections (see Fig. 8). Although the soil-sounding vessels are equipped with sun protection roller shutters, which can attenuate solar radiation, it is impossible to completely block the sun’s rays without restricting the view outside too much by darkening it. Accordingly, in the case of the soil-sounding vessel, the skipper requires tracking that is resistant to sunlight and reflections.
In the drone use case, there are also two essential tracking instances. First, there is the tracking of the soil-sounding drone itself, which is complicated analogously to the soil-sounding vessel due to the water movement. Second, the tracking of the drone operator’s position is relevant. If the drone operator is not on land but on board a vessel, the same tracking problems apply to the operator’s position on the vessel as those faced by the skippers on their soil-sounding vessel.
The multi-dimensional tracking requirement, therefore, arises from the fact that the actor needs an exact positioning of the vessel to ensure traffic safety and the quality of the measurement data.
Collaboration requirement. During soil sounding, the actor must communicate a lot with other people to ensure traffic safety and measurement quality. Both in terms of traffic safety and measurement, the actor has to communicate with other vessels by radio in some situations. Since the soil-sounding vessel cannot always adhere to all traffic regulations, depending on the measurement area, it is often impossible for other vessels to estimate which route the skipper will take. For this reason, it is important to clarify by radio which side the two vessels will cross. However, communication by radio is not always trivial, as misunderstandings can occur, radio messages cannot be repeated, and thus some agreements cannot be understood well acoustically. One skipper explained: “On the water, miscommunication is a huge problem, and it can quickly become a serious problem” (SoSo13). Such issues lead to severe problems, as, for example, other ships have a longer braking distance, which can threaten the safety of both involved parties.
Concerning the task of measuring the water depths, the collaboration with the measurement engineer, who is responsible for the technical implementation of the depth measurement, such as the configuration of the echo sounders, is most important: “A good marriage also only works if you talk about everything” (SoSo12). Before and during the measurement, the actor and measurement engineer must continuously coordinate which areas are to be measured, how and when, and when the measuring devices must be activated or deactivated. They have to do this under consideration of different boundary conditions, e.g., the current water level. If the water level is too low, there is the risk of running aground, and if the water level is too high, it is, for example, no longer possible to pass every bridge. However, the constant communication about the observation of the live measured values and the flexible adaption of the course to them is of great importance since areas need to be measured again if the data quality is insufficient. Especially in unknown areas or very narrow areas, permanent communication is required. Furthermore, they have to plan very carefully in advance so that the work is not suddenly wasted and the assignment has to be repeated. A statement made by a skipper summarizes very well the importance of the collaboration between him and the engineer during soil sounding:”If I don’t synchronize with him [the measurement engineer], nothing will work” (SoSo7).
To support collaboration, the actor and the measurement engineer require the same visualization of the measuring areas, with additional highlighting of areas being useful because, especially when the measurement engineer is at the home office, it is not trivial to understand what the other person is talking about. It would also be useful to visualize whether the measurement device is activated and in which area it is currently collecting data. Provided that the measurement engineer is remotely connected, such as during our observations, the communication needs to be supported even more so that the barriers to communication can be reduced. Furthermore, the skipper must pick up a telephone receiver in order to use the radio. In soil sounding 7, we observed how the skipper held the handset of the radio with one hand and used the other hand to communicate via cell phone with the remotely connected measurement engineer (see Fig. 9). Thus, hands-free communication, i.e., without having to pick up a telephone, is required both with other vessels and the measurement engineer, enabling the actor to have his hands free for the navigation of the vessel and thus ensure safety.
The collaboration requirement, therefore, arises from the fact that the actor has to communicate with other vessels and, in particular, has to collaborate a lot with the measurement engineer in order to ensure both traffic safety and the quality of the measurement data.
Interaction requirement. The requirement of interaction goes along with the above-mentioned variety of displaying information requirement. Currently, information on the monitor cannot be manipulated by the actor; in other words, the actor cannot interact with the system and is only able to consume information. During several think-aloud sessions, we observed that the actor wanted to change the views on the monitor. Since the actor could not interact with the system himself, he had to explain the necessary changes to the measurement engineer, who could make the adjustments. It is, therefore, not possible for the actor, for example, to show or hide information or to zoom into a map to get a detailed view if he requires it, e.g., to avoid obstacles or close measurement gaps safely. In order to carry out navigation and measurement more safely and efficiently, the actor requires an appropriate opportunity to interact with the system.
The interaction requirement, therefore, arises from the fact that the actor requires to show or hide important information or select detailed views to ensure traffic safety and completeness of the measurement data. Furthermore, the way of interaction should be chosen so that the actor is not distracted from the navigation of the vessel.

5 Discussion

5.1 Contribution to Research

Requirements for cognitive assistants are one of the core frontiers of digital service innovation (Peters et al. 2016). However, so far, little research on requirements for AR solutions as cognitive assistants has been done in safety-critical services that demand multitasking and collaboration. Especially in maritime navigation, research has so far fallen behind the possibilities that we can contribute to the scientific discourse in different areas here.
Since the chosen use case of soil sounding is mentally demanding, a high degree of situational awareness must be achieved to ensure traffic safety. Soil sounding requires actors to switch views between computer monitors and the real world. This immediately makes it apparent that AR is an excellent technology to support this use case. Coleman and Thirtyacre (2021) have shown that AR can assist pilots of aerial drones by avoiding view shifts and overlaying important information into the drone operator’s field of view. Our empirical results show that the constant view shifts and context changes are enormously exhausting and endanger vessel traffic safety and cause mental and physical strain on the skipper. The view shifts occur because the skipper has to perform two tasks in parallel: the service process of soil sounding – i.e., measuring water depths – to ensure the harbor infrastructure and, thus, traffic safety for all traffic participants in the harbor and the navigation of the vessel. Since the safety criticality of the process can also be found in other areas, our results can be applied to other use cases. These can be ports in general, e.g., use cases in the context of control centers, but also other domains such as air traffic controllers, where a high level of concentration and situational awareness is required. In addition, similar requirements may also apply in large factories, e.g., where forklifts or high altitudes create safety-critical conditions.
Moreover, soil sounding demands a high degree of collaboration between the skipper and the measurement engineer, which bears the potential to be supported by AR, e.g., through a shared visual base. Studies have shown that performing two tasks in parallel is much more demanding and results in a significantly higher mental workload (Lee et al. 2016; Illing et al. 2021). Our results underline the findings from the literature, as skippers described navigating in narrow areas as very challenging, requiring them to concentrate much more. We propose the real-time overlay requirement to eliminate these view shifts and support situational awareness. The contextual requirements related to real-time overlay provide a guideline for the data that must be overlaid in the field of view in an AR application. One important aspect mentioned by the skippers is the display of AIS data, which includes the exact position and additional information about vessels. A study by von Lukas et al. (2014) shows initial attempts to display AIS data in an AR application. However, the application was limited to displaying the AIS data and did not contain any additional data essential for our case. For example, in addition to the AIS data, skippers for soil sounding need further information about the measurement quality and task, as well as rapidly changing conditions such as currents, tides, or wind direction. The diversity and amount of data the skipper needs to perform the soil sounding leads us to propose the requirement of variety in displaying information. In addition, the skipper must be able to interact with the AR application. The interaction should not interfere with the control of the vessel and should allow for hands-free operation (Johnson et al. 2015; Niemöller et al. 2019). Thus our results are applicable to contexts where multiple tasks are performed simultaneously. In our case, the skipper must navigate the vessel while simultaneously capturing water depth measurement data on the monitor and collaborating with the measurement engineer and other vessels via radio. Especially in navigation settings such as cycling or driving, several parallel tasks quickly occur, e.g., reading information on the navigation device and making a phone call simultaneously while participating in traffic.
Compared to the requirements for smart glasses-based AR systems for cycling training (Berkemeier et al. 2018) and to the use of AR in driving situations (Heymann and Degani 2016), the commonality remains that in all these cases, an AR solution is used in a mobile environment and the users act in a traffic situation. A mistake in road traffic and a mistake in navigating a vessel can have serious consequences. However, unlike cycling or driving, our use case takes place in an environment with more degrees of freedom with respect to movement, which increases the technological requirements for precise tracking because multiple instances of tracking need to be combined. The multi-dimensional tracking requirement is, therefore, more demanding for AR hardware resource-wise than it is in the case of road traffic. The skipper must be tracked within the vessel, while the information displayed must be relative to the vessel’s movement and the skipper’s perspective. Accurate tracking is essential, especially for soil soundings where the measurement must be made in a narrow area or heavy traffic. This applies not only to soil-sounding vessels and drones but to all other types of vessels as well, whereby the failure tolerance decreases as vessels approach narrow or restricted fairways and higher traffic density (Gardenier 1981). However, it is also conceivable that the findings are applicable to navigation in airspace, where movements in 6 DOF are also possible, even if they do not appear at first glance as arbitrary as the movements caused by the water surface. Our use case can be deliberately considered as an edge case, which imposes high requirements. For this reason, a transfer to less demanding requirements is possible. The presented edge case aims to extend our understanding of AR applications from a more general perspective. Consequently, aspects of this use case can be applied in other contexts while not all specific features have to remain constant.
Moreover, our approach contributes to new research standards from a methodological point of view. By underpinning the thinking-aloud method with video material from various perspectives, we were able to gain the best possible understanding of the spatial implications and requirements for AR. Thereby the enriched thinking-aloud material helps to gain further insights and learnings about the user-centered requirements. Additionally, our approach is beneficial in environments that do not meet the standards of typical scientific interviews by being noisy, weather-dependent, and dirty.

5.2 Contribution to Practice

The identified requirements for the mobile use case of water depth measurement provide anchors that AR solution developers can use as guidance to develop an application that best supports users in performing their service process in mobile settings with multitasking. In particular, the detailed analysis of the problems and challenges encountered in the soil-sounding service is very practice-oriented, and the contextual requirements derived from it serve as a guide for design decisions in the development of an AR solution. Depending on the use case setting and the tasks to be performed, the requirements are to be weighted differently and prioritized in the implementation. As the process of soil sounding is a difficult case from practice, and therefore laboratory conditions are not given, implementing an AR solution is a big challenge, but it should be tackled to support practice. Once such a complex use case has been solved, it is relatively simple to adapt and scale the principle to other, less complex applications. Specific to our use case, successful implementation of all the requirements in an AR solution for practical use promises to increase safety in shipping, ensure the quality of measurement data, support collaboration, and prevent health problems.

5.3 Limitations and Future Research

However, our approach is not entirely free of limitations. Due to safety reasons and legal requirements, we could only conduct the think-aloud sessions with experienced skippers with much tacit knowledge due to their many years in the soil-sounding process. Between our three different skippers, who have seven, 13 and 15 years of experience in the process of soil sounding, there is already some initial evidence that the perception of the requirements may vary depending on experience and tacit knowledge. If the same survey were done with more inexperienced skippers, the requirements for an AR solution would likely have different levels of intensity. Thus, it could be that more inexperienced skippers are even more reliant on AR support to fill the experience gap around the missing tacit knowledge. Furthermore, our results are contextual since we have only focused on one specific use case, that of soil-sounding in water depth management. A first small transfer has already taken place within our use case itself, namely with regard to the use of the soil-sounding vessel and the use of the soil-sounding drone. All five requirements that we were able to derive for the soil-sounding vessel were also determined for the drone use case. Accordingly, we assume transferability to other use cases. In this regard, further use cases in the maritime logistics environment could be considered. One possible use case is the dredging industry, responsible for adjusting and dredging the water depths. Additionally, use cases in the field of pilotage in the harbor could be considered. Not least, the transferability to other use cases in general industry, as well as logistic scenarios, remains to be investigated.
Future challenges will be to determine whether supporting the process with AR is beneficial for the skipper from a user perspective and, if so, in what form AR as a cognitive assistant can be used to achieve the greatest possible advantage. For this purpose, a prototypical implementation and evaluation will be initiated in the future to explore the subjective usefulness of AR in the soil-sounding context. In this context, the requirements should be evaluated in more detail with experts from the maritime industry as well as AR solution developers regarding their technical feasibility, whereby the focus should be on interfaces, data types and data quality.

6 Conclusion

Before investigating what requirements an AR cognitive assistant must meet to support the service process of soil sounding, we examined the augmentability of the process using the theory of process augmentability (Yeo 2017). To answer the first research question (RQ1), we could determine that all four characteristics – which according to the theory, must be present in a process – are existent in the process of soil sounding so that augmentation of the process is sensible and could help to facilitate and improve the process.
Knowing the augmentation potential, we derived five generalized user-centered requirements for the soil-sounding process, using the results of the thinking aloud sessions as a foundation: (1) real-time overlay, (2) variety in displaying information, (3) multi-dimensional tracking, (4) collaboration, and (5) interaction requirement. This answers our second research question (RQ2). The requirements for an AR solution as a cognitive assistant, which we have determined with regard to the navigation task of the actors, correspond to the results of previous research on AR applications in road traffic. However, prior research did not investigate such a complex process in the maritime industry, so we are contributing to the research at this point. On the one hand, the moving vessel, in combination with the moving user inside the vessel, poses a great challenge in terms of multi-dimensional tracking possibilities. On the other hand, it is a safety-critical process that requires multitasking and situational awareness, i.e., a constant shift between the navigation of the vessel and the measurement of water depth, as well as collaboration with the measurement engineer. With the help of these five requirements, we provide practitioners and scholars with a foundation to assist in the development of AR applications in the above environment.

Acknowledgements

This work has been partly funded by the Federal Ministry of Education and Research of Germany (BMBF) under grant no. 02K18D180 and 02K18D181 (“WizARd – Wissensvernetzung und Kollaboration durch Anwendung erweiterter Realität in produktionsnahen Dienstleistungen”). An earlier version of the manuscript has been presented at the Wirtschaftsinformatik Conference (Osterbrink et al. 2021).
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Unsere Produktempfehlungen

WIRTSCHAFTSINFORMATIK

WI – WIRTSCHAFTSINFORMATIK – ist das Kommunikations-, Präsentations- und Diskussionsforum für alle Wirtschaftsinformatiker im deutschsprachigen Raum. Über 30 Herausgeber garantieren das hohe redaktionelle Niveau und den praktischen Nutzen für den Leser.

Business & Information Systems Engineering

BISE (Business & Information Systems Engineering) is an international scholarly and double-blind peer-reviewed journal that publishes scientific research on the effective and efficient design and utilization of information systems by individuals, groups, enterprises, and society for the improvement of social welfare.

Wirtschaftsinformatik & Management

Texte auf dem Stand der wissenschaftlichen Forschung, für Praktiker verständlich aufbereitet. Diese Idee ist die Basis von „Wirtschaftsinformatik & Management“ kurz WuM. So soll der Wissenstransfer von Universität zu Unternehmen gefördert werden.

Literatur
Zurück zum Zitat Azuma RT (1997) A survey of augmented reality. Presence Teleoper Virtual Environ 6:355–385CrossRef Azuma RT (1997) A survey of augmented reality. Presence Teleoper Virtual Environ 6:355–385CrossRef
Zurück zum Zitat Balci B, Rosenkranz C (2014) “Virtual or material, what do you prefer?” A study of process virtualization theory. In: Proceedings of the 22nd European conference on information systems Balci B, Rosenkranz C (2014) “Virtual or material, what do you prefer?” A study of process virtualization theory. In: Proceedings of the 22nd European conference on information systems
Zurück zum Zitat Berkemeier L, Menzel L, Remark F, Thomas O (2018) Acceptance by design: towards an acceptable smart glasses-based information system based on the example of cycling training. In: Proceedings of the Multikonferenz Wirtschaftsinformatik Berkemeier L, Menzel L, Remark F, Thomas O (2018) Acceptance by design: towards an acceptable smart glasses-based information system based on the example of cycling training. In: Proceedings of the Multikonferenz Wirtschaftsinformatik
Zurück zum Zitat Bräker J, Hertel J, Semmann M (2022a) Conceptualizing interactions of augmented reality solutions. In: Proceedings of the 55th Hawaii international conference on system sciences Bräker J, Hertel J, Semmann M (2022a) Conceptualizing interactions of augmented reality solutions. In: Proceedings of the 55th Hawaii international conference on system sciences
Zurück zum Zitat Collier DA (1994) The service/quality solution: using service management to gain competitive advantage. ASQC Quality Press, New York Collier DA (1994) The service/quality solution: using service management to gain competitive advantage. ASQC Quality Press, New York
Zurück zum Zitat Elder S, Vakaloudis A (2015) Towards uniformity for smart glasses devices: an assessment of function as the driver for standardisation. In: Proceedings of the IEEE international symposium on technology and society Elder S, Vakaloudis A (2015) Towards uniformity for smart glasses devices: an assessment of function as the driver for standardisation. In: Proceedings of the IEEE international symposium on technology and society
Zurück zum Zitat Engelbart DC (1962) Augmenting human intellect: a conceptual framework. Stanford Research Institute, Menlo ParkCrossRef Engelbart DC (1962) Augmenting human intellect: a conceptual framework. Stanford Research Institute, Menlo ParkCrossRef
Zurück zum Zitat Etzold J, Grimm P, Schweitzer J, Dörner R (2014) kARbon: a collaborative MR web application for communicationsupport in construction scenarios. In: Proceedings of the companion publication of the 17th ACM conference on computer supported cooperative work and social computing, pp 9–12 Etzold J, Grimm P, Schweitzer J, Dörner R (2014) kARbon: a collaborative MR web application for communicationsupport in construction scenarios. In: Proceedings of the companion publication of the 17th ACM conference on computer supported cooperative work and social computing, pp 9–12
Zurück zum Zitat Gardenier JS (1981) Ship navigational failure detection and diagnosis. Human detection and diagnosis of system failures. Springer, Boston, pp 49–74CrossRef Gardenier JS (1981) Ship navigational failure detection and diagnosis. Human detection and diagnosis of system failures. Springer, Boston, pp 49–74CrossRef
Zurück zum Zitat Hertel J, Karaosmanoglu S, Schmidt S, Bräker J, Semmann M, Steinicke F (2021) A taxonomy of interaction techniques for immersive augmented reality based on an iterative literature review. In: Proceedings of the IEEE international symposium on mixed and augmented reality, pp 431–440 Hertel J, Karaosmanoglu S, Schmidt S, Bräker J, Semmann M, Steinicke F (2021) A taxonomy of interaction techniques for immersive augmented reality based on an iterative literature review. In: Proceedings of the IEEE international symposium on mixed and augmented reality, pp 431–440
Zurück zum Zitat Heymann M, Degani A (2016) Classification and organization of information. In: Design of multimodal mobile interfaces. De Gruyter, pp 195–217 Heymann M, Degani A (2016) Classification and organization of information. In: Design of multimodal mobile interfaces. De Gruyter, pp 195–217
Zurück zum Zitat Illing J, Klinke P, Pfingsthorn M, Heuten W (2021) Less is more! Support of parallel and time-critical assembly tasks with augmented reality. Mensch Und Computer 2021:215–226 Illing J, Klinke P, Pfingsthorn M, Heuten W (2021) Less is more! Support of parallel and time-critical assembly tasks with augmented reality. Mensch Und Computer 2021:215–226
Zurück zum Zitat Jeffers PI, Muhanna WA, Nault BR (2008) Information technology and process performance: an empirical investigation of the interaction between IT and non-IT resources. Decis Sci 39:703–735CrossRef Jeffers PI, Muhanna WA, Nault BR (2008) Information technology and process performance: an empirical investigation of the interaction between IT and non-IT resources. Decis Sci 39:703–735CrossRef
Zurück zum Zitat Johnson S, Gibson M, Mutlu B (2015) Handheld or handsfree? Remote collaboration via lightweight head-mounted displays and handheld devices. In: Proceedings of the 18th ACM conference on computer supported cooperative work and social computing, pp 1825–1836 Johnson S, Gibson M, Mutlu B (2015) Handheld or handsfree? Remote collaboration via lightweight head-mounted displays and handheld devices. In: Proceedings of the 18th ACM conference on computer supported cooperative work and social computing, pp 1825–1836
Zurück zum Zitat Jones B, Tang A, Neustaedter C, Antle AN, McLaren E-S (2018) Designing a tangible interface for manager awareness in wilderness search and rescue. In: Companion of the 2018 ACM conference on computer supported cooperative work and social computing, pp 161–164 Jones B, Tang A, Neustaedter C, Antle AN, McLaren E-S (2018) Designing a tangible interface for manager awareness in wilderness search and rescue. In: Companion of the 2018 ACM conference on computer supported cooperative work and social computing, pp 161–164
Zurück zum Zitat Kaufmann H, Schmalstieg D (2002) Mathematics and geometry education with collaborative augmented reality. In: Proceedings of the ACM SIGGRAPH 2002 conference abstracts and applications, pp 37–41 Kaufmann H, Schmalstieg D (2002) Mathematics and geometry education with collaborative augmented reality. In: Proceedings of the ACM SIGGRAPH 2002 conference abstracts and applications, pp 37–41
Zurück zum Zitat Klinker K, Fries V, Wiesche M, Krcmar H (2017) CatCare: designing a serious game to foster hand hygiene compliance in health care facilities. In: Proceedings of the 12th international conference on design science research in information systems and technology, pp 20–28 Klinker K, Fries V, Wiesche M, Krcmar H (2017) CatCare: designing a serious game to foster hand hygiene compliance in health care facilities. In: Proceedings of the 12th international conference on design science research in information systems and technology, pp 20–28
Zurück zum Zitat Klinker K, Wiesche M, Krcmar H (2020) Smart glasses in health care: a patient trust perspective. In: Proceedings of the 53rd Hawaii international conference on system sciences, pp 3548–3557 Klinker K, Wiesche M, Krcmar H (2020) Smart glasses in health care: a patient trust perspective. In: Proceedings of the 53rd Hawaii international conference on system sciences, pp 3548–3557
Zurück zum Zitat Knight JC (2002) Safety critical systems: challenges and directions. In: Proceedings of the 24th international conference on software engineering, pp 547–550 Knight JC (2002) Safety critical systems: challenges and directions. In: Proceedings of the 24th international conference on software engineering, pp 547–550
Zurück zum Zitat Lee JY, Gibson MC, Lee JD (2016) Error recovery in multitasking while driving. In: Proceedings of the 2016 CHI conference on human factors in computing systems, pp 5104–5113 Lee JY, Gibson MC, Lee JD (2016) Error recovery in multitasking while driving. In: Proceedings of the 2016 CHI conference on human factors in computing systems, pp 5104–5113
Zurück zum Zitat Lusch RF, Nambisan S (2015) Service innovation: a service-dominant logic perspective. MIS Q 39:155–176CrossRef Lusch RF, Nambisan S (2015) Service innovation: a service-dominant logic perspective. MIS Q 39:155–176CrossRef
Zurück zum Zitat Matijacic M, Fellmann M, Özcan D, Kammler F, Nüttgens M, Thomas O (2013) Elicitation and consolidation of requirements for mobile technical customer services support systems—a multi-method approach. In: Proceedings of the 34th international conference on information systems Matijacic M, Fellmann M, Özcan D, Kammler F, Nüttgens M, Thomas O (2013) Elicitation and consolidation of requirements for mobile technical customer services support systems—a multi-method approach. In: Proceedings of the 34th international conference on information systems
Zurück zum Zitat Niemöller C, Niemöller D, Zobel B, Berkemeier L, Thomas O (2019) Mobile service support based on smart glasses. J Inf Technol Theor Appl 20:77–108 Niemöller C, Niemöller D, Zobel B, Berkemeier L, Thomas O (2019) Mobile service support based on smart glasses. J Inf Technol Theor Appl 20:77–108
Zurück zum Zitat Osterbrink A, Bräker J, Semmann M, Wiesche M (2021) Requirements for augmented reality solutions for safety-critical services – the case of water depth management in a maritime logistics hub. In: Ahlemann F, Schütte R, Stieglitz S (eds) Innovation through information systems. WI 2021. Lecture Notes in Information Systems and Organisation, vol 46. Springer, Cham. https://doi.org/10.1007/978-3-030-86790-4_16 Osterbrink A, Bräker J, Semmann M, Wiesche M (2021) Requirements for augmented reality solutions for safety-critical services – the case of water depth management in a maritime logistics hub. In: Ahlemann F, Schütte R, Stieglitz S (eds) Innovation through information systems. WI 2021. Lecture Notes in Information Systems and Organisation, vol 46. Springer, Cham. https://​doi.​org/​10.​1007/​978-3-030-86790-4_​16
Zurück zum Zitat Overby E (2008) Process virtualization theory and the impact of information technology. Organ Sci 19:277–291CrossRef Overby E (2008) Process virtualization theory and the impact of information technology. Organ Sci 19:277–291CrossRef
Zurück zum Zitat Träskbäack M, Haller M (2004) Mixed reality training application for an oil refinery: user requirements. In: Proceedings of the ACM SIGGRAPH international conference on virtual reality continuum and its applications in industry, pp 324–327 Träskbäack M, Haller M (2004) Mixed reality training application for an oil refinery: user requirements. In: Proceedings of the ACM SIGGRAPH international conference on virtual reality continuum and its applications in industry, pp 324–327
Zurück zum Zitat Van Someren MW, Barnard YF, Sandberg JAC (1994) The think aloud method: a practical approach to modelling cognitive processes. AcademicPress, London Van Someren MW, Barnard YF, Sandberg JAC (1994) The think aloud method: a practical approach to modelling cognitive processes. AcademicPress, London
Zurück zum Zitat von Lukas U, Vahl M, Mesing B (2014) Maritime applications of augmented reality—experiences and challenges. In: Shumaker R, Lackey S (eds) Virtual, augmented and mixed reality. Applications of virtual and augmented reality. Springer, Cham, pp 465–475CrossRef von Lukas U, Vahl M, Mesing B (2014) Maritime applications of augmented reality—experiences and challenges. In: Shumaker R, Lackey S (eds) Virtual, augmented and mixed reality. Applications of virtual and augmented reality. Springer, Cham, pp 465–475CrossRef
Zurück zum Zitat Yeo J (2017) The theory of process augmentability. In: Proceedings of the 38th international conference on information systems Yeo J (2017) The theory of process augmentability. In: Proceedings of the 38th international conference on information systems
Zurück zum Zitat Yin RK (2003) Case study research design and methods, 3rd edn. Sage, London Yin RK (2003) Case study research design and methods, 3rd edn. Sage, London
Metadaten
Titel
User-Centered Requirements for Augmented Reality as a Cognitive Assistant for Safety-Critical Services
verfasst von
Julia Bräker
Anna Osterbrink
Martin Semmann
Manuel Wiesche
Publikationsdatum
19.12.2022
Verlag
Springer Fachmedien Wiesbaden
Erschienen in
Business & Information Systems Engineering / Ausgabe 2/2023
Print ISSN: 2363-7005
Elektronische ISSN: 1867-0202
DOI
https://doi.org/10.1007/s12599-022-00779-3

Weitere Artikel der Ausgabe 2/2023

Business & Information Systems Engineering 2/2023 Zur Ausgabe

Catchword

Dark Patterns

Premium Partner