Skip to main content
Top
Published in:
Cover of the book

Open Access 2023 | OriginalPaper | Chapter

5. Metrics Representation and Dashboards

Authors : Artem Kruglov, Giancarlo Succi, Idel Ishbaev

Published in: Developing Sustainable and Energy-Efficient Software Systems

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This section is dedicated to reviewing dashboards for various roles, such as developers and managers, to provide metrics for software development in the most appropriate way. After reviewing various papers, different types, objectives, and visualization features of the dashboards were found. Based on these findings, tailored visualization was designed for the distinct roles of users. Furthermore, the dashboard was implemented within the Innometrics project boundaries with easily maintainable and extensible architecture on the front-end.
Software development is unquestionably a difficult task. There are a variety of metrics that may be used to evaluate the development process and developers in order to improve this process. However, the usefulness of these indicators is determined by how they are visualized and presented. For these goals, information dashboards are handy. A dashboard is a “visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance” (Few 2004).
Many studies have been conducted on dashboards, their purposes, visualization, and how to design one. They have been studied as Key Performance Indicators (KPI), scorecards (Kaplan et al. 1997), and Executive Information Systems (EIS) (Thierauf 1991) since the twentieth century, and they can all be treated in the same way (Few 2006). Furthermore, whereas dashboards were once studied in broad terms (Few 2004, 2006; Few et al. 2007; Pauwels et al. 2009), they are now explored in more specific areas, such as their usage in medicine (Boscardin et al. 2018), Continuous Software Engineering (Johanssen et al. 2017), and software development (López et al. 2021; Rahman et al. 2017).
Gaining a deeper understanding of these topics and analyzing them would help to improve development processes by providing valuable feedback (Hattie et al. 2007) and by creating better dashboards.
This chapter aims to better understand the dashboard’s purposes and visualization approaches considering the dashboard’s audience (developers and managers). In addition, it aims to create a faceted dashboard (Few et al. 2007), which means the same data can be represented in multiple ways for various people. This type of dashboard is also known as a tailored dashboard (Johanssen et al. 2019).

5.1 Literature Review

A literature review is a crucial step in determining the study topic. It provides necessary background information, a clear picture of what has been developed and investigated, and its relevance and gaps in existing research. As a result, the following subjects were reviewed: dashboards in general, their types and goals, and visualization approaches.

5.1.1 Review of Literature on Dashboards

There are four levels of information, according to Savoie (2012): data, information, knowledge, and wisdom. Each level makes the previous one richer and more informative. The fist level is made up of facts with no meaning and is just input data. As stated by Few (2006), collecting, processing, and storing data is well explored, but there is “little progress in using that information effectively.” As a result, to present data, it has to be turned into information, which is the second level of processing. To turn data into information, it must be organized and structured with a shared meaning. For example, we can organize the data into rows and columns, assigning titles and descriptions to them, and the data becomes meaningful as information. Furthermore, dashboards are the most effective approach to convey information (Savoie 2012).
EIS (Thierauf 1991), scorecards, and KPIs (Kaplan et al. 1997) have been studied since the 1990s. They are all used to support decision-making, measure performance, and monitor execution of activities, and according to Few (2006) dashboards appear to be synonymous or simply a new name for them. However, nowadays, the definition of dashboards by Few (2004) is the most used and common:
Visual display
of
the most Information needed to achieve one or more objectives
which
fits entirely on a single computer screen
so it can be
monitored at a glance.
It is evident from the definition that the dashboard’s visual appearance, as well as its objectives, is very significant. It should deliver appropriate and reliable content in a straightforward and understandable manner to the end-user. According to Few (2006), most dashboards fail to communicate efficiently and effectively due to poorly designed implementation rather than a lack of technology.
Visualization of dashboards is essential not because of its beauty but because it can communicate with end-users with greater efficiency and richer meaning than plain text (Few 2006). As a result, visualization becomes more science than art because a successful dashboard is informed design, not just cute gauges, meters, and traffic lights (Few 2006).
Furthermore, while considering dashboard objectives, it is evident that different end-users would have varied objectives, requiring the development of different dashboards for each group of users. As a result, tailored dashboards are becoming increasingly useful (Few 2006; Johanssen et al. 2019), customized to meet individual needs to improve communication and suit the needs of users. In particular, faceted analytical displays give distinct views of the same data to different users for the purpose of analysis (Few et al. 2007); in other words, the same data is displayed in different ways.

5.1.2 Types of Dashboards

According to Eckerson (2010), there are three types of dashboards: operational, tactical, and strategic. Each type, from operational to tactical, has more complex data, which means that for the first type, simple data is used, and for the strategic type, more complex data for purposes of analysis.
Each type differs in its objectives, functionality, and end-users, which is the reason for different data abstraction levels. But even if they have different usages, they overlap in some sense, which means that the same dashboard can be used for various purposes and by other end-users. For example, front-line workers can use both tactical and operation types of dashboards and managers can use all three of them. Thus, each dashboard type has its own features, but there is no clear boundary between them (Table 5.1).
Table 5.1
Dashboard types according to Eckerson (2010)
https://static-content.springer.com/image/chp%3A10.1007%2F978-3-031-11658-2_5/MediaObjects/525714_1_En_5_Tab1_HTML.png
Operational level is mainly used by front-line workers to track operational processes, such as those involving people, tasks, events, and activities, as they occur.
Furthermore, there are two subtypes of the operational level: detect-and-respond and incent-and-motivate. The first subtype relates to monitoring activities or optimizing processes. The second subtype is a dashboard to increase workers’ productivity by presenting the workers’ or team’s performance metrics.
Most operational level dashboards display metrics of low-level processes; they contain detailed data or sometimes are slightly summarized. Furthermore, the dashboards often offer only one level of data with no drill-down functionality and provide the most up-to-date information. The dashboards look like automobile dashboards with alerts, dials, and gauges.
Tactical level is used to optimize business processes and to analyze performance against goals. Dashboards emphasize analysis, and monitoring is also available. In most cases, the dashboard looks like an analytical or functional dashboard containing tables and charts describing what happened in the past. Also, it is worth mentioning that usually for tactical-level dashboards different users have different data presented depending on their role.
Interaction with charts and tables is a regular feature at the operational level. Drill-down, filters, sorting, changing views, pop-ups (Few 2006), and other features make it easier to interact with the dashboard and improve the user experience.
The dashboards used by mid-level managers and their appearance are somewhere between operational and strategic levels, allowing users to keep track of different processes and data in one spot.
Strategic level is mostly used by executives to review the progress toward strategic goals once a month or more rarely. In summary, the first two levels measure processes to understand what is happening now or in the short term, but the strategic level deals with long-term strategies.
Furthermore, some research includes a fourth type: analytical. It contains a large amount of complex data and what-if scenarios to assist executives in their analysis and planning (Nadj et al. 2020).
In addition, dashboards can also be classified as static (read-only) and interactive. Despite that, Few (2006) states that providing different varieties of data is meaningless, and nowadays static dashboards are not relevant anymore. Interactive dashboards can help to handle chunks of information effectively considering the growth of complexity and amount of data (Nadj et al. 2020).

5.1.3 Purposes/Objectives of Dashboards

There are four main purposes of dashboards according to Pauwels et al. (2009): consistency, monitoring, planning, and communication.
  • Consistency—indicates an organization’s procedures, measures, and measurements.
  • Monitoring—monitor and evaluate performance. Such dashboards are early performance indicators of performance and let companies detect and react.
  • Planning—strategic planning.
  • Communication—share information about performance, organization values as performance, etc.
However, Rahman et al. (2017) propose a fifth purpose of dashboards—analysis, which is used to analyze personal and companies’ performance and is similar to the monitoring purpose of Pauwels et al. (2009) but more precise.

5.1.4 Visualization Methods of Dashboards

In the study by Yigitbasioglu et al. (2012), the authors underline the importance of interaction with a dashboard. Their research on dashboard graphical user interfaces distinguished two categories of design features: visual and functional.
Visual features refer to how data is displayed to the user; they influence how effectively and efficiently data is presented, and they directly affect the time for perceiving information. Inappropriate use of visual effects, such as varying surface styles, expanding the number and ranges of objects, overwhelming 3D objects, and non-value-adding frames, can make the understanding process more difficult. Gridlines inside charts, a high data-ink ratio, and the elimination of non-data-ink components in charts are all examples of improvements.
“Data-ink ratio is a parameter that defines the relationship of ink used to illustrate data to the overall ink leveraged to represent the chart” (Nadj et al. 2020).
An example of poor visual features is inappropriate use of color or data-ink ratio, which will distract or confuse users, for example, using yellow and red too often may attract attention and also may lead to the impossibility of highlighting errors or critical information (Nadj et al. 2020).
Green, yellow, and red should represent acceptable, satisfactory, and poor performance/alarms, respectively (Few 2006).
Functional features describe what the dashboard can do. Such features include pointing and clicking interactivity, which enables rolling down and up, filtering, sorting, brushing, illustrating more data, and additional information on pointing.
Such features are a step toward interactive dashboards, in which the user is actively participating in the data analysis process. However, the user’s effort to interact can lengthen the analysis process or have a negative impact on decision-making (Nadj et al. 2020).
An example of a poor functional feature is dashboards which fail to provide needed functionality to a user who cannot gain enough information (e.g., to analyze or plan) from dashboards due to a lack of these features.
Few (2006) also includes standard practices in his study, such as visual and functional features:
Common features:
  • Using charts, tables, speedometer widgets for graphical representation
  • Dividing the full set of data to individual views
  • “Digital cockpit”—summary of performance by color-coded light indicator
  • Using gauges with traffic-light colors
In addition, in the study by López et al. (2021), the authors provide different views and different kinds of charts to analyze the performance and distinguish different levels of abstraction to visualize the same data. According to the study, high-level strategic indicators are effective and improve decision-making processes.
  • Process performance strategic indicators (SI)—gauge chart showing the SI assessment value
  • Detailed SI—radar chart, quality factors (QF) for the SI assessment
  • Factors—radar chart, metrics for the QF assessment
  • Historical data view for metrics—line chart, metrics assessments

5.2 Methodology and Implementation

In this chapter, knowledge gathered from various studies is assessed and applied to the Innometrics project considering its objectives and constraints. In order to develop a visualization of the dashboard, the following steps were proposed.
  • Define audience, end-users
  • Define objectives, constraints
  • Define purposes, Fig. 5.1
  • Define type, Fig. 5.1
  • Determine visual and functional features (Yigitbasioglu et al. 2012) for visualization, Table 5.2
    Table 5.2
    Visual and functional features (Eckerson 2010; Few 2004, 2006; Nadj et al. 2020; Rahman et al. 2017)
    https://static-content.springer.com/image/chp%3A10.1007%2F978-3-031-11658-2_5/MediaObjects/525714_1_En_5_Tab2_HTML.png
  • Develop architecture

5.2.1 Innometrics Project

Innometrics is a project that tracks, evaluates, and analyzes the software development process. This system provides a visualization of users’ working activity statistics.
The project collects, analyzes, and visualizes users’ activity data and developers’ code quality metrics:
  • Top applications per person daily
  • Accumulated activities
  • Accumulated total time spent
  • Category of activities
    • Development
    • Education
    • Communication
    • Utilities
    • Management
    • Entertainment
This project aims to keep track of, assess, and evaluate developers and the development process in general. A robust dashboard, which is “such a powerful management tool” (Pauwels et al. 2009), must be designed to visualize data as metrics and communicate with end-users effectively and efficiently. It effectively communicates with the target user by presenting data as information. As noted in the review, we present data in dashboards since it is the most efficient and effective approach to convey data as information and communicate with users. However, to design such a dashboard, we examined its goals and who the target audience is.
There are two types of end-users in the project, each with a different purpose, managerial level, and request for different types of information from dashboards. The project’s primary audience are developers and managers (Fig. 5.2).
As a result, the challenge was to deliver information to both types of users in an effective and efficient manner. We created two unique dashboard visualizations for these purposes, each with a different level of data abstraction (López et al. 2021) and visualization elements (Yigitbasioglu et al. 2012). To put it another way, it will be a faceted dashboard. It offers a distinct view for developers that will satisfy front-line workers’ requirements. For managers, it has another visualization with different objectives, and it delivers another piece of information using the same data. Moreover, considering that the project is growing and will have more metrics and graphics in the future, extensible architecture was developed.
To decide how to visualize data properly, we first examined the objectives of both types of users, and then selected a type of dashboard and appropriate visual and functional features.

5.2.2 Visualization for Developers

Developers are the first group of users. They are front-line employees that require the most up-to-date performance metrics for monitoring rather than historical metrics for analysis. As a result, the dashboard’s purposes are analysis and monitor, and its types are operational, with subtype incent-and-motivate. Considering that, there are less historical data and more Process Performance strategic indicators (gauges) (López et al. 2021) in the dashboard. Furthermore, it is more static than interactive because unnecessary interactive features can distract users (Nadj et al. 2020); yet, it does provide some interactivity choices to improve user experience.
The purpose, type, and features for developers’ dashboard may be found in the Creftab (Table 5.3).
Table 5.3
Dashboard for developers
https://static-content.springer.com/image/chp%3A10.1007%2F978-3-031-11658-2_5/MediaObjects/525714_1_En_5_Tab3_HTML.png

5.2.3 Visualization for Managers

The second group of users is mid-level managers—decision-makers who monitor and analyze employees’ performance and development processes. Thus, the dashboard has monitor and communication purposes and tactical type. It is also fully interactive, making it more analytical and assisting users in better comprehending complex data. In Table 5.4 type, purposes, and features for managers’ dashboards can be observed.
Table 5.4
Dashboard for managers
https://static-content.springer.com/image/chp%3A10.1007%2F978-3-031-11658-2_5/MediaObjects/525714_1_En_5_Tab4_HTML.png

5.2.4 Common Visual and Functional Features

Since operational and tactical types that we determined for both dashboards sometimes overlap, some features are shared (Eckerson 2010), especially visual features, which affect how effectively and efficiently data is presented. In Table 5.5 these features can be observed.
Table 5.5
Common features for developers’ and managers’ dashboards
https://static-content.springer.com/image/chp%3A10.1007%2F978-3-031-11658-2_5/MediaObjects/525714_1_En_5_Tab5_HTML.png

5.2.5 Architecture, Implementation

Another problem was to develop a dashboard that was extensible and maintainable as the project progressed and more metrics and graphics were added. The goal was to provide front-end extensibility that allowed developers to delete, add, and change widgets, graphics, and other visualized data easily, without relying on other layers. Several design patterns and principles can be used to create a dashboard on the front-end layer that is easily expandable and maintainable. The Mediator Design Pattern can be utilized in object-oriented programming (OOP) (Gamma et al. 1995).
The ReactJS framework was used to build the user interface. Thus, we rely on Functional Programming (FP) since the framework relies on Functional components for modern React applications (Banks et al. 2017), and it was not possible to use the Mediator Design Pattern in its pure form.
Functional components were used, which means that components were defined as functions (not as classes). Functions have different design patterns and principles than OOP. We defined low-level functions and then applied composition (Banks et al. 2017). With the help of this, simple standalone functions are consolidated into one big dashboard system.
Moreover, we took the idea from the mediator pattern and created a mediator component that comprised low-level function components (Wieruch 2022). The mediator component took care of preprocessing data (turning data into metrics) and setting configuration data for low-level functions. After configuration components create cards with the following fields for each widget:
  • id
  • name
  • type (used to distribute to exact widget, for example, line graph)
  • API (function to fetch data)
  • *some other configuration information
The mediator component distributes data to low-level components based on the type field. As a result, low-level graphical representation components deal only with ready input data and can be reused (see Fig. 5.3). Thus, only configuration components can be updated to maintain the dashboard.
In Fig. 5.3 the structure of the dashboard can be observed:
  • Dashboard—the highest abstraction level, which comprises lower-level components. It deals with the creation of configuration cards, which are then passed on to the mediator.
  • Configuration—a component that generates cards for each widget, each with fields such as name, type, and API data.
  • Mediator—component that receives configuration cards and distributes them to the widgets based on their types.
  • WidgetA …WidgetN—the lowest-level components that receive preprocessed data and present it in a certain way based on the user’s role.
As a result, updating existing widgets or adding new graphics can be accomplished in a few simple steps, which aligns with our goal of creating a dashboard that is easy to extend and manage.
Steps to edit or add new widgets:
1.
Edit configuration component (edit or add new card):
  • Specify the type that the mediator will use to pass to the exact widget.
  • Update id, name, etc.
  • Set the default location for it to appear.
 
2.
*If necessary, create a preprocessing function (to transform data into a form that can be used by a widget component).
 
3.
*Create new widget (new representation) if needed.
 

5.2.6 Conclusion

As the software development process and its evaluation become more complex, understanding the purposes, types, and visualization approaches of tailored dashboards for managers and developers becomes an essential topic. Hence, it will improve the development of metrics presentations that provide end-users with more relevant data. After reviewing various works, it became evident that the dashboard requires specific representation for managers and developers, taking into account their respective goals and requirements. As a result, types, objectives, and aesthetic aspects were examined in order to design a dashboard that was suited to each type of end-user. On top of that, the dashboard’s architecture was designed to be easily maintainable and extensible since it was one of the research’s aims. A deep understanding of dashboard types, purposes, visuals, and architecture allows assessing the development process and presenting metrics to end-users more accurately and in a practical manner.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Literature
go back to reference Banks, Alex et al. 2017. Learning React: functional web development with React and Redux. Sebastopol: O’Reilly Media. Banks, Alex et al. 2017. Learning React: functional web development with React and Redux. Sebastopol: O’Reilly Media.
go back to reference Boscardin, Christy et al. 2018. Twelve tips to promote successful development of a learner performance dashboard within a medical education program. Medical Teacher 40(8): 855–861.CrossRef Boscardin, Christy et al. 2018. Twelve tips to promote successful development of a learner performance dashboard within a medical education program. Medical Teacher 40(8): 855–861.CrossRef
go back to reference Eckerson, Wayne W. 2010. Performance dashboards: measuring, monitoring, and managing your business, 101–122. New York: Wiley. Eckerson, Wayne W. 2010. Performance dashboards: measuring, monitoring, and managing your business, 101–122. New York: Wiley.
go back to reference Few, Stephen. 2006. Information dashboard design: The effective visual communication of data. Vol. 2. Sebastopol: O’Reilly. Few, Stephen. 2006. Information dashboard design: The effective visual communication of data. Vol. 2. Sebastopol: O’Reilly.
go back to reference Few, Stephen et al. 2007. Dashboard confusion revisited. In Perceptual Edge, 1–6. Few, Stephen et al. 2007. Dashboard confusion revisited. In Perceptual Edge, 1–6.
go back to reference Gamma, Erich et al. 1995. Design patterns: Elements of reusable object-oriented software. Pearson Deutschland GmbH. Gamma, Erich et al. 1995. Design patterns: Elements of reusable object-oriented software. Pearson Deutschland GmbH.
go back to reference Johanssen, Jan Ole et al. 2017. Towards the visualization of usage and decision knowledge in continuous software engineering. In 2017 IEEE Working Conference on Software Visualization (VISSOFT). Piscataway: IEEE. Johanssen, Jan Ole et al. 2017. Towards the visualization of usage and decision knowledge in continuous software engineering. In 2017 IEEE Working Conference on Software Visualization (VISSOFT). Piscataway: IEEE.
go back to reference Johanssen, Jan Ole et al. 2019. Tailored information dashboards: A systematic mapping of the literature. In Proceedings of the XX International Conference on Human Computer Interaction, 1–8. Johanssen, Jan Ole et al. 2019. Tailored information dashboards: A systematic mapping of the literature. In Proceedings of the XX International Conference on Human Computer Interaction, 1–8.
go back to reference Kaplan, Robert S. et al. 1997. Balanced scorecard. Schäffer-Poeschel. Kaplan, Robert S. et al. 1997. Balanced scorecard. Schäffer-Poeschel.
go back to reference López, L. et al. 2021. QaSD: A quality-aware strategic dashboard for supporting decision makers in agile software development. Science of Computer Programming 202: 102568.CrossRef López, L. et al. 2021. QaSD: A quality-aware strategic dashboard for supporting decision makers in agile software development. Science of Computer Programming 202: 102568.CrossRef
go back to reference Nadj, Mario et al. 2020. The effect of interactive analytical dashboard features on situation awareness and task performance. Decision Support Systems 135: 113322.CrossRef Nadj, Mario et al. 2020. The effect of interactive analytical dashboard features on situation awareness and task performance. Decision Support Systems 135: 113322.CrossRef
go back to reference Pauwels, Koen et al. 2009. Dashboards as a service: Why, what, how, and what research is needed? Journal of Service Research 12(2): 175–189.MathSciNetCrossRef Pauwels, Koen et al. 2009. Dashboards as a service: Why, what, how, and what research is needed? Journal of Service Research 12(2): 175–189.MathSciNetCrossRef
go back to reference Rahman, Azizah Abdul et al. 2017. Review on dashboard application from managerial perspective. In 2017 International Conference on Research and Innovation in Information Systems (ICRIIS). Piscataway: IEEE. Rahman, Azizah Abdul et al. 2017. Review on dashboard application from managerial perspective. In 2017 International Conference on Research and Innovation in Information Systems (ICRIIS). Piscataway: IEEE.
go back to reference Savoie, Michael. 2012. Building successful information systems: Five best practices to ensure organizational effectiveness and profitability. Business Expert Press.CrossRef Savoie, Michael. 2012. Building successful information systems: Five best practices to ensure organizational effectiveness and profitability. Business Expert Press.CrossRef
go back to reference Thierauf, Robert J. 1991. Executive information systems: a guide for senior management and MIS professionals. Quorum Books. Thierauf, Robert J. 1991. Executive information systems: a guide for senior management and MIS professionals. Quorum Books.
go back to reference Yigitbasioglu, Ogan M. et al. 2012. A review of dashboards in performance management: Implications for design and research. International Journal of Accounting Information Systems 13(1): 41–59.CrossRef Yigitbasioglu, Ogan M. et al. 2012. A review of dashboards in performance management: Implications for design and research. International Journal of Accounting Information Systems 13(1): 41–59.CrossRef
Metadata
Title
Metrics Representation and Dashboards
Authors
Artem Kruglov
Giancarlo Succi
Idel Ishbaev
Copyright Year
2023
DOI
https://doi.org/10.1007/978-3-031-11658-2_5

Premium Partner