1 Introduction
-
An understanding of the test capabilities of a CI environment. Study results show correlation between CI components and NFR test capabilities.
-
A view of the status on how companies handle NFR evaluation. Common NFR types and NFR metrics were identified in this study.
-
Practices of adding or upgrading components and tools in a CI environment to enable automated verification for particular NFRs (e.g., automated security scans) and support NFR evaluation in terms of fast test execution and continuous analysis and monitoring on test outputs.
-
A guideline to collect metric data through CI components to measure NFRs.
-
Challenges to be considered while performing NFR testing in practice.
2 Related Work
3 Research Methodology
3.1 Research Questions
-
RQ1: What types of NFRs are verified through automated tests?
-
RQ2: What metrics are used for automated NFR testing in industrial practice?
-
RQ3: How are CI environments contributing to NFR test capabilities?
-
RQ4: What challenges are associated with automated NFR testing?
Context | Context | Project A | Project B | Project C | Project D | Project E |
---|---|---|---|---|---|---|
facet | element | |||||
Project | Business domain | Finance technology | Telecommunications | Finance technology | Digital communication | Health and social affairs |
Project size (Sas and Avgeriou 2020) (number of engineers) | Medium (101 to 200) | Large (>1500) | Medium (51 to 100) | Small (21 to 50) | Larget (>1000) | |
Product type | Finance services | Common platform for telecommunications | Banking platform solutions | Digital business support solutions | Common welfare solutions | |
Maturity of product | Mature product | Long-lived mature product | Mature product | Mature product | Long-lived mature product | |
Organization | Size (number of engineers) | \(\sim \)300 | >2500 | \(\sim \)100 | \(\sim \)50 | >1000 |
Agile adoption | Yes(global) | Yes(global) | Yes(nationally) | Yes(global) | No | |
Teams involved | Development & test teams from one location | Development & test teams from one location | Development teams from one location | Development & test teams from one location | Development & test teams from one location | |
Distributed development | Yes | Yes | No | Yes | Yes | |
Market | Type of customer | Global organizations | Global organizations | National individuals | Global organizations | National citizens |
NFRs | Testing process | Plan-driven and agile-based continuous development | Plan-driven and agile-based continuous and incremental development | Incremental development | Mixture of Scrum and Kanban for incremental development | Mixture of Scrum and Kanban for incremental development |
Testing demands | Need more | Need more | Need more | Need more | Need improvements | |
CI environment | CI maturity | Mature | Mature | Fast growing | Fast expansion | Mature |
3.2 Case Study Design
3.3 Case Companies
3.3.1 Prepare Interview Questions
3.3.2 Select Participants
Participant ID | Participant role | Work experience (years) | Project name |
---|---|---|---|
T1 | Lead developer | 10 - 15 | Project A |
T2 | Software architect | 5 - 10 | |
T3 | Senior developer | 10 - 15 | |
T4 | Developer | 3 - 5 | |
T5 | Software architect | 5 - 10 | |
T6 | Tester | 3 - 5 | |
T7 | Tester | 5 - 10 | |
T8 | Senior tester | 10 - 15 | |
T9 | Developer | 3 - 5 | Project B |
T10 | Developer | 5 - 10 | |
T11 | Product owner | 10 - 15 | |
T12 | Tester | 5 - 10 | |
T13 | Principal developer | 15 - 20 | Project C |
T14 | Senior architect | 10 - 15 | |
T15 | Software architect | 5 - 10 | |
T16 | Tester | 5 - 10 | |
T17 | Developer | 10 - 15 | Project D |
T18 | Tester | 5 - 10 | |
T19 | Tester | 5 - 10 | |
T20 | Developer | 5 - 10 | Project E |
T21 | Software architect | 5 - 10 | |
T22 | Tester | 5 - 10 |
3.3.3 Pilot Testing
3.4 Data Collection
3.5 Data Analysis
Codes | Interview | Timestamp | Quote | RQ | Interviewee |
---|---|---|---|---|---|
record ID | IDs | ||||
e.g., Code review | e.g., A1 | 09:59 - 12:42 | Add reviewer manually | RQ2 | e.g., T5 |
Jenkins verification job gives \(+\)1 if success | RQ3 | ||||
Product owners give \(+\)2 when a commit is approved | RQ2 RQ3 | ||||
Two \(+\)2 is set to be mandatory for each Git commit, to ensure it works and encourage people to review carefully | RQ2 |
ID | Codes | Description | Interviewee IDs |
---|---|---|---|
e.g., 10 | e.g., Code review | Statements about source code reviews. | T1,T2,T3,T4,T5,T6,T7,T8,T9, T14,T15,T17,T18,T20,T21,T22 |
4 Results
4.1 RQ1:What Types of NFRs are Verified Through Automated Tests?
NFR | Attributes | Internal or external attribute | Project name |
---|---|---|---|
Maintainability | Testability | Internal | Project B |
Changeability | |||
Modifiability | |||
Security | Vulnerability | External | Project A, B, C, and D |
Confidentiality | |||
Authentication | |||
Access control | |||
Performance | Response time | External | Project A, B, C, D, and E |
Accuracy | |||
Resource utilisation | |||
Stability | Fault tolerance | Internal | Project B |
Recoverability | |||
Scalability | High availability | External | Project A and B |
4.2 RQ2: What Metrics are Used for Automated NFR Testing in Industrial Practice?
NFR type | NFR metrics | Data in the studied projects | Related CI components | Project name |
---|---|---|---|---|
Performance | Mean Response Time (MRT): MRT = (A1 \(+\) A2 \(+\) ...\(+\) An)/n, where Ai is the time that a service takes to respond to a request, and n is the number of measured responses. | Response time of payment transactions between accounts in a financial service | Source code management Version control system CI server Test automation | Project A |
Response time of managing user subscriptions in a mobile network service | Source code management Version control system CI server Test automation | Project B | ||
Response time of requests to create users and accounts in a banking system | Source code management Version control system CI server Test automation | Project C | ||
Processor usage (PU): PU = A/B, where A is processor time required to execute a set of tasks, and B is operation time that is required to perform the tasks. | Processor time of managing mobile devices and data in a web service | CI server Test automation Cloud platform | Project D | |
Accuracy of API requests (AOAR): AOAR = A/B, where A is the number of failed API requests, and B is the total number of API requests processed by a service. | The accuracy rate of user requests in a welfare system | Source code management Version control system CI server Test automation | Project E | |
Security | Data encryption (DEC): DEC = A/B, where A is the number of user data items encrypted correctly, and B is the total number of data items requiring encryption. | User privacy data in a finance system | Source code management Version control system CI server Test automation Issue tracking | Project A |
User access audit, UCA = A/B, where A is the number of accesses recorded in security logs, and B is the number of executed accesses to data. | User access data in a mobile network service | CI server Test automation Issue tracking | Project B | |
Penetration test | Security risks for artifacts in a web application | CI server Artifacts management Cloud platform | Project C | |
Vulnerability assessment (VA) | Vulnerabilities in file systems | CI server Artifacts management Issue tracking | Project A, BC, and D | |
Scalability | Mean recovery time (MRET): MRET = (A1 \(+\) A2 \(+\) ...\(+\) An)/n, where Ai is the total time to recover the failed user requests, and n is the number of failed requests. | Recovery time of accessing mobile data from failures in a mobile network service | Source code management Version control system CI server Test automation Artifacts management Cloud platform | Project B |
Maintainability | N/A | N/A | Source code management CI server Static code analysis | Project A, B |
Stability | Maximum number of requests processed in unit time (MRP): MRP = (A1 \(+\)...\(+\) An)/(B1 \(+\)...\(+\) Bn), where Ai is the total user access that a service can handle within a time period, and Bi is the total time to process user accesses. | The capability of a system to handle user access per second in a mobile network service | Source code management Version control system CI server Test automation Cloud platform | Project B |
4.3 RQ3: How are CI Environments Contributing to NFR Test Capabilities?
4.3.1 Software Tools that Enable Automated NFR Testing
Types of NFRs | Tools enable NFR testing | Tools support NFR testing |
---|---|---|
Performance | Postman, JMeter, Junit | Bitbucket, VMware, Jenkins, Jfrog artifactory |
Security | ZAP, Anchore, SonarQub | Jenkins, Jfrog artifactory |
Scalability | JMeter | Jenkins, VMware |
Maintainability | JMeter | Jenkins, VMware |
Stability | Junit | Jenkins, VMware |
4.3.2 Individual CI Components Used for Automated NFR Testing
4.3.3 Sets of CI Components for Automated NFR Testing
4.4 RQ4: What Challenges are Associated with Automated NFR Testing?
5 Validity Threats
6 Discussion
6.1 Implications for Practitioners
6.2 Implications for Researchers
-
The study provides insights into the implementation of NFR verification in CI environments, offering a foundation for further research in this area. Researchers can build upon these findings to investigate more extensive connections between CI components, NFR metrics, and challenges, exploring additional possibilities that may exist beyond those identified in this study.
-
Further investigation is needed to explain why certain beneficial NFR metrics, which have potential to improve software quality in companies, are not used. This opens opportunities for future research to explore the barriers or reasons behind the underutilization of such metrics, providing a deeper understanding of the factors influencing their adoption.
-
The observation that ISO/IEC 25023 standards were utilized in defining specific NFR metrics in one of the studied projects emphasizes the importance of exploring the role of international standards in NFR evaluation. Future research could explore into the benefits and challenges of adopting international standards for the evaluation.