1 Introduction
2 An Evolutionary Approach to Writing Use Cases
2.1 Overview of Use Cases
2.2 Process of Use-Case Elicitation
2.3 The Problem
3 HAZOP Based Use-Case Review
3.1 HAZOP in a Nutshell
-
NO: the design intend is not achieved at all,
-
MORE: an attribute or characteristic exceeds the design intend,
-
LESS: an attribute or characteristic is below the design intend,
-
AS WELL AS: the correct design intent occurs, but there are additional elements present,
-
PART OF: part of the design intend is obtained,
-
REVERSE: the opposite of the design intent occurs,
-
OTHER THAN: there are some elements present, but they do not fulfill the design intend,
-
EARLY: the design intent happens earlier than expected,
-
LATE: the design intent happens later than expected,
-
BEFORE: the design intent happens earlier in a sequence than expected,
-
AFTER: the design intent happens later in a sequence than expected.
3.2 H4U: HAZOP for Use Cases
-
design intent is the main scenario of a use case;
-
deviation is an event that can appear when the scenario is executed and results in an alternative sequence of steps.
-
Input & output data—relates to objects that are exchanged between external actors and the system;
-
System data—relates to objects already stored in the system;
-
Time—relates to the “moment of time” when a step is being executed (e.g., dependencies between actions, pre- and post- conditions of actions); it does not relate to non-functional aspects such as performance (e.g. response time) or security (user session timeout).
3.3 Example
4 Experimental Evaluation of Individual Reviews
4.1 Empirical Approximation of Maximal Set of Events
-
A1: include events that were defined by the authors of benchmark specification;
-
A2: the authors of the paper could brainstorm in order to identify their own reference set of possible events;
-
A3: construct the approximate maximal set in an empirical way, based on all distinct events identified by the participants in the experiments (after reviewing them).
4.1.1 Rejecting Undetectable Events
4.1.2 Handling Duplicated Events
4.1.3 Excluding Non-Use-Case Events
Abstract class of event | Explanation | Example | Number |
---|---|---|---|
Authorization Problem | Events concerning authentication and authorization problems. They should not be included in alternative flows of a user-level use case, but handled separately as NFRs. |
User does not have permission to add a new thread.
| 42 |
Concurrency of Operations | Events related to concurrency problems at the level of internal system operations (e.g., concurrent modification of the same object). This kind of requirement relates to NFRs (transactional operations). There are also concurrent operations that relate to business rules (e.g., while one is applying to a major the admission is closed). This kind of events belong to another abstract class of events and was accepted. |
Candidate data was changed by the administrator before the candidate submitted modifications.
| 41 |
Connection Problem | Events describing problems with the connection between a user and the system. Use-cases should not include technical details. It should be described as an NFR. The system is not able to respond to such an event. |
The data could not be sent to the server.
| 57 |
Internal System Problem | Events describing failures inside the system. These events should not be included as user-level use cases. |
System was not able to store the data in the database.
| 339 |
Linked Data | Problems related to the consistency of data stored in the system. It should rather be described in the data model, business rules, or NFRs. |
Administrator selects item to be removed that is connected to other items in the system.
| 7 |
Other Action | Events expressing actor’s will to perform a different action than the action described in a use-case step. Any participant could come up with an enormous number of such events. We would not be able to assess if they are reasonable and required by the customer. |
User wants to view the thread instead of adding a new one.
| 226 |
Timeout | Events referring to exceeding the time given to complete a request (e.g., user session timeout). Should be describe as an NFR. |
It takes candidate too long to fill the form.
| 68 |
Withdraw | Events describing cases when a user wants to resign from completing a running operation. This events could be assigned to almost every step of a use case, so this would make the use case cluttered. |
User does not want to provide the data, so he/she quits.
| 180 |
4.1.4 Events Included into the Maximal Set of Events
Abstract class of event | Explanation | Example | Number |
---|---|---|---|
After Deadline | The operation is not allowed, because it is performed after deadline. |
The admission for the chosen major is closed.
| 12 |
Algorithm Runtime Error | Problems with the execution of an admission algorithm, which is defined by a user. |
Unable to execute the algorithm.
| 9 |
Alternative Response | Events describing the alternative response of an actor (usually, a negative version of the step in the main scenario) |
The committee reject the application.
| 89 |
Connection Between Actors | Problems with communication between the system and external actors, i.e. actors representing other systems or devices. The system is able to respond to such an event. |
Payment system does not respond. The system can propose to use a different method of payment.
| 65 |
Lack of Confirmation | A user does not confirm his/her action. |
Administrator does not confirm removing the post.
| 12 |
Missing Data | Events concerning missing input data. |
Candidate did not provide personal data
| 481 |
No Data Defined | System does not have requested data. |
There are no majors available
| 259 |
Payment events | Events identified in a use case “Assign an application fee to a major.” They relate either to credit card payment problems (expired card or insufficient resources) or choice of an alternative payment method. |
Credit card has expired.
| 18 |
Redo Forbidden | Actor would like to redo some operation that can be performed only once (e.g., registering for the second time). |
Candidate tries to register for the second time.
| 103 |
Wrong Data or Incorrect Data | These are events related to the wrong (in the sense of content) or incorrect (in the sense of syntactic format) input data. |
Provided data is incorrect.
| 502 |
4.2 Experiment 1: Student Reviewers
4.2.1 Experiment Design
-
p is a participant in an experiment;
-
#steps(p) is the number of steps reviewed by the participant p;
-
T is the total time spent on the review (constant: 60 min).
-
p is a participant in an experiment;
-
distance(p) is the furthest step in the specification that was analyzed by the participant p;
-
#events(p, s) is the number of distinct events identified by the participant p based on the step s;
-
#Events(s) is the number of distinct events identified by all participants in both experiments based on the step s.
4.2.2 Operation of the Experiment
-
A presentation containing around 30 slides describing the basics of use cases. The G1 group was presented with the information about the ad hoc approach to events identification. The information about the H4U method was presented only to the G2 group. At the end of the presentation, both groups were informed about the details of the experiment.
-
A benchmark specification being the object of the study.
-
An editable form where the participants filled in the identified events.
-
A questionnaire, prepared in order to identify possible factors which might influence the results of the experiment.
4.2.3 Analysis and Interpretation
Speed | ||||
---|---|---|---|---|
[steps/min] | Accuracy | |||
G1 | G2 | G1 | G2 | |
#1 | 2.63 | 0.35 | 0.25 | 0.36 |
#2 | 1.83 | 0.12 | 0.22 | 0.44 |
#3 | 2.58 | 0.40 | 0.18 | 0.26 |
#4 | 2.65 | 0.50 | 0.09 | 0.22 |
#5 | 1.40 | 0.45 | 0.13 | 0.26 |
#6 | 1.95 | 0.33 | 0.09 | 0.34 |
#7 | 2.62 | 1.35 | 0.25 | 0.31 |
#8 | 2.60 | 1.17 | 0.14 | 0.05 |
#9 | 1.85 | 1.07 | 0.19 | 0.12 |
N | 9 | 9 | 9 | 9 |
Min | 1.40 | 0.12 | 0.09 | 0.05 |
1st Qu. | 1.85 | 0.35 | 0.13 | 0.22 |
Median | 2.58 | 0.45 | 0.18 | 0.26 |
Mean | 2.23 | 0.64 | 0.17 | 0.26 |
3rd Qu. | 2.62 | 1.07 | 0.22 | 0.34 |
Max | 2.65 | 1.35 | 0.25 | 0.44 |
4.2.4 Threats to Validity
4.3 Experiment 2: Experienced Reviewers
4.3.1 Experiment Design
4.3.2 Operation of the Experiment
4.3.3 Analysis and Interpretation
Speed | ||||
---|---|---|---|---|
[steps/min] | Accuracy | |||
G3 | G4 | G3 | G4 | |
#1 | 2.18 | 2.32 | 0.19 | 0.13 |
#2 | 2.07 | 0.85 | 0.04 | 0.36 |
#3 | 2.20 | 0.38 | 0.13 | 0.45 |
#4 | 2.12 | 2.02 | 0.13 | 0.05 |
#5 | 0.68 | 1.97 | 0.14 | 0.03 |
#6 | 2.17 | 0.78 | 0.16 | 0.23 |
#7 | 1.90 | 0.68 | 0.11 | 0.09 |
#8 | 0.87 | 0.38 | 0.21 | 0.08 |
#9 | 2.23 | 1.07 | 0.09 | 0.10 |
#10 | 2.30 | 1.35 | 0.24 | 0.28 |
#11 | 2.07 | 2.07 | 0.08 | 0.11 |
#12 | 0.68 | 0.48 | 0.22 | 0.23 |
#13 | 2.22 | 1.07 | 0.09 | 0.07 |
#14 | 2.07 | 0.48 | 0.15 | 0.33 |
#15 | 2.02 | 0.97 | 0.25 | 0.19 |
#16 | 2.20 | 0.38 | 0.28 | 0.14 |
#17 | 2.18 | 0.42 | 0.07 | 0.23 |
#18 | 2.02 | 0.48 | 0.14 | 0.09 |
#19 | 2.17 | 0.68 | 0.22 | 0.35 |
#20 | 1.90 | 0.55 | 0.15 | 0.18 |
#21 | 1.07 | 0.78 | 0.03 | 0.26 |
#22 | 2.02 | 0.78 | 0.07 | 0.17 |
#23 | 2.18 | 0.78 | 0.21 | 0.36 |
#24 | 1.97 | 1.90 | 0.18 | 0.14 |
#25 | 1.60 | 1.48 | 0.12 | 0.08 |
#26 | 1.43 | 1.83 | 0.11 | 0.07 |
#27 | 2.02 | 1.17 | 0.05 | 0.31 |
#28 | 0.42 | 0.48 | 0.15 | 0.26 |
#29 | 0.55 | 1.07 | 0.20 | 0.14 |
#30 | 0.87 | 0.68 | 0.26 | 0.15 |
#31 | 2.08 | 2.22 | 0.11 | 0.23 |
#32 | 2.17 | 0.85 | 0.12 | 0.30 |
N | 32 | 32 | 32 | 32 |
Min | 0.42 | 0.38 | 0.03 | 0.03 |
1st Qu. | 1.56 | 0.53 | 0.10 | 0.10 |
Median | 2.05 | 0.82 | 0.14 | 0.17 |
Mean | 1.77 | 1.04 | 0.15 | 0.19 |
3rd Qu. | 2.17 | 1.38 | 0.20 | 0.26 |
Max | 2.30 | 2.32 | 0.28 | 0.45 |
4.3.4 Threats to Validity
5 Usage of HAZOP Keywords
Detectable use-case events | Detectable non-use-case events | Undetectable events | |
---|---|---|---|
NO | 47.5 | 44.4 | 38.5 |
MORE | 5.2 | 4.2 | 10.0 |
LESS | 12.1 | 3.0 | 12.0 |
AS WELL AS | 0.4 | 0.8 | 1.7 |
PART OF | 7.7 | 3.4 | 10.7 |
REVERSE | 6.0 | 11.0 | 3.8 |
OTHER THAN | 5.8 | 5.3 | 10.7 |
EARLY | 2.7 | 4.2 | 2.3 |
LATE | 1.9 | 7.2 | 4.0 |
BEFORE | 8.5 | 7.8 | 4.6 |
AFTER | 2.0 | 9.1 | 1.8 |
6 Related Work
7 Conclusions
-
The maximum average accuracy of events identification in both conducted experiments was 0.26. This shows that events identification is not an easy task, and many events might be omitted during the analysis, hence new methods and tools are needed to mitigate this problem.
-
The proposed HAZOP-based method can help in achieving higher accuracy of events identification. This is true in two distinct environments: IT professionals (0.19 accuracy with H4U and 0.15 with the at hoc approach) and students (0.26 accuracy with H4U and 0.17 with the at hoc approach).
-
H4U enables higher accuracy of events identification, however, it requires more time to perform a review. In both groups, IT professionals and students, the ad hoc approach was more efficient in terms of the number of steps analyzed. On average the ad hoc approach review speed varied from 1.77 steps analyzed per minute by professionals to 2.23 steps analyzed per minute by junior analysts, while the review speed of the H4U method was respectively 1.04 and 0.64 steps analyzed per minute.
-
Students versus professionals. There is a common doubt as to whether an experiment with the participation of students could provide results that remain valid for professionals. One of the most debatable questions regards the differences between the effectiveness of (typically inexperienced) students in comparison to more experienced professionals. Some of the research performed so far indicates that for less complex tasks the difference between students and professionals is minor (Höst et al. 2000). In the experiments described in the paper we observed that master-level SE students can even outperform professionals in some cases.