Skip to main content

2018 | Book

Data and Decision Sciences in Action

Proceedings of the Australian Society for Operations Research Conference 2016

Editors: Prof. Ruhul Sarker, Prof. Hussein A. Abbass, Dr. Simon Dunstall, Dr. Philip Kilby, Dr. Richard Davis, Leon Young

Publisher: Springer International Publishing

Book Series : Lecture Notes in Management and Industrial Engineering


About this book

Offering a concise and multidisciplinary reference guide to the state of the art in Australian operations research, this book will be of great value to academics working in many disciplines associated with operations research, as well as industrial practitioners engaged in planning, scheduling and logistics.

Over 60 papers, with topics ranging from academic research techniques and case studies to industrial and administrative best practices in operations research, address aspects such as:

• optimization, combinatorial optimization, decision analysis, supply-chain management, queuing and routing, and project management; and

• logistics, government, cyber security, health-care systems, mining and material processing, ergonomics and human factors, space applications, telecommunications and transportation, among many others.

This book presents the Proceedings of the National Conference of the Australian Society for Operations Research, the premier professional organization for Australian academics and practitioners working in optimization and other disciplines related to operations research. The conference was held in Canberra in November 2016.

Table of Contents

What Latin Hypercube Is Not
Simulation methods play a key role in the modelling of theoretical or actual physical systems. Such models can approximate the behaviour of the real system and provide insights into its operation. Well-determined input parameters are of prime importance for obtaining reliable simulations due to their impact on the performance of the simulation design. Among various strategies for producing input parameter samples is Latin hypercube design (LHD). LHDs are generated by Latin hypercube sampling (LHS), a type of stratified sampling that can be applied to multiple variables. LHS has proven to be an efficient and popular method; however, it misses some important elements. While LHS focuses on the parameter space aspects, this paper highlights five more aspects which may greatly impact the efficiency of sampling. In this paper, we do not provide solutions but rather bring up unanswered questions which could be missed during strategy planning on model simulation.
Oleg Mazonka, Charalambos Konstantinou
A BDI-Based Methodology for Eliciting Tactical Decision-Making Expertise
There is an ongoing need to computationally model human tactical decision-making, for example in military simulation, where the tactics of human combatants are modelled for the purposes of training and wargaming. These efforts have been dominated by AI-based approaches, such as production systems and the BDI (Beliefs, Desires, Intentions) paradigm. Typically, the tactics are elicited from human domain experts, but due to the pre-conscious nature of much of human expertise, this is a non-trivial exercise. Knowledge elicitation methods developed for expert systems and ontologies have drawbacks when it comes to tactics modelling. Our objective has been to develop a new methodology that addresses the shortcomings, resulting in an approach that supports the efficient elicitation of tactical decision-making expertise and its mapping to a modelling representation that is intuitive to domain experts. Rather than treating knowledge elicitation, as a process of extracting knowledge from an expert, our approach views it as a collaborative modelling exercise with the expert involved in critiquing the models as they are constructed. To foster this collaborative process, we have employed an intuitive, diagrammatic representation for tactics. This paper describes TEM (Tactics Elicitation Methodology), a novel synthesis of knowledge elicitation with a BDI-based tactics modelling methodology, and outlines three case studies that provide initial support for our contention that it is an effective means of eliciting tactical decision-making knowledge in a form that can be readily understood by domain experts.
Rick Evertsz, John Thangarajah, Thanh Ly
Analysis of Demand and Operations of Inter-modal Terminals
Inter-modal terminals (IMT) reduce road congestion and exploit economies of scale by pooling demand from surrounding areas and using rail to transport containers to and from ports. The alternative to using IMTs is using trucks to transport containers directly to and from the port. Trucks increase road congestion, but rail requires additional handling of containers (lift on and lift off). The attractiveness of truck versus rail is dependent on a number of variables such as costs, total travel time, frequency of services, risk and material resources. To date, the use of open data to analyse and model freight movements has been minimal, primarily because of the shortage of open data focusing on freight movements across cities, regions or countries. In this paper, we leverage open government data for the Port Botany rail network and use it to develop exible and dynamic simulation and optimisation tools that enable various stakeholders including IMT operators, port authorities and government policy makers to make more informed decisions not only about pricing, but also about operation scheduling and internal operations.
Rodolfo García-Flores, Soumya Banerjee, George Mathews, Blandine Vacher, Brian Thorne, Nazanin Borhan, Claudio Aracena, Yuriy Tyshetskiy
Efficient Models, Formulations and Algorithms for Some Variants of Fixed Interval Scheduling Problems
The fixed interval scheduling problem—also known as the personnel task scheduling problem—optimizes the allocation of available resources (workers, machines, or shifts) to execute a given set of jobs or tasks. We introduce a new approach to solve this problem by decomposing it into separate subproblems. We establish the mathematical basis for optimality of such a decomposition and thereafter develop several new techniques (exact and heuristic) to solve the resulting subproblems. An extensive computational analysis of the new techniques proves the efficacy of these approaches when compared to other established techniques in the literature. Specifically, a hybrid integer programming formulation presented in this paper solves several larger problem instances that were not amenable to exact techniques previously. In addition, a constructive heuristic approach (based on quantification metrics for tasks and resources) gives solutions equal to the optimal. We demonstrate that our decomposition approach is applicable for several important variants within the topic of fixed interval scheduling including tactical fixed interval scheduling problem and operational fixed interval scheduling problem.
D. Niraj Ramesh, Mohan Krishnamoorthy, Andreas T. Ernst
The Value of Flexible Road Designs Through Ecologically Sensitive Areas
Mining haul road traffic can have significant impacts on nearby animal populations that can threaten an entity’s licence to operate. As operators have the ability to control heavy vehicle traffic flow through mining road networks, opportunities exist to reroute traffic away from more damaging roads in response to uncertain animal population dynamics. The presence of this flexibility in turn has a positive effect on the future value of proposed road designs. In this paper, we present an approach for evaluating the flexibility of controlling traffic flow in proposed road designs between two locations separated by an intervening species habitat. We do this by treating the design problem as a Real Options Valuation (Stochastic Optimal Control) problem solved using Least-Squares Monte Carlo. Here, the different control actions are the discrete traffic flow rates, and the uncertain-state variables are the animal populations at each location in the region of interest. Because the control chosen has a direct impact on the path of the uncertain variables, we use the technique of control randomisation in generating the Monte Carlo paths and computing the costs-to-go in the Real Options Valuation. In addition, we use a state-reduction parameter called Animals at Risk to reduce the dimensionality of the problem to improve tractability. In an example scenario, the addition of routing flexibility resulted in an increase in project value over the case without flexibility while maintaining the animal population above a critical threshold.
Nicholas Davey, Simon Dunstall, Saman Halgamuge
Local Cuts for 0–1 Multidimensional Knapsack Problems
This paper investigates the local cuts approach for the multidimensional knapsack problem (MKP) which has more than one knapsack constraints. The implementation of the local cuts-based cutting plane algorithm is an extension of the exact knapsack separation scheme of Vasilyev et al. (J Glob Optim 1–24, [13]). Comparisons are made with the global lifted cover inequalities (GLCI) proposed in our recent paper in Computers & Operations Research (Gu, Comput Oper Res 71:82–89, [7]). Preliminary results show that the local cuts approach may be powerful for the MKP.
Hanyu Gu
An Exact Algorithm for the Heterogeneous Fleet Vehicle Routing Problem with Time Windows and Three-Dimensional Loading Constraints
One of our industry partners distributes a multitude of orders of fibre boards all over each of the Australian capital cites, a problem that has been formalised as the Heterogeneous Fleet Vehicle Routing Problem with Time Windows and Three-Dimensional Loading Constraints (3L-HFVRPTW). The fleet consists of two types of trucks with flat loading surfaces and slots for spacers that allow a subdivision into stacks of different sizes. A customer’s delivery can be positioned on more than one partition, but the deliveries have to be loaded in a strict LIFO order. Optimising the truck loads beyond the 75% that can be achieved manually provides value to the company because the deliveries are generally last-minute orders and the customers depend on the deliveries for their contract work on refurbishments. This paper presents an exact integer linear programming model that serves two purposes: (1) providing exact solutions for problems of a modest size as a basis for comparing the quality of heuristic solution methodologies, and (2) for further exploration of various relaxations, stack generation, and decomposition strategies that are based on the ILP model. We solved a few real-life instances by obtaining the exact optimal solution using CPLEX 12.61, whereas previously, the problem was solved manually by staff members of the furniture company.
Vicky Mak-Hau, I. Moser, Aldeida Aleti
Automated Techniques for Generating Behavioural Models for Constructive Combat Simulations
Constructive combat simulation is widely used across Defence Science and Technology Group, typically using behavioural models written by software developers in a scripting or programming language for a specific simulation. This approach is time-consuming, can lead to inconsistencies between the same behaviour in different simulations, and is difficult to engage military subject matter experts in the elicitation and verification of behaviours. Therefore, a representation is required that is both comprehensible to non-programmers and is translatable to different simulation execution formats. This paper presents such a representation, the Hierarchical Behaviour Model and Notation (HBMN), which incorporates aspects of existing business process and behaviour representations to provide a hierarchical schema allowing an incremental approach to developing and refining behaviour models from abstract partial models to concrete executable models. The HBMN representation is combined with automated processes for translating written military doctrine texts to HBMN and from HBMN to executable simulation behaviours, providing a cohesive solution to modelling combat behaviours across all stages of development.
Matt Selway, Kerryn R. Owen, Richard M. Dexter, Georg Grossmann, Wolfgang Mayer, Markus Stumptner
Analytic and Probabilistic Techniques for the Determination of Surface Spray Patterns from Air Bursting Munitions
We present a methodological framework for the evaluation and comparison of surface spray fragmentation patterns from a range of medium-calibre air bursting munitions. The methodology is underpinned by both analytic and probabilistic modelling techniques. In particular, we present a fly-out model for the calculation of the terminal speed of a fragment on the impact plane. The fly-out model constitutes a trade-off between the computational efficiency of analytic models and the accuracy of detailed numerical methods extant in the literature. The methodology has been developed with the ability to readily adapt to modifications in the gun and ammunition parameters, with applications to controlled and naturally fragmenting munitions, as well as shrapnel-based warheads. This is demonstrated by comparing the surface spray patterns of two different ammunition types.
Paul A. Chircop
Reformulations and Computational Results for the Uncapacitated Single Allocation Hub Covering Problem
We study the single allocation hub covering problem, which is a special case of the general hub location problem and an important extension to traditional covering problems. Hubs are located at some nodes in the network and are used to facilitate (consolidate, transfer, distribute) flows. An important feature in hub location is that the transfer cost between hub nodes is discounted. The hub covering problem is to locate a minimum number of hubs such that the travel cost between each o–d pair in the network does not exceed a given threshold. We improve the best existing integer programming formulation for this problem by lifting constraints to produce facet-defining inequalities. We also develop a new formulation for the problem. The numerical results show that your new formulation performs better than existing formulations when using lifted constraints and paying special attention the number of non-zero coefficients used.
Andreas T. Ernst, Houyuan Jiang, Mohan Krishanmoorthy, Davaatseren Baatar
Search Strategies for Problems with Detectable Boundaries and Restricted Level Sets
The chapter is concerned with a class of discrete optimisation problems which includes a number of classical NP-hard scheduling problems. The considered solution method is an iterative procedure that at each iteration, it computes a lower bound on the optimal objective value and searches for a feasible solution attaining this bound. The chapter describes three search methods—descending search, ascending search, and their combination. These methods are illustrated by considering their implementations for the unit execution time unit communication delay maximum lateness problem with parallel identical machines. The resultant algorithms are compared by means of computational experiments.
Hanyu Gu, Julia Memar, Yakov Zinder
Alternative Passenger Cars for the Australian Market: A Cost–Benefit Analysis
Petrol or diesel powered cars (henceforth referred as conventional vehicles (CV)) have long been in existence, and currently, 75% of cars in Australia belong to this category. Hybrids (HEVs), i.e. a vehicle with an electric drive system and an internal combustion engine running on either petrol or diesel have gained significant market share in recent years. While hybrids are regularly presented as “greener alternatives”, their competitive edge is largely dependent on existing market conditions (gasoline prices, electricity prices, purchase price and maintenance costs) and the usage (commuting distances). A new breed of cars, i.e. fully electric vehicles (EVs), is becoming increasingly popular for city commuters and expected to feature prominently in “Smart Cities” of the future. This study aims to evaluate the performance of electric vehicles (EVs), hybrid electric vehicles (HEVs) and conventional vehicles (CVs) based on three major considerations: average gasoline consumption per day, average GHG emissions in kg-CO2 equivalent (kg-CO2-eq) per day and equivalent annualized cost (EAC), for a typical Australian scenario. Each of the three objectives are assessed across a range of gasoline prices, electricity tariffs and commuting distances. Four vehicles have been considered in this study (The models for analysis have been developed based on available open source data with simplifications and assumptions.): two EVs (Nissan LEAF and a BMW i3), one HEV (Toyota Prius) and one CV (Hyundai i30). For a typical city commute of 50 km/day, the average gasoline consumption varies between 0 to 2.30 litre per day, average GHG emission varies between 8.25 to 12.57 kg-CO2 equivalent per day, and equivalent annualized cost (EAC) varies between 9.90 to 38.79 AUD per day. With such a variation, the choice of one vehicle over another is largely dependent on user preferences. The chapter also presents an approach to customize EVs i.e. effectively develop EV or HEVs to be attractive to a particular market segment.
Jason Milowski, Kalyan Shankar Bhattacharjee, Hemant Kumar Singh, Tapabrata Ray
A Quick Practical Guide to Polyhedral Analysis in Integer Programming
Polyhedral analysis is one of the most interesting elements of integer programming and has been often overlooked. It plays an important role in finding exact solutions to an integer program. In this paper, we will discuss what polyhedral analysis is, and how some constraints for an integer programming model are “ideal” in the sense that if the model contains all of these “ideal” constraints, then the integer optimal solution can be obtained by simply solving a linear programming relaxation of the integer program. This paper serves as a quick guide for young researchers and PhD students.
Vicky Mak-Hau
Towards a Feasible Design Space for Proximity Alerts Between Two Aircraft in the Conflict Plane
In studying airspace it is necessary to have a model of how and when two aircraft come into proximity. There are two components: a kinematic description of the aircraft in flight, and a set of rules for deciding whether proximity has occurred. The focus in this paper is on the rules with the kinematics kept as simple as possible by utilizing the crossing track model. The aircraft are assumed to be flying straight-line paths at constant speed in a common plane towards a common waypoint A where the paths intersect. The kinematics is completely determined once the initial positions and the velocity vectors are known. A set of proximity rules are described which then permit partitioning of a constrained mathematical model for proximity. The partitioning is a function of parameter values derived from the proximity rules. It follows that the region where all constraints are satisfied defines a Feasible Design Space (FDS), and a different set of design variables is identified for describing the kinematics than is commonly used in current practice. The Compromise Decision Support methodology constructs permit the formalization of the analysis of how different designs for airspace management and communication might affect the occurrence of proximity between aircraft. The existence of proximity can be determined analytically once the kinematic variables are known. The use of the new FDS is applicable hierarchically from strategic planning through to real-time adaption required in actual in-flight operations. It is important as a visualization tool for simulation studies and decision support analyses.
Mark Westcott, Neale Fulton, Warren F. Smith
Constructing a Feasible Design Space for Multiple Cluster Conflict and Taskload Assessment
Appropriate management of proximity is required if aircraft are to be operated safely. Flight trajectories ranging from continental to very localized operations may become proximate causing multi-aircraft clusters to form within a flow. Clusters may emerge, dissipate in time, coalesce or be sustained (e.g. swarms). A collective characterization of multiple clusters within an air traffic flow, using a Number Theoretic approach, is presented in this paper. This complements earlier research that characterized the structure of individual clusters through Graph Theoretic techniques. A Feasible Design Space is constructed based on the number of aircraft generating the flow, the total number of clusters emergent in a flow, and the aggregated number of proximity-pairs across clusters. The system taskload demand is defined by the aggregated number of proximity-pairs. The engineering task of designing avionics, ergonometric interfaces, communication channels or ground-based facilities to manage proximity requires knowledge of the peak taskloads imposed on the various subsystems involved. Variation (inclusive of non-monotonic behaviour) in taskload during the growth and dissipation of the clusters is evident in the analysis presented. The mathematical basis underpinning the emergent proximity Feasible Design Space would enable these operational points to be identified a priori together with a suitable management strategy to avoid system saturation and to reduce the possibility of collision due to communication failure or task overload.
Neale L. Fulton, Mark Westcott, Warren F. Smith
Open-Pit Mine Production Planning and Scheduling: A Research Agenda
Mining is a complex, expensive, yet lucrative business. Today’s open-pit mines are huge projects in Australia. To keep the projects profitable, planners and schedulers are under constant pressure to make mine plans that are as accurate as possible and optimize production at all stages, from mine to market. In general, two different systems are available for extracting material in the mining industries: the traditional truck and shovel (T&S) and the modern in-pit crushing and conveying (IPCC) systems. While T&S has been extensively studied by operations research (OR) community, there are, however, almost no studies for optimizing the operations in IPCC systems. Despite great advantages of IPCC systems, mining companies are often reluctant to use it, due to the lack of an optimum strategic plan that makes it difficult, if not impossible, to estimate the costs of IPCC systems. In most cases, industry is still relying on the judgement or best estimate of experienced personnel in strategic decision making. This is without any guarantee of optimality and will be refined manually through multiple time-consuming iterations. This chapter introduces IPCC to the OR community and points out the need for OR research. Subsequently, we will develop a research agenda that provides an apt ground to study this system.
Mehran Samavati, Daryl L. Essam, Micah Nehring, Ruhul Sarker
A Comparative Study of Different Integer Linear Programming Approaches for Resource-Constrained Project Scheduling Problems
Over the last few decades, the resource-constrained project scheduling problem (RCPSP) has been considered a challenging research topic in operations research and computer science. In this paper, we have primarily proposed two different integer linear programming (ILP) models for RCPSPs. As the computational effort required for solving such models depends on the number of variables and constraints, our proposed mathematical models were carried out while attempting to reduce the required number of variables and constraints. For better demonstration, four other ILP models for RCPSPs were also considered so that they could be compared with our proposed models. That comparative study was conducted by solving standard benchmark instances while using a common objective function. The study provides interesting insights about the problem characteristics, model sizes, solution quality and computational efforts required for solving those ILP models.
Ripon K. Chakrabortty, Ruhul Sarker, Daryl L. Essam
A Recovery Model for Sudden Supply Delay with Demand Uncertainty and Safety Stock
In this paper, a recovery model is developed for managing sudden supply delays that affect retailers’ Economic Order Quantity (EOQ) model. For this, a mathematical model is developed that considers demand uncertainty and safety stock, and generates a recovery plan for a finite future period immediately after a sudden supply delay. Solving recovery problems involve high commercial software costs, and their solutions are complex. Therefore, an efficient heuristic solution is developed that generates the recovery plan after a sudden supply delay. An experiment is conducted to test the proposed approach. To assess the quality and consistency of solutions, the performance of the proposed heuristic is compared with the performance of the Generalized Reduced Gradient (GRG) method, which is widely applied in constrained mathematical programming. Several numerical examples are presented and a sensitivity analysis is performed to demonstrate the effects of various parameters on the performance of the heuristic method. The results show that safety stock plays an important role in recovery from sudden supply delays, and there is a trade-off between backorder and lost sales costs in the recovery plan.
Sanjoy Kumar Paul, Shams Rahman
Applying Action Research to Strategic Thinking Modelling
The versatility of the word strategy has created significant headaches for those looking to develop the area. The unfortunate position of the word strategy is that it has “acquired a universality which has robbed it of meaning”. Not only is strategy poorly understood and poorly used, but also the subservient capabilities that allow for the development of strategy, specifically strategic thinking, are equally mired in confusion and debate. Yet, despite this historical confusion, strategy and strategic thinking are often cited as the reason for organisational success or failure. Due to the inherent uncertainty within the subject, this paper seeks to use soft systems methodology to understand the problem. The problem of strategic thinking, specifically the development of a capability, is one of understanding rather than one of optimising. This chapter provides a broad review of a large research project on computational strategic thinking modelling. Importantly, we will examine the research methodology through the lenses of action research. We will outline the uncertain framework and summarise the finding that has notably increased our understanding of strategic thinking. This work confirms action research as a valid methodology for tackling uncertain situation and understanding what the problem is.
Leon Young
Regression Models for Project Expenditures
This report discusses regression models to analyse planned and actual expenditure data from projects in an Australian Defence capital investment program. Variations in expenditure from that planned have the potential to create cash flow problems in Defence. Therefore, an understanding of the relationship between planned and actual expenditure will improve project planning and portfolio financial management. It is also useful to understand if families of projects behave differently to each other because differences may indicate varying management practices across project domains or differences in the nature of project families. The regression model accommodates project time and military domain effects, heteroscedasticity, repeated measures and nonlinear relations between planned and actual annual project expenditure. The nonlinear model is linearized into an additive lognormal model and fitted via a linear mixed models methodology that facilitates modelling the within project covariance structure. Generally, projects underspend against the plan early in their life and overspend in later years. For each year, there are significant differences in expenditure rates between domains, but the expenditure rate does not depend on planned expenditure. The model may be used by project and portfolio planners to test likely spending variations from plan and to then adjust expenditure plans accordingly; it, therefore, provides a tool to measure project and financial risk.
Terence Weir
SimR: Automating Combat Simulation Database Generation
Land simulation, experimentation and wargaming (LSEW) has long-used land combat simulation as a central tool to support its analysis of army modernisation studies. Combat simulations provide great utility through their ability to analyse the impact of equipment, structure and tactics within a force-on-force context. However, their extensive input data requirements create significant challenges in terms of data generation, verification and validation. Since 2012, LSEW has been developing the simulation repository (SimR), a database designed for the express purpose of managing and storing combat simulation input data. SimR was designed with a number of critical features in mind: a minimalist storage methodology, the concept of data provenance and the use of deterministic data generation algorithms to produce input data on demand. More recently, LSEW has introduced the feature of automated database generation. Focusing on the COMBATXXI [1] combat simulation as a target, SimR can now produce a working COMBATXXI land component performance database on demand with minimal human intervention. This is a significant advance over previous methods, which relied heavily on analysts to manually piece together databases from disparate sources, a time-consuming and error-prone approach. As a result, the latest COMBATXXI study uses a SimR-generated database.
Lance Holden, Richard M. Dexter, Denis R. Shine
Battlespace Mobile/Ad Hoc Communication Networks: Performance, Vulnerability and Resilience
Dynamic self-forming/self-healing communication networks that exchange IP traffic are known as mobile ad hoc networks (MANET). The performance and vulnerabilities in such networks and their dependence on continuously changing network topologies under a range of conditions are not fully understood. In this work, we investigate the relationship between network topologies and performance of a 128-node packet-based network composed of four 32-node communities, by simulating packet exchange between network nodes. In the first approximation, the proposed model may represent a company of soldiers consisting of four platoons, where each soldier is equipped with MANET-participating radio. In this model, every network node is a source of network traffic, a potential destination for network packets, and also performs routing of network packets destined to other nodes. We used the Girvan-Newman benchmark to generate random networks with certain community structures. The interaction strength between the communities was expressed in terms of the relative number of network links. The average packet travel time was used as the proxy for network performance. To simulate a network attack, selected subsets of connections between nodes were disabled, and the performance of the network was observed. As expected, the simulations show that the average packet travel time between communities of users (i.e. between platoons) is more strongly affected by the degree of mixing compared to the average packet travel time within a community of users (i.e. within an individual platoon). While the conditions presented here simulate a relatively mild attack or interference, simulation results indicate significant effects on the average packet travel time between communities.
Vladimir Likic, Kamran Shafi
Using Multi-agent Simulation to Assess the Future Sustainability of Capability
The ability to make sound decisions in an area with many complex interactions, such as the sustainability of future capability, is limited by the tools available to emulate the system under study. Methods used to forecast maintenance capability and capacity to support future systems are typically static and deterministic in nature and hence cannot incorporate the true stochastic nature of maintenance events and the capability changes associated with the “growth” of personnel through their technical mastery journey. By comparison, discrete event simulations provide a dynamic platform within which we can emulate the randomness inherent in complex systems, and the extension to multi-agent simulations allows us to capture the effects of changes attributed to personnel. Using a simulation created to address the question of maintenance sustainability for future capability (Air Traffic Management System) for 44WG as a basis for analysis, this chapter compares the results of a discrete event simulation with no agent-based functionality against models containing successively greater multi-agent functionality. A consistent set of fictitious data was used in the analyses presented to run eight individual scenarios to allow fair comparison. From the analyses, we find the discrete event simulation provides overly optimistic results which would lead to understaffing of the maintenance team. In comparison, the multi-agent simulation results were closer to reality and therefore better suited to inform decision making.
A. Gore, M. Harvey
Application of Field Anomaly Relaxation to Battlefield Casualties and Treatment: A Formal Approach to Consolidating Large Morphological Spaces
Field anomaly relaxation (FAR) is a qualitative strategic planning technique useful for generating a range of alternative future scenarios in a complex, multi-dimensional problem space. It utilises the principles of general morphological analysis, in that a set of possible future scenarios are constructed by first defining the set of dimensions (‘sectors’) and their possible values (‘factors’), then filtering out scenarios with combinations of parameters that are inconsistent with each other. This is an exercise that is inherently based on human judgement. As such, limitations on the dimensionality of the problem space being considered are generally required in order to keep the problem within the bounds of a human’s cognitive processes. For example, the number of sectors and number of factors within each sector are generally restricted to no more than seven and five, respectively. Further, clustering of plausible scenarios that are ‘similar’ is generally performed to further reduce the resulting morphological space. This paper defines a mathematical framework for the computation of two heuristics, degree of overlap and degree of divergence, to assist humans in dealing with higher dimensionality, more complex problem spaces by helping to identify potential candidate sets of scenarios for clustering. We take a didactic approach to the concepts and mathematical formulation of these heuristics using a simple example, before briefly presenting the preliminary results of a case study on defining the scenario space for tactical battlefield casualty treatment. Ultimately, we wish to establish a comprehensive set of scenarios for high-level assessment of potential technologies for rendering enhanced battlefield casualty care and treatment.
Guy E. Gallasch, Jon Jordans, Ksenia Ivanova
Network Analysis of Decision Loops in Operational Command and Control Arrangements
In 2014, Commander Joint Task Force (JTF) 633 asked deployed operations analysts to examine information flow between deployed units of the Australian Defence Force (ADF) in the Middle East Region (MER) and to consider the contribution to decision-making of flows within and between the various deployed headquarters, Headquarters Joint Operations Command (HQJOC) and other strategic and coalition nodes. Data were collected across the theatre, using attributes such as frequency, means, network and type. Data were also collected on the information function of the interaction, which relates to the role of the communication within Boyd’s Observe–Orient–Decide–Act (OODA) loop. Insights from these data were communicated to Commanders in theatre in late 2014 and early 2015. Further analysis used a representation of the C2 arrangements to generate a network diagram showing both information function and frequency. The network may now be visualised as a full socio-technical system, with nodes representing both organisational entities and information artefacts or systems, or as a pure social network (human-to-human). This approach extends the Situation Awareness Weighted Network (SAWN) framework, creating an OODA Weighted Network model presented here as OODAWN. This paper will discuss the context for the work, the data collection techniques and the resulting network diagrams. We demonstrate the utility of the model by discussing insights regarding the C2 arrangements, both within theatre and from theatre back to Australia.
Alexander Kalloniatis, Cayt Rowe, Phuong La, Andrew Holder, Jamahl Bennier, Brice Mitchell
Impact of Initial Level and Growth Rate in Multiplicative HW Model on Bullwhip Effect in a Supply Chain
Bullwhip effect (BWE) in a supply chain, attributed as the amplification of variance of demand along its route of propagation, compels a manufacturer to bear additional costs in the form of non-optimal resource usage. An accurate forecasting approach for demand prediction can be instrumental in mitigating the BWE. Numerous researchers have attempted to assess the impact of several forecasting approaches such as Moving Average, Single, Double and Triple Exponential Smoothing models, ARIMA, AI-based methods on BWE. However, Multiplicative Holt-Winters approach in mitigating the BWE is not widely exploited particularly with respect to the influence of the initial values of the level and growth rate of this approach. Hence, in this research endeavour, an attempt is made to study the impact of these parameters of the Multiplicative Holt-Winters model on the bullwhip effect in a two-echelon supply chain. Accordingly a simulation is performed in MS Excel along with ANOVA to reveal the significance of the parametric values. The preliminary results demonstrate that the initial values of the level have a significant impact over the bullwhip effect whereas the initial values of the growth rate maintain a U-type relationship. Thus, a scope is revealed for further study to improve the widely adopted Multiplicative Holt-Winters forecasting approach for tackling the BWE through exploration of optimal conditions.
H. M. Emrul Kays, A. N. M. Karim, M. Hasan, R. A. Sarker
The p-Median Problem and Health Facilities: Cost Saving and Improvement in Healthcare Delivery Through Facility Location
The importance of health to economic growth and development is an undisputed fact. Modern advancement in technology and healthcare has contributed to improved health and productivity, but there are many people who cannot access healthcare in a timely fashion. Factors affecting delays in accessing healthcare include inadequate supply, poor location, or lack of healthcare facilities all of which can be exacerbated by increasing healthcare costs and scarcity of resources. In this study, we develop a simple two-stage method based on the p-median problem to investigate the location and access to healthcare (emergency) facilities in urban areas. We compare the results of our new method with the results of similar existing methods using 26-node, 42-node, and 55-node data. We also show the efficiency of our method with exact methods using 150-node random data. Our method compares favorably with optimal and the existing methods.
Michael Dzator, Janet Dzator
A Bi-level Mixed Integer Programming Model to Solve the Multi-Servicing Facility Location Problem, Minimising Negative Impacts Due to an Existing Semi-Obnoxious Facility
We propose a bi-level multi-objective model to solve the multi-facility location problem with traffic equilibrium constraints. The main facility location problem within our proposed model consists of locating a set of buildings with varying sensitivity thresholds due to the negative impacts propagating from an existing semi-obnoxious facility. The traffic routing problem is modelled as a user equilibrium which is embedded using its Karush-Kuhn-Tucker optimality conditions. We use the convex scalarisation approach to deal with multiple objectives. Two solution methods are then contrasted: in the first method we solve our linearised model using an off-the shelf Mixed Integer Programming solver. In the second solution approach we use Benders Decomposition algorithm to improve computational tractability. Numerical results highlight the superiority of the decomposition approach when solving a realistic-sized instance.
Ahmed W.A. Hammad, David Rey, Ali Akbarnezhad
Can Three Pronouns Discriminate Identity in Writing?
In a study of three female and two male contemporary authors, five thousand words from each was obtained by accessing 30 freely available news articles, Web articles, personal blog posts, book extracts, and oration transcripts on the Internet. The data was anonymised to remove identity. All 25,000 words were aggregated across the 30 articles by word frequency and 29 personal pronouns extracted and normalised by sample size. Using logistic regression, each sample was tested to determine if it were possible to identify the author’s gender using a subset of personal pronouns. The study found that it is possible to identify gender with 90% accuracy using the three pronouns ‘my’, ‘her’, and ‘its. The technique was tested against six independent samples with 84% accuracy and could support the identification of adversaries on the Internet or in a theatre of war.
David Kernot
Data and Decision Sciences in Action
Prof. Ruhul Sarker
Prof. Hussein A. Abbass
Dr. Simon Dunstall
Dr. Philip Kilby
Dr. Richard Davis
Leon Young
Copyright Year
Electronic ISBN
Print ISBN