Regular PaperUsing animal instincts to design efficient biomedical studies via particle swarm optimization
Introduction
Optimal experimental designs have been gaining attention in the last two decades [1]. A main reason is rising cost in conducting experiments and an increasing realization in more applied fields that optimal design ideas can save costs substantially without sacrifice in statistical efficiency. Some examples are given in [2], [3], [4], [5], [6], where the problems include designing reaction kinetics studies to medical studies with a time-to-event outcome. Berger and Wong [7] describes a collection of concrete applications of optimal designs to real problems that ranges from applications in biomedical and social science arenas, including a design problem to identify optimal locations to monitor groundwater wells in the Los Angeles basin.
Nonlinear models are frequently used to study outcomes or responses in biomedical experiments. This means that we assume a known nonlinear functional relationship between the mean response and the independent variables. This function has unknown parameters that determine the shape and properties of the mean response and one common goal in the study is to estimate the parameters in the mean function. In addition, the model also has an unobservable error term with mean zero and constant variance. Given a study objective, the design problem involves selecting the right number of combination levels of the independent variables to observe the outcome and what these levels are. The design optimality criterion for nonlinear models depends on the values of the model parameters and nominal values (or best guesses for these parameters) are required before the optimal design can be implemented. Because the optimal designs depend on the nominal values, they are termed locally optimal. Such optimal designs usually represent the fist step in finding an optimal design strategy and is the simplest to construct and study.
The analytical description of the locally optimal design for a nonlinear model is rarely available unless the model is very simple. When they do exist, they are usually complicated; see for example, the analytical description for the locally D-optimal design for estimating the two parameters in the logistic model [8]. Further, the formula or analytical description of the optimal design in a nonlinear model is invariably derived under a set of mathematical assumptions that may or may not apply in practice. For these reasons, it is desirable to have a flexible and effective algorithm that can find a variety of optimal designs quickly and reliably.
There are algorithms for finding optimal designs and most are based on heuristics or intuition and they do not have a theoretical basis. Only a couple of algorithms can be proven to converge to the optimal designs and prominent ones include Fedorov׳s and Wynn׳s algorithms for generating D and c-optimal designs [9], [10]. The former designs are useful for estimating all parameters in the mean function and the latter targets estimation of a specific function of the model parameters by minimizing, respectively, the volume of the confidence ellipsoid and the asymptotic variance of the estimated function of interest. For the few algorithms that can be shown to converge mathematically, problems may still exist and they include (i) they take too long to converge, (ii) they may fail to converge for more complicated setups that they are not designed, such as for nonlinear for mixed effects nonlinear models, and (iii) numerical issues due to rounding problems or the intrinsic nature of the sequential process; for example, many algorithms produce clusters of support points as the algorithm proceeds and these clusters require periodic and judicious collapsing into the correct distinct but unknown support points.
In the next section, we briefly review particle swarm optimization (PSO) methodology and show that it is an exciting, easy and effective algorithm to generate optimal designs for statistical models. This algorithm has been used for almost a dozen of years in the computer science and engineering circles, and increasingly more so in recent years due to its repeated successes in solving increasingly large class of applied problems. The main reasons for its popularity seem to be its flexibility, ease of implementation and utility, and general applicability to solve (or nearly solve) complex optimization problems without having to make specific assumptions on the objective function. In Section 3 we present the statistical background and demonstrate PSO can efficiently find a variety of optimal designs for different types of nonlinear models in biomedicine. Section 4 provides a discussion and also compares PSO performance relative to the differential evolution algorithm, which is another popular metaheuristic algorithm for solving optimization problems in engineering problems. Section 5 is the conclusion.
Section snippets
Particle swarm optimization (PSO)
Nature-inspired algorithms have been gaining popularity and dominance in the last decade both in academia and industrial applications after adjusting for different types of biases [11], [12]. One of the most prominent examples of a nature-inspired algorithm is Particle Swarm Optimization (PSO) based on swarm intelligence. It is a metaheuristic algorithm and comes about from the research in fish and swarm movement behavior. PSO is intriguing in that they always seem to be able to quickly solve
Generating optimal designs for biomedical studies using PSO
In this section, we discuss the statistical background and apply PSO to find various types of optimal designs for common models in the biomedical studies. These models may appear small in terms of the number of parameters that they have but as noted in Konstantinou et al. [18], finding optimal designs for such models can still be problematic using traditional numerical methods or analytically.
Here and throughout, our focus is approximate designs, which are probability measures defined on the
Discussion
We discussed using PSO to find locally D and c-optimal designs for compartmental models, logistic models, a double exponential model useful for monitoring tumor regrowth and estimating parameters in a survival model. The computational experience we had with these and many other problems we had looked at were similar to what is reported in the literature. First, many parameters in the PSO did not seem to matter much. Following common usage in the literature, see for example, [35], [36], we set
Conclusion
Particle swarm optimization is a powerful, flexible and interesting tool for solving optimization problems but appears greatly under-utilized in the statistical literature. We have shown here that PSO is an efficient and novel way for finding optimal experimental designs in statistics. Our applications were in biostatistical problems but clearly they can be applied to solve general optimization problems in statistics. An advantage of PSO is that it is a versatile algorithm in that its success
Acknowledgments
The research of Chen was partially supported by the National Science Council under Grant NSC 101-2118-M-006-002-MY2 and the Mathematics Division of the National Center for Theoretical Sciences (South) in Taiwan. The research of Wang was partially supported by the National Science Council, the Taida Institute of Mathematical Sciences, and the National Center for Theoretical Sciences (Taipei Office). The idea for this manuscript originated at the Isaac Newton Institute for Mathematical Sciences
References (51)
- et al.
Optimal designs for tumor regrowth models
J. Stat. Plan. Inference
(2011) - et al.
A note on D-optimal designs for a logistic regression model
J. Stat. Plan. Inference
(1997) The usefulness of optimum experimental designs
J. R. Stat. Soc. B
(1996)- et al.
Robust and efficient designs for the Michaelis–Menten model
J. Am. Stat. Assoc.
(2003) - et al.
Optimal designs for goodness of fit of the Michaelis–Menten enzyme kinetic function
J. Am. Stat. Assoc.
(2005) - et al.
Designs for generalized linear models with several variables and model uncertainty
Technometrics
(2006) - et al.
Optimal designs for Cox regression
Stat. Neerl.
(2009) - et al.
Bayesian L-optimal exact design of experiments for biological kinetic models
Appl. Stat.
(2011) - et al.
An Introduction to Optimal Designs for Social and Biomedical Research
(2009) Optimal Design
(1980)
Theory of Optimal Experiments
Results in the theory and construction of D-optimum experimental designs
J. R. Stat. Soc. B
Recent trends indicate rapid growth of nature-inspired optimization in academia and industry
Computing
Survival of the flexibleexplaining the recent dominance of nature-inspired optimization within a rapidly evolving world
Computing
The biological principles of swarm intelligence
Swarm Intell.
Particle swarm optimizationan overview
Swarm Intell.
Nature-Inspired Metaheuristic Algorithms
Optimal Designs for Two-Parameter Nonlinear Models with Application to Survival Models
Properties of D-optimal sampling schedule for compartmental models
Biocybern. Biomed. Eng.
Applications of optimal design methodologies in clinical pharmacology experiments
Pharm. Stat.
Optimal designs for compartmental models with correlated observations
J. Appl. Stat.
An evaluation of population D-optimal designs via pharmacokinetic simulations
Ann. Biomed. Eng.
Optimal designs for population pharmacokinetic studies of the partner drugs co-administered with artemisinin derivatives in patients with complicated falciparum malaria
Malar. J.
Cited by (26)
A systematic review on the potency of swarm intelligent nanorobots in the medical field
2024, Swarm and Evolutionary ComputationNumerical Methods for Finding A-optimal Designs Analytically
2023, Econometrics and StatisticsCitation Excerpt :Bonyadi and Michalewicz (2017) provides an excellent review of the PSO algorithm for solving single-objective optimization problems and its many real-world applications, and Mohanty (2019) provides a monograph on PSO aimed squarely for tackling regression problems with real applications. PSO was proposed by Eberhart and Kennedy (1995) and has been discussed a number times for finding optimal designs; see for example, Qiu et al. (2014) and Chen et al. (2015). It is important to note that there are many other nature-inspired metaheuristic algorithms and each has its strengths and weaknesses, including perceived greater successes for solving certain types of optimization problems.
Optimal experimental designs for clustered read-out data of reliability tests via particle swarm optimization
2022, Computers and Industrial EngineeringA comparison of general-purpose optimization algorithms for finding optimal approximate experimental designs
2020, Computational Statistics and Data AnalysisCitation Excerpt :Our main software is Matlab R2017b and all numerical simulations are performed using a MSI Laptop with Processor Intel® Core™ i7-6700 HQ (2.6 GHz) and RAM 16 GB DDR4 2133 MHz. Qiu et al. (2014) and Wong et al. (2015) found a variety of optimal designs using this algorithm. The GA algorithm we use is from the GA function in Matlab.
An improved adaptive human learning algorithm for engineering optimization
2018, Applied Soft Computing JournalCitation Excerpt :Attracted by their excellent characteristics and the flexibility of applications, metaheuristics are soon adopted by researchers worldwide to solve engineering problems. The most popular metaheuristic algorithms, such as Genetic Algorithms (GAs) [4,7], Differential Evolution (DE) [2,3], Particle Swarm Optimization (PSO) [8–10], and Ant Colony Optimization (ACO) [11,12], have been widely used in the fields of mechanical engineering, chemical engineering, structural engineering, material engineering, biology engineering, biochemical engineering, traffic engineering, etc. Encouraged by the previous achievements, more and more researchers and engineers devote themselves into the applications of metaheuristics on engineering problems.
A dynamic metaheuristic optimization model inspired by biological nervous systems: Neural network algorithm
2018, Applied Soft Computing JournalCitation Excerpt :Researchers found that the synchrony of animal’s behavior was through maintaining optimal distances between individual members and their neighbors [13]. The PSO has proved its efficiency for handling real-life optimization problems and its variants have been developed considering design improvements for many engineering cases [14–16]. Geem et al. [4] developed the HS that reproduces the musical process of searching for a perfect state of harmony.