Simulation-Based Derivative-Free Optimization for Computationally Expensive Functions

2012 AIChE Annual Meeting in Pittsburgh

October 31, 2012

F. Boukouvala M.G. Ierapetritou

Nowadays, the need to develop predictive models for complex processes along with advances in computational tools and capabilities have lead to the development of a plethora of detailed complex mathematical methods. However, as the complexity and level of detail increases, so does the necessary computational cost which prohibits their use in any application that requires multiple simulations- such as optimization- and limits them to simple qualitative assessment tools. For this reason, there has been a lot of interest on simulation based optimization techniques which aim to minimize the necessary function evaluations by not relying on analytic derivatives of the objective function for optimization, which are costly to evaluate and oftentimes noisy [1]. In the literature, many gradient-free optimization algorithms have been proposed, which can be divided into direct-search or model-based, local or global and deterministic or stochastic based on their search criteria, search space and the nature of their search steps respectively [2]. In addition, one of the major challenges of gradient-free optimization is the handling of complicated constraints. In this work, a gradient-free algorithm will be proposed which employs a hybrid global/local search, is deterministic and model-based and can handle black-box constraints through the concept of black-box feasibility mapping [3]. Since the first significant direct search gradient-free methods ([4],[5]), a large number of methods have been proposed which aim to optimize models without the need for the calculation of derivatives [6]. The history of gradient-free optimization has transitioned from direct-search to surrogate-based optimization, which has been most popular recently based on the hypothesis that through the use of surrogate models, the search will be guided in a more intelligent way and thus function calls will be minimized. Local model-based methods focus within a small subregion of the domain, for which the appropriate amount of samples is collected to fit the chosen response surface, an optimizer is found within the trust region and the trust region radius is increased, decreased, or relocated based on a set of rules proven to lead to convergence. This procedure is repeated until the trust region radius is sufficiently small. On the other hand, many global methods have been proposed which differ in terms of the type of response surface (response surface methodology, kriging, radial basis functions etc) or the strategy that identifies the next promising sample (expected improvement, bumpiness, entropy etc)[7-10]. The basic ideas however are identical: collect an initial sample from the entire region, fit a global model, optimize it, find the next promising sample based on a search criterion and evaluate it, update response surface and re-optimize until expected improvement is low. Driven by the need to optimize extremely expensive and noisy simulations, this work combines the main advantages of well-established global and local search algorithms with new concepts of feasibility space mapping within a framework that aims to optimize such models, while minimizing sampling without sacrificing quality of the solution. The proposed algorithm initially employs a global search, collecting an initial sample from the entire investigated region based on which not only an approximate global model is built, but also a mapping of the feasible space of the problem is performed. This first step serves two purposes since it provides promising initial starting points for the proceeding local search of the algorithm, and it allows the investigation of the feasible region without the need of any assumptions about its nature (i.e. linear, convex, continuous). Next, the algorithm switches to local-search mode starting from more than one in parallel (if applicable) warm-start locations, taking advantage of the fact that surrogate model approximations are more accurate within smaller trust-regions. The algorithm proceeds based on a set of rules for updating the location or size of the trust region, until the trust region radius is sufficiently small. During this procedure, the global model as well as the feasibility mapping are updated when new samples are collected while the size of the trust-region is continuously adapted based on the local information about the feasibility space, in order to avoid sampling in infeasible regions. The performance of the proposed approach is evaluated through a set of benchmark examples and an actual case study of an expensive simulation from the pharmaceutical industry. References: 1.            Kolda, T.G., R.M. Lewis, and V. Torczon, Optimization by Direct Search : New Perspectives on Some Classical and Modern Methods. 2003. 45(3): p. 385-482. 2.            Rios, L.M. and N.V. Sahinidis, Derivative-free optimization: A review of algorithms and comparison of software implementation. AICHE Annual Conference, Nashville, TN, 2009. 3.            Boukouvala, F. and M.G. Ierapetritou, Feasibility analysis of black-box processes using an adaptive sampling Kriging-based method. Computers & Chemical Engineering, 2012. 36(0): p. 358-368. 4.            Hooke, R. and T.A. Jeeves, "Direct Search'' Solution of Numerical and Statistical Problems. J. ACM, 1961. 8(2): p. 212-229. 5.            Nelder, J.A. and R. Mead, A Simplex Method for Function Minimization. The Computer Journal, 1965. 7(4): p. 308-313. 6.            Conn, A.R., K. Scheinberg, and L.N. Vicente, Introduction to derivative-free optimization. MPS-SIAM series on optimization. 2009, Philadelphia: Society for Industrial and Applied Mathematics/Mathematical Programming Society. xii, 277 p. 7.            Huang, D. Experimental planning and sequential kriging optimization using variable fidelity data.  2005; Available from: 8.            Jakobsson, S., et al., A method for simulation based optimization using radial basis functions.Optimization and Engineering, 2010. 11(4): p. 501-532. 9.            Jones, D.R., M. Schonlau, and W.J. Welch, Efficient Global Optimization of Expensive Black-Box Functions. Journal of Global Optimization, 1998. 13(4): p. 455-492. 10.          Villemonteix, J., E. Vazquez, and E. Walter, An informational approach to the global optimization of expensive-to-evaluate functions. Journal of Global Optimization, 2009. 44(4): p. 509-534.