Adaptive Simulated Annealing Technique

The Adaptive Simulated Annealing (ASA) algorithm is well suited for solving highly nonlinear problems with short running analysis codes, when finding the global optimum is more important than a quick improvement of the design.

To understand how the Adaptive Simulated Annealing algorithm works, it helps to visualize the problems presented by such complex systems as a geographical terrain. For example, consider a mountain range, with two “parameters,” such as along the North-South and East-West directions. We want to find the lowest valley in this terrain. The ASA algorithm approaches this problem similar to using a ball that can bounce over mountains from valley to valley. We start at a high “temperature,” where the temperature is an ASA parameter that mimics the effect of a fast moving particle in a hot object like a hot molten metal; the ball is permitted to make very high bounces and is able to bounce over any mountain to access any valley, given enough bounces. As the temperature is made relatively colder, the ball cannot bounce so high, and it can settle, becoming trapped in relatively smaller ranges of valleys.

We imagine that our mountain range is aptly described by a “cost function” (the ObjectiveandPenalty parameter in Isight). We define probability distributions of the two directional parameters, called generating distributions, because they generate possible valleys or states we are to explore. We define another distribution, called the acceptance distribution, which depends on the difference of cost functions of the present generated valley we are to explore and the last saved lowest valley. The acceptance distribution decides probabilistically whether to stay in a new lower valley or to bounce out of it. All the generating and acceptance distributions depend on temperatures.

In a D-dimensional parameter space with parameters pi having ranges [Ai,Bi], about the k’th last saved point (e.g., a local optima),pki, a new point is generated using a distribution defined by the product of distributions for each parameter,gi(yi,Ti), in terms of random variables yi in [1,1], where pk+1i=pki+yi(BiAi), and “temperatures” Ti,gi(yi;Ti)=1/[2(|yi|+Ti)(1+1/Ti)].

The cost functions, C(pk+1)C(pk), are compared using a uniform random generator, U in (0,1), in a “Boltzmann” test: If exp[(C(pk+1)C(pk))/Tcost]>U, where Tcost is the “temperature” used for this test, the new point is accepted as the new saved point for the next iteration. Otherwise, the last saved point is retained.

The annealing schedule for each parameter temperature, Ti, from a starting temperature Ti0, is as follows:

Ti(ki)=T0iexp(ciki1/D).

The annealing schedule for the cost temperature is developed similarly to the parameter temperatures. However, the index for reannealing the cost function, kcost, is determined by the number of accepted points, instead of the number of generated points as used for the parameters.

Tcost(kcost)=T0costexp(ccostkcost1/D).

As determined by the technique options selected, the parameter “temperatures” may be periodically adaptively reannealed or increased relative to their previous values, using their relative first derivatives with respect to the cost function, to guide the search “fairly” among the parameters.

As determined by the technique options selected, the reannealing of the cost temperature resets the scale of the annealing of the cost acceptance criteria using the new T0cost value. The new T0cost value is taken to be the minimum of the current initial cost temperature and the maximum of the absolute values of the best and last cost functions and their difference. The new kcost is calculated taking TcostTcost as the maximum of the current value and the absolute value of the difference between the last and best saved minima of the cost function, constrained not to exceed the current initial cost temperature. This procedure resets the scale of the annealing of the cost temperature within the scale of the current best or last saved minimum.