Pointer Core Algorithms

The core Pointer algorithms were selected based on their individual performance and the added performance in the set of algorithms used.

Related Topics
Optimization References

For smooth problems, the best optimizer is the sequential quadratic programming (SQP) algorithm. The SQP algorithm uses function calls close to the starting point to determine the topology of the problem. SQP is widely used in trajectory optimization and structural optimization.

The SQP algorithm used in the Pointer technique is NLPQL, developed by Dr. Klaus Schittkowski, which solves general nonlinear mathematical programming problems with equality and inequality constraints. It is assumed that all problem functions are continuously differentiable. Proceeding from a quadratic approximation of the Lagrangian function and a linearization of the constraints, a quadratic subproblem is formulated and solved by the dual code QL. Subsequently, a line search is performed with respect to two alternative merit functions, and the Hessian approximation is updated by a modified Broyden-Fletcher-Goldfarb-Shanno (BFGS)-formula.

For nonsmooth continuous problems, the downhill simplex method is unmatched. Downhill simplex is especially popular in the chemical engineering field. Unlike SQP, which starts the computation near the starting point, downhill simplex starts the computations from the edges of a computational domain. The Pointer technique uses a modified version of Nelder and Mead’s implementation. The simplex is a geometrical body with n+1 vertices represented by a triangle in two dimensions and a tetrahedron in three. The method calculates and compares the objective function at the vertices of a simplex in the variable space, selects the worst one, and moves this point through the opposite face of the simplex to a lower point. If this new vertex is better, the old one is deleted. If there is no improvement after a number of steps, the method “shrinks” the simplex by reducing the length of each side; therefore, trapping the optimum solution. Instead of moving the best point directly in the direction of the steepest descent (the approach of SQP), it moves the worst points of the initial set in the direction of the best. Therefore, SQP and downhill simplex have little overlap in terms of assumptions and computational use of objective function data.

Another algorithm in the Pointer technique is the so-called evolution or genetic algorithm. SQP and downhill simplex determine analytically where the best answer is based on previous objective function calls. Genetic algorithms work well because they incorporate randomness in their search. It gives the algorithm the ability to correct deterministic search bottlenecks. No objective function continuity is required for this method.

The algorithm in the Pointer technique is based on Professor Schwefel’s evolution strategy, later modified by Dr. Mathias Hadenfeld. The evolution strategy starts out with a large number of points inside the computational domain. In the case of mutation, each of the points produces new points that are normally distributed around the original point. The best point out of this set is selected. In the case of recombination, a random number of points exchange parameter values. Instead of the step size (for SQP) and the size of the simplex (downhill simplex), the volatility of the search is determined by the standard deviation of the average mutation.

Genetic algorithms almost always work. However, they are not often the best algorithms to use because of their high computational expense.