About the Adaptive Simulated Annealing TechniqueThe Adaptive Simulated Annealing (ASA) algorithm is well suited for solving highly nonlinear problems with short running analysis codes, when finding the global optimum is more important than a quick improvement of the design. For detailed information about ASA, see Adaptive Simulated Annealing Technique. About the Archive-Based Micro Genetic Algorithm (AMGA)The Archive-based Micro Genetic Algorithm (AMGA) is an evolutionary optimization algorithm and relies on genetic variation operators for creating new solutions. The generation scheme deployed in AMGA can be classified as generational since, during a particular iteration (generation), only solutions created before that iteration take part in the selection process. The algorithm, however, generates a very small number of new solutions (it can work with just two solutions per iteration) at every iteration. Therefore, it can also be classified as an almost steady-state genetic algorithm. The final Pareto set of designs produced at the end of optimization is saved in a file. For more information, see About the AMGA ParetoFile Format. For detailed information about the AMGA algorithm, see Archive-Based Micro Genetic Technique. About the Downhill Simplex TechniqueThe Downhill Simplex technique is a geometrically intuitive algorithm. A simplex is defined as a body in n dimensions consisting of n+1 vertices. Specifying the location of each vertex fully defines the simplex. In two dimensions the simplex is a triangle. In three dimensions it is a tetrahedron. As the algorithm proceeds, the simplex makes its way downward toward the location of the minimum through a series of steps. For detailed information about the Downhill Simplex technique, see Downhill Simplex Technique. About the Evolutionary Optimization Algorithm (Evol)The Evolutionary Optimization Algorithm (Evol) is an evolution strategy that mutates designs by adding a normally distributed random value to each design variable. The mutation strength (standard deviation of the normal distribution) is self-adaptive and changes during the optimization loop. The algorithm has been calibrated to efficiently solve design problems with low numbers of variables and with some noise in the design space. For more information about the Evolutionary Optimization Algorithm, see Evolutionary Optimization Algorithm (Evol). About the Hooke-Jeeves TechniqueThe Hooke-Jeeves technique begins with a starting guess and searches for a local minimum. It does not require the objective function to be continuous. Because the algorithm does not use derivatives, the function does not need to be differentiable. Because this technique also has a convergence parameter, rho, you can determine the number of function evaluations needed for the greatest probability of convergence. About the Large Scale Generalized Reduced Gradient (LSGRG) TechniqueThe Large Scale Generalized Reduced Gradient (LSGRG) technique uses the generalized reduced gradient algorithm for solving constrained nonlinear optimization problems. The algorithm uses a search direction such that any active constraints remain precisely active for some small move in that direction. For detailed information about the LSGRG technique, see Large Scale Generalized Reduced Gradient (LSGRG) Technique. About the Mixed-Integer Sequential Quadratic Programming (MISQP) TechniqueMixed-Integer Sequential Quadratic Programming (MISQP) is a trust region–based method for solving problems that include integer and other discrete variables. Similar to other sequential quadratic programming methods, MISQP assumes that the objective function and constraints are continuously differentiable. In addition, MISQP assumes that the objective and constraint functions are smooth with respect to the integer variables. Unlike other mixed-integer methods, MISQP does not relax the integer variables. MISQP uses a branch-and-bound procedure for solving each of the successive mixed-integer quadratic programs (MIQP). MISQP guarantees convergence for convex problems and produces good results for non-convex problems. About the Modified Method of Feasible Directions (MMFD) TechniqueThe Modified Method of Feasible Directions (MMFD) is a direct numeric optimization technique used to solve constrained optimization problems. For detailed information about the MMFD technique, see Modified Method of Feasible Directions (MMFD) Technique. About the Multifunction Optimization System Tool (MOST) TechniqueThe Multifunction Optimization System Tool (MOST) technique first solves the given design problem as if it were a purely continuous problem, using sequential quadratic programming to locate an initial peak. If all design variables are real, optimization stops here. Otherwise, the technique branches out to the nearest points that satisfy the integer or discrete value limits of one non-real parameter, for each such parameter. For detailed information, see Multifunction Optimization System Tool (MOST) Technique. About the Multi-Island Genetic Algorithm TechniqueIn the Multi-Island Genetic Algorithm, as with other genetic algorithms, each design point is perceived as an individual with a certain fitness value, based on the value of the objective function and constraint penalty. An individual with a better value of objective function and penalty has a higher fitness value. For detailed information about the Multi-Island Genetic Algorithm, see Multi-Island Genetic Algorithm. About the Multi-Objective Particle Swarm TechniqueParticle swarm optimization is a population-based search procedure where individuals (called particles) continuously change position (called state) within the search area. In other words, these particles “fly” around in the design space looking for the best position. You can use the Multi-Objective Particle Swarm technique with one objective. However, if you use only one objective the Pareto file will be empty. For detailed information about the Multi-Objective Particle Swarm algorithm, see Multi-Objective Particle Swarm Technique. About the Neighborhood Cultivation Genetic Algorithm (NCGA) TechniqueIn the Neighborhood Cultivation Genetic Algorithm (NCGA) technique, each objective parameter is treated separately. Standard genetic operation of mutation and crossover are performed on the designs. The crossover process is based on the “neighborhood cultivation” mechanism, where the crossover is performed mostly between individuals with values close to one of the objectives. By the end of the optimization run, a pareto set is constructed where each design has the “best” combination of objective values, and improving one objective is impossible without sacrificing one or more of the other objectives. The final Pareto set of designs produced at the end of optimization is saved in a file. For more information, see About the NCGA ParetoFile Format. About the Non-Dominated Sorting Genetic Algorithm - NSGA-II TechniqueIn the Non-dominated Sorting Genetic Algorithm (NSGA-II), each objective parameter is treated separately. Standard genetic operation of mutation and crossover are performed on the designs. The selection process is based on two main mechanisms: “non-dominated sorting” and “crowding distance sorting.” By the end of the optimization run a pareto set is constructed where each design has the “best” combination of objective values and improving one objective is impossible without sacrificing one or more of the other objectives. The final Pareto set of designs produced at the end of optimization is saved in a file. For more information, see About the NSGA-II ParetoFile Format. About the Pointer Automatic Optimizer TechniqueThe Pointer technique consists of a complementary set of optimization algorithms: linear simplex, sequential quadratic programming (SQP), downhill simplex, and genetic algorithms. The Pointer technique can efficiently solve a wide range of problems in a fully automatic manner because of a special automatic control algorithm. The goal of the Pointer technique is to make optimization more accessible to non-expert users without sacrificing performance. For detailed information about the Pointer Automatic Optimizer algorithm, see Overview of Pointer Technique. About the Sequential Quadratic Programming (NLPQLP) TechniqueNLPQLP is a special implementation of a Sequential Quadratic Programming (SQP) method. Proceeding from a quadratic approximation of the Lagrangian function and a linearization of constraints, a quadratic programming subproblem is formulated and solved. Depending on the number of compute nodes, objective and constraint functions can be evaluated simultaneously at predetermined test points along the search direction. The parallel line search is performed with respect to an augmented Lagrangian merit function. About the Stress Ratio TechniqueThe Stress Ratio Optimizer is a fully stressed design (FSD) method commonly used in structural optimization. For detailed information about the Stress Ratio algorithm, see Stress Ratio Technique. |