SRCLID:Simulation-Based Design Optimization

(Difference between revisions)
Jump to: navigation, search
Line 59: Line 59:
see [ Metamodeling]
see [[Metamodeling|Metamodeling]] [ Metamodeling Wikipedia]
==Unconstrained Optimization==
==Unconstrained Optimization==

Revision as of 14:04, 6 December 2013

Design optimization can be thought of as used at the different length scale models and materials similar to every ICME notion.


General Design Optimization

Optimization discipline deals with finding the maximum and minimum of functions subject to some constraints.

Optimization guidelines

Design variables: A design variable is a specification that is controllable by the designer (eg., thickness, material, etc.) and are often bounded by maximum and minimum values. Sometimes these bounds can be treated as constraints.

Constraints: A constraint is a condition that must be satisfied for the design to be feasible. Examples include physical laws, constraints can reflect resource limitations, user requirements, or bounds on the validity of the analysis models. Constraints can be used explicitly by the solution algorithm or can be incorporated into the objective using Lagrange multipliers.

Objectives: An objective is a numerical value or function that is to be maximized or minimized. For example, a designer may wish to maximize profit or minimize weight. Many solution methods work only with single objectives. When using these methods, the designer normally weights the various objectives and sums them to form a single objective. Other methods allow multi-objective optimization, such as the calculation of a Pareto frontier

Models: The designer must also choose models to relate the constraints and the objectives to the design variables. They may include finite element analysis, reduced order metamodels, etc.

Reliability: the probability of a component to perform its required functions under stated conditions for a specified period of time.

Optimization Methods

Gradient-based methods:

Adjoint equation, Newton’s method, Steepest descent, Conjugate gradient, and Sequential quadratic programming

Gradient-free methods:

Hooke-Jeeves pattern search and Nelder-Mead method

Population-based methods:

Genetic algorithm, Memetic algorithm, Particle swarm optimization, Ant colony, and Harmony search

Other methods:

Random search, Grid search, Simulated annealing, Direct search, and IOSO(Indirect Optimization based on Self-Organization)

Convergence of Pareto Frontier

It is relatively simple to determine an optimal solution for single objective methods (solution with the lowest error function). However, for multiple objectives, we must evaluate solutions on a “Pareto frontier.” A solution lies on the Pareto frontier when any further changes to the parameters result in one or more objectives improving with the other objective(s) suffering as a result. Once a set of solutions have converged to the Pareto frontier, further testing is required in order to determine which candidate force field is optimal for the problems of interest. Be aware that searches with a limited number of parameters might “cram” a lot of important physics into a few parameters.


see Metamodeling Metamodeling Wikipedia

Unconstrained Optimization

Zeroth-Order Methods

These methods are referred to as “zeroth-order methods” because they require only evaluation of the function, f(X), in each iterative step. Some examples of zeroth-order methods are the Bracketing Method and the Golden Section Search Method. Some population based methods could also be categorized as zeroth-order methods [1].

Bracketing Method

The Bracketing method is a zeroth-order method which used progressively smaller intervals to converge to an optimal solution. The interval is set up such that the x value corresponding to the optimal value of f lies within the interval. The interval is then divided into any number of sub-intervals of any given length. At each dividing point the value of f is calculated. The optimum sub-interval is then chosen as the next interval. This process iterates until convergence criteria is met [1].

Golden Section Search

First-Order Methods

In addition to evaluation of f(X), first-order methods require the calculation of the gradient vector ∇f(X) in each iterative step. Some examples of first-order methods are the Steepest Descent or Cauchy Method and the Conjugate Gradient Method.

Steepest Descent (Cauchy) Method

The Steepest Descent method uses a search direction of some magnitude in the negative direction of the gradient. The negative of the gradient gives the direction of maximum decrease, hence steepest descent. The magnitude of the constant for the search direction can be determined through zeroth-order methods or from direct calculation. The direct calculation is done by setting the derivative equal to zero and solving for the constant. This method is guaranteed to converge to a local minimum, but convergence may be slow as previous iterations are not considered in determining the search direction of subsequent iterations. The rate of convergence can be estimated using the condition number of the Hessian matrix. If the condition number of the Hessian is large convergence will be slow [1].

Conjugate Gradient Method

The Conjugate Gradient Method is similar to the Steepest Descent Method except that it takes into consideration previous iterations when choosing search directions. The conjugate direction is determining by adding the steepest descent direction of the previous iteration, scaled by some value, to the steepest descent direction of the current iteration. The constant used to scale the search direction of the previous iteration can be determined using either the Fletcher-Reeves formula or the Polak-Ribiere formula [1].

Second-Order Methods

Second-order methods take advantage of the Hessian matrix, the second derivative, of the the function to improve search direction and rate of convergence. Some examples of second-order methods are Newton's Method, Davidon-Fletcher-Powell (DFP) method, and the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method [1].

Population-Based Methods

Population based methods generate a population of points throughout the design space. Some methods then specify a range of the best points and generate a new population, continuing until convergence is reached (Monte-Carlo Method). Others generate a population and then "evolve" the points. The weakest of the new population is eliminated and the remainder evolved again until convergence is reached (Genetic Algorithm).

Monte-Carlo Method

see Monte-Carlo

Genetic Algorithm

back to the SRCLID home

Structural Scale Optimization

Analytical Model for Axial Crushing of Multi-cell Multi-corner Tubes (Multi-CRUSH)

Contributers: Ali Najafi and Masoud Rais-Rohani

Topology Optimization of Continuum Structures Using Element Exchange Method

Authors: Mohammad Rouhi ( and Masoud Rais-Rohani (

Mohammad Rouhi's MSc Thesis

Topology Optimization of Continuum Structures Using Element Exchange Method

Authors: Mohammad Rouhi ( and Masoud Rais-Rohani (

Element Exchange Method for Topology Optimization

Authors: Mohammad Rouhi (, Masoud Rais-Rohani ( and Thomas Neil Williams (


Optimization algorithms can be used for model calibration. For example, the DMGfit and MSFfit routines employ optimization to fit the plasticity-damage model and the fatigue model, respectively. The constants of interest are selected and a Monte Carlo optimization routine is performed to generate candidate constants. A single element simulation then produces the model stress-strain curve. The curve is compared to the input data for fit comparison, and this process is repeated until a satisfactory fit is achieved or a maximum number of iterations is reached. The resulting optimized constants are then output.




The Embedded Atom Method (EAM) and Modified Embedded Atom Method (MEAM) potentials can be optimized based upon on Electronics Scale calculation results and experimental data.

Electronic Scale

Multilevel Design Optimization

This is an emerging topics at CAVS. The pages describing the progress are currently available only to the members of the research team.


  1. 1.0 1.1 1.2 1.3 1.4 Rais-Rohani,Masoud “Handout #3: Mathematical Programming Methods for Unconstrained Optimization,” Design Optimization Class, Mississippi State University, Spring 2012.
Personal tools

Material Models