Anda di halaman 1dari 29

POLYTECHNIC UNIVERSITY OF THE PHILIPPINES

COLLEGE OF ENGINEERING NDC COMPOUND, STA.MESA MANILA

MULTIDIMENSIONAL UNCONSTRAINED OPTIMIZATION


IN PARTIAL FULFILLMENT OF THE REQUIREMENTS IN NUMERICAL METHOD SUBJECT FOR THE MASTER OF SCIENCE IN ENGINEERING S.Y 2010- 2011 ( 2ND SEMESTER)

TABLE OF CONTENTS:
TECHNIQUES TO FIND MINIMUM AND MAXIMUM OF SEVERAL VARIABLES DIRECT METHODS/ RANDOM SEARCH ADVANTAGE AND DISADVANTAGE OF RANDOM SEARCH UNIVARIATE AND PATTERN SEARCHES

TABLE OF CONTENTS:

GRADIENT METHODS THE HESSIANS THE STEEPEST ASCENT METHOD

DETERMINING THE BEST DIRECTION AND BEST VALUE AS PART OF THE PROBLEM

MULTIDIMENSION - one or more dimensions - one dimension (x or y or z axis) - two dimension (xy axes) - three dimension (xyz axes) CONSTRAINED - to force, to limit, to confine UNCONSTRAINED - to make less intense, to loosen

OPTIMIZATION

An act or process or methodology of making something as fully perfect, effective and function as possible.

OPTIMIZATION *Techniques to find minimum and


maximum of a function of several variables are described:

*These techniques are classified as:

OPTIMIZATION
Gradient or descent or ascent methods
Requires derivative evaluation Example of derivatives: Y= 3 + 2x- x2 at ( 2, 3) Soln: dy= d(3)+ 2dx-dx2 at(2, 3) dy= 0 + 2dx -2xdx dy= 2-2x

Direct methods or non- gradient


Do not require derivative evaluation

FIGURES FOR LINE OF CONSTANT

FIGURES FOR LINE OF CONSTANT

DIRECT METHODS
RANDOM SEARCH

Based on evaluation of the function randomly at selected values of the independent variables. If a sufficient number of samples are conducted, the optimum will be eventually located. Example: plotting of points

DIRECT METHODS
RANDOM SEARCH

Random Search- repeatedly evaluating the function at randomly selected value of the independent variables. Y= 2x+3 ( y is the dependent and x is the independent value) Example: x=1, x= -2, x= .001, x= .0001

ADVANTAGE OF DIRECT METHODS


Works even for discontinuous and no differentiable functions.

DISADVANTAGE OF DIRECT METHODS


As the number of independent variables grows, the task can become onerous (complex). Not efficient, it does not account (analyze) for the behavior of underlying function.

UNIVARIATE AND PATTERN SEARCHES


More efficient than random search and still doesnt require derivative evaluation. BASIC STRATEGY Change one variable at a time while the other variables are held constant.

UNIVARIATE AND PATTERN SEARCHES


Thus problem is reduced to a sequence of one- dimensional searches that can be solved by variety of methods. The search becomes less efficient as you approach the maximum.

GRADIENT METHODS
GRADIENT AND HESSIAN

Gradient- inclination Hessians- broad Algorithms- step by step solution

GRADIENT METHODS
GRADIENT AND HESSIAN

- If f(x,y ) is a two dimensional function, the gradient vector tells us What direction is the steepest ascend? How much we will gain by taking that step? Vf= f (i)/ x + f (j)/ y ( rectangular coordinates)

THE HESSIANS

For one dimensional functions both first and second derivatives valuable information for searching out optima.

THE HESSIANS
First derivative provides (a) the steepest trajectory of the function and (b) tells us what we have reached the maximum. Ex: y= 3x y= 3dx y= 0 Second derivatives tells us that whether we are a maximum or minimum

THE HESSIANS
Assuming that the partial derivatives are continuous at and near the point being evaluated /H/ = 2f 2f - [2f ]2 x2 y2 [xy]

If /H/ > 0 and 2f > 0, then f(x,y) has a local minimum x2

THE HESSIANS
If /H/ > 0 and 2f < 0, then f(x,y) has a local minimum x2 If /H/ < 0, then f(x,y) has a saddle point Saddle point - a ridge connecting two higher elevations ( minimum and maximum) /H/ stands for Hessian

THE HESSIANS

THE STEEPEST ASCEND METHOD


The quantity /H/ is equal to the determinant of a matrix made up of second derivatives. Start at an initial point (Xo, Yo), determine the direction of steepest ascend, that is, the gradient. Then search along the direction of the gradient, ho, until we find maximum. Process is then repeated.

THE STEEPEST ASCEND METHOD

THE STEEPEST ASCEND METHOD


The problem has two parts Determining the best direction Determining the best value along that search direction.

THE STEEPEST ASCEND METHOD


Steepest ascent method uses the gradient approach as its choice for the best direction.

THE STEEPEST ASCEND METHOD


To transform a function of x and y into a function of h along the gradient section: X= Xo + f (h)/ x Y= Yo + f (h)/ y ( h is distance along the h axis) where : f(x)= 2xy + 2x x2- 2y2
vf(x)= [ 2y + 2- 2x] [ 2x- 4y]

THE STEEPEST ASCEND METHOD


If Xo =1 and Yo =2
vf= 6i + 6j

x= 1 + 6h y= 2 + 6h
*** CHAPTER 14 ENDS***

Anda mungkin juga menyukai