In many optimization problems, simple methods are not sufficient to find the best solution. They will only find a local optimum, usually the one closest to the search starting point, which is often given by the user. For example, consider the expression
>
|
|
You could likely approximate the global minimum in the given domain, and find that minimum easily, by simply looking at the plot. But if you were unable to properly approximate it, or if you approximated incorrectly, you would not find the global minimum by using the usual optimization techniques.
>
|
|
| (1.3.1) |
According to the Minimize command, the minimum is at approximately x = 30. However, you can see in the plot above that this is not the global minimum.
By using the global optimization equivalent of this command, you can be assured that you have found the global minimum in the specified interval.
>
|
|
>
|
|
| (1.3.2) |
To view the solving methods that were used to determine the global minimum, change the infolevel variable and re-execute the command. At infolevel = 3, the command displays the input model size and type, solver operational mode and parameter, and detailed runtime information.
>
|
|
>
|
|
GlobalSolve: calling NLP solver
| |
GlobalSolve: calling global optimization solver
| |
GlobalSolve: number of problem variables 1
| |
GlobalSolve: number of nonlinear inequality constraints 0
| |
GlobalSolve: number of nonlinear equality constraints 0
| |
GlobalSolve: method OptimusDEVOL
| |
GlobalSolve: maximum iterations 80
| |
GlobalSolve: population size 50
| |
GlobalSolve: average stopping stepwidth 0.1e-3
| |
GlobalSolve: time limit 100
| |
GlobalSolve: trying evalhf mode
| |
GlobalSolve: performing local refinement
| |
| (1.3.3) |
Typically, the infolevel is set to 0, the default.
>
|
|
The following is an example of a situation in which you cannot approximate the global minimum by using linear methods; however, the global solver finds the best solution.
|
Example
|
|
Consider a non-linear system of equations.
>
|
|
>
|
|
The induced least-squares error function is highly multi-extremal, making it difficult to minimize the error of the least squares approximation.
>
|
|
To determine the global minimum of the least-squares error, define the objective function and constraints to be optimized.
>
|
|
>
|
|
First, try to find a local solution by using Maple's built-in Optimization package. This system is sufficiently complex that linear optimization solvers cannot find a feasible solution.
>
|
|
However, global optimization techniques can be used to find a global minimum.
>
|
|
| |
| (1.3.1.1) |
Substitution into the constraints shows that the least squares approximation can be a fairly precise solution. That is, the error of the least squares approximation is very small at the minimum point.
| (1.3.1.2) |
|
With the examples demonstrated here, you are now ready to use the Global Optimization Toolbox to solve many complex mathematical problems. See the Maple help system for more information about the commands used in this guide, or more ways in which the Global Optimization Toolbox can help you.