Solves an unconstrainted multi-variable mixed integer non linear programming optimization problem
xopt = intfminunc(f,x0) xopt = intfminunc(f,x0,intcon) xopt = intfminunc(f,x0,intcon,options) [xopt,fopt] = intfminunc(.....) [xopt,fopt,exitflag]= intfminunc(.....) [xopt,fopt,exitflag,gradient,hessian]= intfminunc(.....)
A function, representing the objective function of the problem.
A vector of doubles, containing the starting values of variables of size (1 X n) or (n X 1) where 'n' is the number of Variables.
A vector of integers, representing the variables that are constrained to be integers.
A list, containing the option for user to specify. See below for details.
A vector of doubles, containing the computed solution of the optimization problem.
A double, containing the the function value at x.
An integer, containing the flag which denotes the reason for termination of algorithm. See below for details.
A vector of doubles, containing the objective's gradient of the solution.
A matrix of doubles, containing the Lagrangian's hessian of the solution.
Search the minimum of a multi-variable mixed integer non linear programming unconstrained optimization problem specified by : Find the minimum of f(x) such that
intfminunc calls Bonmin, which is an optimization library written in C++, to solve the bound optimization problem.
options= list("IntegerTolerance", [---], "MaxNodes",[---], "MaxIter", [---], "AllowableGap",[---] "CpuTime", [---],"gradobj", "off", "hessian", "off" );
options = list('integertolerance',1d-06,'maxnodes',2147483647,'cputime',1d10,'allowablegap',0,'maxiter',2147483647,'gradobj',"off",'hessian',"off")
The exitflag allows to know the status of the optimization which is given back by Ipopt.
For more details on exitflag, see the Bonmin documentation which can be found on http://www.coin-or.org/Bonmin
A few examples displaying the various functionalities of intfminunc have been provided below. You will find a series of problems and the appropriate code snippets to solve them.
We begin with the minimization of a simple non-linear function.
Find x in R^2 such that it minimizes:
We now look at the Rosenbrock function, a non-convex performance test problem for optimization routines. We use this example to illustrate how we can enhance the functionality of intfminunc by setting input options. We can pre-define the gradient of the objective function and/or the hessian of the lagrange function and thereby improve the speed of computation. This is elaborated on in example 2. We also set solver parameters using the options.
///Example 2: //Objective function to be minimised function y=f(x) y= 100*(x(2) - x(1)^2)^2 + (1-x(1))^2; endfunction //Starting point x0=[-1,2]; intcon = [2] //Options options=list("MaxIter", [1500], "CpuTime", [500]); //Calling [xopt,fopt,exitflag,gradient,hessian]=intfminunc(f,x0,intcon,options) // Press ENTER to continue | ![]() | ![]() |
Unbounded Problems: Find x in R^2 such that it minimizes:
//The below problem is an unbounded problem: //Find x in R^2 such that the below function is minimum //f = - x1^2 - x2^2 //Objective function to be minimised function [y, g, h]=f(x) y = -x(1)^2 - x(2)^2; g = [-2*x(1),-2*x(2)]; h = [-2,0;0,-2]; endfunction //Starting point x0=[2,1]; intcon = [1] options = list("gradobj","ON","hessian","on"); [xopt,fopt,exitflag,gradient,hessian]=intfminunc(f,x0,intcon,options) | ![]() | ![]() |