help minlp * * * Solution of MINLP * * * Z = min(y,x) F(x,y) = C'y + f(x) s.t. G(x,y) = By + g(x) <= 0 x in X in R^n y in Y = {0,1}^m usage: [wo,St,x0,y0] = minlp(NLP,GRAD,n,m,rx,fig,x0,y0,xlb,xub,M,epsR,epsI,epsC,epsZ) wo : best solution found [x,y,u,Z] where u is the maximum constraint violation St : matrix = (number of constraints, number of LPs) for each NLP NLP : M file describing the NLP: [F,G] = nlp(x,y) with the first line: if size(x,2) > n, y = x(n+1:n+m); end GRAD : M file for gradients: [Fx,Gx,C,B] = grad(x,y) with the first line: if size(x,2) > n, y = x(n+1:n+m); end and last lines: if size(x,2) > n, Fx = [Fx C]; Gx = [Gx B]; end Gx = Gx'; rx : = 1 : relaxed initial NLP. = 0 : unrelaxed initial NLP (default) fig : = 1 : plot constraints surfaces. = 0 : no plots (default). x0 : initial guess for x (default = zeros(1,n)). Returns first solution. y0 : initial guess for y (default = round(rand(1,m))) xlb : lower bound for x (default = zeros(1,n)) xub : upper bound for x (default = inf * ones(1,n)) M > 0 : constraint violation penalization (default = 1e3) epsR : real variable tolerance (default = 1e-4) epsI : integer variable tolerance (default = 1e-4) epsC : constraint violation tolerance (default = 1e-6) epsZ : objective function tolerance (default = 1e-4) type minlp1 function [f_nlp,g_nlp] = nlp(x,y) if size(x,2) > 3 y = x(4:6); % NLP com integralidade relaxada end % * * * exemplo 1 - Kocis e Grossmann(1987) * * * f_nlp = -2.9*x(3)-8.9*log(1+x(1))-10.44*log(1+x(2))+1.8*x(1)+1.8*x(2)+3.5*y(1)+y(2)+1.5*y(3); g_nlp(1) = -y(1)+0.9*log(1+x(1))+1.08*log(1+x(2))+0.9*x(3); g_nlp(2) = -10*y(2)+log(1+x(1)); g_nlp(3) = -10*y(3)+1.2*log(1+x(2)); g_nlp(4) = y(2) + y(3) -1; type gminlp1 function [gf_nlp,gg_nlp,C,B] = grad (x,y) if size(x,2) > 3 y = x(4:6); % NLP com integralidade relaxada end % * * * exemplo 1 - Kocis e Grossmann(1987) * * * gf_nlp(1) =-8.9/(1+x(1))+1.8; gf_nlp(2) =-10.44/(1+x(2))+1.8; gf_nlp(3) =-2.9; gg_nlp(1,1) =0.9/(1+x(1)); gg_nlp(1,2) =1.08/(1+x(2)); gg_nlp(1,3) =0.9; gg_nlp(2,1) =1/(1+x(1)); gg_nlp(2,2) =0; gg_nlp(2,3) =0; gg_nlp(3,1) =0; gg_nlp(3,2) =1.2/(1+x(2)); gg_nlp(3,3) =0; gg_nlp(4,1) =0; gg_nlp(4,2) =0; gg_nlp(4,3) =0; if nargout > 2 | size(x,2) > 3 C = [3.5, 1, 1.5]; B = [-1, 0, 0; 0, -10, 0; 0, 0, -10; 0, 1, 1]; end if size(x,2) > 3 % NLP com integralidade relaxada gf_nlp = [gf_nlp C]; gg_nlp = [gg_nlp B]; end gg_nlp=gg_nlp'; help minlp * * * Solution of MINLP * * * Z = min(y,x) F(x,y) = C'y + f(x) s.t. G(x,y) = By + g(x) <= 0 x in X in R^n y in Y = {0,1}^m usage: [wo,St,x0,y0] = minlp(NLP,GRAD,n,m,rx,fig,x0,y0,xlb,xub,M,epsR,epsI,epsC,epsZ) wo : best solution found [x,y,u,Z] where u is the maximum constraint violation St : matrix = (number of constraints, number of LPs) for each NLP NLP : M file describing the NLP: [F,G] = nlp(x,y) with the first line: if size(x,2) > n, y = x(n+1:n+m); end GRAD : M file for gradients: [Fx,Gx,C,B] = grad(x,y) with the first line: if size(x,2) > n, y = x(n+1:n+m); end and last lines: if size(x,2) > n, Fx = [Fx C]; Gx = [Gx B]; end Gx = Gx'; rx : = 1 : relaxed initial NLP. = 0 : unrelaxed initial NLP (default) fig : = 1 : plot constraints surfaces. = 0 : no plots (default). x0 : initial guess for x (default = zeros(1,n)). Returns first solution. y0 : initial guess for y (default = round(rand(1,m))) xlb : lower bound for x (default = zeros(1,n)) xub : upper bound for x (default = inf * ones(1,n)) M > 0 : constraint violation penalization (default = 1e3) epsR : real variable tolerance (default = 1e-4) epsI : integer variable tolerance (default = 1e-4) epsC : constraint violation tolerance (default = 1e-6) epsZ : objective function tolerance (default = 1e-4) [wo,St] = minlp('minlp1','gminlp1',3,3,0,0,[0 0 1],[0 1 0],[0 0 0]) f-COUNT FUNCTION MAX{g} STEP Procedures 1 -1.9 0.9 1 2 1 3.9968e-016 1 Hessian modified twice 3 1 3.9968e-016 1 Hessian modified twice Optimization Converged Successfully Active Constraints: 1 3 f-COUNT FUNCTION MAX{g} STEP Procedures 1 -0.176071 0 1 2 -1.73758 0 1 3 -1.92031 0 1 4 -1.9231 0 1 Hessian modified 5 -1.9231 0 1 Hessian modified Optimization Converged Successfully Active Constraints: 1 2 f-COUNT FUNCTION MAX{g} STEP Procedures 1 -0.150498 0 1 2 -1.52948 0 1 3 -1.71621 0 1 4 -1.72097 0 1 Hessian modified 5 -1.72097 0 1 Hessian modified Optimization Converged Successfully Active Constraints: 1 3 wo = Columns 1 through 7 0 1.5242 0 1.0000 0 1.0000 0 Column 8 -1.9231 St = 5 2 6 3 7 2 help minlpn * * * Solution of Mixed Integer Non Linear Programming * * * Z = min(x,y) F(x,y) s.t. G(x,y) (= or <=) 0 , where the first nh equations are equality constraints x in X in R^n y in Y = {0,1}^m usage: [wo, nIter] = minlpn(NLP,GRAD,n,m,nh,rx,fig,x0,y0,xlb,xub,M,epsR,epsI,epsC,epsZ) wo : best solution found [x,y,u,Z] where u is the maximum constraint violation St : matrix = (number of constraints, number of LPs) for each NLP NLP : M file describing the NLP: [F,G] = nlp(x,y) GRAD : M file for gradients: [Fx,Gx,Fy,Gy] = grad(x,y) n : number of continuous variables m : number of binary variables nh : number of equality constraints fig : = 1 : plot constraints surfaces. = 0 : no plots (default). x0 : initial guess for x (default = zeros(1,n)). Returns first solution. y0 : initial guess for y (default = round(rand(1,m))) xlb : lower bound for x (default = zeros(1,n)) xub : upper bound for x (default = inf * ones(1,n)) M > 0 : constraint violation penalization (default = 1e3) epsR : real variable tolerance (default = 1e-4) epsI : integer variable tolerance (default = 1e-4) epsC : constraint violation tolerance (default = 1e-6) epsZ : objective function tolerance (default = 1e-4) [wo,St] = minlpn('minlp1','gminlp1',3,3,0,0,[0 0 1],[0 1 0],[0 0 0]) MINLPN message: sounding nodes ***. MINLPN message: sounding nodes **1. ****MINLPN****: solution found. wo = Columns 1 through 7 0.0000 1.5242 0.0000 1.0000 0 1.0000 0.0000 Column 8 -1.9231 St = 2