cd v:\cursos\pos\otimiza\aulas A=rand(4) A = 0.9501 0.8913 0.8214 0.9218 0.2311 0.7621 0.4447 0.7382 0.6068 0.4565 0.6154 0.1763 0.4860 0.0185 0.7919 0.4057 x=rand(4,1) x = 0.9355 0.9169 0.4103 0.8936 y=rand(4,1) y = 0.0579 0.3529 0.8132 0.0099 rank(A) ans = 4 rank(x*y') ans = 1 x*y' ans = 0.0542 0.3301 0.7607 0.0092 0.0531 0.3235 0.7456 0.0090 0.0238 0.1448 0.3336 0.0040 0.0517 0.3153 0.7267 0.0088 rank(x*y'+y*x') ans = 2 what M-files in the current directory v:\cursos\pos\otimiza\aulas CATALIS ex_qp2 hkjeeves pareto20 test13 EXTRATOR ex_qp3 htest1 pareto21 test14 LUCRO ex_swarm htest2 powell test15 MINQUA extrat interior qpsub test16 MODELO fmincon1 karmarkar refino test17 OPT_RES fminunc1 lmarqua restr test18 PLANOS fminusub lp_nlp restr1 test19 READ2 fun milp1 restr14 test1m SEMIDEF gmilp1 minlp restr15 test2 SMODELO gminlp1 minlp1 restr16 test20 aurea gminlp2 minlp2 restr17 test21 bandem1 gminlp3 minlp3 restr20 test22 bfgs gminlp4 minlp4 restr21 test3 bracket gminlp5 minlp5 rosembr test4 buscarnd gminlp6 minlp6 set1 test5 cgrad gmodelagem minlpn setoptim test6 checkbounds gmurray modelagem sol_extrat test7 coggins grad naturais sqp test8 compdir grg newton steepdes test9 complex gtest1 newton_h swarm univar dfp gtest10 newtont test0 varmetr dual gtest12 nlconst test1 visual ex2_karm gtest13 nlp_internal test10 writearq ex_karma gtest2 pareto18 test11 xplot ex_qp1 gtest9 pareto19 test12 help fminunc FMINUNC Finds the minimum of a function of several variables. X=FMINUNC(FUN,X0) starts at the point X0 and finds a minimum X of the function described in FUN. X0 can be a scalar, vector or matrix. The function FUN (usually an M-file or inline object) should return a scalar function value F evaluated at X when called with feval: F=feval(FUN,X). See the examples below for more about FUN. X=FMINUNC(FUN,X0,OPTIONS) minimizes with the default optimization parameters replaced by values in the structure OPTIONS, an argument created with the OPTIMSET function. See OPTIMSET for details. Used options are Display, TolX, TolFun, DerivativeCheck, Diagnostics, GradObj, HessPattern, LineSearchType, Hessian, HessUpdate, MaxFunEvals, MaxIter, DiffMinChange and DiffMaxChange, LargeScale, MaxPCGIter, PrecondBandWidth, TolPCG, TypicalX. Use the GradObj option to specify that FUN can be called with two output arguments where the second, G, is the partial derivatives of the function df/dX, at the point X: [F,G] = feval(FUN,X). Use Hessian to specify that FUN can be called with three output arguments where the second, G, is the partial derivatives of the function df/dX, and the third H is the 2nd partial derivatives of the function (the Hessian) at the point X: [F,G,H] = feval(FUN,X). The Hessian is only used by the large-scale method, not the line-search method. X=FMINUNC(FUN,X0,OPTIONS,P1,P2,...) passes the problem-dependent parameters P1,P2,... directly to the function FUN, e.g. FUN would be called using feval as in: feval(FUN,X,P1,P2,...). Pass an empty matrix for OPTIONS to use the default values. [X,FVAL]=FMINUNC(FUN,X0,...) returns the value of the objective function FUN at the solution X. [X,FVAL,EXITFLAG]=FMINUNC(FUN,X0,...) returns a string EXITFLAG that describes the exit condition of FMINUNC. If EXITFLAG is: > 0 then FMINUNC converged to a solution X. 0 then the maximum number of function evaluations was reached. < 0 then FMINUNC did not converge to a solution. [X,FVAL,EXITFLAG,OUTPUT]=FMINUNC(FUN,X0,...) returns a structure OUTPUT with the number of iterations taken in OUTPUT.iterations, the number of function evaluations in OUTPUT.funcCount, the algorithm used in OUTPUT.algorithm, the number of CG iterations (if used) in OUTPUT.cgiterations, and the first-order optimality (if used) in OUTPUT.firstorderopt. [X,FVAL,EXITFLAG,OUTPUT,GRAD]=FMINUNC(FUN,X0,...) returns the value of the gradient of FUN at the solution X. [X,FVAL,EXITFLAG,OUTPUT,GRAD,HESSIAN]=FMINUNC(FUN,X0,...) returns the value of the Hessian of the objective function FUN at the solution X. Examples Minimize the one dimensional function f(x) = sin(x) + 3: To use an M-file, i.e. FUN = 'myfun', create a file myfun.m: function f = myfun(x) f = sin(x)+3; Then call FMINUNC to find a minimum of FUN near 2: x = fminunc('myfun',2) To minimize this function with the gradient provided, modify the m-file myfun.m so the gradient is the second output argument: function [f,g]= myfun(x) f = sin(x) + 3; g = cos(x); and indicate the gradient value is available by creating an options structure with OPTIONS.GradObj set to 'on' (using OPTIMSET): options = optimset('GradObj','on'); x = fminunc('myfun',2,options); To minimize the function f(x) = sin(x) + 3 using an inline object: f = inline('sin(x)+3'); x = fminunc(f,2); To use inline objects for the function and gradient, FUN is a cell array of two inline objects where the first is the objective and the second is the gradient of the objective: options = optimset('GradObj','on'); x = fminunc({ inline('sin(x)+3'), inline('cos(x)') } ,2,options); help dfp Unconstrained optimization using DFP. [xo,Ot,nS]=dfp(S,x0,ip,G,method,Lb,Ub,problem,tol,mxit) S: objective function x0: initial point ip: (0) no plot (default), (>0) plot figure ip with pause, (<0) plot figure ip G: gradient vector function method: line-search method: (0) quadratic+cubic (default), (1) cubic Lb, Ub: lower and upper bound vectors to plot (default = x0*(1+/-2)) problem: (-1): minimum (default), (1): maximum tol: tolerance (default = 1e-4) mxit: maximum number of iterations (default = 50*(1+4*~(ip>0))) xo: optimal point Ot: optimal value of S nS: number of objective function evaluations type test10 function S=test(x) % Edgar & Himmelblau, 1988 % x0 = [1, 2]' % xo = [0, 0]' % S(xo) = 0 S=4*x(1).^2-2.*x(1).*x(2)+x(2).^2; [xo,Ot,nS]=dfp('test10',[1 2],1,'gtest10') Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 4 0.4 -20 Pause: hit any key to continue... 2 4 2.07692 0.192308 4.44e-015 Warning: Divide by zero. > In C:\Apps\Matlab\toolbox\optim\cubici2.m at line 12 In C:\Apps\Matlab\toolbox\optim\searchq.m at line 65 In v:\cursos\pos\otimiza\aulas\fminusub.m at line 283 In v:\cursos\pos\otimiza\aulas\fminunc1.m at line 220 In v:\cursos\pos\otimiza\aulas\varmetr.m at line 115 In v:\cursos\pos\otimiza\aulas\dfp.m at line 51 Warning: Divide by zero. > In C:\Apps\Matlab\toolbox\optim\cubici2.m at line 16 In C:\Apps\Matlab\toolbox\optim\searchq.m at line 65 In v:\cursos\pos\otimiza\aulas\fminusub.m at line 283 In v:\cursos\pos\otimiza\aulas\fminunc1.m at line 220 In v:\cursos\pos\otimiza\aulas\varmetr.m at line 115 In v:\cursos\pos\otimiza\aulas\dfp.m at line 51 Pause: hit any key to continue... 3 9 1.47911e-031 0.641026 4.8e-016 Optimization terminated successfully: Search direction less than 2*options.TolX Pause: hit any key to continue... xo = 1.0e-016 * -0.7971 -0.7971 Ot = 1.9060e-032 nS = 10 [xo,Ot,nS]=newton('test10',[1 2],1,'gtest10') Pause: hit any key to continue... Pause: hit any key to continue... xo = 1.0e-030 * 0.0740 0.1972 Ot = 3.1601e-062 nS = 3 [xo,Ot,nS]=cgrad('test10',[1 2],1,'gtest10') Pause: hit any key to continue... Pause: hit any key to continue... Pause: hit any key to continue... Pause: hit any key to continue... xo = 1.0e-017 * 0.1966 0.6447 Ot = 3.1678e-035 nS = 37 [xo,Ot,nS]=bfgs('test10',[1 2],1,'gtest10') Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 4 0.4 -20 Pause: hit any key to continue... 2 4 2.07692 0.192308 4.44e-015 Pause: hit any key to continue... 3 9 0.0572417 0.505273 1.59 Pause: hit any key to continue... 4 14 0.00228967 1.2 0.0229 Pause: hit any key to continue... 5 18 1.46539e-007 1.008 3.66e-005 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun Pause: hit any key to continue... xo = 0 0 Ot = 0 nS = 19 [xo,Ot,nS]=gmurray('test10',[1 2],1,'gtest10') Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 4 0.4 -20 Pause: hit any key to continue... 2 4 2.07692 0.192308 4.44e-015 Pause: hit any key to continue... 3 9 0.0824801 0.519688 1.91 Pause: hit any key to continue... 4 14 0.0032992 1.2 0.033 Pause: hit any key to continue... 5 18 1.20371e-035 1 9.57e-020 Optimization terminated successfully: Search direction less than 2*options.TolX Pause: hit any key to continue... xo = 1.0e-032 * -0.3081 -0.2311 Ot = 2.9080e-065 nS = 19 [xo,Ot,nS]=steepdes('test10',[1 2],1,'gtest10') Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 4 0.4 -20 Pause: hit any key to continue... 2 4 2.07692 0.192308 4.44e-015 Pause: hit any key to continue... 3 8 1.08431 0.192308 -0.737 Pause: hit any key to continue... 4 12 0.556838 0.220204 -0.14 Pause: hit any key to continue... 5 16 0.288789 0.194617 0.102 Pause: hit any key to continue... 6 20 0.150205 0.194617 -0.0254 Pause: hit any key to continue... 7 24 0.0780309 0.202758 -0.0228 Pause: hit any key to continue... 8 28 0.0405624 0.203896 -2.78e-016 Warning: Divide by zero. > In C:\Apps\Matlab\toolbox\optim\cubici2.m at line 12 In C:\Apps\Matlab\toolbox\optim\searchq.m at line 71 In v:\cursos\pos\otimiza\aulas\fminusub.m at line 283 In v:\cursos\pos\otimiza\aulas\fminunc1.m at line 220 In v:\cursos\pos\otimiza\aulas\varmetr.m at line 115 In v:\cursos\pos\otimiza\aulas\steepdes.m at line 51 Warning: Divide by zero. > In C:\Apps\Matlab\toolbox\optim\cubici2.m at line 16 In C:\Apps\Matlab\toolbox\optim\searchq.m at line 71 In v:\cursos\pos\otimiza\aulas\fminusub.m at line 283 In v:\cursos\pos\otimiza\aulas\fminunc1.m at line 220 In v:\cursos\pos\otimiza\aulas\varmetr.m at line 115 In v:\cursos\pos\otimiza\aulas\steepdes.m at line 51 Pause: hit any key to continue... 9 32 0.0210853 0.19625 2.08e-017 Pause: hit any key to continue... 10 36 0.0109608 0.204604 0.000345 Pause: hit any key to continue... 11 40 0.00570899 0.204604 0.00267 Pause: hit any key to continue... 12 44 0.00297994 0.204604 0.00281 Pause: hit any key to continue... 13 48 0.0015307 0.181871 0.000224 Warning: Divide by zero. > In C:\Apps\Matlab\toolbox\optim\cubici2.m at line 12 In C:\Apps\Matlab\toolbox\optim\searchq.m at line 65 In v:\cursos\pos\otimiza\aulas\fminusub.m at line 283 In v:\cursos\pos\otimiza\aulas\fminunc1.m at line 220 In v:\cursos\pos\otimiza\aulas\varmetr.m at line 115 In v:\cursos\pos\otimiza\aulas\steepdes.m at line 51 Warning: Divide by zero. > In C:\Apps\Matlab\toolbox\optim\cubici2.m at line 16 In C:\Apps\Matlab\toolbox\optim\searchq.m at line 65 In v:\cursos\pos\otimiza\aulas\fminusub.m at line 283 In v:\cursos\pos\otimiza\aulas\fminunc1.m at line 220 In v:\cursos\pos\otimiza\aulas\varmetr.m at line 115 In v:\cursos\pos\otimiza\aulas\steepdes.m at line 51 Pause: hit any key to continue... 14 52 0.000790702 0.218418 2.17e-018 Pause: hit any key to continue... 15 56 0.000408693 0.179767 -0.000105 Pause: hit any key to continue... 16 60 0.000208574 0.232673 -3.55e-016 Pause: hit any key to continue... 17 64 0.000106445 0.175373 -1.59e-016 Pause: hit any key to continue... 18 68 5.43234e-005 0.232673 -3.73e-016 Pause: hit any key to continue... 19 72 2.77236e-005 0.175373 -1.6e-016 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun Pause: hit any key to continue... xo = 0.0000 0.0038 Ot = 1.4637e-005 nS = 73 [xo,Ot,nS]=steepdes('test1',[-1.2 1],-1,'gtest1') Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 2 4 4.12814 0.000786792 -68.5 3 10 2.51901 0.474636 -3.52 4 11 1334.32 1.81761 2.64e+003 5 12 14.6767 0.454632 86.6 6 13 2.39681 0.0873426 1.38 7 18 2.18924 0.00259764 -0.0586 8 23 2.05632 0.0506328 0.025 9 27 1.9435 0.00324456 0.0115 10 32 1.85496 0.0318887 -0.0072 11 36 1.77328 0.0038722 -0.0112 12 41 1.70345 0.0235929 -0.00453 13 45 1.63651 0.00453985 -0.0145 14 50 1.57699 0.018776 -0.00249 15 54 1.51869 0.0052898 -0.0106 16 59 1.46567 0.0155433 -0.00115 17 63 1.41297 0.00616508 -0.0074 18 68 1.36427 0.0131905 -0.000179 19 72 1.3153 0.00722181 -0.0051 20 76 1.26948 0.0113785 -0.000206 21 80 1.22289 0.00855135 0.00296 22 84 1.17882 0.00990883 -0.000124 23 88 1.13349 0.0102947 -0.00223 24 92 1.09014 0.00869839 6.03e-005 25 97 1.04491 0.0127083 -0.00259 26 101 1.00112 0.00765536 -0.001 27 106 0.954494 0.0163651 -0.00364 28 110 0.90861 0.00672118 -0.00179 29 115 0.857986 0.0227281 -0.0158 30 119 0.806848 0.00584408 -0.0038 31 124 0.745361 0.0376699 -0.0624 32 128 0.67918 0.0049208 -0.0142 33 134 0.37907 0.195478 -2.39 34 138 0.30518 0.00280425 -0.989 35 142 0.255314 0.108733 -0.122 36 146 0.237502 0.00248565 -0.000378 37 151 0.223127 0.0560455 -0.000576 38 155 0.211859 0.00238209 -0.000367 39 160 0.202105 0.0455429 -0.000579 40 164 0.193909 0.00230943 -0.000175 41 169 0.186563 0.0391765 -0.000364 42 173 0.180147 0.00225356 -9.63e-005 43 178 0.174274 0.0348427 -0.000233 44 182 0.169019 0.0022082 -5.89e-005 45 187 0.164141 0.0316664 -0.000156 46 191 0.159702 0.00217008 -3.87e-005 47 196 0.155541 0.0292183 -0.000109 48 200 0.151708 0.00213722 -2.69e-005 49 205 0.148087 0.0272614 -7.96e-005 50 209 0.144721 0.00210838 -1.95e-005 51 214 0.141522 0.0256536 -5.97e-005 52 218 0.138528 0.00208269 -1.46e-005 53 223 0.135668 0.0243039 -4.59e-005 54 227 0.132975 0.00205956 -1.12e-005 55 232 0.130394 0.023151 -3.61e-005 56 236 0.127951 0.00203854 -8.8e-006 57 241 0.125603 0.0221522 -2.88e-005 58 245 0.12337 0.00201928 -7.04e-006 59 250 0.121218 0.0212766 -2.34e-005 60 254 0.119166 0.00200151 -5.72e-006 61 259 0.117183 0.0205012 -1.93e-005 62 263 0.115286 0.00198505 -4.71e-006 63 268 0.113449 0.0198085 -1.6e-005 64 272 0.111687 0.0019697 -3.93e-006 65 277 0.109978 0.0191852 -1.35e-005 66 281 0.108336 0.00195534 -3.31e-006 67 286 0.10674 0.0186205 -1.15e-005 68 290 0.105203 0.00194185 -2.81e-006 69 295 0.103707 0.018106 -9.8e-006 70 299 0.102264 0.00192914 -2.41e-006 71 304 0.100858 0.0176348 -8.45e-006 72 308 0.0994988 0.00191713 -2.08e-006 73 313 0.0981734 0.0172012 -7.33e-006 74 317 0.0968906 0.00190574 -1.81e-006 75 322 0.095638 0.0168006 -6.4e-006 76 326 0.094424 0.00189493 -1.58e-006 77 331 0.0932376 0.0164291 -5.62e-006 78 335 0.0920864 0.00188463 -1.39e-006 79 340 0.0909603 0.0160833 -4.96e-006 80 344 0.0898663 0.0018748 -1.23e-006 81 349 0.0887954 0.0157606 -4.4e-006 82 353 0.087754 0.00186541 -1.09e-006 83 358 0.0867338 0.0154584 -3.91e-006 84 362 0.0857407 0.00185641 -9.72e-007 85 367 0.0847672 0.0151748 -3.5e-006 86 371 0.0838187 0.00184779 -8.7e-007 87 376 0.0828883 0.014908 -3.14e-006 88 380 0.0819812 0.0018395 -7.81e-007 89 385 0.0810908 0.0146563 -2.82e-006 90 389 0.080222 0.00183154 -7.04e-007 91 394 0.0793688 0.0144185 -2.55e-006 92 398 0.0785356 0.00182387 -6.37e-007 93 403 0.077717 0.0141932 -2.31e-006 94 407 0.0769171 0.00181647 -5.77e-007 95 412 0.0761309 0.0139796 -2.1e-006 96 416 0.075362 0.00180933 -5.25e-007 97 421 0.074606 0.0137766 -1.91e-006 98 425 0.0738663 0.00180244 -4.79e-007 99 430 0.0731385 0.0135834 -1.74e-006 100 434 0.0724261 0.00179577 -4.38e-007 101 439 0.071725 0.0133993 -1.6e-006 102 443 0.0710382 0.00178931 -4.01e-007 103 448 0.0703621 0.0132235 -1.46e-006 104 452 0.0696995 0.00178306 -3.68e-007 105 457 0.0690469 0.0130555 -1.34e-006 106 461 0.0684071 0.001777 -3.39e-007 107 466 0.0677768 0.0128947 -1.24e-006 108 470 0.0671585 0.00177111 -3.12e-007 109 475 0.0665492 0.0127407 -1.14e-006 110 479 0.0659512 0.0017654 -2.88e-007 111 484 0.0653618 0.0125929 -1.06e-006 112 488 0.0647831 0.00175985 -2.67e-007 113 493 0.0642125 0.012451 -9.77e-007 114 497 0.0636521 0.00175445 -2.47e-007 115 502 0.0630994 0.0123147 -9.06e-007 116 506 0.0625564 0.00174919 -2.29e-007 117 511 0.0620206 0.0121835 -8.41e-007 118 515 0.0614941 0.00174408 -2.13e-007 119 520 0.0609744 0.0120571 -7.82e-007 120 524 0.0604636 0.0017391 -1.99e-007 121 529 0.0599593 0.0119354 -7.28e-007 122 533 0.0594634 0.00173424 -1.85e-007 123 538 0.0589738 0.0118179 -6.79e-007 124 542 0.0584922 0.0017295 -1.73e-007 125 547 0.0580165 0.0117045 -6.34e-007 126 551 0.0575485 0.00172488 -1.61e-007 127 556 0.0570862 0.0115949 -5.93e-007 128 560 0.0566311 0.00172037 -1.51e-007 129 565 0.0561815 0.011489 -5.55e-007 130 569 0.0557389 0.00171596 -1.42e-007 131 574 0.0553015 0.0113865 -5.2e-007 132 578 0.0548708 0.00171166 -1.33e-007 133 583 0.0544451 0.0112873 -4.88e-007 134 587 0.0540257 0.00170745 -1.25e-007 135 592 0.0536112 0.0111912 -4.58e-007 136 596 0.0532028 0.00170333 -1.17e-007 137 601 0.052799 0.011098 -4.31e-007 138 605 0.052401 0.00169931 -1.1e-007 139 610 0.0520075 0.0110076 -4.05e-007 140 614 0.0516196 0.00169537 -1.04e-007 141 619 0.0512359 0.0109199 -3.82e-007 142 623 0.0508577 0.00169151 -9.8e-008 143 628 0.0504835 0.0108347 -3.6e-007 144 632 0.0501145 0.00168774 -9.25e-008 145 637 0.0497495 0.0107519 -3.4e-007 146 641 0.0493894 0.00168404 -8.73e-008 147 646 0.0490332 0.0106715 -3.21e-007 148 650 0.0486817 0.00168041 -8.25e-008 149 655 0.0483339 0.0105932 -3.03e-007 150 659 0.0479907 0.00167685 -7.81e-008 151 664 0.0476511 0.0105171 -2.87e-007 152 668 0.0473158 0.00167337 -7.39e-008 153 673 0.046984 0.010443 -2.71e-007 154 677 0.0466565 0.00166995 -7.01e-008 155 682 0.0463322 0.0103708 -2.57e-007 156 686 0.0460121 0.00166659 -6.64e-008 157 691 0.0456952 0.0103005 -2.44e-007 158 695 0.0453822 0.0016633 -6.3e-008 159 700 0.0450723 0.010232 -2.31e-007 160 704 0.0447663 0.00166007 -5.98e-008 161 709 0.0444632 0.0101651 -2.19e-007 162 713 0.0441638 0.00165689 -5.69e-008 163 718 0.0438673 0.0100999 -2.08e-007 164 722 0.0435744 0.00165377 -5.41e-008 165 727 0.0432842 0.0100363 -1.98e-007 166 731 0.0429975 0.00165071 -5.14e-008 167 736 0.0427135 0.00997422 -1.88e-007 168 740 0.0424329 0.0016477 -4.9e-008 169 745 0.0421548 0.00991357 -1.79e-007 170 749 0.0418801 0.00164474 -4.66e-008 171 754 0.0416078 0.00985432 -1.71e-007 172 758 0.0413386 0.00164183 -4.44e-008 173 763 0.0410719 0.00979642 -1.63e-007 174 767 0.0408083 0.00163897 -4.24e-008 175 772 0.040547 0.00973983 -1.55e-007 176 776 0.0402887 0.00163616 -4.04e-008 177 781 0.0400326 0.00968449 -1.48e-007 178 785 0.0397795 0.00163339 -3.86e-008 179 790 0.0395285 0.00963035 -1.41e-007 180 794 0.0392803 0.00163066 -3.69e-008 181 799 0.0390343 0.00957739 -1.35e-007 182 803 0.0387909 0.00162798 -3.52e-008 183 808 0.0385497 0.00952555 -1.29e-007 184 812 0.0383111 0.00162534 -3.37e-008 185 817 0.0380745 0.0094748 -1.23e-007 186 821 0.0378404 0.00162275 -3.22e-008 187 826 0.0376083 0.00942511 -1.18e-007 188 830 0.0373787 0.00162019 -3.08e-008 189 835 0.037151 0.00937643 -1.13e-007 190 839 0.0369257 0.00161767 -2.95e-008 191 844 0.0367023 0.00932873 -1.08e-007 192 848 0.0364811 0.00161518 -2.83e-008 193 853 0.0362618 0.00928199 -1.03e-007 194 857 0.0360448 0.00161274 -2.71e-008 195 862 0.0358295 0.00923617 -9.89e-008 196 866 0.0356165 0.00161033 -2.6e-008 197 871 0.0354051 0.00919125 -9.48e-008 198 875 0.0351959 0.00160795 -2.49e-008 199 880 0.0349883 0.00914719 -9.09e-008 200 884 0.0347828 0.00160561 -2.39e-008 201 889 0.034579 0.00910397 -8.72e-008 202 893 0.0343772 0.0016033 -2.29e-008 203 898 0.034177 0.00906156 -8.36e-008 204 902 0.0339787 0.00160103 -2.2e-008 205 907 0.033782 0.00901994 -8.03e-008 206 911 0.0335872 0.00159878 -2.12e-008 207 916 0.0333939 0.00897909 -7.71e-008 208 920 0.0332025 0.00159657 -2.03e-008 209 925 0.0330125 0.00893898 -7.4e-008 210 929 0.0328244 0.00159439 -1.96e-008 211 934 0.0326377 0.00889959 -7.11e-008 212 938 0.0324527 0.00159223 -1.88e-008 213 943 0.0322692 0.0088609 -6.84e-008 214 947 0.0320874 0.00159011 -1.81e-008 215 952 0.0319069 0.0088229 -6.58e-008 216 956 0.0317282 0.00158801 -1.74e-008 217 961 0.0315507 0.00878555 -6.33e-008 218 965 0.031375 0.00158594 -1.68e-008 219 970 0.0312005 0.00874885 -6.09e-008 220 974 0.0310276 0.0015839 -1.61e-008 221 979 0.030856 0.00871278 -5.86e-008 222 983 0.0306859 0.00158188 -1.55e-008 223 988 0.0305171 0.00867731 -5.64e-008 224 992 0.0303498 0.00157989 -1.5e-008 225 997 0.0301837 0.00864244 -5.43e-008 226 1001 0.0300191 0.00157792 -1.44e-008 227 1006 0.0298557 0.00860814 -5.23e-008 228 1010 0.0296937 0.00157598 -1.39e-008 229 1015 0.0295329 0.00857441 -5.04e-008 230 1019 0.0293735 0.00157406 -1.34e-008 231 1024 0.0292152 0.00854122 -4.86e-008 232 1028 0.0290583 0.00157217 -1.29e-008 233 1033 0.0289026 0.00850856 -4.69e-008 234 1037 0.0287481 0.00157029 -1.25e-008 235 1042 0.0285948 0.00847643 -4.52e-008 236 1046 0.0284428 0.00156844 -1.21e-008 237 1051 0.0282918 0.0084448 -4.36e-008 238 1055 0.0281421 0.00156662 -1.16e-008 239 1060 0.0279935 0.00841366 -4.21e-008 240 1064 0.0278461 0.00156481 -1.12e-008 241 1069 0.0276997 0.00838301 -4.06e-008 242 1073 0.0275546 0.00156303 -1.09e-008 243 1078 0.0274104 0.00835282 -3.93e-008 244 1082 0.0272674 0.00156126 -1.05e-008 245 1087 0.0271255 0.00832309 -3.79e-008 246 1091 0.0269847 0.00155952 -1.01e-008 247 1096 0.0268448 0.00829381 -3.66e-008 248 1100 0.0267061 0.0015578 -9.8e-009 249 1105 0.0265683 0.00826497 -3.54e-008 250 1109 0.0264317 0.00155609 -9.47e-009 251 1114 0.0262959 0.00823655 -3.42e-008 Maximum number of iterations exceeded; increase options.MaxIter Warning steepdesc: reached maximum number of iterations! xo = 0.8387 0.7018 Ot = 0.0263 nS = 1115 [xo,Ot,nS]=newton('test1',[-1.2 1],-1,'gtest1') xo = 1.0000 1.0000 Ot = 4.5475e-024 nS = 35 [xo,Ot,nS]=newtont('test1',[-1.2 1],-1,'gtest1') ??? Error using ==> figure Argument must be a valid figure handle. Error in ==> v:\cursos\pos\otimiza\aulas\newtont.m On line 58 ==> figure(abs(ip)); help newtont Unconstrained optimization using Newton with trust region. [xo,Ot,nS]=newtont(S,x0,G,ip,H,Lb,Ub,problem,tol,mxit) S: objective function x0: initial point G: gradient vector function ip: (0) no plot (default), (>0) plot figure ip with pause, (<0) plot figure ip H: Hessian matrix function (default = sparse finite-differences) Lb, Ub: lower and upper bound vectors to plot (default = x0*(1+/-2)) problem: (-1): minimum (default), (1): maximum tol: tolerance (default = 1e-4) mxit: maximum number of iterations (default = 50*(1+4*~(ip>0))) xo: optimal point Ot: optimal value of S nS: number of objective function evaluations [xo,Ot,nS]=newtont('test1',[-1.2 1],'gtest1',-1) xo = 0.9998 0.9996 Ot = 5.5868e-008 nS = 27 [xo,Ot,nS]=bfgs('test1',[-1.2 1],-1,'gtest1') Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 2 4 4.12814 0.000786792 -68.5 3 8 3.85717 0.140499 -0.23 4 12 3.59745 1.42893 -0.0302 5 17 2.53212 3.26342 -0.03 6 21 2.30913 0.322518 -1.07e-006 7 25 2.00527 0.849632 -0.296 8 29 1.84151 0.173995 -0.5 9 33 1.67682 0.878415 -0.0201 10 39 0.868237 5.07515 -0.0428 11 43 0.82685 0.24347 -1e-005 12 47 0.668453 0.93938 -0.129 13 51 0.589839 0.284767 -0.0882 14 56 0.463144 1.65056 -0.00483 15 60 0.31572 1.12036 -9.57e-005 16 64 0.221455 0.759373 -0.017 17 68 0.197201 0.842636 -0.000904 18 74 0.0420617 4.35001 -0.00941 19 78 0.0380113 0.338559 -3.48e-008 20 82 0.0181601 1.28044 -0.00661 21 86 0.0114946 0.893086 -0.000327 22 91 0.00268658 1.43795 -2.44e-005 23 95 0.000603063 0.788611 -8.17e-005 24 100 6.06076e-005 2.27117 -6.87e-006 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun xo = 1.0004 1.0000 Ot = 6.0608e-005 nS = 101 [xo,Ot,nS]=dfp('test1',[-1.2 1],-1,'gtest1') Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 2 4 4.12814 0.000786792 -68.5 3 8 3.85717 0.140507 -0.23 4 12 3.60311 1.53 -0.027 5 17 3.02074 5.35075 -0.0153 6 23 1.737 17.1805 -0.0156 7 27 1.52148 2.78028 0.000801 8 31 1.3912 1 -0.127 9 35 1.26262 0.0277262 -3.8 10 39 1.19888 0.387583 -0.00342 11 44 1.17421 33.4478 -3.7e-006 12 49 0.92106 330.109 -0.000135 13 53 0.918667 0.54583 -0.00241 14 58 0.915796 2.49044 -1.34e-006 15 65 0.274341 1472.81 -0.000336 Warning: Matrix is close to singular or badly scaled. Results may be inaccurate. RCOND = 1.016308e-019. > In C:\Apps\Matlab\toolbox\optim\cubici2.m at line 10 In C:\Apps\Matlab\toolbox\optim\searchq.m at line 71 In v:\cursos\pos\otimiza\aulas\fminusub.m at line 283 In v:\cursos\pos\otimiza\aulas\fminunc1.m at line 220 In v:\cursos\pos\otimiza\aulas\varmetr.m at line 115 In v:\cursos\pos\otimiza\aulas\dfp.m at line 51 16 69 0.266194 0.0538338 -0.15 17 73 0.189802 0.155923 -0.231 18 78 0.184744 0.471964 -2.61e-006 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun xo = 0.5864 0.3322 Ot = 0.1846 nS = 79 [xo,Ot,nS]=gmurray('test1',[-1.2 1],-1,'gtest1') Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 2 4 4.12814 0.000786792 -68.5 3 8 3.85717 0.140499 -0.23 4 12 3.59745 1.42893 -0.0302 5 17 2.53212 3.26342 -0.03 6 21 2.30913 0.322518 -1.07e-006 7 25 2.00527 0.849632 -0.296 8 29 1.84151 0.173995 -0.5 9 33 1.67682 0.878415 -0.0201 10 39 0.868237 5.07515 -0.0428 11 43 0.82685 0.24347 -1e-005 12 47 0.668453 0.93938 -0.129 13 51 0.589839 0.284767 -0.0882 14 56 0.463144 1.65056 -0.00483 15 60 0.31572 1.12036 -9.57e-005 16 64 0.221455 0.759373 -0.017 17 68 0.197201 0.842636 -0.000904 18 74 0.0420617 4.35001 -0.00941 19 78 0.0380113 0.338559 -3.48e-008 20 82 0.0181601 1.28044 -0.00661 21 86 0.0114946 0.893086 -0.000327 22 91 0.00268658 1.43795 -2.44e-005 23 95 0.000603063 0.788611 -8.17e-005 24 100 6.06076e-005 2.27117 -6.87e-006 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun xo = 1.0004 1.0000 Ot = 6.0608e-005 nS = 101 [xo,Ot,nS]=cgrad('test1',[-1.2 1],-1,'gtest1') xo = 1.0000 1.0000 Ot = 3.1054e-010 nS = 507 H=[0 1;0 1] H = 0 1 0 1 H=[0 1;1 0] H = 0 1 1 0 eig(H) ans = -1.0000 1.0000 diary off