help bfgs Unconstrained optimization using BFGS. [xo,Ot,nS]=bfgs(S,x0,ip,G,method,Lb,Ub,problem,tol,mxit) S: objective function x0: initial point ip: (0) no plot (default), (>0) plot figure ip with pause, (<0) plot figure ip G: gradient vector function method: line-search method: (0) quadratic+cubic (default), (1) cubic Lb, Ub: lower and upper bound vectors to plot (default = x0*(1+/-2)) problem: (-1): minimum (default), (1): maximum tol: tolerance (default = 1e-4) mxit: maximum number of iterations (default = 50*(1+4*~(ip>0))) xo: optimal point Ot: optimal value of S nS: number of objective function evaluations [xo,Ot,nS]=bfgs('test1',[-1.2 1],1,'gtest1',0) Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 Pause: hit any key to continue... 2 4 4.12814 0.000786792 -68.5 Pause: hit any key to continue... 3 8 3.85717 0.140499 -0.23 Pause: hit any key to continue... 4 12 3.59745 1.42893 -0.0302 Pause: hit any key to continue... 5 17 2.53212 3.26342 -0.03 Pause: hit any key to continue... 6 21 2.30913 0.322518 -1.07e-006 Pause: hit any key to continue... 7 25 2.00527 0.849632 -0.296 Pause: hit any key to continue... 8 29 1.84151 0.173995 -0.5 Pause: hit any key to continue... 9 33 1.67682 0.878415 -0.0201 Pause: hit any key to continue... 10 39 0.868237 5.07515 -0.0428 Pause: hit any key to continue... 11 43 0.82685 0.24347 -1e-005 Pause: hit any key to continue... 12 47 0.668453 0.93938 -0.129 Pause: hit any key to continue... 13 51 0.589839 0.284767 -0.0882 Pause: hit any key to continue... 14 56 0.463144 1.65056 -0.00483 Pause: hit any key to continue... 15 60 0.31572 1.12036 -9.57e-005 Pause: hit any key to continue... 16 64 0.221455 0.759373 -0.017 Pause: hit any key to continue... 17 68 0.197201 0.842636 -0.000904 Pause: hit any key to continue... 18 74 0.0420617 4.35001 -0.00941 Pause: hit any key to continue... 19 78 0.0380113 0.338559 -3.48e-008 Pause: hit any key to continue... 20 82 0.0181601 1.28044 -0.00661 Pause: hit any key to continue... 21 86 0.0114946 0.893086 -0.000327 Pause: hit any key to continue... 22 91 0.00268658 1.43795 -2.44e-005 Pause: hit any key to continue... 23 95 0.000603063 0.788611 -8.17e-005 Pause: hit any key to continue... 24 100 6.06076e-005 2.27117 -6.87e-006 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun Pause: hit any key to continue... xo = 1.0004 1.0000 Ot = 6.0608e-005 nS = 101 [xo,Ot,nS]=bfgs('test1',[-1.2 1],1,'gtest1',1) Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 Pause: hit any key to continue... 2 1 5.35291 0.001 1.13e+004 Pause: hit any key to continue... 3 2 5.32771 0.001 -25.2 Pause: hit any key to continue... 4 3 3.29646 0.141724 14.3 Pause: hit any key to continue... 5 4 3.21191 0.141724 -0.462 Pause: hit any key to continue... 6 5 3.11385 0.502766 -0.163 Pause: hit any key to continue... 7 6 2.86325 1.19523 -0.0918 Pause: hit any key to continue... 8 7 2.49846 1.58118 -0.231 Pause: hit any key to continue... 9 8 9.58521e+007 1.81467 2.11e+008 Pause: hit any key to continue... 10 9 1.18328e+006 0.602782 7.81e+006 Pause: hit any key to continue... 11 10 14596.1 0.198829 2.89e+005 Pause: hit any key to continue... 12 11 179.331 0.0642512 1.07e+004 Pause: hit any key to continue... 13 12 2.16877 0.0100348 19.5 Pause: hit any key to continue... 14 13 2.16689 0.0100348 -0.187 Pause: hit any key to continue... 15 14 1.57032 1.18654 -0.176 Pause: hit any key to continue... 16 15 1.71001 1.90065 0.613 Pause: hit any key to continue... 17 16 1.36574 0.806933 8.2e-005 Pause: hit any key to continue... 18 17 1.08679 0.806933 -0.0848 Pause: hit any key to continue... 19 18 1.09516 1.06007 0.325 Pause: hit any key to continue... 20 19 1.02385 0.571952 -0.00123 Pause: hit any key to continue... 21 20 0.88253 0.689483 -0.174 Pause: hit any key to continue... 22 21 0.624889 1.22037 -0.133 Pause: hit any key to continue... 23 22 0.465288 1.5683 -0.0296 Pause: hit any key to continue... 24 23 463.038 1.92081 931 Pause: hit any key to continue... 25 24 5.27063 0.59427 33.3 Pause: hit any key to continue... 26 25 0.311303 0.207996 0.827 Pause: hit any key to continue... 27 26 0.29764 0.207996 -0.0535 Pause: hit any key to continue... 28 27 0.242475 0.852337 -0.0476 Pause: hit any key to continue... 29 28 0.161885 1.07807 -0.0595 Pause: hit any key to continue... 30 29 0.255937 1.17457 0.483 Pause: hit any key to continue... 31 30 0.126783 0.530701 0.00391 Pause: hit any key to continue... 32 31 0.103763 0.530701 -0.0381 Pause: hit any key to continue... 33 32 0.0541126 1.04402 -0.0326 Pause: hit any key to continue... 34 33 0.0331203 1.11462 -0.00316 Pause: hit any key to continue... 35 34 0.018051 1.14837 0.0183 Pause: hit any key to continue... 36 35 0.0521269 0.63614 0.364 Pause: hit any key to continue... 37 36 4.37505e-006 0.299816 0.00105 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun Pause: hit any key to continue... xo = 0.9984 0.9968 Ot = 3.5611e-006 nS = 37 [xo,Ot,nS]=dfp('test1',[-1.2 1],1,'gtest1',0) Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 Pause: hit any key to continue... 2 4 4.12814 0.000786792 -68.5 Pause: hit any key to continue... 3 8 3.85717 0.140507 -0.23 Pause: hit any key to continue... 4 12 3.60311 1.53 -0.027 Pause: hit any key to continue... 5 17 3.02074 5.35075 -0.0153 Pause: hit any key to continue... 6 23 1.737 17.1805 -0.0156 Pause: hit any key to continue... 7 27 1.52148 2.78028 0.000801 Pause: hit any key to continue... 8 31 1.3912 1 -0.127 Pause: hit any key to continue... 9 35 1.26262 0.0277262 -3.8 Pause: hit any key to continue... 10 39 1.19888 0.387583 -0.00342 Pause: hit any key to continue... 11 44 1.17421 33.4478 -3.7e-006 Pause: hit any key to continue... 12 49 0.92106 330.109 -0.000135 Pause: hit any key to continue... 13 53 0.918667 0.54583 -0.00241 Pause: hit any key to continue... 14 58 0.915796 2.49044 -1.34e-006 Pause: hit any key to continue... 15 65 0.274341 1472.81 -0.000336 Warning: Matrix is close to singular or badly scaled. Results may be inaccurate. RCOND = 1.016308e-019. > In C:\Apps\Matlab\toolbox\optim\cubici2.m at line 10 In C:\Apps\Matlab\toolbox\optim\searchq.m at line 71 In v:\cursos\pos\otimiza\aulas\fminusub.m at line 283 In v:\cursos\pos\otimiza\aulas\fminunc1.m at line 220 In v:\cursos\pos\otimiza\aulas\varmetr.m at line 115 In v:\cursos\pos\otimiza\aulas\dfp.m at line 51 Pause: hit any key to continue... 16 69 0.266194 0.0538338 -0.15 Pause: hit any key to continue... 17 73 0.189802 0.155923 -0.231 Pause: hit any key to continue... 18 78 0.184744 0.471964 -2.61e-006 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun Pause: hit any key to continue... xo = 0.5864 0.3322 Ot = 0.1846 nS = 79 [xo,Ot,nS]=dfp('test1',[-1.2 1],1,'gtest1',1) Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 Pause: hit any key to continue... 2 1 5.35291 0.001 1.13e+004 Pause: hit any key to continue... 3 2 5.32772 0.001 -25.2 Pause: hit any key to continue... 4 3 3.29644 0.14178 14.2 Pause: hit any key to continue... 5 4 3.21189 0.14178 -0.462 Pause: hit any key to continue... 6 5 3.1212 0.502803 -0.15 Pause: hit any key to continue... 7 6 2.94901 1.18056 -0.0359 Pause: hit any key to continue... 8 7 2.84428 1.47809 -0.0686 Pause: hit any key to continue... 9 8 12.5183 1.55478 33.9 Pause: hit any key to continue... 10 9 2.04884 0.721974 0.464 Pause: hit any key to continue... 11 10 2.06792 0.721974 0.125 Pause: hit any key to continue... ??? Error using ==> inline/feval Error in inline expression ==> [feval('gtest1',x);xplot(x,1)*[]] ??? Attempt to reference field of non-structure array 'ud'. Error in ==> v:\cursos\pos\otimiza\aulas\fminusub.m On line 309 ==> GRAD(:) = feval(funfcn{4},x,varargin{:}); Error in ==> v:\cursos\pos\otimiza\aulas\fminunc1.m On line 220 ==> [x,FVAL,GRAD,HESSIAN,EXITFLAG,OUTPUT] = fminusub(funfcn,x,verbosity,options,f,GRAD,HESS,varargin{:}); Error in ==> v:\cursos\pos\otimiza\aulas\varmetr.m On line 115 ==> [xo,yo,exitflag,out] = fminunc1(fun,x0,op); Error in ==> v:\cursos\pos\otimiza\aulas\dfp.m On line 51 ==> [xo,Ot,nS]=varmetr('dfp',S,x0,ip,G,method,Lb,Ub,problem,tol,mxit); [xo,Ot,nS]=dfp('test1',[-1.2 1],1,'gtest1',1) Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 Pause: hit any key to continue... 2 1 5.35291 0.001 1.13e+004 Pause: hit any key to continue... 3 2 5.32772 0.001 -25.2 Pause: hit any key to continue... 4 3 3.29644 0.14178 14.2 Pause: hit any key to continue... 5 4 3.21189 0.14178 -0.462 Pause: hit any key to continue... 6 5 3.1212 0.502803 -0.15 Pause: hit any key to continue... 7 6 2.94901 1.18056 -0.0359 Pause: hit any key to continue... 8 7 2.84428 1.47809 -0.0686 Pause: hit any key to continue... 9 8 12.5183 1.55478 33.9 Pause: hit any key to continue... 10 9 2.04884 0.721974 0.464 Pause: hit any key to continue... 11 10 2.06792 0.721974 0.125 Pause: hit any key to continue... 12 11 2.03719 0.260082 0.000174 Pause: hit any key to continue... 13 12 2.03495 0.260082 -0.00861 Pause: hit any key to continue... 14 13 1.56862 1.00862 -0.165 Pause: hit any key to continue... 15 14 1.28289 1.5026 -0.145 Pause: hit any key to continue... 16 15 0.861735 1.80188 -0.141 Pause: hit any key to continue... 17 16 0.490806 2.33911 -0.112 Pause: hit any key to continue... 18 17 35.5283 2.77431 42.4 Pause: hit any key to continue... 19 18 0.63703 0.511043 1.07 Pause: hit any key to continue... 20 19 0.470347 0.158879 0.00573 Pause: hit any key to continue... 21 20 0.449176 0.158879 -0.127 Pause: hit any key to continue... 22 21 0.299421 1.12921 -0.00149 Pause: hit any key to continue... 23 22 0.218529 1.36127 -0.0241 Pause: hit any key to continue... 24 23 0.141688 1.48423 -0.0354 Pause: hit any key to continue... 25 24 0.622197 1.55972 1.17 Pause: hit any key to continue... 26 25 0.121707 0.476424 0.0276 Pause: hit any key to continue... 27 26 0.110692 0.476424 -0.021 Pause: hit any key to continue... 28 27 0.0697679 1.02302 -0.0188 Pause: hit any key to continue... 29 28 0.0483445 1.08875 -0.0181 Pause: hit any key to continue... 30 29 0.559014 1.11074 1.92 Pause: hit any key to continue... 31 30 0.0282031 0.396686 0.0519 Pause: hit any key to continue... 32 31 0.0244297 0.396686 -0.00697 Pause: hit any key to continue... 33 32 0.0179639 1.00902 -0.00393 Pause: hit any key to continue... 34 33 0.0133689 1.01796 -0.0031 Pause: hit any key to continue... 35 34 0.0083809 1.0241 -0.00413 Pause: hit any key to continue... 36 35 0.00051408 1.03004 0.00251 Pause: hit any key to continue... 37 36 0.000474518 0.942261 0.000382 Pause: hit any key to continue... 38 37 0.000379604 0.504729 -0.000152 Pause: hit any key to continue... 39 38 0.00025264 1.00019 -2.55e-005 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun Pause: hit any key to continue... xo = 0.9910 0.9810 Ot = 2.0704e-004 nS = 39 [xo,Ot,nS]=gmurray('test1',[-1.2 1],1,'gtest1',1) Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 Pause: hit any key to continue... 2 1 5.35291 0.001 1.13e+004 Pause: hit any key to continue... 3 2 5.32771 0.001 -25.2 Pause: hit any key to continue... 4 3 3.29646 0.141724 14.3 Pause: hit any key to continue... 5 4 3.21191 0.141724 -0.462 Pause: hit any key to continue... 6 5 3.11385 0.502766 -0.163 Pause: hit any key to continue... 7 6 2.86325 1.19523 -0.0918 Pause: hit any key to continue... 8 7 2.49846 1.58118 -0.231 Pause: hit any key to continue... 9 8 9.58526e+007 1.81467 2.11e+008 Pause: hit any key to continue... 10 9 1.18328e+006 0.602782 7.81e+006 Pause: hit any key to continue... 11 10 14596.1 0.198829 2.89e+005 Pause: hit any key to continue... 12 11 179.332 0.0642512 1.07e+004 Pause: hit any key to continue... 13 12 3.82696 0.0200696 375 Pause: hit any key to continue... 14 13 2.15786 0.0090659 3.39 Pause: hit any key to continue... 15 14 2.15637 0.0090659 -0.165 Pause: hit any key to continue... 16 15 1.55632 1.16488 -0.373 Pause: hit any key to continue... 17 16 165.046 2 296 Pause: hit any key to continue... 18 17 2.77552 0.517206 9.66 Pause: hit any key to continue... 19 18 1.42371 0.177818 0.127 Pause: hit any key to continue... 20 19 1.39109 0.177818 -0.177 Pause: hit any key to continue... 21 20 0.957415 1.17951 -0.222 Pause: hit any key to continue... 22 21 0.640296 1.8099 -0.136 Pause: hit any key to continue... 23 22 61.4703 2.16058 96.8 Pause: hit any key to continue... 24 23 0.988373 0.451133 2.75 Pause: hit any key to continue... 25 24 0.609903 0.1288 0.0228 Pause: hit any key to continue... 26 25 0.579598 0.1288 -0.222 Pause: hit any key to continue... 27 26 0.374339 1.22572 -0.0366 Pause: hit any key to continue... 28 27 0.341694 1.56322 0.145 Pause: hit any key to continue... 29 28 0.210899 0.839353 0.614 Pause: hit any key to continue... 30 29 0.181562 0.839353 0.14 Pause: hit any key to continue... 31 30 0.145518 0.839353 -0.0361 Pause: hit any key to continue... 32 31 0.580276 1.04551 1.64 Pause: hit any key to continue... 33 32 0.11761 0.342427 0.0317 Pause: hit any key to continue... 34 33 0.112775 0.342427 -0.0135 Pause: hit any key to continue... 35 34 0.0552936 1.01393 -0.035 Pause: hit any key to continue... 36 35 0.031029 1.10117 -0.0165 Pause: hit any key to continue... 37 36 0.016528 1.1316 0.00228 Pause: hit any key to continue... 38 37 0.0101117 1 0.0225 Pause: hit any key to continue... 39 38 4.59945e-005 0.562411 -0.00259 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun Pause: hit any key to continue... xo = 1.0015 1.0027 Ot = 9.0766e-006 nS = 39 [xo,Ot,nS]=gmurray('test1',[-1.2 1],1,'gtest1',0) Pause: hit any key to continue... Directional Iteration Func-count f(x) Step-size derivative 1 0 24.2 0.001 -5.42e+004 Pause: hit any key to continue... 2 4 4.12814 0.000786792 -68.5 Pause: hit any key to continue... 3 8 3.85717 0.140499 -0.23 Pause: hit any key to continue... 4 12 3.59745 1.42893 -0.0302 Pause: hit any key to continue... 5 17 2.53212 3.26342 -0.03 Pause: hit any key to continue... 6 21 2.30913 0.322518 -1.07e-006 Pause: hit any key to continue... 7 25 2.00527 0.849632 -0.296 Pause: hit any key to continue... 8 29 1.84151 0.173995 -0.5 Pause: hit any key to continue... 9 33 1.67682 0.878415 -0.0201 Pause: hit any key to continue... 10 39 0.868237 5.07515 -0.0428 Pause: hit any key to continue... 11 43 0.82685 0.24347 -1e-005 Pause: hit any key to continue... 12 47 0.668453 0.93938 -0.129 Pause: hit any key to continue... 13 51 0.589839 0.284767 -0.0882 Pause: hit any key to continue... 14 56 0.463144 1.65056 -0.00483 Pause: hit any key to continue... 15 60 0.31572 1.12036 -9.57e-005 Pause: hit any key to continue... 16 64 0.221455 0.759373 -0.017 Pause: hit any key to continue... 17 68 0.197201 0.842636 -0.000904 Pause: hit any key to continue... 18 74 0.0420617 4.35001 -0.00941 Pause: hit any key to continue... 19 78 0.0380113 0.338559 -3.48e-008 Pause: hit any key to continue... 20 82 0.0181601 1.28044 -0.00661 Pause: hit any key to continue... 21 86 0.0114946 0.893086 -0.000327 Pause: hit any key to continue... 22 91 0.00268658 1.43795 -2.44e-005 Pause: hit any key to continue... 23 95 0.000603063 0.788611 -8.17e-005 Pause: hit any key to continue... 24 100 6.06076e-005 2.27117 -6.87e-006 Optimization terminated successfully: Current search direction is a descent direction, and magnitude of directional derivative in search direction less than 2*options.TolFun Pause: hit any key to continue... xo = 1.0004 1.0000 Ot = 6.0608e-005 nS = 101 op=optimiset('fminunc') ??? Undefined function or variable 'optimiset'. op=optimset('fminunc') op = ActiveConstrTol: [] DerivativeCheck: 'off' Diagnostics: 'off' DiffMaxChange: 0.1000 DiffMinChange: 1.0000e-008 Display: 'final' GoalsExactAchieve: [] GradConstr: [] GradObj: 'off' Hessian: 'off' HessMult: [] HessPattern: 'sparse(ones(numberOfVariables))' HessUpdate: 'bfgs' Jacobian: [] JacobMult: [] JacobPattern: [] LargeScale: 'on' LevenbergMarquardt: [] LineSearchType: 'quadcubic' MaxFunEvals: '100*numberOfVariables' MaxIter: 400 MaxPCGIter: 'max(1,floor(numberOfVariables/2))' MeritFunction: [] MinAbsMax: [] Preconditioner: [] PrecondBandWidth: 0 ShowStatusWindow: [] TolCon: [] TolFun: 1.0000e-006 TolPCG: 0.1000 TolX: 1.0000e-006 TypicalX: 'ones(numberOfVariables,1)' op=optimset('fminsearch') op = ActiveConstrTol: [] DerivativeCheck: [] Diagnostics: [] DiffMaxChange: [] DiffMinChange: [] Display: 'final' GoalsExactAchieve: [] GradConstr: [] GradObj: [] Hessian: [] HessMult: [] HessPattern: [] HessUpdate: [] Jacobian: [] JacobMult: [] JacobPattern: [] LargeScale: [] LevenbergMarquardt: [] LineSearchType: [] MaxFunEvals: '200*numberOfVariables' MaxIter: '200*numberOfVariables' MaxPCGIter: [] MeritFunction: [] MinAbsMax: [] Preconditioner: [] PrecondBandWidth: [] ShowStatusWindow: [] TolCon: [] TolFun: 1.0000e-004 TolPCG: [] TolX: 1.0000e-004 TypicalX: [] op=optimset('fsolve') op = ActiveConstrTol: [] DerivativeCheck: 'off' Diagnostics: 'off' DiffMaxChange: 0.1000 DiffMinChange: 1.0000e-008 Display: 'final' GoalsExactAchieve: [] GradConstr: [] GradObj: [] Hessian: [] HessMult: [] HessPattern: [] HessUpdate: [] Jacobian: 'off' JacobMult: [] JacobPattern: [] LargeScale: 'on' LevenbergMarquardt: 'off' LineSearchType: 'quadcubic' MaxFunEvals: '100*numberOfVariables' MaxIter: 400 MaxPCGIter: 'max(1,floor(numberOfVariables/2))' MeritFunction: [] MinAbsMax: [] Preconditioner: [] PrecondBandWidth: 0 ShowStatusWindow: [] TolCon: [] TolFun: 1.0000e-006 TolPCG: 0.1000 TolX: 1.0000e-006 TypicalX: 'ones(numberOfVariables,1)' op=optimset('fminunc') op = ActiveConstrTol: [] DerivativeCheck: 'off' Diagnostics: 'off' DiffMaxChange: 0.1000 DiffMinChange: 1.0000e-008 Display: 'final' GoalsExactAchieve: [] GradConstr: [] GradObj: 'off' Hessian: 'off' HessMult: [] HessPattern: 'sparse(ones(numberOfVariables))' HessUpdate: 'bfgs' Jacobian: [] JacobMult: [] JacobPattern: [] LargeScale: 'on' LevenbergMarquardt: [] LineSearchType: 'quadcubic' MaxFunEvals: '100*numberOfVariables' MaxIter: 400 MaxPCGIter: 'max(1,floor(numberOfVariables/2))' MeritFunction: [] MinAbsMax: [] Preconditioner: [] PrecondBandWidth: 0 ShowStatusWindow: [] TolCon: [] TolFun: 1.0000e-006 TolPCG: 0.1000 TolX: 1.0000e-006 TypicalX: 'ones(numberOfVariables,1)' op=optimset(op, 'MaxIter',1000) op = ActiveConstrTol: [] DerivativeCheck: 'off' Diagnostics: 'off' DiffMaxChange: 0.1000 DiffMinChange: 1.0000e-008 Display: 'final' GoalsExactAchieve: [] GradConstr: [] GradObj: 'off' Hessian: 'off' HessMult: [] HessPattern: 'sparse(ones(numberOfVariables))' HessUpdate: 'bfgs' Jacobian: [] JacobMult: [] JacobPattern: [] LargeScale: 'on' LevenbergMarquardt: [] LineSearchType: 'quadcubic' MaxFunEvals: '100*numberOfVariables' MaxIter: 1000 MaxPCGIter: 'max(1,floor(numberOfVariables/2))' MeritFunction: [] MinAbsMax: [] Preconditioner: [] PrecondBandWidth: 0 ShowStatusWindow: [] TolCon: [] TolFun: 1.0000e-006 TolPCG: 0.1000 TolX: 1.0000e-006 TypicalX: 'ones(numberOfVariables,1)' help fminunc FMINUNC Finds the minimum of a function of several variables. X=FMINUNC(FUN,X0) starts at the point X0 and finds a minimum X of the function described in FUN. X0 can be a scalar, vector or matrix. The function FUN (usually an M-file or inline object) should return a scalar function value F evaluated at X when called with feval: F=feval(FUN,X). See the examples below for more about FUN. X=FMINUNC(FUN,X0,OPTIONS) minimizes with the default optimization parameters replaced by values in the structure OPTIONS, an argument created with the OPTIMSET function. See OPTIMSET for details. Used options are Display, TolX, TolFun, DerivativeCheck, Diagnostics, GradObj, HessPattern, LineSearchType, Hessian, HessUpdate, MaxFunEvals, MaxIter, DiffMinChange and DiffMaxChange, LargeScale, MaxPCGIter, PrecondBandWidth, TolPCG, TypicalX. Use the GradObj option to specify that FUN can be called with two output arguments where the second, G, is the partial derivatives of the function df/dX, at the point X: [F,G] = feval(FUN,X). Use Hessian to specify that FUN can be called with three output arguments where the second, G, is the partial derivatives of the function df/dX, and the third H is the 2nd partial derivatives of the function (the Hessian) at the point X: [F,G,H] = feval(FUN,X). The Hessian is only used by the large-scale method, not the line-search method. X=FMINUNC(FUN,X0,OPTIONS,P1,P2,...) passes the problem-dependent parameters P1,P2,... directly to the function FUN, e.g. FUN would be called using feval as in: feval(FUN,X,P1,P2,...). Pass an empty matrix for OPTIONS to use the default values. [X,FVAL]=FMINUNC(FUN,X0,...) returns the value of the objective function FUN at the solution X. [X,FVAL,EXITFLAG]=FMINUNC(FUN,X0,...) returns a string EXITFLAG that describes the exit condition of FMINUNC. If EXITFLAG is: > 0 then FMINUNC converged to a solution X. 0 then the maximum number of function evaluations was reached. < 0 then FMINUNC did not converge to a solution. [X,FVAL,EXITFLAG,OUTPUT]=FMINUNC(FUN,X0,...) returns a structure OUTPUT with the number of iterations taken in OUTPUT.iterations, the number of function evaluations in OUTPUT.funcCount, the algorithm used in OUTPUT.algorithm, the number of CG iterations (if used) in OUTPUT.cgiterations, and the first-order optimality (if used) in OUTPUT.firstorderopt. [X,FVAL,EXITFLAG,OUTPUT,GRAD]=FMINUNC(FUN,X0,...) returns the value of the gradient of FUN at the solution X. [X,FVAL,EXITFLAG,OUTPUT,GRAD,HESSIAN]=FMINUNC(FUN,X0,...) returns the value of the Hessian of the objective function FUN at the solution X. Examples Minimize the one dimensional function f(x) = sin(x) + 3: To use an M-file, i.e. FUN = 'myfun', create a file myfun.m: function f = myfun(x) f = sin(x)+3; Then call FMINUNC to find a minimum of FUN near 2: x = fminunc('myfun',2) To minimize this function with the gradient provided, modify the m-file myfun.m so the gradient is the second output argument: function [f,g]= myfun(x) f = sin(x) + 3; g = cos(x); and indicate the gradient value is available by creating an options structure with OPTIONS.GradObj set to 'on' (using OPTIMSET): options = optimset('GradObj','on'); x = fminunc('myfun',2,options); To minimize the function f(x) = sin(x) + 3 using an inline object: f = inline('sin(x)+3'); x = fminunc(f,2); To use inline objects for the function and gradient, FUN is a cell array of two inline objects where the first is the objective and the second is the gradient of the objective: options = optimset('GradObj','on'); x = fminunc({ inline('sin(x)+3'), inline('cos(x)') } ,2,options); what M-files in the current directory v:\cursos\pos\otimiza\aulas CATALIS ex_qp1 gtest2 nlp_internal test15 EXTRATOR ex_qp2 gtest9 powell test16 LUCRO ex_qp3 hkjeeves qpsub test17 MINQUA ex_swarm htest1 refino test18 MODELO fmincon1 htest2 restr test19 OPT_RES fminunc1 interior restr1 test2 PLANOS fminusub karmarkar restr14 test20 READ2 fun lmarqua restr15 test3 SEMIDEF gmilp1 lp_nlp restr16 test4 aurea gminlp1 milp1 restr17 test5 bandem1 gminlp2 minlp restr20 test6 bfgs gminlp3 minlp1 rosembr test7 bracket gminlp4 minlp2 set1 test8 buscarnd gminlp5 minlp3 setoptim test9 cgrad gminlp6 minlp4 sqp univar checkbounds gmodelagem minlp5 steepdes varmetr coggins gmurray minlp6 swarm writearq compdir grad minlpn test1 xplot complex grg modelagem test10 dfp gtest1 newton test11 dual gtest10 newton_h test12 ex2_karm gtest12 newtont test13 ex_karma gtest13 nlconst test14 setoptim ans = DerivativeCheck: 'off' Diagnostics: 'off' Display: 'final' GradObj: 'off' Hessian: 'off' HessUpdate: 'dfp' LargeScale: 'on' LineSearchType: 'quadcubic' DiffMaxChange: 0.1000 DiffMinChange: 1.0000e-008 MaxIter: 400 PrecondBandWidth: 0 TolFun: 1.0000e-006 TolPCG: 0.1000 TolX: 1.0000e-006 ans = DerivativeCheck: 'off' Diagnostics: 'off' Display: 'final' GradObj: 'off' Hessian: 'off' HessUpdate: 'dfp' LargeScale: 'on' LineSearchType: 'quadcubic' DiffMaxChange: 0.1000 DiffMinChange: 1.0000e-008 MaxIter: 400 PrecondBandWidth: 0 TolFun: 1.0000e-006 TolPCG: 0.1000 TolX: 1.0000e-006 what M-files in the current directory v:\cursos\pos\otimiza\aulas CATALIS ex_qp1 gtest2 nlp_internal test15 EXTRATOR ex_qp2 gtest9 powell test16 LUCRO ex_qp3 hkjeeves qpsub test17 MINQUA ex_swarm htest1 refino test18 MODELO fmincon1 htest2 restr test19 OPT_RES fminunc1 interior restr1 test2 PLANOS fminusub karmarkar restr14 test20 READ2 fun lmarqua restr15 test3 SEMIDEF gmilp1 lp_nlp restr16 test4 aurea gminlp1 milp1 restr17 test5 bandem1 gminlp2 minlp restr20 test6 bfgs gminlp3 minlp1 rosembr test7 bracket gminlp4 minlp2 set1 test8 buscarnd gminlp5 minlp3 setoptim test9 cgrad gminlp6 minlp4 sqp univar checkbounds gmodelagem minlp5 steepdes varmetr coggins gmurray minlp6 swarm writearq compdir grad minlpn test1 xplot complex grg modelagem test10 dfp gtest1 newton test11 dual gtest10 newton_h test12 ex2_karm gtest12 newtont test13 ex_karma gtest13 nlconst test14 who Your variables are: Ot ans nS op xo ans ans = DerivativeCheck: 'off' Diagnostics: 'off' Display: 'final' GradObj: 'off' Hessian: 'off' HessUpdate: 'dfp' LargeScale: 'on' LineSearchType: 'quadcubic' DiffMaxChange: 0.1000 DiffMinChange: 1.0000e-008 MaxIter: 400 PrecondBandWidth: 0 TolFun: 1.0000e-006 TolPCG: 0.1000 TolX: 1.0000e-006 setoptim ans ans = DerivativeCheck: 'off' Diagnostics: 'off' Display: 'final' GradObj: 'off' Hessian: 'off' HessUpdate: 'dfp' LargeScale: 'on' LineSearchType: 'quadcubic' DiffMaxChange: 0.1000 DiffMinChange: 1.0000e-008 MaxIter: 400 PrecondBandWidth: 0 TolFun: 1.0000e-006 TolPCG: 0.1000 TolX: 1.0000e-006