3

I want to optimize only two parameters alpha and beta of a function and I am using scipy.optimize.minimize function with TNC algorithm. The objective function is mean square error of observed versus predicted. Both alpha and beta can vary between 0.1 and 0.99 during optimization. I observe that scipy.minimize is exploring only limited sample space. enter image description here

The error function behaved as following during optimization

enter image description here

Does this mean that the algorithm is stuck in local minima?

The optimization parameters are as following:

anfangwerte = np.array([0.9, 0.6])  # initial values of alpha and beta
grenze = (
    (0.1, 0.99),  # bounds for alpha
    (0.1, 0.99)   # bounds for beta
     )

res = minimize(model.objective_function,  # it returns mean square error
               x0=anfangwerte,
               method='TNC',       # 'TNC',
               tol=0.002,    #
               bounds=grenze,
               options={
                   'eps': 1e-2,      # default 1e-8
                   'scale': None,
                   'stepmax': 15,
                   'disp': True,
                   'maxiter': 100,
                   'accuracy': 0.00004,
                   'gtol': 0.00002,
                   'ftol': 0.00002
               }
               )

Why does the algorithm explore only limited sample space? Which parameter should I change for this?

By changing the initial values and changing optimization parameters slightly, the model does explore a lot more sample space. I used 0.1 and 0.99 as initial values of both parameters and sample space was explored as following enter image description here

Interestingly, the model still converged to same MSE i.e. 167. enter image description here

I still see a lot of sample space being unexplored. How can we assume that the function has found global minima?

Ather Cheema
  • 367
  • 1
  • 12

0 Answers0