here is the well known Rosembrock test case: 2 variables, unconstraint problem, single (global) minimum.
Keep in mind that:
Hope it helps.
Keep in mind that:
- several evaluations/iterations are necessary; in your case, 1 iteration = 1 FEA
- gradient/hessian are numerically estimated (1 estimation = 1 simulation)
- your cost function is essential to succeed
- of course you can add constraints to bracket variables
Hope it helps.
from scipy.optimize import minimize # https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html def Rosembrock(x): # f(x, y) = (1 - x)**2 + 100*(y - x**2)**2 # minimum in (x, y) = (1, 1) return (1 - x[0])**2 + 100*(x[1] - x[0]**2)**2 # Method = 'NM' # "Nelder-Mead" # Method = 'BFGS' # "Broyden-Fletcher-Goldfarb-Shanno" # Method = 'CG' # "Conjugate-gradient" Method = 'SLSQP' # "Sequential Least SQuares Programming" # Starting point x0 = [2., 3] if (Method == 'NM'): # order 0 method (Nelder-Mead) #x0 = starting point Results = minimize(Rosembrock, x0, method='nelder-mead', options={'xatol': 1e-8, 'disp': False} ) else: # order 1 gradient based method (BFGS / CG / SLSQP) # Gradient and Hessian are numerically calculated => several evaluations are needed Results = minimize(Rosembrock, x0, method = Method, jac = '3-point', options={'disp': False} ) # Optimization results: print(f"Optimized value: {Results.x}") print(f"Cost function value after optimization: {Results.fun}") print(f"Number of iterations: {Results.nit}") print(f"Number of function evalutaions: {Results.nfev}") print(f"Status: {Results.message}")