python - Ineq and eq constraints with scipy.optimize.minimize() -


i attempting understand behavior of constraints in scipy.optimize.minimize:

first, create 4 assets , 100 scenarios of returns. average returning funds in order best worse d > b > > c

#seed first  np.random.seed(1)  df_returns = pd.dataframe(np.random.rand(100,4) - 0.25, columns =list('abcd')) df_returns.head()                b           c           d 0   0.167022    0.470324    -0.249886   0.052333 1   -0.103244   -0.157661   -0.063740   0.095561 2   0.146767    0.288817    0.169195    0.435220 3   -0.045548   0.628117    -0.222612   0.420468 4   0.167305    0.308690    -0.109613   -0.051899 

and set of weights

weights = pd.series([0.25, 0.25, 0.25, 0.25], index=list('abcd'))      0   0.25 b   0.25 c   0.25 d   0.25 

we create objective function:

def returns_objective_function(weights, df_returns):     result = -1. * (df_returns * weights).mean().sum()     return result 

and constraints , bounds

cons = ({'type': 'eq', 'fun': lambda weights: np.sum(weights) -1  }) bnds = ((0.01, .8), (0.01, .8), (0.01, .8), (0.01, .75)) 

let's optimize

optimize.minimize(returns_objective_function, weights, (df_returns),                               bounds=bnds, constraints=cons, method= 'slsqp')  , success.   status: 0  success: true     njev: 8     nfev: 48      fun: -0.2885398923185326        x: array([ 0.01,  0.23,  0.01,  0.75])  message: 'optimization terminated successfully.'      jac: array([-0.24384782, -0.2789166 , -0.21977262, -0.29300382,  0.        ])      nit: 8 

now wish add constraints starting basic inequality:

scipy.optimize.minimize documentation states

equality constraint means constraint function result 0 whereas inequality means non-negative.

cons = (          {'type': 'eq', 'fun': lambda weights: np.sum(weights) -1  }         ,{'type': 'ineq', 'fun': lambda weights: np.sum(weights) + x} ) 

depending on x, unexpected behavior.

x = -100 

based on bounds, weights can maximum of 3.15 and, of course, must sum 1 first equality constraint np.sum(weights) - 1, but, result, np.sum(weights) + x negative. believe no solution should found, yet scipy.optimize.minimize returns success.

with simpler model same behavior:

x = [1,2]  optimize.minimize(     lambda x: x[0]**2+x[1]**2,      x,      constraints = (         {'type':'eq','fun': lambda x: x[0]+x[1]-1},         {'type':'ineq','fun': lambda x: x[0]-2}                   ),     bounds = ((0,none),(0,none)),     method='slsqp') 

with results:

   nfev: 8     fun: 2.77777777777712     nit: 6     jac: array([  3.33333334e+00,   2.98023224e-08,   0.00000000e+00])       x: array([  1.66666667e+00,   1.39888101e-14]) success: true message: 'optimization terminated successfully.'  status: 0    njev: 2 

there should flag infeasible solution.

slsqp available r:

> slsqp(c(1,2), +       function(x) {x[1]^2+x[2]^2}, +       heq=function(x){x[1]+x[2]-1}, +       hin=function(x){x[1]-2}, +       lower=c(0,0)) $par [1] 1.666667e+00 4.773719e-11  $value [1] 2.777778  $iter [1] 105  $convergence [1] -4  $message [1] "nlopt_roundoff_limited: roundoff errors led breakdown of optimization algorithm. in case, returned minimum may still useful. (e.g. error occurs in newuoa if 1 tries achieve tolerance close machine precision.)" 

at least see warning signals here.


Comments

Popular posts from this blog

c - How to retrieve a variable from the Apache configuration inside the module? -

c# - Constructor arguments cannot be passed for interface mocks -

python - malformed header from script index.py Bad header -