-
Notifications
You must be signed in to change notification settings - Fork 552
Constrained optimization #355
Comments
The acquisition function already provides gradients (https://github.com/scikit-optimize/scikit-optimize/blob/master/skopt/acquisition.py#L264) which are utilized by lbfgs with bounds set on the parameters. Or is it something else that you are referring to? |
I think the constrained here refers to "find maximum of f(x, y) subject to x + y < 2". So not bounds on the parameters directly but on combinations of them. Some previous thoughts/attempts on handling "invalid" parameter combos (I think this is related to constrained optimization but please correct me if wrong) are in #199 and #249 |
Hey, I started using skopt recently and I really dig it. However, in order to be able to use it for all of my optimization problems, I'd need some kind of linear constraints s.t. |
I have the same need. For the moment, whenever a point is selected that is infeasible, i.e., it does not satisfy the constraints, i set the return value of the function evaluation to a "big number". But this method is not robust, as the particular value of this "big number" influences the performance of Bayesian Optimization. I think development here is needed |
This feature would be very much appreciated indeed! Any updates on constrained optimization? |
+1 Maybe @holgern has time for this in the future? |
Any update on this? I saw that there is an open pull request. Anything we can do to help? |
Since EI has an analytical formula, argmax(EI) should easily incorporate constraints. An example is weights, where sum(x's)=1, or points in a circle, where x^2+y^2<1. I've seen papers that use matlab's DIRECT algorithm.
As for python, scipy's COBYLA is a non-linearly constrained optimization algorithm. I'll try to see how it works when I have the time.
The text was updated successfully, but these errors were encountered: