-
Notifications
You must be signed in to change notification settings - Fork 552
Is there a way to restrict the parameter space differently at each iteration? #730
Comments
You can always create a new instance of |
First of: @iaroslav-ai, you really rock tonight! I've never seen as many updates on one night. But, with this question: Setting a new instance of the Optimizer with one of the previous dimensions left out (to mimic that a parameter in that dimension is set to a set-point) - would this not remodel everything - and "forget" about potential cross-interactions between any of the other remaining dimensions. |
Ah I see. I understood that some dimensions are to be dropped; But I guess that you just want to fix some, and calculate acquisition function over the rest of parameters? I think I had a code snippet that does someting similar, I take a look. But I remember that it was not straightforward at least |
Thanks for a positive comment btw :) |
Hm the best I can come up with is #325 and this corresponding snippet. What this does though is allow a fixed "context" vector, which is fixed for optimization of acquisition function. This is done in context of Transfer Learning. Maybe something for you to look at, but not exactly fixing arbitrary dimensions of the acquisition function / surrogate, but only values in the context vector. I guess if I was in your position I would look at the latest surrogate that Beyond that, I do not have anything better at the moment :/ |
Would a simple API update be to allow the caller to constrain the space of expected_minimum_random_sampling to a subspace on a subsequent call? |
I'd like to only compute acquisition function values for a subset of the parameters within the total parameter space. Is there an easy way for me to do this? In other words, could I restrict the parameter space considered to some input set of parameters?
The text was updated successfully, but these errors were encountered: