8000 [MRG] ENH avoid deepcopy when a parameter is declared immutable by glemaitre · Pull Request #16185 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

[MRG] ENH avoid deepcopy when a parameter is declared immutable #16185

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 17 commits into from

Conversation

glemaitre
Copy link
Member

closes #5563
closes #9696
closes #16137

Using estimator tags to avoid a deep copy of parameters when cloning.

@glemaitre glemaitre changed the title [WIP] ENH avoid deepcopy when a parameter is declared mutable [WIP] ENH avoid deepcopy when a parameter is declared immutable Jan 23, 2020
@glemaitre glemaitre changed the title [WIP] ENH avoid deepcopy when a parameter is declared immutable [MRG] ENH avoid deepcopy when a parameter is declared immutable Jan 23, 2020
@glemaitre
Copy link
Member Author

@ogrisel proposed to use estimator_tags to safely handle cases where one would like to not trigger a copy of parameter (as his own risk). We could add immutable_params tag for which could be a list of parameters for which only an assignment will be done when calling clone instead of copy.

ping @amueller @jnothman @GaelVaroquaux

Copy link
Member
@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs documentation, at least, ideally including a good example of when you'd use the tag.

@glemaitre
Copy link
Member Author

I added some documentation. @jnothman Do you think that I should an extra example with an hand-crafted estimator? If yes, I don't exactly know which section I should add such example.

@amueller
Copy link
Member

This also implements estimator freezing, right? #8370
And could more directly solve #6451 by not cloning any prefit estimators?

@glemaitre
Copy link
Member Author

This also implements estimator freezing, right? #8370

It will not solve these problems. It will only not copy the unfitted parameter if declared as immutable with the tag.

And could more directly solve #6451 by not cloning any prefit estimators?

Still not since it works at the level of the unfitted parameters only.

For this particular case, would it make sense to make a deepcopy of the full estimator keeping the fitted parameters? If we would have a mechanism to add a tag at runtime, it would be possible to change the behaviour of clone?

@amueller
Copy link
Member
amueller commented Feb 24, 2020

@glemaitre I think I understand the freezing part: it's not enough to not clone because we'll call fit later, and it's a bit unclear how to change tags during runtime.

I don't understand the prefit part. The nested estimator will never be fit when using the meta-estimator, and we can derive the tag within the class from the prefit attribute.

It basically achieves what polymorphic clone does: #5080. This is not enough for freezing but I think this had been one of the sticking points. For freezing we'd also have to do something to fits -- assuming we can modify tags at runtime, which is not as obvious as I first thought.

@glemaitre
Copy link
Member Author

I don't understand the prefit part. The nested estimator will never be fit when using the meta-estimator so I think it will fix that.

Oh, yes it will fix this case since base_estimator is a parameter of the meta-estimator. Sorry, I didn't get this at the first place.

Base automatically changed from master to main January 22, 2021 10:51
@haiatn
Copy link
Contributor
haiatn commented Oct 4, 2022

What is missing for this to be merged?

@adrinjalali
Copy link
Member

We can probably close this one in favor of scikit-learn/enhancement_proposals#67

@adrinjalali
Copy link
Member

closing for the same reason as #5563 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
5 participants
0