-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
Add an easy way to run pytest -Werror::DeprecationWarning -Werror::FutureWarning #13396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
If our CI runs with warnings as errors, why is it overkill to do this in
setup.cfg? Is it because there are platforms and dependency setups where we
expect this to fail??
|
Maybe I just wanted to be conservative? Actually I agree with you, we should always run it. |
Because our CI runs with either fixed versions or the latest versions of dependencies. Developers and people packaging scikit-learn for Linux distributions can run it with arbitrary versions of dependencies where we cannot guarantee that some warning won't occur (particularly for not yet released versions of numpy/scipy). In those conditions, this option will lead to unwanted test failures. See for instance #12043 So I don't think having it as default is a good idea, however +1 adding a new entry (non default) to the makefile if you find it useful. |
A makefile entry would make passing pytest parameters like Is there a way to easily bypass the flags in |
@rth lol ok yes, I was opposed to this change ;) but clearly didn't see the ping so that's on me. And I agree your reasoning is sound but the change makes running the tests locally too hard. could we by default raise on deprecation warnings only from within sklearn? I think it's weird to have something pass locally and then fail on the CI with the same versions. |
Yeah, that's the reason I'm not using the makefile for testing :)
Last time I checked it was non-trivial, at least much harder than providing a CLI parameter to pytest. My point that one shouldn't have to bypass the flags in setup.cfg. For instance, I would consider a bug, a situation where tests pass for some scikit-learn release; then start failing few month later because of a new scipy release that's perfectly backward compatible but adds a new harmless deprecation warning. An option could be to suggest setting the |
@rth my concern is more about deprecation warnings within sklearn which couldn't raise these spurious warnings. If I deprecate something in sklearn I should catch or change all occurrences. Right now the tests don't force me to do that locally, but they do on the CI. I think warnings caused by deprecations within sklearn are much more common than from scipy or numpy. If we can't filter these, then I guess another option would be to not use DeprecationWarning but only error on our own warning that inherits from DeprecationWarning? |
Yeah, but even a single deprecation warning from numpy / scipy / pandas breaking tests post release (for the sake of development convenience) is not OK IMO. The latest was the deprecation of scipy.sparse matrices not that long ago.
We could if you think it's worth the effort.
Wouldn't setting the To summarize, I still have concerns about changing |
@rth I agree they shouldn't lead to errors post-release. |
The documentation has been updated in #13540, and the discussion about forcing erroring on warnings locally doesn't seem to reach consensus. Might this one be closed? |
CI is erroring on uncaught deprecation and future warnings but that's hard to reproduce locally unless you really know what you're doing.
I feel like we should make this easier, or maybe even the default when running tests locally?
One way would be to do this in the makefile, which would only help if using make, but that already would be better, I think.
We could also add it to the configuration, not sure if that's overkill?
It's a bit annoying to have tests pass locally but then not on CI because it uses different flags [yes yes, I added these flags, I know].
The text was updated successfully, but these errors were encountered: