8000 Aim to raise warnings as errors in test suite · Issue #5685 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

Aim to raise warnings as errors in test suite #5685

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
amueller opened this issue Nov 2, 2015 · 24 comments
Open

Aim to raise warnings as errors in test suite #5685

amueller opened this issue Nov 2, 2015 · 24 comments

Comments

@amueller
Copy link
Member
amueller commented Nov 2, 2015

Going through the fun of seeing all the deprecated usages of numpy in master and 0.17, I think it would be nice to let nose break on warnings.
I don't think there should be any warnings when running the tests, and that would put the effort much more on the contributors, instead of the maintainers.

@giorgiop
Copy link
Contributor
giorgiop commented Nov 2, 2015

+1

@amueller
Copy link
Member Author
amueller commented Nov 4, 2015

It would also be nice to check warnings in the examples. We use deprecated things from matplotlib, and we should fix them when they are deprecated, not when they break.

@vighneshbirodkar
Copy link
Contributor

@amueller What about convergence warnings ?

@amueller
Copy link
Member Author

if we expect them, we should catch them. But I feel like there shouldn't be convergence warnings on toy datasets. If we decreased the number of iterations to speed up tests, it's fine to catch them.

@TomDLT
Copy link
Member
TomDLT commented Dec 12, 2015

👍

@thomasjpfan
Copy link
Member

This should be resolved given we run pytest with -Werror::DeprecationWarning -Werror::FutureWarning in our CI.

@amueller
Copy link
Member Author

@thomasjpfan the point was to raise no warning,s not to raise no deprecation and future warnings.

@thomasjpfan
Copy link
Member
thomasjpfan commented Jul 12, 2019

@amueller pytest's -Werror::DeprecationWarning -Werror::FutureWarning converts the warnings into errors. When scipy or numpy deprecates something our CI is raising an error. This happens from time to time in our nightly CI which uses the master versions of numpy and scipy.

Edit: Converting warnings to errors forces us to make changes.

@amueller
Copy link
Member Author

@thomasjpfan the point of the issue was to make the necessary changes and either catch warnings if they are expected or eliminate the cause.

@thomasjpfan
Copy link
Member

@amueller Since we are already catching DeprecationWarning and FutureWarning, is this issue focused on all our custom warnings defined in sklearn/exceptions.py, such as ConvergenceWarning?

@thomasjpfan
Copy link
Member

We can configure pytest to error on all warnings, which will force us to catch everything.

@amueller
Copy link
Member Author

Yes, the point was to error on all warnings as the title of the issue says ;)

There is many other warnings, like numpy ones and user warnings and scipy warnings etc....
Also see #10158

@cmarmo
Copy link
Contributor
cmarmo commented Mar 6, 2020

As this issue is quite old and nobody yet stepped in, may I take it?
@amueller I have add a CHECK_ALL_WARNINGS variable to the Azure build scripts and the result is here. Some tests, indeed, are affected by warnings. Should we probably open issues about them, before changing the build behavior? I can take care of that if there are no objections.

@rth
Copy link
Member
rth commented Mar 6, 2020

As mentioned in #12043 I am -1 on raising all warning as error, and part of this was reverted in #12048 . I think checking deprecation warning as error in 1 job with latest dependencies is enough. We are indeed not very good on addressing other warnings, but I'm not sure that erroring on them is a solution.

The issue is that we no control over warnings raised by dependencies, and so each time a new version of a dependencies raises a warning (whether it's relevant or not) our test suite would fail. That is not sustainable. Besides one can get slightly different numeric depending on the architecture, and so if I run tests say on Arch64 (or with a different BLAS) and they fail because of a convergence warning because max_iter are optimized to work on amd64 that is very unpleasant.

@rth
Copy link
Member
rth commented Mar 6, 2020

I have add a CHECK_ALL_WARNINGS variable to the Azure build scripts and the result is here.

To be more specific, adding an env variable to azure as long as it doesn't change pytest config files could be OK. Anyway we wouldn't be able to enable it as long as all warning are addressed which is a lot of work (and sometimes non trivial #11536) :)

@cmarmo
Copy link
Contributor
cmarmo commented Mar 6, 2020

@rth, thanks for your answer. I wasn't aware of previous discussions, sorry. Anyway, adding the variable didn't take so much time and my PR is just a fake one on my fork.
Is it worth it to keep this issue open and labelled 'help wanted', then?

@rth rth removed the help wanted label Mar 6, 2020
@rth
Copy link
Member
rth commented Mar 6, 2020

I removed the 'help wanted' label. Addressing existing warnings in the test suite would be very useful though.

@cmarmo
Copy link
Contributor
cmarmo commented Mar 6, 2020

Addressing existing warnings in the test suite would be very useful though.

I will try to update their list then. It is worth noticing that they are platform dependent...

@cmarmo
Copy link
Contributor
cmarmo commented Mar 6, 2020

First list (the easier one) at #16651.
I'm wondering if the check on warnings couldn't be added for those pipelines that don't fail right now: this will prevent new warnings to be added at least for them.

@amueller
Copy link
Member Author

@cmarmo I think the main concern that @rth (and others?) raised was that a warning can be introduced by a change in an upstream library like numpy and making the tests fail based on upstream changes can be confusing. I'm not entirely sure what the best way forward is, though addressing existing warnings is certainly useful (I've found many bugs that way over the years).

@cmarmo
Copy link
Contributor
cmarmo commented Mar 13, 2020

Thanks @amueller, this is clear to me. I'm trying to identify warnings in the builds and understand if they are solvable or not. I'm not planning to solve all the warnings, and this is very useful for me to learn about pytest. Anyway, there is at least one warning linked to pytest marks that we can fix (#16652)... if you want to take a look and review it... :)

@jnothman
Copy link
Member
jnothman commented Mar 16, 2020 via email

@cmarmo
Copy link
Contributor
cmarmo commented Sep 7, 2020

There is a bunch of numpy DeprecationWarning in the test suite (see for example the build of this fake PR)

E       numpy.VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray

Are those warnings worth to fix? If yes I will make a checklist and open a new issue for them. @rth? Thanks.

@ogrisel
Copy link
Member
ogrisel commented Sep 8, 2020

Yes that would be great to fix those warnings but I have not looked in details.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants
0