-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
Aim to raise warnings as errors in test suite #5685
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
+1 |
It would also be nice to check warnings in the examples. We use deprecated things from matplotlib, and we should fix them when they are deprecated, not when they break. |
@amueller What about convergence warnings ? |
if we expect them, we should catch them. But I feel like there shouldn't be convergence warnings on toy datasets. If we decreased the number of iterations to speed up tests, it's fine to catch them. |
👍 |
This should be resolved given we run |
@thomasjpfan the point was to raise no warning,s not to raise no deprecation and future warnings. |
@amueller pytest's Edit: Converting warnings to errors forces us to make changes. |
@thomasjpfan the point of the issue was to make the necessary changes and either catch warnings if they are expected or eliminate the cause. |
@amueller Since we are already catching |
We can configure pytest to error on all warnings, which will force us to catch everything. |
Yes, the point was to error on all warnings as the title of the issue says ;) There is many other warnings, like numpy ones and user warnings and scipy warnings etc.... |
As this issue is quite old and nobody yet stepped in, may I take it? |
As mentioned in #12043 I am -1 on raising all warning as error, and part of this was reverted in #12048 . I think checking deprecation warning as error in 1 job with latest dependencies is enough. We are indeed not very good on addressing other warnings, but I'm not sure that erroring on them is a solution. The issue is that we no control over warnings raised by dependencies, and so each time a new version of a dependencies raises a warning (whether it's relevant or not) our test suite would fail. That is not sustainable. Besides one can get slightly different numeric depending on the architecture, and so if I run tests say on Arch64 (or with a different BLAS) and they fail because of a convergence warning because max_iter are optimized to work on amd64 that is very unpleasant. |
To be more specific, adding an env variable to azure as long as it doesn't change pytest config files could be OK. Anyway we wouldn't be able to enable it as long as all warning are addressed which is a lot of work (and sometimes non trivial #11536) :) |
@rth, thanks for your answer. I wasn't aware of previous discussions, sorry. Anyway, adding the variable didn't take so much time and my PR is just a fake one on my fork. |
I removed the 'help wanted' label. Addressing existing warnings in the test suite would be very useful though. |
I will try to update their list then. It is worth noticing that they are platform dependent... |
First list (the easier one) at #16651. |
@cmarmo I think the main concern that @rth (and others?) raised was that a warning can be introduced by a change in an upstream library like numpy and making the tests fail based on upstream changes can be confusing. I'm not entirely sure what the best way forward is, though addressing existing warnings is certainly useful (I've found many bugs that way over the years). |
Thanks @amueller, this is clear to me. I'm trying to identify warnings in the builds and understand if they are solvable or not. I'm not planning to solve all the warnings, and this is very useful for me to learn about pytest. Anyway, there is at least one warning linked to pytest marks that we can fix (#16652)... if you want to take a look and review it... :) |
It's pretty straightforward to handle warnings only if they are from a
particular package, no?
|
There is a bunch of numpy DeprecationWarning in the test suite (see for example the build of this fake PR)
Are those warnings worth to fix? If yes I will make a checklist and open a new issue for them. @rth? Thanks. |
Yes that would be great to fix those warnings but I have not looked in details. |
Going through the fun of seeing all the deprecated usages of numpy in master and 0.17, I think it would be nice to let nose break on warnings.
I don't think there should be any warnings when running the tests, and that would put the effort much more on the contributors, instead of the maintainers.
The text was updated successfully, but these errors were encountered: