-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
Review DeprecationWarnings and FutureWarnings in raise tests for the 0.20 release #11252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
Milestone
Comments
We should finish and merge #9570 prior to conducting this audit. |
This can be done progressively, with several PR, one test module at a time. |
Related: #11431 aims to ensure that there are no warnings at module import time. |
ping @janvanrijn |
amueller
added a commit
that referenced
this issue
Jul 17, 2018
Towards #11252. In the end we'd like to make these errors so we can keep this cleaner in the future.
fixed in #11570 I would argue ;) |
fixed in #11570 I would argue ;) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Prior to the release, we need to conduct an audit specifically on
DeprecationWarning
andFutureWarning
s raised in the tests (including doctests from the doc). Then on a case by case basis we need to decide for each occurrence whether to:test_common.py
it's ok to blindly ignore the warnings because the automated estimator introspection lookup will trigger them.In any case, we should never blindly ignore the warning. In the very few cases where there is a legetimate reason to catch the warning (as in
test_common
), we need to at least write why it's safe to ignore this specific warning occurrence in an inline comment in the source code of the test.The text was updated successfully, but these errors were encountered: