8000 catch more warnings in common tests · Issue #11109 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

catch more warnings in common tests #11109

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
amueller opened this issue May 19, 2018 · 30 comments · Fixed by #11151
Closed

catch more warnings in common tests #11109

amueller opened this issue May 19, 2018 · 30 comments · Fixed by #11151
Labels
Easy Well-defined and straightforward way to resolve good first issue Easy with clear instructions to resolve help wanted

Comments

@amueller
Copy link
Member

Right now the output of travis is too large to render, partially because a lot of deprecation warnings and partially because of warnings from the common tests. The common tests should catch all the deprecation warnings and probably also convergence warnings or numerical warnings (we decrease the number of iterations for faster testing).

@amueller amueller added Easy Well-defined and straightforward way to resolve good first issue Easy with clear instructions to resolve help wanted labels May 19, 2018
@ghost
Copy link
ghost commented May 19, 2018

Do you mean deprecation warnings like the one below? If you can guide me a little, I am happy to work on this issue.
image

@amueller
Copy link
Member Author

I meant deprecation warnings raised in scikit-learn, not in Numpy. That one you mention is during compilation. I meant during testing.

@TomDLT
Copy link
Member
TomDLT commented May 22, 2018

On your machine, you can run the tests with pytest -r a sklearn. At the end of the script, pytest prints a "warnings summary". It would be great to silence the expected warnings, such as convergence warnings.

Example: pytest -r a sklearn/linear_model/tests/test_coordinate_descent.py:

sklearn/linear_model/tests/test_coordinate_descent.py ...................................

=========================================================== warnings summary ============================================================
sklearn/linear_model/tests/test_coordinate_descent.py::test_lasso_cv_with_some_model_selection
  /cal/homes/tdupre/work/src/scikit-learn/sklearn/model_selection/_split.py:605: Warning: The least populated class in y has only 1 members, which is too few. The minimum number of members in any class cannot be less than n_splits=5.
    % (min_groups, self.n_splits)), Warning)

sklearn/linear_model/tests/test_coordinate_descent.py::test_multitask_enet_and_lasso_cv
  /cal/homes/tdupre/work/src/scikit-learn/sklearn/linear_model/coordinate_descent.py:1783: ConvergenceWarning: Objective did not converge, you might want to increase the number of iterations
    ConvergenceWarning)
  /cal/homes/tdupre/work/src/scikit-learn/sklearn/linear_model/coordinate_descent.py:1783: ConvergenceWarning: Objective did not converge, you might want to increase the number of iterations
    ConvergenceWarning)

sklearn/linear_model/tests/test_coordinate_descent.py::test_check_input_false
  /cal/homes/tdupre/work/src/scikit-learn/sklearn/linear_model/coordinate_descent.py:491: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Fitting data with very small alpha may cause precision problems.
    ConvergenceWarning)

To silence warnings, see sklearn.utils.testing.ignore_warnings:

def ignore_warnings(obj=None, category=Warning):
"""Context manager and decorator to ignore warnings.
Note: Using this (in both variants) will clear all warnings
from all python modules loaded. In case you need to test
cross-module-warning-logging, this is not your tool of choice.
Parameters
----------
category : warning class, defaults to Warning.
The category to filter. If Warning, all categories will be muted.
Examples
--------
>>> with ignore_warnings():
... warnings.warn('buhuhuhu')
>>> def nasty_warn():
... warnings.warn('buhuhuhu')
... print(42)
>>> ignore_warnings(nasty_warn)()
42
"""

@ShangwuYao
Copy link
Contributor

Hi, I am looking into this issue, I figure the "common tests" means the test_common.py under the tests directory? I tried running the test but no warning is generated, could you tell me what warning is expected? Thanks a lot.

============================================= test session starts =============================================
platform darwin -- Python 3.6.1, pytest-3.0.7, py-1.4.33, pluggy-0.4.0
rootdir: /Users/sw/programming/opensourceproject/scikit-learn, inifile: setup.cfg
plugins: pep8-1.0.6
collected 4822 items

test_common.py .........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s............................................................................s............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s.........................................................................................................................................................................................................................................................................................................s................................................................................................................................................................................................................................s..................................s..............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s............................................................................................................................................................................................................................................................................s........................................................................................................................................................................................................................................................................................................................................................................................................................
=========================================== short test summary info ===========================================
SKIP [1] /anaconda/lib/python3.6/site-packages/_pytest/nose.py:23: score_samples of BernoulliRBM is not invariant when applied to a subset.
SKIP [3] /anaconda/lib/python3.6/site-packages/_pytest/nose.py:23: Skipping check_estimators_data_not_an_array for cross decomposition module as estimators are not deterministic.
SKIP [1] /anaconda/lib/python3.6/site-packages/_pytest/nose.py:23: transform of MiniBatchSparsePCA is not invariant when applied to a subset.
SKIP [1] /anaconda/lib/python3.6/site-packages/_pytest/nose.py:23: Not testing NuSVC class weight as it is ignored.
SKIP [1] /anaconda/lib/python3.6/site-packages/_pytest/nose.py:23: decision_function of SVC is not invariant when applied to a subset.
SKIP [1] /anaconda/lib/python3.6/site-packages/_pytest/nose.py:23: transform of SparsePCA is not invariant when applied to a subset.

=================================== 4814 passed, 8 skipped in 72.05 seconds ===================================

@rth
Copy link
Member
rth commented May 27, 2018

figure the "common tests" means the test_common.py under the tests directory?

Yes

I tried running the test but no warning is generated

You need to add the -r a pytest CLI option as indicated in #11109 (comment)

@ShangwuYao
Copy link
Contributor

Huh, That's weird. I did have -r a, and the command was: pytest -r a sklearn/tests/test_common.py. But as shown above, no warning was generated, I tried pytest -r a sklearn/tests/test_common.py --strict and found one deprecationwarning and a bunch of errors.

@ShangwuYao
Copy link
Contributor

figured out, I was using an older pytest verision.

@lesteve
Copy link
Member
lesteve commented May 28, 2018

Last time I looked at this, I realised that one of the problem (there may be more) was that assert_warns_* resets warnings.filters which overrides the ignore_warnings used as a decorator. It does not seem like a great idea and the reason for it is slightly unclear. Below is a snippet to show the problem:

import warnings

import pytest

from sklearn.utils.testing import ignore_warnings
from sklearn.utils.testing import assert_warns_message


def warns():
    warnings.warn("some warning")
    return 1


@ignore_warnings()
def test_1():
    print('before:', warnings.filters)
    assert_warns_message(UserWarning, 'some warning', warns)
    print('after:', warnings.filters)
    # This warning is visible because assert_warns_message resets
    # warnings.filters.
    warns()


def test_12():
    print('test12:', warnings.filters)


ignore_common_warnings = pytest.mark.filterwarnings('ignore::UserWarning')


@ignore_common_warnings
def test_2():
    warns()


@ignore_common_warnings
def test_3():
    assert_warns_message(UserWarning, 'some warning', warns)
    # This warning is visible
    warns()
❯ pytest /tmp/test-warnings.py -s 
======================================================== test session starts ========================================================
platform linux -- Python 3.6.5, pytest-3.5.1, py-1.5.3, pluggy-0.6.0
rootdir: /tmp, inifile:
plugins: timeout-1.2.1, flake8-0.9.1, cov-2.5.1, hypothesis-3.56.0
collected 3 items                                                                                                                   

../../../../../tmp/test-warnings.py before: [('ignore', None, <class 'Warning'>, None, 0)]
after: []
...

========================================================= warnings summary ==========================================================
test-warnings.py::test_1
  /tmp/test-warnings.py:10: UserWarning: some warning
    warnings.warn("some warning")

test-warnings.py::test_3
  /tmp/test-warnings.py:10: UserWarning: some warning
    warnings.warn("some warning")

-- Docs: http://doc.pytest.org/en/latest/warnings.html
=============================================== 3 passed, 2 warnings in 0.18 seconds ================================================

@rth
Copy link
Member
rth commented May 28, 2018

Interesting. I would be in favor of replacing assert_warns_message(UserWarning, 'some warning', warn) with,

with pytest.warns(UserWarning, match='some warning'):
   warn()

to avoid dealing with it..

@lesteve
Copy link
Member
lesteve commented May 28, 2018

If someone has a semi-automated way of replacing the assert_warns*, why not, but it could well be the case that some downstream packages are using them.

Maybe implementing assert_warns_* in terms of pytest_warns is a less tedious and less intrusive change to implement, i.e. something like this (not tested):

def assert_warns_regex(warning_cls, warning_message, func, *args, **kwargs):
    with pytest.warns(warning_cls, match=re.escape(warning_message)):
        func(*args, **kwargs)

@jnothman
Copy link
Member
jnothman commented May 28, 2018 via email

@rth
Copy link
Member
rth commented May 28, 2018

but it could well be the case that some downstream packages are using them.

We could still keep assert_warns_* for a deprecation cycle in utils/testing.py in any case

Maybe implementing assert_warns_* in terms of pytest_warns is a less tedious and less intrusive change to implement

True, that would be faster, but then there isn't that many lines to change,

% git grep assert_warns_message | wc -l                 [master] 
162
% git grep assert_warns | wc -l                         [master] 
319

to warant keeping a wrapper, I think. Directly using pytest warning capture in the code base without a wrapper would be IMO cleaner for contributors reading the code... That could be a part of cleaning up assert_* functions in general #10728 (comment)

@ShangwuYao
Copy link
Contributor

That makes sense, thanks a lot. But when I tried replacing the assert_warns_* (including assert_warns) with pytest_warns as @lesteve suggested, I still got all those warnings.

And there is sth that wasn't clear to me: when using ignore_warnings as decorator, the code is

    def __call__(self, fn):
        """Decorator to catch and hide warnings without visual nesting."""
        @wraps(fn)
        def wrapper(*args, **kwargs):
            # very important to avoid uncontrolled state propagation
            clean_warning_registry()
            with warnings.catch_warnings():
                warnings.simplefilter("ignore", self.category)
                return fn(*args, **kwargs)

        return wrapper

Isn't the clean_warning_registry() also reset the filters?

@lesteve
Copy link
Member
lesteve commented May 28, 2018

clean_warning_registry is what resets warnings.filters indeed, so ignore_warnings has the same problem (there may be others).

Not sure what you have tried but if I were you I would try to look at a single test that shows a problem, for example, I get a ConvergenceWarning with this test:

❯ pytest sklearn/tests/test_common.py -r a -k 'test_non_meta_estimators and TheilSen and subset_invariance'
======================================================== test session starts ========================================================
platform linux -- Python 3.6.5, pytest-3.5.1, py-1.5.3, pluggy-0.6.0
rootdir: /home/local/lesteve/dev/scikit-learn, inifile: setup.cfg
plugins: timeout-1.2.1, flake8-0.9.1, cov-2.5.1, hypothesis-3.56.0
collected 4822 items / 4821 deselected                                                                                              

sklearn/tests/test_common.py .                                                                                                [100%]

========================================================= warnings summary ==========================================================
sklearn/tests/test_common.py::test_non_meta_estimators[TheilSenRegressor-TheilSenRegressor-check_methods_subset_invariance]
  /home/local/lesteve/dev/scikit-learn/sklearn/linear_model/theil_sen.py:128: ConvergenceWarning: Maximum number of iterations 5 reached in spatial median for TheilSen regressor.
    "".format(max_iter=max_iter), ConvergenceWarning)

-- Docs: http://doc.pytest.org/en/latest/warnings.html
======================================= 1 passed, 4821 deselected, 1 warnings in 1.30 seconds =======================================

print warnings.filters (use -s to show stdout in pytest) and try to figure out why warnings.simplefilter("ignore", ConvergenceWarning) here does not seem to have any effect.

@ShangwuYao
Copy link
Contributor

Can I just comment out the clean_warning_registry() in ignore_warnings then? That seems to solve all the problems.

@jnothman
Copy link
Member
jnothman commented May 30, 2018 8000 via email

@rth
Copy link
Member
rth commented May 30, 2018

After more investigation, the issue with using ignore_warnings as a wrapper for pytest.warn or replace it altogether in this issue is that sklearn/utils/estimator_checks.py use ignore_warnings and we don't want to make it depend on pytest. Unless ones use a decorator that optionally ignores warnings only if pytest is installed which is also not ideal.

@ShangwuYao
Copy link
Contributor
ShangwuYao commented May 30, 2018

That really helped me understand, thanks a lot! @jnothman
I have figured out the reason: each check in the estimator_checks.py uses the ignore_warnings decorator, which calls clean_warning_registry() and reset the filters given by the parents. By the time the check actually starts running, it only has the filters it defined itself in the decorator. For example:

def test_non_meta_estimators(name, Estimator, check):
    with ignore_warnings(category=(DeprecationWarning, ConvergenceWarning,
                           UserWarning, FutureWarning)):   # this will be reset by check's ignore_warnings
        # Common tests for non-meta estimators
        estimator = Estimator()
        set_checking_parameters(estimator)
        check(name, estimator)

# this will reset ignore_warnings, leaving only DeprecationWarning and FutureWarning filters
@ignore_warnings(category=(DeprecationWarning, FutureWarning))
def check_dont_overwrite_parameters(name, estimator_orig):

I have to say that I don't see why it has to overwrite the filters passed to it, I guess, to me, it makes more sense to add more filters (as defined in the check's decorator) instead of reset the filters.

I could implement the adding more filters thing, but if you are worried about affecting other code, I could also just remove all the ignore_warnings decorators for the checks, and only have this decorator in the test_common.

@ShangwuYao
Copy link
Contributor

Could someone confirm this? Appreciate it

8000

@jnothman
Copy link
Member
jnothman commented Jun 2, 2018 via email

@ShangwuYao
Copy link
Contributor

I am pretty sure I got it right in my previous post.
@jnothman , I don't quite understand what you mean by "affects filters that only display the first of duplicate warnings", but I'm think the clean_warning_registry in ignore_warnings will reset warning filters (which is the cause of the problem):
When using ignore_warnings as decorator and context manager, it will call clean_warning_registry(), and the clean_warning_registry() will call warnings.resetwarnings(), which reset all the warning filters:

def resetwarnings():
    """Clear the list of warning filters, so that no filters are active."""
    filters[:] = []
    _filters_mutated()

I have tried adding catch_warnings to the context manager as you suggested, but it doesn't work, then I looked into the source code and found out it was dealing with warning filters, but since the warning filters are reset in the low level check functions, it doesn't matter adding catch_warnings or not.
And removing ignore_warnings for all the check functions in estimator_checks.py indeed fixed the problem and cleared the warnings, this is working. But this isn't a fix, only a way to avoid dealing with it, could you take a look at the two methods I suggested in the previous post and tell me what kind of fix you have in mind? I would very much appreciate it.

@jnothman
Copy link
Member
jnothman commented Jun 4, 2018

Yes, I see. clear_warnings_registry does indeed affect filters that only display the first of duplicate warnings (i.e. simplefilter('once'). But it also, as you say, removes any filters with resetwarnings.

But this only applies to the current catch_warnings scope:

In [9]: warnings.resetwarnings()

In [10]: warnings.filters
Out[10]: []

In [11]: warnings.simplefilter('ignore')

In [12]: with warnings.catch_warnings() as c:
    ...:     warnings.resetwarnings()
    ...:     print(warnings.filters)
    ...:
[]

In [13]: print(warnings.filters)
[('ignore', None, <class 'Warning'>, None, 0)]

@jnothman
Copy link
Member
jnothman commented Jun 4, 2018

This means that it does indeed matter whether the ignore_warnings context manager operates with 6D40 in a catch_warnings context, because when ignore_warnings is __enter__ed, its simplefilter('ignore', self.category) will affect non-local scope.

@jnothman
Copy link
Member
jnothman commented Jun 4, 2018

I have not understood your proposals @ShangwuYao. A PR with an example would help clarify.

I think we could remove resetwarnings from clear_warnings_registry and move it into ignore_warnings only (not into the assert_warns variants)...

Another solution would be to backport pytest.warns for use in estimator checks. It is fairly self-sufficient:

from __future__ import absolute_import, division, print_function

import inspect

import py
import sys
import warnings

import re




def warns(expected_warning, *args, **kwargs):
    """Assert that code raises a particular class of warning.

    Specifically, the parameter ``expected_warning`` can be a warning class or
    sequence of warning classes, and the inside the ``with`` block must issue a warning of that class or
    classes.

    This helper produces a list of :class:`warnings.WarningMessage` objects,
    one for each warning raised.

    This function can be used as a context manager, or any of the other ways
    ``pytest.raises`` can be used::

        >>> with warns(RuntimeWarning):
        ...    warnings.warn("my warning", RuntimeWarning)

    In the context manager form you may use the keyword argument ``match`` to assert
    that the exception matches a text or regex::

        >>> with warns(UserWarning, match='must be 0 or None'):
        ...     warnings.warn("value must be 0 or None", UserWarning)

        >>> with warns(UserWarning, match=r'must be \d+$'):
        ...     warnings.warn("value must be 42", UserWarning)

        >>> with warns(UserWarning, match=r'must be \d+$'):
        ...     warnings.warn("this is not here", UserWarning)
        Traceback (most recent call last):
          ...
        Failed: DID NOT WARN. No warnings of type ...UserWarning... was emitted...

    """
    match_expr = None
    if not args:
        if "match" in kwargs:
            match_expr = kwargs.pop("match")
        return WarningsChecker(expected_warning, match_expr=match_expr)
    else:
        func = args[0]
        with WarningsChecker(expected_warning, match_expr=match_expr):
            return func(*args[1:], **kwargs)



class WarningsRecorder(warnings.catch_warnings):
    """A context manager to record raised warnings.

    Adapted from `warnings.catch_warnings`.
    """

    def __init__(self):
        super(WarningsRecorder, self).__init__(record=True)
        self._entered = False
        self._list = []

    @property
    def list(self):
        """The list of recorded warnings."""
        return self._list

    def __getitem__(self, i):
        """Get a recorded warning by index."""
        return self._list[i]

    def __iter__(self):
        """Iterate through the recorded warnings."""
        return iter(self._list)

    def __len__(self):
        """The number of recorded warnings."""
        return len(self._list)

    def pop(self, cls=Warning):
        """Pop the first recorded warning, raise exception if not exists."""
        for i, w in enumerate(self._list):
            if issubclass(w.category, cls):
                return self._list.pop(i)
        __tracebackhide__ = True
        raise AssertionError("%r not found in warning list" % cls)


    def clear(self):
        """Clear the list of recorded warnings."""
        self._list[:] = []


    def __enter__(self):
        if self._entered:
            __tracebackhide__ = True
            raise RuntimeError("Cannot enter %r twice" % self)
        self._list = super(WarningsRecorder, self).__enter__()
        warnings.simplefilter("always")
        return self

    def __exit__(self, *exc_info):
        if not self._entered:
            __tracebackhide__ = True
            raise RuntimeError("Cannot exit %r without entering first" % self)
        super(WarningsRecorder, self).__exit__(*exc_info)


class WarningsChecker(WarningsRecorder):

    def __init__(self, expected_warning=None, match_expr=None):
        super(WarningsChecker, self).__init__()

        msg = (
            "exceptions must be old-style classes or " "derived from Warning, not %s"
        )
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not inspect.isclass(exc):
                    raise TypeError(msg % type(exc))
        elif inspect.isclass(expected_warning):
            expected_warning = (expected_warning,)
        elif expected_warning is not None:
            raise TypeError(msg % type(expected_warning))

        self.expected_warning = expected_warning
        self.match_expr = match_expr

    def __exit__(self, *exc_info):
        super(WarningsChecker, self).__exit__(*exc_info)

        # only check if we're not currently handling an exception
        if all(a is None for a in exc_info):
            if self.expected_warning is not None:
                if not any(issubclass(r.category, self.expected_warning) for r in self):
                    __tracebackhide__ = True
                    raise AssertionError(
                        "DID NOT WARN. No warnings of type {} was emitted. "
                        "The list of emitted warnings is: {}.".format(
                            self.expected_warning, [each.message for each in self]
                        )
                    )
                elif self.match_expr is not None:
                    for r in self:
                        if issubclass(r.category, self.expected_warning):
                            if re.compile(self.match_expr).search(str(r.message)):
                                break
                    else:
                        raise AssertionError(
                            "DID NOT WARN. No warnings of type {} matching"
                            " ('{}') was emitted. The list of emitted warnings"
                            " is: {}.".format(
                                self.expected_warning,
                                self.match_expr,
                                [each.message for each in self],
                            )
                        )

(under a compatible MIT licence: https://docs.pytest.org/en/latest/license.html)

@ShangwuYao
Copy link
Contributor

Thank you so much for the clarification @jnothman , I absolutely agree with what you said. Not using catch_warnings is a bug, and we should definitely fix it. But I don't think that is the cause for the warnings not being ignored here, the real reason is, here they use ignore_warnings at different layers, and ignore_warnings doesn't cover the warnings we want to ignore in the layers the warnings is actually raised. For example:

In [38]: print(warnings.filters)
[]

In [39]: warnings.simplefilter('ignore', UserWarning)

In [40]: print(warnings.filters)
[('ignore', None, <class 'UserWarning'>, None, 0)]

In [41]: with ignore_warnings(category=(DeprecationWarning)):  #[a test in test_common.py]
    ...:     print(warnings.filters)
    ...:     with ignore_warnings(category=(UserWarning)): #[a check in estimator_checks.py]
    ...:         print(warnings.filters)
    ...:         warnings.warn("yo", DeprecationWarning)
    ...:
[('ignore', None, <class 'DeprecationWarning'>, None, 0)]
[('ignore', None, <class 'UserWarning'>, None, 0)]
/anaconda/bin/ipython:5: DeprecationWarning: yo


In [42]: print(warnings.filters)
[]

tests in test_common.py calls checks in estimator_checks.py, and checks use ignore_warnings as well as tests, in this example, the tests want to ignore DeprecationWarning, but the checks want to ignore UserWarning, so even though we used ignore_warnings in test_common.py, the DeprecationWarning is not ignored.

@ShangwuYao
Copy link
Contributor
ShangwuYao commented Jun 4, 2018

BTW, I think neither decorator nor context manager is working correctly for ignore_warnings:

def controlled_warning_context_manager():
    with ignore_warnings():
        warnings.warn('yo', UserWarning)
        print(warnings.filters)

@ignore_warnings
def controlled_warning_decorator():
    warnings.warn('yo', UserWarning)
    print(warnings.filters)

context manager:

warnings.resetwarnings()
warnings.simplefilter('ignore', UserWarning)
print(warnings.filters)
controlled_warning_context_manager()
print(warnings.filters)

output:

[('ignore', None, <class 'UserWarning'>, None, 0)]
[('ignore', None, <class 'Warning'>, None, 0)]
[]          # this should be [('ignore', None, <class 'UserWarning'>, None, 0)]

decorator:

warnings.resetwarnings()
warnings.simplefilter('ignore', UserWarning)
print(warnings.filters)
controlled_warning_decorator()
print(warnings.filters)
[('ignore', None, <class 'UserWarning'>, None, 0)]
[('ignore', None, <class 'Warning'>, None, 0)]
[]          # this should be [('ignore', None, <class 'UserWarning'>, None, 0)]

but I am not too sure about this, correct me if I were wrong

@jnothman
Copy link
Member
jnothman commented Jun 5, 2018 via email

@ShangwuYao
Copy link
Contributor
ShangwuYao commented Jun 5, 2018

that's true, but the non-local scope is affected, (I mean the third output is wrong, not the second). In both examples, I set a filter for UserWarning, but after the call to controlled_warning*, the non-local filter got reset.

(and there is another post above that, could you please take a look as well, thanks!)

@lesteve
Copy link
Member
lesteve commented Jun 5, 2018

I looked at this one a bit more in details and I am in favour of never calling warnings.resetwarnings in ignore_warnings, neither when using ignore_warnings to wrap a function nor when using it as a context manager. Here is a snippet to illustrate my point, which is similar to how sklearn/tests/test_common.py is organized.

import warnings

from sklearn.utils.testing import ignore_warnings


def warns():
    print('inside warns:', warnings.filters)
    warnings.warn('some warning', UserWarning)


def check():
    with ignore_warnings(category=DeprecationWarning):
        warns()


def test():
    with ignore_warnings(category=UserWarning):
        check()

test()

Output:

inside warns: [('ignore', None, <class 'DeprecationWarning'>, None, 0)]
/home/local/lesteve/miniconda3/bin/ipython:8: UserWarning: some warning

In other words the innermost ignore_warnings wins and completely overrides all the others which I find unexpected. This is annoying because sometimes you want to ignore warnings in the test (in sklearn/tests/test_common.py), sometimes in the check (in sklearn/utils/estimator_checks.py) and you would like both levels to cooperate.

I completely understand the need for the __warningregistry__ hack but I really do not get the need for warnings.resetfilters. I found that numpy and scikit-image have their own way (see numpy and scikit-image approach) of cleaning __warningregistry__ but do not seem to resort to warnings.resetfilters.

If I comment warnings.resetfilters all the test pass but one

    def test_warn(self):
        def f():
            warnings.warn("yo")
            return 3
    
        # Test that assert_warns is not impacted by externally set
        # filters and is reset internally.
        # This is because `clean_warning_registry()` is called internally by
        # assert_warns and clears all previous filters.
        warnings.simplefilter("ignore", UserWarning)
        assert_equal(assert_warns(UserWarning, f), 3)
    
        # Test that the warning registry is empty after assert_warns
        assert_equal(sys.modules['warnings'].filters, [])  # Note: test fails here

I really don't think that affecting warning.filters globally is a good-side effect of assert_warns we should rather check that __warningregistry__ has been cleaned as expected.

@jnothman
Copy link
Member
jnothman commented Jun 5, 2018 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Easy Well-defined and straightforward way to resolve good first issue Easy with clear instructions to resolve help wanted
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants
0