8000 CI: test_requirements is not used consistently · Issue #24655 · numpy/numpy · GitHub
[go: up one dir, main page]

Skip to content

CI: test_requirements is not used consistently #24655

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
bsipocz opened this issue Sep 6, 2023 · 14 comments
Closed

CI: test_requirements is not used consistently #24655

bsipocz opened this issue Sep 6, 2023 · 14 comments
Labels
33 - Question Question about NumPy usage or development component: CI

Comments

@bsipocz
Copy link
Member
bsipocz commented Sep 6, 2023

Some of the jobs manually install test dependencies rather than using the requirement file (while other ones install using the file). This introduces the potential of missing out newly added dependencies, etc.

Is this intentional? If not, I'll go ahead and open a PR that uses the requirements file instead.

@bsipocz bsipocz added component: CI 33 - Question Question about NumPy usage or development labels Sep 6, 2023
@mattip
Copy link
Member
mattip commented Sep 6, 2023

There is not a good way to tell pyodide to ignore ninja, so it has to be installed for testing outside the requirements file. What else is not installed via the requirements file?

@bsipocz
Copy link
Member Author
bsipocz commented Sep 6, 2023

Oh, that one is fair enough, I mostly was referring to the handling of pytest plugins. E.g in #23812 I'm adding pytest-doctestplus, and while it was added to the requirements file, I noticed it's not enough as not all the jobs use that.

pyodide itself is not listed in test_requirements.txt, neither ninja, could it be that using that file won't interfere with the issue you mention?

@rgommers
Copy link
Member

rgommers commented Sep 6, 2023

Doctesting should only run in a single job though, so how can it be not enough?

@bsipocz
Copy link
Member Author
bsipocz commented Sep 6, 2023

Doctesting should only run in a single job though

Should it, why?

Either case, this is a bit of an inconsistency, independently of doctesting, I don't think there should be two different ways how e.g. pytest-xdist is installed in CI (to bring an existing example from the file).

@rgommers
Copy link
Member
rgommers commented Sep 6, 2023

Should it, why?

It's there to check the docs, which needs to be done exactly once. It's not platform or Python version dependent. And if it fails, we want it to fail in a job dedicated to docs/refguide-check.

Either case, this is a bit of an inconsistency, independently of doctesting, I don't think there should be two different ways how e.g. pytest-xdist is installed in CI (to bring an existing example from the file).

We have other jobs that explicitly avoid test_requirements.txt because it pulls in too much stuff. We only have two required test dependencies: pytest and hypothesis. Everything else is an optional dependency.

@bsipocz
Copy link
Member Author
bsipocz commented Sep 6, 2023

It's there to check the docs

Also to check that the examples in the docs are indeed not version or platform dependent. Why should the examples in the documentation treated differently from the other unit tests?

We only have two required test dependencies: pytest and hypothesis. Everything else is an optional dependency.

Yet I don't see any jobs that use only those two. pytest-xdist always seems to be there, too, etc.
Anyway, if this issue is bikeshedding, please close it. I only raised it as the intents are not clear and the installed dependencies felt a bit ad-hoc especially that there is a requirements file available. If it's a pick and chose install for each job, then maybe not having the file at all would made the situation a bit more clear.

@charris
Copy link
Member
charris commented Sep 6, 2023

Anyway, if this issue is bikeshedding, please close it.

I like the idea myself.

@rgommers
Copy link
Member
rgommers commented Sep 6, 2023

Why should the examples in the documentation treated differently from the other unit tests?

They are not unit tests. We've been over that for many years again and again. Doctests are code examples, and should not be treated as tests. The only purpose of "doc testing" is to verify that all the examples run and the repr's are reasonable.

I only raised it as the intents are not clear and the installed dependencies felt a bit ad-hoc especially that there is a requirements file available. If it's a pick and chose install for each job, then maybe not having the file at all would made the situation a bit more clear.

I'll note that we are still in the middle of a major CI overhaul. First we need to finish making it work with Meson, then drop all setup.py files, then drop Azure, then make everything as consistent and concise as possible. Hopefully it'll be easier to navigate after all that (it's already a lot better than 2 months ago).

If it's a pick and chose install for each job, then maybe not having the file at all would made the situation a bit more clear.

I hope that we can get there indeed. Requirements files are not ideal, but there's no better alternative yet (installing directly from pyproject.toml isn't (easily) possible.

@bsipocz
Copy link
Member Author
bsipocz commented Sep 6, 2023

The only purpose of "doc testing" is to verify that all the examples run and the repr's are reasonable.

I think this is where the opinions differ. Ideally examples in the narrative docs should be indeed platform independent, but that's not always the case. How do you suppose to ensure that without actually running them?

So far I was assuming overall support in enabling a more comprehensive doctesting, as the status quo apparently leaves a bit of gaps here and there.

@rgommers
Copy link
Member
rgommers commented Sep 6, 2023

Ideally examples in the narrative docs should be indeed platform independent, but that's not always the case. How do you suppose to ensure that without actually running them?

That's both unlikely to happen and very low-impact if it does happen for some function at some point. If so, opening an issue and then adressing that should be fine. This is not the kind of thing that should be tested in CI on every PR.

So far I was assuming overall support in enabling a more comprehensive doctesting

I am certainly not in favor of that. Imho it's both too slow and too fragile, for little benefit.

The questions we've had for a long time around doctesting are around maintainability of the tooling and whether that can be shared between the various refguide checkers and pytest plugins. Some unification seemed in order. Running more doctests or treating them as tests has never been a goal, at least as far as I am aware.

@charris
Copy link
Member
charris commented Sep 6, 2023

You could break the requirements file down into more specific groups, i.e., pyiodide_requirements or some such. The requirements files to provide a easy way to keep versions updated or initialize a new workspace/python version.

@bsipocz
Copy link
Member Author
bsipocz commented Sep 6, 2023

around maintainability of the tooling

This would be part of it. But enabling it and letting it run is in fact way easier than just hacking it around to run on one job.
Btw, this approach works well in astropy land. Sure, there are some annoyances, but some solutions are in the works (e.g. auto-generating the desired outputs, etc.).
I can't comment on speed just yet since the build system changed less standard tools work out of the box with a local build (e.g. a direct pytest used to work with an editable install, it's not the case any more).

@rgommers
Copy link
Member
rgommers commented Sep 7, 2023

e.g. a direct pytest used to work with an editable install, it's not the case any more)

This should work fine:

pip install -e . --no-build-isolation
pytest numpy

A plain pytest gives you the infamous _pytest.pathlib.ImportPathMismatchError because it confuses the two locations. I guess that is what you meant? You'll see the exact same thing with python setup.py develop.

But enabling it and letting it run is in fact way easier than just hacking it around to run on one job.

It's trivial to do this in a single job. It's literally two lines right now:

python tools/refguide_check.py --rst
python tools/refguide_check.py --doctests

And if it fails, we get a single CI job failing, rather than a ton of them.

To change the above to the external pytest plugin, it's a matter of installing that plugin with pip and then changing the invocation to whatever is needed there to run the doctests.

@rgommers
Copy link
Member

I'm fixing the issue with ninja in gh-25012, after that it's included as part of test_requirements.txt. The rest of this issue isn't actionable/accepted, so I'll close it here. Thanks all.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
33 - Question Question about NumPy usage or development component: CI
Projects
None yet
Development

No branches or pull requests

4 participants
0