-
-
Notifications
You must be signed in to change notification settings - Fork 597
Introduce compile_pip_requirements rule #373
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for upstreaming.
To give others some added context, we at Canva also use pip-tools
to 'compile' our top-level deps into a locked transitive closure, but do this with a less elegant bash script called regenerate_py_deps.sh
. The script updates a .txt
that then becomes the input file to pip_install
.
Over the year we've used pip-tools
for this purpose, we've found it practical and reliable, though it is annoying to need a third-party tool to do something so essential.
pip==9.0.3 | ||
setuptools==44.0.0 | ||
wheel==0.30.0a0 | ||
# |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We'd be able to get rid of this requirements file once piptool
is fully removed from the rules. Would we then remove the macro usage here, or is there a reason to keep it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure if the requirements file goes away, then so should the rule that keeps it updated.
d01ae74
to
74f80f0
Compare
ah, now the CI failure is because it's running python 2 and one of the requirements varies with python version
|
9f192e0
to
6bfa3c0
Compare
d6984b1
to
7175da7
Compare
Hey, I tried to be sneaky and try this out. Did I do it wrong?
diff --git a/WORKSPACE b/WORKSPACE
index 06724b178..15306f4af 100644
--- a/WORKSPACE
+++ b/WORKSPACE
@@ -23,10 +23,10 @@ register_toolchains("//python/interpreter:bfe_py_toolchain")
### Load rules_python first so our version wins.
http_archive(
name = "rules_python",
- sha256 = "95ee649313caeb410b438b230f632222fb5d2053e801fe4ae0572eb1d71e95b8",
- strip_prefix = "rules_python-c8c79aae9aa1b61d199ad03d5fe06338febd0774",
+ sha256 = "72800a67f1aa25d13dfeb8269e1d92c2dfb044792893605238db90b6a0c39a3c",
+ strip_prefix = "rules_python-7175da78d6d032094ea69ab418bc7f8e6f15573a",
# equivalent SHA to that of 0.1.0 release, except the archive has experimental stuff like wheel.bzl
- url = "https://github.com/bazelbuild/rules_python/archive/c8c79aae9aa1b61d199ad03d5fe06338febd0774.tar.gz",
+ url = "https://github.com/bazelbuild/rules_python/archive/7175da78d6d032094ea69ab418bc7f8e6f15573a.tar.gz",
)
load("@rules_python//python:repositories.bzl", "py_repositories")
diff --git a/python/BUILD b/python/BUILD
index ffd0fb0cd..020ee0cc5 100644
--- a/python/BUILD
+++ b/python/BUILD
@@ -1 +1,3 @@
package(default_visibility = ["//visibility:public"])
+load("@rules_python//python/pip_install:requirements.bzl", "compile_pip_requirements")
+compile_pip_requirements()
I got this error:
FWIW, we are using a custom python interpreter (3.7.9) and our python runtime has no python2 allowed. Also, sorry for jumping in randomly... don't want to distract too badly from the point of the PR, but if this is a bug (missing dep?) it might be helpful. |
I've now test-driven this myself and run into the same error as @dhalperi. As yet I haven't investigated the issue deeply, but I can create the same import problem in my workspace's Nix shell with Unrelated, but I it was a bit finicky figuring out whether to create the
So at some point bootstrap instructions would be good to tell users how to smoothly onboard into this. |
@dhalperi @thundergolfer I'm still just experimenting atm, but I believe you'd need to add |
ad742e2
to
fc2f81d
Compare
The release notes for https://github.com/jazzband/pip-tools/releases/tag/5.5.0 mention
So I suspect that's even better than adding our own dependency on six |
f9804c9
to
f73e99c
Compare
} | ||
|
||
# cheap way to detect the bazel version | ||
_bazel_version_4_or_greater = "propeller_optimize" in dir(native) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sneaky 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got a link to why this works? I can't see propeller_optmize
in the 4.0 changelog, but I do see it in docs for C++.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No link - I just checked dir(native)
under both 3.3.1 and 4.0.0 and got lucky that there was one new symbol introduced. I didn't spend much time searching for more obvious ways to predict whether py_binary
has an env
attribute - but dir(py_binary)
comes back []
so we have to look somewhere.
@@ -0,0 +1,10 @@ | |||
pip==9.0.3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pip is kinda old no? Current release is https://pypi.org/project/pip/21.0.1/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe but this is change-one-thing-at-a-time ™️ - we're already depending in this version in the existing requirements.txt file
] + extra_args | ||
|
||
deps = [ | ||
requirement("click"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
click
seems to be a transitive dep? Although I don't see it in the locked requirements
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah, there are two different requirements lists
the one in requirements.in/requirements.txt is just being refactored to use the new rule, to prove that it works
The rule itself gets bootstrapped by the list in repositories.bzl which is also how users get the dependencies. click
is in there.
70dd3bf
to
01778f8
Compare
# cli will exit(0) on success | ||
try: | ||
print("Checking " + requirements_txt) | ||
cli() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Apologies if I am misunderstanding, but IIUC this is going to call out to PyPI to check if there are newer versions of packages, making the test non-cacheable. One way around this would be to add the external
tag so that the test results are never cached.
We've taken a somewhat different approach to the problem-we make a hash of the requirements.in
file and prepend it to the requirements.txt
file. Then there's a test checking that the hash is still correct. This is a weaker test-it doesn't tell you when you need to update, it instead only makes sure that people remember to update the requirements.txt
whenever they edit the requirements.in
, but it is hermetic.
Anyway, some comments from the peanut gallery...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's a good question, and was my first reaction as well. However pip-compile has a separate flag that expresses the intent of "give me newer versions"
https://github.com/jazzband/pip-tools#updating-requirements
and since it reads both requirements.in
and requirements.txt
it's possible for it to use the latter to pin its own "re-locking" semantics to give you the same versions you have already.
Also we've been operating this rule for a few months at one client, and haven't seen a case of CI failing because some package cut a release and the pip-compile test suddenly thinks the lockfile is out-of-date.
So I'm pretty confident we are good.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah ha, thanks for pointing that out; I played around with it locally and it indeed did not work like I thought it did.
} | ||
|
||
# cheap way to detect the bazel version | ||
_bazel_version_4_or_greater = "propeller_optimize" in dir(native) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got a link to why this works? I can't see propeller_optmize
in the 4.0 changelog, but I do see it in docs for C++.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FWIW
f95d3a5
to
c0d021a
Compare
All (the pull request submitter and all commit authors) CLAs are signed, but one or more commits were authored or co-authored by someone other than the pull request submitter. We need to confirm that all authors are ok with their commits being contributed to this project. Please have them confirm that by leaving a comment that contains only Note to project maintainer: There may be cases where the author cannot leave a comment, or the comment is not properly detected as consent. In those cases, you can manually confirm consent of the commit author(s), and set the ℹ️ Googlers: Go here for more info. |
This uses pip-tools to compile a requirements.in file to a requirements.txt file, allowing transitive dependency versions to be pinned so that builds are reproducible. Fixes bazel-contrib#176
c0d021a
to
4762e62
Compare
This uses pip-tools to compile a requirements.in file to a requirements.txt file,
allowing transitive dependency versions to be pinned so that builds are reproducible.
Fixes #176