The wheel builds are currently done by Azure Pipelines. Options that may be used for future builds are TravisCI and Appveyor. Note that currently TravisCI and Appveyor are not triggered, if you want to enable them you will need to do so in the settings/Webhooks.
Build process pages
- Azure Pipelines at https://dev.azure.com/numpy/numpy/_build?definitionId=8&_a=summary&view=runs
- Travis CI at https://travis-ci.org/MacPython/numpy-wheels
- The Appveyor at https://ci.appveyor.com/project/matthew-brett/numpy-wheels
- The driving github repository is https://github.com/MacPython/numpy-wheels
Uploaded file locations
- Release builds at https://anaconda.org/multibuild-wheels-staging/numpy/files
- Nightly builds at https://anaconda.org/scipy-wheels-nightly/numpy/files
The wheel-building repository:
- checks out either a known version of NumPy or master's HEAD
- downloads OpenBLAS using numpy/tools/openblas_support.py
- builds a numpy wheel, linking against the downloaded library and using the appropriate additional variables from env_vars.sh or env_vars32.sh
- processes the wheel using delocate (OSX) or auditwheel
repair
(manylinux).delocate
andauditwheel
copy the required dynamic libraries into the wheel and relinks the extension modules against the copied libraries; - uploads the wheel to the appropriate release or nightly Anaconda repository.
The resulting wheels are self-contained and do not need any external dynamic libraries apart from those provided as standard by OSX / Linux as defined by the manylinux standard.
You will likely want to edit the azure-pipelines.yml
and .travis.yml
files to specify the BUILD_COMMIT
before triggering a build - see below.
You will need write permission to the github repository to trigger new builds. Contact us on the mailing list if you need this.
You can trigger a build by either of two methods:
- making a commit to the
numpy-wheels
repository (e.g. withgit commit --allow-empty
) followed by a push of the desired branch. - merging a pull request to the master branch
PRs merged to this repo from a fork, and commits directly pushed to this repo
will build the commit specified in the BUILD_COMMIT
at the top of the
azure-pipelines.yml
file, the wheels will be
uploaded to https://anaconda.org/multibuild-wheels-staging/numpy. The
NIGHTLY_BUILD_COMMIT
is built once a week (sorry for the misnomer),
and uploaded to https://anaconda.org/scipy-wheels-nightly/.
The value of BUILD_COMMIT
can be any naming of a commit, including branch
name, tag name or commit hash.
When the wheels are updated, you can download them using the
download-wheels.py
script and then upload them using twine
. Things are
done this way so that we can generate hashes and the README files needed for a
release before putting the wheels up on PyPI. The download-wheels.py
script
is run as follows:
$ python3 tools/download-wheels.py 1.19.0 -w <path_to_wheelhouse>
Where 1.19.0
is the release version, The wheelhouse argument is optional
and defaults to ./release/installers
.
You will need beautifulsoup4 and urllib3 installed in order to run
download-wheels.py
and permissions in order to upload to PyPI.