8000 BLD Migrate away from distutils and only use setuptools (#24563) · andportnoy/scikit-learn@6258ce6 · GitHub
[go: up one dir, main page]

Skip to content

Commit 6258ce6

Browse files
thomasjpfanjjerphanogriselglemaitre
authored andcommitted
BLD Migrate away from distutils and only use setuptools (scikit-learn#24563)
Co-authored-by: Julien Jerphanion <git@jjerphan.xyz> Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
1 parent f09f1e6 commit 6258ce6

File tree

37 files changed

+419
-1446
lines changed

37 files changed

+419
-1446
lines changed

MANIFEST.in

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
include *.rst
22
recursive-include doc *
33
recursive-include examples *
4-
recursive-include sklearn *.c *.h *.pyx *.pxd *.pxi *.tp
4+
recursive-include sklearn *.c *.cpp *.h *.pyx *.pxd *.pxi *.tp
55
recursive-include sklearn/datasets *.csv *.csv.gz *.rst *.jpg *.txt *.arff.gz *.json.gz
66
include COPYING
77
include README.rst

azure-pipelines.yml

Lines changed: 0 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -106,27 +106,6 @@ jobs:
106106
LOCK_FILE: './build_tools/azure/python_nogil_lock.txt'
107107
COVERAGE: 'false'
108108

109-
# Check compilation with intel C++ compiler (ICC)
110-
- template: build_tools/azure/posix.yml
111-
parameters:
112-
name: Linux_Nightly_ICC
113-
vmImage: ubuntu-20.04
114-
dependsOn: [git_commit, linting]
115-
condition: |
116-
and(
117-
succeeded(),
118-
not(contains(dependencies['git_commit']['outputs']['commit.message'], '[ci skip]')),
119-
or(eq(variables['Build.Reason'], 'Schedule'),
120-
contains(dependencies['git_commit']['outputs']['commit.message'], '[icc-build]')
121-
)
122-
)
123-
matrix:
124-
pylatest_conda_forge_mkl:
125-
DISTRIB: 'conda'
126-
LOCK_FILE: 'build_tools/azure/pylatest_conda_forge_mkl_no_coverage_linux-64_conda.lock'
127-
COVERAGE: 'false'
128-
BUILD_WITH_ICC: 'true'
129-
130109
- template: build_tools/azure/posix-docker.yml
131110
parameters:
132111
name: Linux_Nightly_PyPy
@@ -182,7 +161,6 @@ jobs:
182161
DISTRIB: 'conda'
183162
LOCK_FILE: './build_tools/azure/py38_conda_forge_openblas_ubuntu_2204_linux-64_conda.lock'
184163
COVERAGE: 'false'
185-
BUILD_WITH_ICC: 'false'
186164
SKLEARN_TESTS_GLOBAL_RANDOM_SEED: '0' # non-default seed
187165

188166
- template: build_tools/azure/posix.yml

build_tools/azure/install.sh

Lines changed: 0 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -59,15 +59,6 @@ pre_python_environment_install() {
5959
export PYTHON_NOGIL_PATH="${PYTHON_NOGIL_CLONE_PATH}/python"
6060
cd $OLDPWD
6161

62-
elif [[ "$BUILD_WITH_ICC" == "true" ]]; then
63-
wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
64-
sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
65-
rm GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
66-
sudo add-apt-repository "deb https://apt.repos.intel.com/oneapi all main"
67-
sudo apt-get update
68-
sudo apt-get install intel-oneapi-compiler-dpcpp-cpp-and-cpp-classic
69-
source /opt/intel/oneapi/setvars.sh
70-
7162
fi
7263
}
7364

@@ -122,13 +113,6 @@ scikit_learn_install() {
122113
export LDFLAGS="$LDFLAGS -Wl,--sysroot=/"
123114
fi
124115

125-
if [[ "$BUILD_WITH_ICC" == "true" ]]; then
126-
# The "build_clib" command is implicitly used to build "libsvm-skl".
127-
# To compile with a different compiler, we also need to specify the
128-
# compiler for this command
129-
python setup.py build_ext --compiler=intelem -i build_clib --compiler=intelem
130-
fi
131-
132116
# TODO use a specific variable for this rather than using a particular build ...
133117
if [[ "$DISTRIB" == "conda-pip-latest" ]]; then
134118
# Check that pip can automatically build scikit-learn with the build

build_tools/azure/test_docs.sh

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,4 @@ elif [[ "$DISTRIB" == "ubuntu" || "$DISTRIB" == "pip-nogil" ]]; then
88
source $VIRTUALENV/bin/activate
99
fi
1010

11-
if [[ "$BUILD_WITH_ICC" == "true" ]]; then
12-
source /opt/intel/oneapi/setvars.sh
13-
fi
14-
1511
make test-doc

build_tools/azure/test_script.sh

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -11,10 +11,6 @@ elif [[ "$DISTRIB" == "ubuntu" || "$DISTRIB" == "debian-32" || "$DISTRIB" == "pi
1111
source $VIRTUALENV/bin/activate
1212
fi
1313

14-
if [[ "$BUILD_WITH_ICC" == "true" ]]; then
15-
source /opt/intel/oneapi/setvars.sh
16-
fi
17-
1814
if [[ "$BUILD_REASON" == "Schedule" ]]; then
1915
# Enable global random seed randomization to discover seed-sensitive tests
2016
# only on nightly builds.
@@ -62,9 +58,6 @@ if [[ -n "$CHECK_WARNINGS" ]]; then
6258
# removes its usage
6359
TEST_CMD="$TEST_CMD -Wignore:tostring:DeprecationWarning"
6460

65-
# Python 3.10 deprecates distutils, which is imported by numpy internally
66-
TEST_CMD="$TEST_CMD -Wignore:The\ distutils:DeprecationWarning"
67-
6861
# Ignore distutils deprecation warning, used by joblib internally
6962
TEST_CMD="$TEST_CMD -Wignore:distutils\ Version\ classes\ are\ deprecated:DeprecationWarning"
7063

build_tools/circle/list_versions.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
import re
66
import sys
77

8-
from distutils.version import LooseVersion
8+
from sklearn.utils.fixes import parse_version
99
from urllib.request import urlopen
1010

1111

@@ -37,8 +37,8 @@ def get_file_extension(version):
3737
# The 'dev' branch should be explicitly handled
3838
return "zip"
3939

40-
current_version = LooseVersion(version)
41-
min_zip_version = LooseVersion("0.24")
40+
current_version = parse_version(version)
41+
min_zip_version = parse_version("0.24")
4242

4343
return "zip" if current_version >= min_zip_version else "pdf"
4444

@@ -94,7 +94,7 @@ def get_file_size(version):
9494
# Output in order: dev, stable, decreasing other version
9595
seen = set()
9696
for name in NAMED_DIRS + sorted(
97-
(k for k in dirs if k[:1].isdigit()), key=LooseVersion, reverse=True
97+
(k for k in dirs if k[:1].isdigit()), key=parse_version, reverse=True
9898
):
9999
version_num, file_size = dirs[name]
100100
if version_num in seen:

build_tools/github/build_wheels.sh

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -31,11 +31,6 @@ if [[ "$RUNNER_OS" == "macOS" ]]; then
3131
export CFLAGS="$CFLAGS -I$PREFIX/include"
3232
export CXXFLAGS="$CXXFLAGS -I$PREFIX/include"
3333
export LDFLAGS="$LDFLAGS -Wl,-rpath,$PREFIX/lib -L$PREFIX/lib -lomp"
34-
# Disable the use of setuptools's vendored copy distutils when invoking setuptools
35-
# See: https://setuptools.pypa.io/en/latest/deprecated/distutils-legacy.html
36-
# TODO: remove the definition of this environment variable when no
37-
# reference to distutils exist in the code-base for building scikit-learn.
38-
export SETUPTOOLS_USE_DISTUTILS=stdlib
3934
fi
4035

4136
# The version of the built dependencies are specified

doc/computing/computational_performance.rst

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -278,10 +278,9 @@ BLAS implementation and lead to orders of magnitude speedup over a
278278
non-optimized BLAS.
279279

280280
You can display the BLAS / LAPACK implementation used by your NumPy / SciPy /
281-
scikit-learn install with the following commands::
281+
scikit-learn install with the following command::
282282

283-
from numpy import show_config
284-
show_config()
283+
python -c "import sklearn; sklearn.show_versions()"
285284

286285
Optimized BLAS / LAPACK implementations include:
287286
- Atlas (need hardware specific tuning by rebuilding on the target machine)

doc/developers/advanced_installation.rst

Lines changed: 0 additions & 69 deletions
Original file line numberDiff line numberDiff line change
@@ -461,75 +461,6 @@ the base system and these steps will not be necessary.
461461
.. _conda environment: https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html
462462
.. _Miniforge3: https://github.com/conda-forge/miniforge#miniforge3
463463

464-
Alternative compilers
465-
=====================
466-
467-
The command:
468-
469-
.. prompt:: bash $
470-
471-
pip install --verbose --editable .
472-
473-
will build scikit-learn using your default C/C++ compiler. If you want to build
474-
scikit-learn with another compiler handled by ``distutils`` or by
475-
``numpy.distutils``, use the following command:
476-
477-
.. prompt:: bash $
478-
479-
python setup.py build_ext --compiler=<compiler> -i build_clib --compiler=<compiler>
480-
481-
To see the list of available compilers run:
482-
483-
.. prompt:: bash $
484-
485-
python setup.py build_ext --help-compiler
486-
487-
If your compiler is not listed here, you can specify it via the ``CC`` and
488-
``LDSHARED`` environment variables (does not work on windows):
489-
490-
.. prompt:: bash $
491-
492-
CC=<compiler> LDSHARED="<compiler> -shared" python setup.py build_ext -i
493-
494-
Building with Intel C Compiler (ICC) using oneAPI on Linux
495-
----------------------------------------------------------
496-
497-
Intel provides access to all of its oneAPI toolkits and packages through a
498-
public APT repository. First you need to get and install the public key of this
499-
repository:
500-
501-
.. prompt:: bash $
502-
503-
wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
504-
sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
505-
rm GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
506-
507-
Then, add the oneAPI repository to your APT repositories:
508-
509-
.. prompt:: bash $
510-
511-
sudo add-apt-repository "deb https://apt.repos.intel.com/oneapi all main"
512-
sudo apt-get update
513-
514-
Install ICC, packaged under the name
515-
``intel-oneapi-compiler-dpcpp-cpp-and-cpp-classic``:
516-
517-
.. prompt:: bash $
518-
519-
sudo apt-get install intel-oneapi-compiler-dpcpp-cpp-and-cpp-classic
520-
521-
Before using ICC, you need to set up environment variables:
522-
523-
.. prompt:: bash $
524-
525-
source /opt/intel/oneapi/setvars.sh
526-
527-
Finally, you can build scikit-learn. For example on Linux x86_64:
528-
529-
.. prompt:: bash $
530-
531-
python setup.py build_ext --compiler=intelem -i build_clib --compiler=intelem
532-
533464
Parallel builds
534465
===============
535466

doc/developers/contributing.rst

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -549,7 +549,6 @@ message, the following actions are taken.
549549
[lint skip] Azure pipeline skips linting
550550
[scipy-dev] Build & test with our dependencies (numpy, scipy, etc ...) development builds
551551
[nogil] Build & test with the nogil experimental branches of CPython, Cython, NumPy, SciPy...
552-
[icc-build] Build & test with the Intel C compiler (ICC)
553552
[pypy] Build & test with PyPy
554553
[float32] Run float32 tests by setting `SKLEARN_RUN_FLOAT32_TESTS=1`. See :ref:`environment_variable` for more details
555554
[doc skip] Docs are not built

0 commit comments

Comments
 (0)
0