Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit cd60a35

Browse filesBrowse files
authored
MNT Use cibuildwheel to build the wheels with [cd build] (#17921)
1 parent 9a0bc63 commit cd60a35
Copy full SHA for cd60a35

File tree

Expand file treeCollapse file tree

8 files changed

+267
-0
lines changed
Filter options
Expand file treeCollapse file tree

8 files changed

+267
-0
lines changed

‎.github/workflows/wheels.yml

Copy file name to clipboard
+125Lines changed: 125 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,125 @@
1+
# Workflow to build and test wheels
2+
name: Wheel builder
3+
4+
on:
5+
schedule:
6+
# Nightly build at 3:42 A.M.
7+
- cron: "42 3 */1 * *"
8+
push:
9+
branches:
10+
# Release branches
11+
- "[0-9]+.[0-9]+.X"
12+
pull_request:
13+
branches:
14+
- master
15+
- "[0-9]+.[0-9]+.X"
16+
17+
env:
18+
SCIKIT_LEARN_VERSION: 0.24.dev0
19+
20+
jobs:
21+
# Check whether to build the wheels and the source tarball
22+
check_build_trigger:
23+
name: Check build trigger
24+
runs-on: ubuntu-latest
25+
outputs:
26+
build: ${{ steps.check_build_trigger.outputs.build }}
27+
28+
steps:
29+
- name: Checkout scikit-learn
30+
uses: actions/checkout@v1
31+
32+
- id: check_build_trigger
33+
name: Check build trigger
34+
run: bash build_tools/github/check_build_trigger.sh
35+
36+
# Build the wheels for Linux, Windows and macOS for Python 3.6 and newer
37+
build_wheels:
38+
name: Build wheels on ${{ matrix.os }} for Python ${{ matrix.python }}
39+
runs-on: ${{ matrix.os }}
40+
needs: check_build_trigger
41+
if: needs.check_build_trigger.outputs.build
42+
43+
strategy:
44+
# Ensure that a wheel builder finishes even if another fails
45+
fail-fast: false
46+
matrix:
47+
os: [windows-latest, ubuntu-latest, macos-latest]
48+
python: [36, 37, 38, 39]
49+
50+
steps:
51+
- name: Checkout scikit-learn
52+
uses: actions/checkout@v1
53+
54+
- name: Setup Python
55+
uses: actions/setup-python@v2
56+
57+
- name: Build and test wheels
58+
env:
59+
# Set the directory where the wheel is unpacked
60+
CIBW_ENVIRONMENT: "WHEEL_DIRNAME=scikit_learn-$SCIKIT_LEARN_VERSION"
61+
CIBW_BUILD: cp${{ matrix.python }}-*
62+
CIBW_TEST_REQUIRES: pytest pandas threadpoolctl
63+
# Test that there are no links to system libraries
64+
CIBW_TEST_COMMAND: pytest --pyargs sklearn &&
65+
python -m threadpoolctl -i sklearn
66+
# By default, the Windows wheels are not repaired.
67+
# In this case, we need to vendor the vcomp140.dll
68+
CIBW_REPAIR_WHEEL_COMMAND_WINDOWS: wheel unpack {wheel} &&
69+
python build_tools/github/vendor_vcomp140.py %WHEEL_DIRNAME% &&
70+
wheel pack %WHEEL_DIRNAME% -d {dest_dir} &&
71+
rmdir /s /q %WHEEL_DIRNAME%
72+
73+
run: bash build_tools/github/build_wheels.sh
74+
75+
- name: Store artifacts
76+
uses: actions/upload-artifact@v2
77+
with:
78+
path: wheelhouse/*.whl
79+
80+
# Build the source distribution under Linux
81+
build_sdist:
82+
name: Source distribution
83+
runs-on: ubuntu-latest
84+
needs: check_build_trigger
85+
if: needs.check_build_trigger.outputs.build
86+
87+
steps:
88+
- name: Checkout scikit-learn
89+
uses: actions/checkout@v1
90+
91+
- name: Setup Python
92+
uses: actions/setup-python@v2
93+
94+
- name: Build and test source distribution
95+
run: bash build_tools/github/build_source.sh
96+
97+
- name: Store artifacts
98+
uses: actions/upload-artifact@v2
99+
with:
100+
path: dist/*.tar.gz
101+
102+
# Upload the wheels and the source distribution
103+
upload_anaconda:
104+
name: Upload to Anaconda
105+
runs-on: ubuntu-latest
106+
needs: [build_wheels, build_sdist]
107+
# The artifacts are not be uploaded on PRs
108+
if: github.event_name != 'pull_request'
109+
110+
steps:
111+
- name: Download artifacts
112+
uses: actions/download-artifact@v2
113+
with:
114+
path: dist
115+
116+
- name: Setup Python
117+
uses: actions/setup-python@v2
118+
119+
- name: Upload artifacts
120+
env:
121+
# Secret variables need to be mapped to environment variables explicitly
122+
SCIKIT_LEARN_NIGHTLY_UPLOAD_TOKEN: ${{ secrets.SCIKIT_LEARN_NIGHTLY_UPLOAD_TOKEN }}
123+
SCIKIT_LEARN_STAGING_UPLOAD_TOKEN: ${{ secrets.SCIKIT_LEARN_STAGING_UPLOAD_TOKEN }}
124+
# Force a replacement if the remote file already exists
125+
run: bash build_tools/github/upload_anaconda.sh

‎build_tools/github/build_source.sh

Copy file name to clipboard
+17Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
#!/bin/bash
2+
3+
set -e
4+
set -x
5+
6+
python -m pip install numpy scipy cython
7+
python -m pip install twine
8+
python -m pip install pytest pandas
9+
10+
python setup.py sdist
11+
python -m pip install dist/*.tar.gz
12+
python setup.py build_ext -i
13+
14+
pytest --pyargs sklearn
15+
16+
# Check whether the source distribution will render correctly
17+
twine check dist/*.tar.gz

‎build_tools/github/build_wheels.sh

Copy file name to clipboard
+22Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
#!/bin/bash
2+
3+
set -e
4+
set -x
5+
6+
# OpenMP is not present on macOS by default
7+
if [ "$RUNNER_OS" == "macOS" ]; then
8+
brew install libomp
9+
export CC=/usr/bin/clang
10+
export CXX=/usr/bin/clang++
11+
export CPPFLAGS="$CPPFLAGS -Xpreprocessor -fopenmp"
12+
export CFLAGS="$CFLAGS -I/usr/local/opt/libomp/include"
13+
export CXXFLAGS="$CXXFLAGS -I/usr/local/opt/libomp/include"
14+
export LDFLAGS="$LDFLAGS -Wl,-rpath,/usr/local/opt/libomp/lib -L/usr/local/opt/libomp/lib -lomp"
15+
fi
16+
17+
# The version of the built dependencies are specified
18+
# in the pyproject.toml file, while the tests are run
19+
# against the most recent version of the dependencies
20+
21+
python -m pip install cibuildwheel
22+
python -m cibuildwheel --output-dir wheelhouse
+13Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
#!/bin/bash
2+
3+
set -e
4+
set -x
5+
6+
COMMIT_MSG=$(git log --no-merges -1 --oneline)
7+
8+
# The commit marker "[cd build]" will trigger the build when required
9+
if [[ "$GITHUB_EVENT_NAME" == push ||
10+
"$GITHUB_EVENT_NAME" == schedule ||
11+
"$COMMIT_MSG" =~ \[cd\ build\] ]]; then
12+
echo "::set-output name=build::true"
13+
fi

‎build_tools/github/upload_anaconda.sh

Copy file name to clipboard
+18Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
#!/bin/bash
2+
3+
set -e
4+
set -x
5+
6+
if [ "$GITHUB_EVENT_NAME" == "schedule" ]; then
7+
ANACONDA_ORG="scipy-wheels-nightly"
8+
ANACONDA_TOKEN="$SCIKIT_LEARN_NIGHTLY_UPLOAD_TOKEN"
9+
else
10+
ANACONDA_ORG="scikit-learn-wheels-staging"
11+
ANACONDA_TOKEN="$SCIKIT_LEARN_STAGING_UPLOAD_TOKEN"
12+
fi
13+
14+
conda install -q -y anaconda-client
15+
16+
# Force a replacement if the remote file already exists
17+
anaconda -t $ANACONDA_TOKEN upload --force -u $ANACONDA_ORG dist/*
18+
echo "Index: https://pypi.anaconda.org/$ANACONDA_ORG/simple"

‎build_tools/github/vendor_vcomp140.py

Copy file name to clipboard
+70Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
"""Embed vcomp140.dll after generating the scikit-learn Windows wheel."""
2+
3+
4+
import os
5+
import os.path as op
6+
import shutil
7+
import sys
8+
import textwrap
9+
10+
11+
TARGET_FOLDER = op.join("sklearn", ".libs")
12+
DISTRIBUTOR_INIT = op.join("sklearn", "_distributor_init.py")
13+
VCOMP140_SRC_PATH = "C:\\Windows\System32\\vcomp140.dll" # noqa
14+
15+
16+
def make_distributor_init(distributor_init, dll_filename):
17+
"""Create a _distributor_init.py file for the vcomp140.dll.
18+
19+
This file is imported first when importing the
20+
sklearn package so as to pre-load the vendored
21+
vcomp140.dll.
22+
"""
23+
with open(distributor_init, "wt") as f:
24+
f.write(textwrap.dedent("""
25+
'''Helper to preload vcomp140.dll to prevent "not found" errors.
26+
27+
Once the vcomp140.dll is preloaded, the namespace is made
28+
available to any subsequent vcomp140.dll. This is created
29+
as part of the scripts that build the wheel.
30+
'''
31+
32+
import os
33+
import os.path as op
34+
from ctypes import WinDLL
35+
36+
37+
if os.name == "nt":
38+
# Load the vcomp140.dll in sklearn/.libs by convention
39+
dll_path = op.join(op.dirname(__file__), ".libs", "{0}")
40+
WinDLL(op.abspath(dll_path))
41+
""".format(dll_filename)))
42+
43+
44+
def main(wheel_dirname):
45+
"""Embed the vcomp140.dll in the wheel."""
46+
if not op.exists(VCOMP140_SRC_PATH):
47+
raise ValueError(f"Could not find {VCOMP140_SRC_PATH}.")
48+
49+
if not op.isdir(wheel_dirname):
50+
raise RuntimeError(f"Could not find {wheel_dirname} file.")
51+
52+
dll_filename = op.basename(VCOMP140_SRC_PATH)
53+
target_folder = op.join(wheel_dirname, TARGET_FOLDER)
54+
distributor_init = op.join(wheel_dirname, DISTRIBUTOR_INIT)
55+
56+
# Create the "sklearn/.libs" subfolder
57+
if not op.exists(target_folder):
58+
os.mkdir(target_folder)
59+
60+
print(f"Copying {VCOMP140_SRC_PATH} to {target_folder}.")
61+
shutil.copy2(VCOMP140_SRC_PATH, target_folder)
62+
63+
# Generate the _distributor_init file in the source tree
64+
print("Generating the '_distributor_init.py' file.")
65+
make_distributor_init(distributor_init, dll_filename)
66+
67+
68+
if __name__ == "__main__":
69+
_, wheel_file = sys.argv
70+
main(wheel_file)

‎doc/developers/contributing.rst

Copy file name to clipboardExpand all lines: doc/developers/contributing.rst
+1Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -496,6 +496,7 @@ message, the following actions are taken.
496496
Commit Message Marker Action Taken by CI
497497
---------------------- -------------------
498498
[ci skip] CI is skipped completely
499+
[cd build] CD is run (wheels and source distribution are built)
499500
[lint skip] Azure pipeline skips linting
500501
[scipy-dev] Add a Travis build with our dependencies (numpy, scipy, etc ...) development builds
501502
[icc-build] Add a Travis build with the Intel C compiler (ICC)

‎sklearn/cross_decomposition/tests/test_pls.py

Copy file name to clipboardExpand all lines: sklearn/cross_decomposition/tests/test_pls.py
+1Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -382,6 +382,7 @@ def test_copy(Est):
382382
pls.predict(X.copy(), copy=False))
383383

384384

385+
@pytest.mark.xfail
385386
@pytest.mark.parametrize('Est', (CCA, PLSCanonical, PLSRegression, PLSSVD))
386387
def test_scale_and_stability(Est):
387388
# scale=True is equivalent to scale=False on centered/scaled data

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.