Skip to content

Commit

Permalink
docs: modernize some of the pages
Browse files Browse the repository at this point in the history
Signed-off-by: Henry Schreiner <[email protected]>
  • Loading branch information
henryiii committed Feb 2, 2024
1 parent c598175 commit aa7ace3
Show file tree
Hide file tree
Showing 4 changed files with 59 additions and 46 deletions.
42 changes: 13 additions & 29 deletions docs/pages/guides/gha_wheels.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,7 @@ custom_title: GitHub Actions for Binary Wheels

Building binary wheels is a bit more involved, but can still be done effectively
with GHA. This document will introduce [cibuildwheel][] for use in your project.
The benefits of cibuildwheel are a larger user base, fast fixes from CI and pip,
works on all major CI vendors (no lock-in), and covers cases we were not able to
cover (like ARM). We will focus on GHA below.
We will focus on GHA below.

## Header

Expand Down Expand Up @@ -58,21 +56,11 @@ test-extras = "test"
test-command = "pytest {project}/tests"
# Optional
build-verbosity = 1
# Optional: support Universal2 for Apple Silicon with these two lines:
[tool.cibuildwheel.macos]
archs = ["auto", "universal2"]
test-skip = ["*universal2:arm64"]
```

The `test-extras` will cause the pip install to use `[test]`. The `test-command`
will use pytest to run your tests. You can also set the build verbosity (`-v` in
pip) if you want to. If you support Apple Silicon, you can add the final two
lines, the first of which enables the `universal2` wheel, which has both Intel
and AS architectures in it, and the second explicitly skips testing the AS part
of the wheel, since it can't be tested on Intel. If you use CMake instead of
pure setuptools, you will likely need further customizations for AS
cross-compiling. Only Python 3.8+ supports Apple Silicon.
pip) if you want to.

## Making an SDist

Expand Down Expand Up @@ -114,7 +102,7 @@ build_wheels:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, windows-latest, macos-latest]
os: [ubuntu-latest, windows-latest, macos-13, macos-14]
steps:
- uses: actions/checkout@v4
Expand All @@ -139,7 +127,12 @@ builds nicely into a wheel without strange customizations (if you _really_ need
them, check out [`CIBW_BEFORE_BUILD`][] and [`CIBW_ENVIRONMENT`][]).

This lists all three OS's; if you do not support Windows, you can remove that
here.
here. If you would rather make universal2 wheels for macOS, you can remove
either the Intel (`macos-13`) or Apple Silicon (`macos-14`) job and set
`CIBW_ARCHS_MACOS` to `"universal2"`. You can also set `CIBW_TEST_SKIP` to
`"*universal2:arm64"` if building from Intel to acknowledge you understand that
you can't test Apple Silicon from Intel. You can do this from the
`pyproject.toml` file instead if you want.

The build step is controlled almost exclusively through environment variables,
which makes it easier (usually) to setup in CI. The main variable needed here is
Expand All @@ -148,19 +141,10 @@ usually `CIBW_BUILD` to select the platforms you want to build for - see the
alternative architectures need emulation, so are not shown here (adds one extra
step).

You can also select different base images (the _default_ is manylinux2010). If
you want manylinux1, just do:

```yaml
env:
CIBW_MANYLINUX_X86_64_IMAGE: manylinux1
CIBW_MANYLINUX_I686_IMAGE: manylinux1
```

You can even put any docker image here, as needed by the project. Note that
manylinux1 was discontinued on Jan 1, 2022, and updates will cease whenever they
break. If you always need a specific image, you can set that in the
`pyproject.toml` file instead.
You can also select different base images (the _default_ is manylinux2014). If
you want a different supported image, set `CIBW_MANYLINUX_X86_64_IMAGE`,
`CIBW_MANYLINUX_I686_IMAGE`, etc. If you always need a specific image, you can
set that in the `pyproject.toml` file instead.

## Publishing

Expand Down
39 changes: 24 additions & 15 deletions docs/pages/guides/packaging_classic.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,13 @@ parent: Topical Guides

The libraries in the scientific Python ecosytem have a variety of different
packaging styles, but this document is intended to outline a recommended style
that new packages should follow, and existing packages should slowly adopt. The
reasoning for each decision is outlined as well.
that existing packages should slowly adopt. The reasoning for each decision is
outlined as well.

There are currently several popular packaging systems. This guide covers
[Setuptools][], which is currently the only system that supports compiled
extensions. If you are not planning on writing C/C++ code, other systems like
[Hatch][] are drastically simpler - most of this page is unneeded for those
There are several popular packaging systems. This guide covers [Setuptools][],
which is the oldest system and supports compiled extensions. If you are not
working on legacy code or are willing to make a larger change, other systems
like [Hatch][] are drastically simpler - most of this page is unneeded for those
systems.

Also see the [Python packaging guide][], especially the [Python packaging
Expand Down Expand Up @@ -81,14 +81,10 @@ have dev instructions on how to install requirements needed to run `setup.py`.

You can also use this to select your entire build system; we use setuptools
above but you can also use others, such as [Flit][] or [Poetry][]. This is
possible due to the `build-backend` selection, as described in PEP 517.
Scientific Python packages don't often use these since they usually do not allow
binary packages to be created and a few common developer needs, like editable
installs, look slightly different (a way to include editable installs in PEP 517
is being worked on). Usage of these "[hypermodern][]" packaging tools are
generally not found in scientific Python packages, but not discouraged; all
tools build the same wheels (and they often build setuptools compliant SDists,
as well).
possible due to the `build-backend` selection, as described in PEP 517. Usage of
these "[hypermodern][]" packaging tools is growing in scientific Python
packages. All tools build the same wheels (and they often build setuptools
compliant SDists, as well).

{% rr PP003 %} Note that `"wheel"` is never required; it is injected
automatically by setuptools only when needed.
Expand All @@ -97,7 +93,7 @@ automatically by setuptools only when needed.

You may want to build against NumPy (mostly for Cython packages, pybind11 does
not need to access the NumPy headers). This is the recommendation for scientific
Python packages:
Python packages supporting older versions of NumPy:

```toml
requires = [
Expand All @@ -113,6 +109,19 @@ developers that tracks the
Otherwise, you would have to list the earliest version of NumPy that had support
for each Python version here.

{: .note }

> Modern versions of NumPy (1.25+) allow you to target older versions when
> building, which is _highly_ recommended, and this will become required in
> NumPy 2.0. Now you add:
>
> ```cpp
> #define NPY_TARGET_VERSION NPY_1_22_API_VERSION
> ```
>
> (Where that number is whatever version you support as a minimum) then make
> sure you build with NumPy 1.25+ (or 2.0+ when it comes out).

## Versioning (medium/high priority)

Scientific Python packages should use one of the following systems:
Expand Down
20 changes: 20 additions & 0 deletions docs/pages/guides/packaging_compiled.md
Original file line number Diff line number Diff line change
Expand Up @@ -335,6 +335,26 @@ Unlike pure Python, you'll need to build redistributable wheels for each
platform and supported Python version if you want to avoid compilation on the
user's system. See [the CI page on wheels][gha_wheels] for a suggested workflow.

## Special considerations

### NumPy

Modern versions of NumPy (1.25+) allow you to target older versions when
building, which is _highly_ recommended, and this will become required in NumPy
2.0. Now you add:

```cpp
#define NPY_TARGET_VERSION NPY_1_22_API_VERSION
```
(Where that number is whatever version you support as a minimum) then make sure
you build with NumPy 1.25+ (or 2.0+ when it comes out). Before 1.25, it was
necessary to actually pin the oldest NumPy you supported (the
`oldest-supported-numpy` package is the easiest method). If you support Python <
3.9, you'll have to use the old method for those versions.
If using pybind11, you don't need NumPy at build-time in the first place.
<!-- prettier-ignore-start -->
[scikit-build-core]: https://scikit-build-core.readthedocs.io
Expand Down
4 changes: 2 additions & 2 deletions src/sp_repo_review/checks/github.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ def check(workflows: dict[str, Any]) -> bool:

GH104_ERROR_MSG = """
Multiple upload-artifact usages _must_ have unique names to be
compatible with `v4` (which no longer merge artifacts, but instead
compatible with `v4` (which no longer merges artifacts, but instead
errors out). The most general solution is:
```yaml
Expand All @@ -130,7 +130,7 @@ class GH104(GitHub):
"Use unique names for upload-artifact"

requires = {"GH100"}
url = mk_url("gha-wheel")
url = mk_url("gha-wheels")

@staticmethod
def check(workflows: dict[str, Any]) -> str:
Expand Down

0 comments on commit aa7ace3

Please sign in to comment.