Continuous integration

Tip

If a workflow has:

on:
  schedule:
    - cron: "..."

Add a workflow_dispatch event, to be able to test the workflow by triggering it manually:

on:
  workflow_dispatch:
  schedule:
    - cron: "..."

Automated tests

Create a .github/workflows/ci.yml file, and use or adapt one of the templates below.

  • Workflows should have a single responsibility: running tests, linting Python, checking translations, deploying, etc. To connect workflows, read Events that trigger workflows and Running a workflow based on the conclusion of another workflow, in particular.

  • If the project is only used with a specific version of the OS or Python, set runs-on: and python-version: appropriately.

  • If a run: step uses an env: key, put env: before run:, so that the reader is more likely to see the command with its environment.

  • If a run: step is a single line, omit the name: key.

  • Put commands that form logical units in the same run: step. For example:

    - name: Install gettext
      run: |
        sudo apt update
        sudo apt install gettext
    

    Not:

    - run: sudo apt update # WRONG
    - run: sudo apt install gettext # WRONG
    

Reference: Customizing GitHub-hosted runners

Python warnings

The step that runs tests should set either the -W option or the PYTHONWARNINGS environment variable to error.

Tip

The Python documentation describes warning filter specifications as using regular expressions. However, this is only true when using the warnings module. If set using -W or PYTHONWARNINGS, the message and module parts are escaped using re.escape, and the module part is suffixed with a \Z anchor.

Code coverage

All the templates below use Coveralls, as preferred.

Service containers

If the workflow requires service containers, add the services: key after the steps: key, so that files are easier to compare visually.

Note

Service containers are only available on Ubuntu runners.

Mock APIs

Use the mccutchen/go-httpbin image to mock APIs. For example:

steps:
  # ...
  - env:
      TEST_URL: http://localhost:${{ job.services.httpbin.ports[8080] }}
    run: coverage run --source=MODULENAME -m pytest -W error
services:
  httpbin:
    image: mccutchen/go-httpbin:latest
    ports:
      - 8080/tcp
import requests

TEST_URL = os.getenv("TEST_URL", "http://httpbingo.org")


def test_200():
    assert requests.get(f"{TEST_URL}/status/200").status_code == 200

Note

Services to mock APIs include httpbin, RequestBin, Postman Echo, PostBin, etc.

PostgreSQL

Set the image tag to the version used in production.

services:
  postgres:
    image: postgres:15
    env:
      POSTGRES_PASSWORD: postgres
    options: >-
      --health-cmd pg_isready
      --health-interval 10s
      --health-timeout 5s
      --health-retries 5
    ports:
      - 5432/tcp

This connection string can be used in psql commands or in environment variables to setup the database or configure the application:

postgresql://postgres:postgres@localhost:${{ job.services.postgres.ports[5432] }}/postgres

Tip

If you are running out of connections, use the cyberboss/postgres-max-connections image, which is a fork of postgres:latest with max_connections=500.

Reference: Creating PostgreSQL service containers

RabbitMQ

services:
  rabbitmq:
    image: rabbitmq:latest
    options: >-
      --health-cmd "rabbitmqctl node_health_check"
      --health-interval 10s
      --health-timeout 5s
      --health-retries 5
    ports:
      - 5672/tcp

This connection string can be used:

amqp://127.0.0.1:${{ job.services.rabbitmq.ports[5672] }}

Elasticsearch

Set the image tag to the version used in production.

services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.10.0
    env:
      discovery.type: single-node
    options: >-
      --health-cmd "curl localhost:9200/_cluster/health"
      --health-interval 10s
      --health-timeout 5s
      --health-retries 5
    ports:
      - 9200/tcp

Templates

Applications

Set the Ubuntu version and Python version to those used in production.

If using Django, use this template, replacing core if needed and adding app directories as comma-separated values after --source:

Note

Remember to add __init__.py files to the management and management/commands directories within app directories. Otherwise, their coverage won’t be calculated.

name: CI
on: [push, pull_request]
jobs:
  build:
    if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '{{ cookiecutter.python_version }}'
          cache: pip
          cache-dependency-path: '**/requirements*.txt'
      - run: pip install -r requirements.txt
      # Check requirements.txt contains production requirements.
      - run: ./manage.py --help
      - run: pip install -r requirements_dev.txt
      - name: Run checks and tests
        env:
          PYTHONWARNINGS: error
          DATABASE_URL: postgresql://postgres:postgres@localhost:${{ "{{" }} job.services.postgres.ports[5432] }}/postgres
        shell: bash
        run: |
          ./manage.py migrate
          ./manage.py makemigrations --check --dry-run
          ./manage.py check --fail-level WARNING
          coverage run --source core manage.py test
      - uses: coverallsapp/github-action@v2
    services:
      postgres:
        image: postgres:15
        env:
          POSTGRES_PASSWORD: postgres
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5
        ports:
          - 5432/tcp

Otherwise, use this template, replacing APPNAME1:

name: CI
on: [push, pull_request]
jobs:
  build:
    if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.10'
          cache: pip
          cache-dependency-path: '**/requirements*.txt'
      - run: pip install -r requirements_dev.txt
      - run: coverage run --source=APPNAME1 -m pytest -W error
      - uses: coverallsapp/github-action@v2

Packages

If using tox:

name: CI
on: [push, pull_request]
jobs:
  build:
    if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
    runs-on: ${{ matrix.os }}
    strategy:
      matrix:
        os: [macos-latest, windows-latest, ubuntu-latest]
        python-version: [3.9, '3.10', '3.11', '3.12', pypy-3.10]
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: ${{ matrix.python-version }}
      - run: pip install coverage tox
      - run: tox -- coverage run --source=PACKAGENAME --append -m pytest
      - run: coverage lcov
      - uses: coverallsapp/github-action@v2

Note

Use matrix in GitHub Actions to test multiple Python versions, not tox. This makes it easier to install version-specific dependencies (like libxml2-dev for PyPy), and it makes exclusions more visible (like pypy-3.10 on Windows).

If not using tox, use this template, replacing {{ cookiecutter.package_name }} and removing the Jinja syntax if not using the Cookiecutter template:

name: CI
on: [push, pull_request]
jobs:
  build:
    if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
{%- if cookiecutter.os_independent == "y" %}
    {%- raw %}
    runs-on: ${{ matrix.os }}
    {%- endraw %}
{%- else %}
    runs-on: ubuntu-latest
{%- endif %}
    strategy:
      matrix:
{%- if cookiecutter.os_independent == "y" %}
        os: [macos-latest, windows-latest, ubuntu-latest]
{%- endif %}
        python-version: [3.9, '3.10', '3.11', '3.12'{% if cookiecutter.pypy == "y" %}, pypy-3.10{% endif %}]
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          {%- raw %}
          python-version: ${{ matrix.python-version }}
          {%- endraw %}
          cache: pip
          cache-dependency-path: pyproject.toml
      - run: pip install .[test]
      - run: coverage run --source={{ cookiecutter.package_name }} pytest -W error
      - uses: coverallsapp/github-action@v2

Test packages on Python versions that aren’t end-of-life, and on the latest version of PyPy. Test on Ubuntu, macOS and Windows (though only Ubuntu if a service container is needed).

If the package has optional support for orjson, to test on PyPy, replace the pytest step with the following steps, replacing PACKAGENAME:

# "orjson does not support PyPy" and fails to install. https://pypi.org/project/orjson/
- if: ${{ matrix.python-version != 'pypy-3.10' }}
  name: Test
  shell: bash
  run: |
    coverage run --source=PACKAGENAME --append -m pytest
    pip install orjson
    coverage run --source=PACKAGENAME --append -m pytest
    pip uninstall -y orjson
- if: ${{ matrix.python-version == 'pypy-3.10' }}
  name: Test
  run: coverage run --source=PACKAGENAME -m pytest

Static files

For example, the Extension Registry mainly contains static files. Tests are used to validate the files.

name: CI
on: [push, pull_request]
jobs:
  build:
    if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.12'
          cache: pip
          cache-dependency-path: '**/requirements*.txt'
      - run: pip install -r requirements_dev.txt
      - run: pytest -W error

Dependabot

Keep GitHub Actions up-to-date with:

.github/dependabot.yml
version: 2
updates:
  - package-ecosystem: "github-actions"
    directory: "/"
    schedule:
      interval: "daily"

Reference: Configuration options for dependency updates

Maintenance

Find unexpected workflows:

find . -path '*/workflows/*' ! -path '*/node_modules/*' ! -path '*/vendor/*' ! -name a11y.yml ! -name automerge.yml ! -name ci.yml ! -name deploy.yml ! -name docker.yml ! -name i18n.yml ! -name js.yml ! -name lint.yml ! -name pypi.yml ! -name shell.yml ! -name release.yml ! -name spellcheck.yml

Find ci.yml files without lint.yml files, and vice versa:

find . \( -name lint.yml \) -exec bash -c 'if [[ -z $(find $(echo {} | cut -d/ -f2) -name ci.yml) ]]; then echo {}; fi' \;
find . \( -name ci.yml \) ! -path '*/node_modules/*' -exec bash -c 'if [[ -z $(find $(echo {} | cut -d/ -f2) -name lint.yml) ]]; then echo {}; fi' \;

Find and compare pypi.yml files:

find . -name pypi.yml -exec bash -c 'echo $(shasum {} | cut -d" " -f1) {}' \;

Find repositories for Python packages but without pypi.yml files:

find . -name pyproject.toml ! -path '*/node_modules/*' -exec bash -c 'if grep classifiers {} > /dev/null && [[ -z $(find $(echo {} | cut -d/ -f2) -name pypi.yml) ]]; then echo {}; fi' \;

Find repositories with LC_MESSAGES directories but without i18n.yml files:

find . -name LC_MESSAGES ! -path '*/en/*' -exec bash -c 'if [[ -z $(find $(echo {} | cut -d/ -f2) -name i18n.yml) ]]; then echo {}; fi' \;

Reference

The following prevents GitHub Actions from running a workflow twice when pushing to the branch of a pull request:

if: ${{ github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository }}

Note

A common configuration for GitHub Actions is:

on:
  push:
    branches: [main, master]
  pull_request:
    branches: [main, master]

However, this means the workflow won’t run for a push to a non-PR branch. Some developers only open a PR when ready for review, rather than as soon as they push the branch. In such cases, it’s important for the developer to receive feedback from the workflow.

This also means the workflow won’t run for a pull request whose base branch isn’t a default branch. Sometimes, we create PRs on non-default branches, like when doing a rewrite, like the django branch of Kingfisher Process.

To correct for both scenarios, we use on: [push, pull_request], and then use the above condition to avoid duplicate runs.

Note that, in standards repositories, we have many protected branches (like 1.0 and 1.0-dev) that are not “main” or “master”. The above setup avoids accidentally excluding relevant branches.