Continuous integration#
See also
Workflows for linting Python, JavaScript and shell scripts, for releasing packages and for checking translations
Tip
If a workflow has:
on:
schedule:
- cron: "..."
Add a workflow_dispatch
event, to be able to test the workflow by triggering it manually:
on:
workflow_dispatch:
schedule:
- cron: "..."
Automated tests#
Create a .github/workflows/ci.yml
file, and use or adapt one of the templates below.
Workflows should have a single responsibility: running tests, linting Python, checking translations, deploying, etc. To connect workflows, read Events that trigger workflows and Running a workflow based on the conclusion of another workflow, in particular.
If the project is only used with a specific version of the OS or Python, set
runs-on:
andpython-version:
appropriately.If a
run:
step uses anenv:
key, putenv:
beforerun:
, so that the reader is more likely to see the command with its environment.If a
run:
step is a single line, omit thename:
key.Put commands that form logical units in the same
run:
step. For example:- name: Install gettext run: | sudo apt update sudo apt install gettext
Not:
- run: sudo apt update # WRONG - run: sudo apt install gettext # WRONG
Reference: Customizing GitHub-hosted runners
Warnings#
The step that runs tests should either the -W
option or the PYTHONWARNINGS
environment variable to error
.
Tip
The Python documentation describes warning filter specifications as using regular expressions. However, this is only true when using the warnings
module. If set using -W
or PYTHONWARNINGS
, the message and module parts are escaped using re.escape
, and the module part is suffixed with a \Z
anchor.
Code coverage#
All the templates below use Coveralls, as preferred.
Tip
If needed, you can combine coverage results from multiple jobs.
Service containers#
If the workflow requires service containers, add the services:
key after the steps:
key, so that files are easier to compare visually.
Note
Service containers are only available on Ubuntu runners.
PostgreSQL#
Set the image tag to the version used in production.
services:
postgres:
image: postgres:13
env:
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432/tcp
This connection string can be used in psql
commands or in environment variables to setup the database or configure the application:
postgresql://postgres:postgres@localhost:${{ job.services.postgres.ports[5432] }}/postgres
Tip
If you are running out of connections, use the cyberboss/postgres-max-connections
image, which is a fork of postgres:latest
with max_connections=500
.
Reference: Creating PostgreSQL service containers
RabbitMQ#
services:
rabbitmq:
image: rabbitmq:latest
options: >-
--health-cmd "rabbitmqctl node_health_check"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5672/tcp
This connection string can be used:
amqp://127.0.0.1:${{ job.services.rabbitmq.ports[5672] }}
Elasticsearch#
Set the image tag to the version used in production.
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.10.0
env:
discovery.type: single-node
options: >-
--health-cmd "curl localhost:9200/_cluster/health"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 9200/tcp
Templates#
Applications#
Set the Ubuntu version and Python version to those used in production.
If using Django, use this template, replacing core
if needed and adding app directories as comma-separated values after --source
:
name: CI
on: [push, pull_request]
jobs:
build:
if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: '3.10'
cache: pip
cache-dependency-path: '**/requirements*.txt'
- run: pip install -r requirements.txt
# Check requirements.txt contains production requirements.
- run: ./manage.py --help
- run: pip install -r requirements_dev.txt
- name: Run checks and tests
env:
PYTHONWARNINGS: error
DATABASE_URL: postgresql://postgres:postgres@localhost:${{ job.services.postgres.ports[5432] }}/postgres
run: |
./manage.py migrate
./manage.py makemigrations --check --dry-run
./manage.py check --fail-level WARNING
coverage run --source core manage.py test
- env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: coveralls --service=github
services:
postgres:
image: postgres:13
env:
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432/tcp
Note
Remember to add __init__.py
files to the management
and management/commands
directories within app directories. Otherwise, their coverage won’t be calculated.
Otherwise, use this template, replacing APPNAME1
:
name: CI
on: [push, pull_request]
jobs:
build:
if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: '3.10'
cache: pip
cache-dependency-path: '**/requirements*.txt'
- run: pip install -r requirements_dev.txt
- run: pytest --cov APPNAME1
- env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: coveralls --service=github
Packages#
If using tox:
name: CI
on: [push, pull_request]
jobs:
build:
if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [macos-latest, windows-latest, ubuntu-latest]
python-version: [3.7, 3.8, 3.9, '3.10', '3.11', pypy-3.7]
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- run: pip install tox
- env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: tox
Note
Do not use tox
to test multiple Python versions. Use the matrix in GitHub Actions, instead. This makes it easier to install version-specific dependencies (like libxml2-dev
for PyPy), and it makes exclusions more visible (like pypy-3.7 on Windows).
If not using tox
, use this template, replacing {{ cookiecutter.package_name }}
and removing the Jinja syntax if not using the Cookiecutter template:
name: CI
on: [push, pull_request]
jobs:
build:
if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
{%- if cookiecutter.os_independent == "y" %}
{%- raw %}
runs-on: ${{ matrix.os }}
{%- endraw %}
{%- else %}
runs-on: ubuntu-latest
{%- endif %}
strategy:
matrix:
{%- if cookiecutter.os_independent == "y" %}
os: [macos-latest, windows-latest, ubuntu-latest]
{%- endif %}
python-version: [3.7, 3.8, 3.9, '3.10', '3.11'{% if cookiecutter.pypy == "y" %}, pypy-3.7{% endif %}]
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
{%- raw %}
python-version: ${{ matrix.python-version }}
{%- endraw %}
cache: pip
cache-dependency-path: setup.cfg
- run: pip install .[test]
- run: pytest -W error --cov {{ cookiecutter.package_name }}
- env:
{%- raw %}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
{%- endraw %}
run: coveralls --service=github
Test packages on Python versions that aren’t end-of-life, and on the latest version of PyPy. Test on Ubuntu, macOS and Windows (though only Ubuntu if a service container is needed).
If the package has optional support for orjson, to test on PyPy, replace the pytest
step with the following steps, replacing PACKAGENAME
:
# "orjson does not support PyPy" and fails to install. https://pypi.org/project/orjson/
- if: matrix.python-version != 'pypy-3.7'
name: Test
run: |
coverage run --append --source=PACKAGENAME -m pytest
pip install orjson
coverage run --append --source=PACKAGENAME -m pytest
pip uninstall -y orjson
- if: matrix.python-version == 'pypy-3.7'
name: Test
run: pytest --cov PACKAGENAME
Static files#
For example, the Extension Registry mainly contains static files. Tests are used to validate the files.
name: CI
on: [push, pull_request]
jobs:
build:
if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: '3.10'
cache: pip
cache-dependency-path: '**/requirements*.txt'
- run: pip install -r requirements.txt
- run: pytest
Dependabot#
Keep GitHub Actions up-to-date with:
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "daily"
Reference: Configuration options for dependency updates
Maintenance#
Find unexpected workflows:
find . -path '*/workflows/*' ! -name ci.yml ! -name lint.yml ! -name mypy.yml ! -name js.yml ! -name shell.yml ! -name spellcheck.yml ! -name i18n.yml ! -name pypi.yml ! -name docker.yml ! -path '*/node_modules/*' ! -path '*/vendor/*'
Find ci.yml
files without lint.yml
files, and vice versa:
find . \( -name lint.yml \) -exec bash -c 'if [[ -z $(find $(echo {} | cut -d/ -f2) -name ci.yml) ]]; then echo {}; fi' \;
find . \( -name ci.yml \) -not -path '*/node_modules/*' -exec bash -c 'if [[ -z $(find $(echo {} | cut -d/ -f2) -name lint.yml) ]]; then echo {}; fi' \;
Find and compare lint.yml
files:
find . -name lint.yml -exec bash -c 'sha=$(shasum {} | cut -d" " -f1); if [[ ! "9773a893d136df0dc82deddedd8af8563969c04a 9222eac95ab63f3c2d983ba3cf4629caea53a72e fc3eff616a7e72f41c96e48214d244c9058dbc83 953ef7f0815d49226fd2d05db8df516fff2e3fdb dfe1c0d1fbdb18bb1e2b3bcfb1f0c10fe6b06bc4" =~ $sha ]]; then echo -e "\n\033[0;32m{}\033[0m"; echo $sha; cat {}; fi' \;
Find and compare js.yml
files:
find . -name js.yml -exec bash -c 'echo $(tail -r {} | tail +2 | tail -r | shasum - | cut -d" " -f1) {}' \;
Find and compare shell.yml
files:
find . -name shell.yml -exec bash -c 'echo $(shasum {} | cut -d" " -f1) {}' \;
Find repositories with shell scripts but without shell.yml
files:
find . \( -path '*/script/*' -o -name '*.sh' \) -not -path '*/node_modules/*' -not -path '*/vendor/*' -exec bash -c 'if [[ -z $(find $(echo {} | cut -d/ -f2) -name shell.yml) ]]; then echo {}; fi' \;
Find and compare pypi.yml
files:
find . -name pypi.yml -exec bash -c 'echo $(shasum {} | cut -d" " -f1) {}' \;
Find repositories for Python packages but without pypi.yml
files:
find . -name setup.cfg -not -path '*/node_modules/*' -exec bash -c 'if grep long_description {} > /dev/null && [[ -z $(find $(echo {} | cut -d/ -f2) -name pypi.yml) ]]; then echo {}; fi' \;
Find and compare i18n.yml
files:
find . -name i18n.yml -exec bash -c 'echo $(shasum {} | cut -d" " -f1) {}' \;
Find repositories with LC_MESSAGES
directories but without i18n.yml
files:
find . -name LC_MESSAGES -not -path '*/en/*' -exec bash -c 'if [[ -z $(find $(echo {} | cut -d/ -f2) -name i18n.yml) ]]; then echo {}; fi' \;
Reference#
The following prevents GitHub Actions from running a workflow twice when pushing to the branch of a pull request:
if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
Note
A common configuration for GitHub Actions is:
on:
push:
branches: [main, master]
pull_request:
branches: [main, master]
However, this means the workflow won’t run for a push to a non-PR branch. Some developers only open a PR when ready for review, rather than as soon as they push the branch. In such cases, it’s important for the developer to receive feedback from the workflow.
This also means the workflow won’t run for a pull request whose base branch isn’t a default branch. Sometimes, we create PRs on non-default branches, like when doing a rewrite, like the django
branch of Kingfisher Process.
To correct for both scenarios, we use on: [push, pull_request]
, and then use the above condition to avoid duplicate runs.
Note that, in standards repositories, we have many protected branches (like 1.0
and 1.0-dev
) that are not “main” or “master”. The above setup avoids accidentally excluding relevant branches.