Skip to content

Pixi dependency management#301

Draft
sbrandstaeter wants to merge 90 commits intoqueens-py:mainfrom
sbrandstaeter:pixi-dependency-management
Draft

Pixi dependency management#301
sbrandstaeter wants to merge 90 commits intoqueens-py:mainfrom
sbrandstaeter:pixi-dependency-management

Conversation

@sbrandstaeter
Copy link
Copy Markdown
Member

@sbrandstaeter sbrandstaeter commented Apr 19, 2026

Summary of big changes:

  • switch to pixi for dependency management (keep backward compatibility to pip and conda)
  • switch to Python 3.12 and bump versions of dependencies
  • run tests on macos -> official support?

Minor changes

  • clean up GitHub actions
  • switch to pip-licenses to check licenses of all production dependencies (also transitive dependencies)
  • fix several pylint errors that popped up with the newer version
  • fix some warning in the documentation

Description and Context:
What and Why?

This PR moves the project to pixi as the main dependency and environment manager.

pixi gives us a modern workflow for mixed Conda/PyPI environments:

  • Conda and PyPI dependencies can be declared together
  • Conda takes precedence where applicable
  • PyPI dependencies are resolved via uv under the hood
  • environment creation and locking are fast and reproducible

At the same time, this setup does not force users to adopt pixi. QUEENS can still be used through standard Python packaging workflows such as pip install -e ., and pure-PyPI workflows using uv remain possible. Even so, I strongly recommend trying pixi: it is easy to use and very fast.

Reminder:

One major reason for this change is that we should bump the dependencies' versions
This is also done with this PR. Starting with switching to Python 3.12.
The new versions will also require several changes and adaptations to the tests.
One adaptation that was already done is the switch to pip-licenses for checking the licenses of QUEENS' dependencies including transitive dependencies (dependencies of direct dependencies).

What Changed Compared To main

This branch introduces:

  • pixi workspace-based dependency and environment management
  • a pixi.lock-based reproducible environment workflow
  • update to Python 3.12 and bump of many dependency versions
  • CI updates for GitHub and GitLab to build and test against pixi-managed environments
  • remote environment bootstrapping aligned with pixi
  • consistency checks between PEP-style dependency declarations and pixi dependency declarations
  • pre-commit and CI checks to prevent those duplicate dependency definitions from drifting apart
  • lockfile integrity checks in CI so dependency changes and stale lockfiles are surfaced early

Why We Now Have Duplicate Dependency Definitions

A central part of this setup is that dependencies are now defined in two forms:

  1. PEP-compliant package metadata:
  • project.dependencies
  • dependency-groups
  • project.optional-dependencies
  1. pixi dependency definitions:
  • tool.pixi.dependencies
  • tool.pixi.pypi-dependencies
  • tool.pixi.feature.*

This duplication is intentional.

The PEP-style declarations are needed for standard Python packaging metadata and pure-PyPI workflows.
The pixi declarations are needed to drive mixed Conda/PyPI environment resolution efficiently and reproducibly.

To make sure these two views do not diverge, this PR adds automated integrity checks in:

  • pre-commit
  • GitHub Actions

CI / Workflow Improvements

The pipeline now:

  • uses pixi-managed environments
  • checks dependency declaration integrity
  • checks whether dependency-related pyproject.toml changes would require a lockfile update
  • fails when dependency declarations changed and pixi.lock is stale

This gives us much better guardrails around environment reproducibility.

What Is Still Missing

One thing not fully supported yet, but straightforward to add, is a pure Conda environment-management workflow for users who want to stay entirely in Conda-style tooling.

The natural path for that is:

  • exporting environments from pixi
  • using those exported environment files for Conda-only workflows

This should be added next.

Possible Next Steps

Future improvements that would fit well on top of this work:

  • support pure Conda environment management via pixi environment export
  • test multiple environment flavors in CI, for example:
    • a locked mixed Conda/PyPI pixi environment
    • a pure PyPI install/test flow
  • potentially test both unlocked developer workflows and locked reproducible workflows explicitly

Short Usage Tutorials

1. Using pixi

Install pixi, then from the repository root:

Do the following once (per environment):

pixi install --environment queens-dev
pixi run -e queens-dev pip install -e ./ --no-deps

Useful environments currently include:

  • queens-base
  • queens-all
  • queens-dev

Then you to for example run the test suite

pixi run -e queens-dev pytest

Open a shell in an environment:

pixi shell -e queens-dev

Refresh the lockfile after dependency changes:

pixi lock

Use the lockfile strictly:

pixi install --locked --environment queens-dev
pixi run -e queens-dev pip install -e ./ --no-deps

2. Pure PyPI Workflows

These remain supported.

2.1 pip install

python -m venv .venv
source .venv/bin/activate
pip install -e .

This is the classic editable-install workflow.

2.2 uv

uv venv
source .venv/bin/activate
uv pip install -e .

This is a good option for a fast pure-PyPI workflow.

3. Conda via Pixi Export

This is the intended future direction for Conda-only usage.

The idea is:

  • define dependencies centrally via pixi
  • export a Conda-compatible environment
  • use that exported file in Conda-only workflows

That support is not the main path yet, but it can be added next.

Open points

  • add support for macos
  • fix all tests with the new setup
  • remove unnecessary explicit dependencies (as analysed by @gilrrei)
  • introduce and test pure PyPI environment
  • introduce and test "pure" conda environment
  • fix warnings in the pipeline
  • write documentation on how to install QUEENS (up to three cases?)
  • write documentation on how to change (add, remove, upgrade, ...) dependencies
  • extract workspace setup in a dedicated GitHub action (reduce code duplication)
  • extract GitHub action to run QUEENS tests for specific markers (reduce code duplication)
  • fix xml to md report generation (for multiple junit reports)
  • enable testing of tutorials using 4C Tutorials 3 and 4 are currently not included in the tutorial test suite. #302

Follow-ups

  • handling of remote environments
  • GitLab test pipeline
  • fix coverage report if tests are split (core tests vs 4C tests vs tutorials)
  • track progress of mapping tensorflow-probability (tfp-nightly) between conda and PyPi (ubuntu) -> readd default pypi dependency to development queens = {path = "./", editable=True} (alternatively: pixi adds --no-deps feature)
  • publish coverage report
  • publish QUEENS on pypi @sbrandstaeter
  • publish QUEENS on conda-forge @sbrandstaeter
  • automatic updating of dependencies (see e.g. updating lockfile)
  • automatic updating of pixi version (see notes on pin your version
  • automatically bump version of GitHub actions
  • add 4C tests on mac once Add cmake preset to compile 4C on Darwin 4C-multiphysics/4C#1795 is merged

Related Issues and Pull Requests

Interested Parties

Note: More information on the merge request procedure in QUEENS can be found in the Submit a pull request section in the CONTRIBUTING.md file.

Copy link
Copy Markdown
Member

@leahaeusel leahaeusel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a few thoughts that I had while browsing through the changes (without having read all of them yet):

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I haven't missed anything, we can delete this action now, right?

pixi-version: v0.67.0
cache: true
cache-write: ${{ github.event_name == 'push' && github.ref_name == 'main' }}
environments: queens-dev
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the difference between queens-dev and queens-all? I'm just wondering which parts of queens-all we are using neither for the documentation nor for the tests.

Copy link
Copy Markdown
Member Author

@sbrandstaeter sbrandstaeter Apr 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very good question.
queens-dev is queens-all plus development dependencies (pytest, isort, etc.).
And queens-all includes all possible production dependencies.

To verify: the environments are defined here

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok got it, thanks for clarifying 👍 In this case, I would like to bring up the point that it might be helpful to rename queens-dev to queens-all-dev to avoid confusion and make the extension even clearer. However, I do understand that a shorter name is also more convenient.

PYTHON_PACKAGE_MANAGER: ${{steps.environment.outputs.ppm}}
- name: Register Jupyter kernel
run: |
$PYTHON_PACKAGE_MANAGER activate queens
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super nice that we don't have to manually activate the environment in every single shell anymore :)

Comment on lines +37 to +40
environments: >-
queens-base
queens-all
queens-dev
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it actually necessary to list all of the available environments here?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am still not sure about it.
The idea was that all environments are actually built and thus tested.
If building an environment fails, the pipeline should fail.
Not sure if it works like this though tbh.

For running the tests queens-dev should be enough.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the separation of code checks from the testsuite. This allows us to parallelize the two workflows, right?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, exactly, they run in parallel.

Comment thread .github/workflows/local_testsuite.yml Outdated
Comment thread .github/workflows/code_quality.yml Outdated
Comment thread doc/README.md
To build the documentation, first set up a QUEENS environment as described in the
[README](../README.md). For documentation work, use the development setup there, which
includes the required documentation and tutorial dependencies.
Next, register the environment as a Jupyter kernel with:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Next, register the environment as a Jupyter kernel with:
Next, register the environment as a Jupyter kernel such that the tutorial notbooks can be run while building the documentation:

@@ -5,6 +5,6 @@ This package contains drivers and utilities realted to the for multiphysics code
## Material random field interface
In order to create random material field in combintation with 4C, we require the package [fourcipp](https://github.com/4C-multiphysics/fourcipp). Therefore install QUEENS via (in the QUEENS main directory):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
In order to create random material field in combintation with 4C, we require the package [fourcipp](https://github.com/4C-multiphysics/fourcipp). Therefore install QUEENS via (in the QUEENS main directory):
In order to create random material fields in combination with 4C, we require the package [fourcipp](https://github.com/4C-multiphysics/fourcipp). Therefore, install the base QUEENS environment as described in the [README.md](../../README.md) and install the additional fourcipp dependency via

mamba env update -n queens -f environment.fourc.yml
```
(You can omit the `dev` if you don't require the additional development packages).
For this you need a QUEENS environment, for instructions on how to set it up see the [README.md](../README.md).
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
For this you need a QUEENS environment, for instructions on how to set it up see the [README.md](../README.md).

@sbrandstaeter sbrandstaeter marked this pull request as draft April 27, 2026 10:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants