Pixi dependency management#301
Conversation
leahaeusel
left a comment
There was a problem hiding this comment.
Just a few thoughts that I had while browsing through the changes (without having read all of them yet):
There was a problem hiding this comment.
If I haven't missed anything, we can delete this action now, right?
| pixi-version: v0.67.0 | ||
| cache: true | ||
| cache-write: ${{ github.event_name == 'push' && github.ref_name == 'main' }} | ||
| environments: queens-dev |
There was a problem hiding this comment.
What's the difference between queens-dev and queens-all? I'm just wondering which parts of queens-all we are using neither for the documentation nor for the tests.
There was a problem hiding this comment.
Very good question.
queens-dev is queens-all plus development dependencies (pytest, isort, etc.).
And queens-all includes all possible production dependencies.
To verify: the environments are defined here
There was a problem hiding this comment.
Ok got it, thanks for clarifying 👍 In this case, I would like to bring up the point that it might be helpful to rename queens-dev to queens-all-dev to avoid confusion and make the extension even clearer. However, I do understand that a shorter name is also more convenient.
| PYTHON_PACKAGE_MANAGER: ${{steps.environment.outputs.ppm}} | ||
| - name: Register Jupyter kernel | ||
| run: | | ||
| $PYTHON_PACKAGE_MANAGER activate queens |
There was a problem hiding this comment.
Super nice that we don't have to manually activate the environment in every single shell anymore :)
| environments: >- | ||
| queens-base | ||
| queens-all | ||
| queens-dev |
There was a problem hiding this comment.
Is it actually necessary to list all of the available environments here?
There was a problem hiding this comment.
I am still not sure about it.
The idea was that all environments are actually built and thus tested.
If building an environment fails, the pipeline should fail.
Not sure if it works like this though tbh.
For running the tests queens-dev should be enough.
There was a problem hiding this comment.
I like the separation of code checks from the testsuite. This allows us to parallelize the two workflows, right?
There was a problem hiding this comment.
Yes, exactly, they run in parallel.
| To build the documentation, first set up a QUEENS environment as described in the | ||
| [README](../README.md). For documentation work, use the development setup there, which | ||
| includes the required documentation and tutorial dependencies. | ||
| Next, register the environment as a Jupyter kernel with: |
There was a problem hiding this comment.
| Next, register the environment as a Jupyter kernel with: | |
| Next, register the environment as a Jupyter kernel such that the tutorial notbooks can be run while building the documentation: |
| @@ -5,6 +5,6 @@ This package contains drivers and utilities realted to the for multiphysics code | |||
| ## Material random field interface | |||
| In order to create random material field in combintation with 4C, we require the package [fourcipp](https://github.com/4C-multiphysics/fourcipp). Therefore install QUEENS via (in the QUEENS main directory): | |||
There was a problem hiding this comment.
| In order to create random material field in combintation with 4C, we require the package [fourcipp](https://github.com/4C-multiphysics/fourcipp). Therefore install QUEENS via (in the QUEENS main directory): | |
| In order to create random material fields in combination with 4C, we require the package [fourcipp](https://github.com/4C-multiphysics/fourcipp). Therefore, install the base QUEENS environment as described in the [README.md](../../README.md) and install the additional fourcipp dependency via |
| mamba env update -n queens -f environment.fourc.yml | ||
| ``` | ||
| (You can omit the `dev` if you don't require the additional development packages). | ||
| For this you need a QUEENS environment, for instructions on how to set it up see the [README.md](../README.md). |
There was a problem hiding this comment.
| For this you need a QUEENS environment, for instructions on how to set it up see the [README.md](../README.md). |
…ests upload also hidden files
Summary of big changes:
pixifor dependency management (keep backward compatibility to pip and conda)Minor changes
pip-licensesto check licenses of all production dependencies (also transitive dependencies)Description and Context:
What and Why?
This PR moves the project to pixi as the main dependency and environment manager.
pixigives us a modern workflow for mixed Conda/PyPI environments:uvunder the hoodAt the same time, this setup does not force users to adopt
pixi. QUEENS can still be used through standard Python packaging workflows such aspip install -e ., and pure-PyPI workflows usinguvremain possible. Even so, I strongly recommend tryingpixi: it is easy to use and very fast.Reminder:
One major reason for this change is that we should bump the dependencies' versions
This is also done with this PR. Starting with switching to Python 3.12.
The new versions will also require several changes and adaptations to the tests.
One adaptation that was already done is the switch to
pip-licensesfor checking the licenses of QUEENS' dependencies including transitive dependencies (dependencies of direct dependencies).What Changed Compared To
mainThis branch introduces:
pixiworkspace-based dependency and environment managementpixi.lock-based reproducible environment workflowWhy We Now Have Duplicate Dependency Definitions
A central part of this setup is that dependencies are now defined in two forms:
project.dependenciesdependency-groupsproject.optional-dependenciespixidependency definitions:tool.pixi.dependenciestool.pixi.pypi-dependenciestool.pixi.feature.*This duplication is intentional.
The PEP-style declarations are needed for standard Python packaging metadata and pure-PyPI workflows.
The
pixideclarations are needed to drive mixed Conda/PyPI environment resolution efficiently and reproducibly.To make sure these two views do not diverge, this PR adds automated integrity checks in:
CI / Workflow Improvements
The pipeline now:
pyproject.tomlchanges would require a lockfile updatepixi.lockis staleThis gives us much better guardrails around environment reproducibility.
What Is Still Missing
One thing not fully supported yet, but straightforward to add, is a pure Conda environment-management workflow for users who want to stay entirely in Conda-style tooling.
The natural path for that is:
This should be added next.
Possible Next Steps
Future improvements that would fit well on top of this work:
Short Usage Tutorials
1. Using
pixiInstall pixi, then from the repository root:
Do the following once (per environment):
Useful environments currently include:
queens-basequeens-allqueens-devThen you to for example run the test suite
Open a shell in an environment:
Refresh the lockfile after dependency changes:
Use the lockfile strictly:
2. Pure PyPI Workflows
These remain supported.
2.1
pip installThis is the classic editable-install workflow.
2.2
uvThis is a good option for a fast pure-PyPI workflow.
3. Conda via Pixi Export
This is the intended future direction for Conda-only usage.
The idea is:
That support is not the main path yet, but it can be added next.
Open points
Follow-ups
queens = {path = "./", editable=True}(alternatively: pixi adds--no-depsfeature)Related Issues and Pull Requests
Interested Parties