VoLCA is a Life Cycle Assessment engine that turns LCA databases into inspectable, queryable answers — fast.
It loads EcoSpold2, EcoSpold1, SimaPro CSV, and ILCD process databases, builds supply chain dependency trees, computes life cycle inventories using sparse matrix algebra, and applies characterization methods for impact assessment. Everything runs in-memory against your own data.
- Browse activities and flows across multiple databases
- Explore supply chain trees, force-directed dependency graphs, and supply chain analysis by sector classification
- Compute life cycle inventories (LCI) and impact scores (LCIA) — single method or batch across a full collection
- Normalize and weight LCIA results with Raw / Normalized / Weighted view toggle; compute a single-score in Pt when normalization-weighting data is available
- Map method characterization factors to database flows with a 4-step cascade (UUID → CAS → name → synonym) and coverage statistics
- Link databases across nomenclatures (e.g., a sector database referencing Agribalyse)
- Upload databases and method collections via the API, without touching config files
- Multiple database formats: EcoSpold2 (.spold), EcoSpold1 (.xml), SimaPro CSV, ILCD process datasets
- Archive support: Load databases directly from .zip, .7z, .gz, or .xz archives — no manual extraction
- Cross-database linking: Resolve supplier references across databases, with configurable dependencies and topological load ordering
- LCIA method collections: Load ILCD method packages (ZIP or directory), SimaPro method CSV exports, or tabular CSV from config
- Normalization and weighting: Batch LCIA computes normalized and weighted scores per category and a single aggregated score (Pt) when NW data is present in the method collection
- Flow mapping engine: 4-step matching cascade (UUID → CAS → name → synonym) with per-strategy coverage statistics
- Activity classifications: ISIC, CPC, and category fields parsed from EcoSpold1/2 and ILCD
- Auto-extracted synonyms: Synonym pairs extracted automatically from loaded databases and method packages, available for toggling and download
- Reference data management: Flow synonyms, compartment mappings, and unit definitions can be configured in TOML, uploaded via the API, or toggled independently
- Fast cache: Per-database cache with automatic schema-based invalidation; large databases (26k activities) load in ~47s cold, ~3.4s cached
- Optional access control: Single-code login with cookie-based session
All figures measured on Ecoinvent 3.12 (26 533 activities) on a 4-core machine.
| Phase | Cold start | Hot start |
|---|---|---|
| Startup (read all 26 533 EcoSpold files from disk) | ~50 s | — |
| Hot startup — load from cache | — | ~3.7 s |
| First computation after startup (inventory, impact score)¹ | ~90 ms | ~8.5 s |
| Next computations (inventory, impact score) | ~85 ms | ~85 ms |
| Batch of 200 computations | ~19 s total (10/sec) | ~19 s total (10/sec) |
| Computation of a modified process (upstream process substitution) | ~120 ms | ~110 ms |
¹ The matrix factorisation is deferred to the first computation and cached for the lifetime of the server. All subsequent requests reuse it.
./build.sh
# Start the server
volca --config volca.toml server --port 8080
# Open http://localhost:8080The CLI is a lightweight HTTP client that connects to a running server (~0.2s per command). Start the server once, then query it freely:
# Start server (loads databases into memory once)
volca --config volca.toml server --port 8080
# In another terminal — all commands talk to the server via HTTP
volca --config volca.toml activities --name "electricity" --geo "FR"
volca --config volca.toml --db agribalyse tree "12345678-..." --depth 3
volca --config volca.toml --db agribalyse lcia "12345678-..." --method METHOD_UUID# Launch REPL (auto-starts the server if not running)
volca --config volca.toml repl
# Inside the REPL:
# volca> activities --name "wheat"
# volca> use agribalyse
# volca[agribalyse]> inventory UUID
# volca[agribalyse]> :format table
# volca[agribalyse]> lcia UUID --method METHOD_UUID
# volca[agribalyse]> :help
# volca[agribalyse]> :quitA TOML config file enables multi-database setups, method collections, and reference data:
[server]
port = 8080
host = "127.0.0.1"
password = "mysecret" # optional — omit to disable auth
[[databases]]
name = "agribalyse-3.2"
displayName = "Agribalyse 3.2"
path = "DBs/AGB32_final.CSV"
description = "SimaPro CSV"
load = true
[[databases]]
name = "my-sector-db"
displayName = "Sector DB"
path = "DBs/sector.CSV"
depends = ["agribalyse-3.2"] # loads agribalyse first, then links
load = true
[[methods]]
name = "EF-3.1"
path = "DBs/EF-v3.1.zip" # ILCD method package (ZIP or directory)
# SimaPro method CSV exports and tabular CSV are also accepted:
# path = "DBs/EF3.1_methods.csv"
[[flow-synonyms]]
name = "Default flow synonyms"
path = "data/flows.csv" # CSV with two columns: name1,name2
active = true
[[compartment-mappings]]
name = "Default compartment mapping"
path = "data/compartments.csv"
active = true
[[units]]
name = "Default units"
path = "data/units.csv"
active = trueThe depends field ensures dependency databases load first and their flows are available for cross-database linking. Setting load = true on a database transitively loads all its dependencies.
All per-database resources are scoped under /api/v1/db/{name}/:
GET /api/v1/db/{name}/activity/{id} Activity details
GET /api/v1/db/{name}/activity/{id}/tree Supply chain tree
GET /api/v1/db/{name}/activity/{id}/inventory Life cycle inventory
GET /api/v1/db/{name}/activity/{id}/graph?cutoff= Supply chain graph
GET /api/v1/db/{name}/activity/{id}/supply-chain Supply chain table
GET /api/v1/db/{name}/activity/{id}/lcia/{methodId} LCIA score (single method)
GET /api/v1/db/{name}/activity/{id}/lcia-batch/{col} LCIA batch (all categories, with optional NW scores and single-score)
GET /api/v1/db/{name}/activities?name=&geo=&product= Search activities
GET /api/v1/db/{name}/flows?q=&lang= Search flows
GET /api/v1/db/{name}/flow/{flowId} Flow details
GET /api/v1/db/{name}/flow/{flowId}/activities Activities using a flow
GET /api/v1/db/{name}/method/{id}/mapping Mapping coverage stats
GET /api/v1/db/{name}/method/{id}/flow-mapping Per-flow mapping detail
GET /api/v1/db List databases and status
POST /api/v1/db/upload Upload a database archive
POST /api/v1/db/{name}/load Load a configured database
POST /api/v1/db/{name}/finalize Finalize cross-DB linking
DELETE /api/v1/db/{name} Delete a database
GET /api/v1/methods List individual methods
GET /api/v1/method/{id} Method details
GET /api/v1/method/{id}/factors Characterization factors
GET /api/v1/method-collections List method collections
POST /api/v1/method-collections/{name}/load Load a method collection
POST /api/v1/method-collections/{name}/unload Unload a method collection
POST /api/v1/method-collections/upload Upload a method package
DELETE /api/v1/method-collections/{name} Delete a method collection
GET /api/v1/flow-synonyms List synonym sets
POST /api/v1/flow-synonyms/{name}/load Activate a synonym set
POST /api/v1/flow-synonyms/{name}/unload Deactivate a synonym set
GET /api/v1/flow-synonyms/{name}/groups Browse synonym groups
GET /api/v1/flow-synonyms/{name}/download Download synonym CSV
GET /api/v1/compartment-mappings List compartment mappings
GET /api/v1/units List unit definitions
POST /api/v1/auth Login (returns session cookie)
The lcia-batch response includes per-category normalizedScore and weightedScore fields (when normalization-weighting data is present in the method collection), plus a singleScore sum in Pt.
The full OpenAPI 3.0 specification is served at runtime:
GET /api/v1/openapi.json— machine-readable spec (for code generation, tooling)GET /api/v1/docs— Swagger UI (interactive browser)
Use these to build your own frontend, generate a typed client, or explore the API interactively.
VoLCA exposes an MCP (Model Context Protocol) endpoint at POST /mcp, making LCA data queryable by AI assistants (Claude, Cursor, etc.) natively.
Configure it in your MCP client:
{
"mcpServers": {
"volca": {
"url": "http://localhost:8080/mcp"
}
}
}Available tools:
| Tool | Description |
|---|---|
list_databases |
List loaded databases |
search_activities |
Search by name, geography, or product |
search_flows |
Search biosphere flows |
get_activity |
Activity details and exchanges |
get_tree |
Supply chain tree with loop detection |
get_supply_chain |
Flat upstream activity list with quantities |
get_inventory |
LCI biosphere flows (top N by quantity) |
compute_lcia |
LCIA score for an activity and method |
list_methods |
List loaded impact assessment methods |
get_flow_mapping |
CF-to-flow mapping coverage for a method |
Authentication uses the same password as the REST API.
| Option | Description |
|---|---|
--config FILE |
TOML config file for multi-database setup |
--url URL |
Server URL (default: from config; or set VOLCA_URL) |
--password PWD |
Server password (or set VOLCA_PASSWORD) |
--db NAME |
Database name to query |
--format FORMAT |
Output format: pretty (default), json, table, csv |
--jsonpath PATH |
Field to extract for CSV output (e.g., srResults) |
--tree-depth N |
Maximum tree depth (default: 2) |
--no-cache |
Disable caching (for development) |
# Start server (loads databases — run once)
volca --config volca.toml server --port 8080
# Single HTTP command (connects to running server, ~0.2s)
volca --config volca.toml [--db NAME] COMMAND [OPTIONS]
# Interactive REPL (auto-starts server if not running)
volca --config volca.toml replvolca activities --name "electricity" --geo "DE" --limit 10
volca activities --product "steel" --limit 10 --offset 20
volca flows --query "carbon dioxide" --limit 5# Activity details
volca activity "12345678-..."
# Supply chain tree
volca tree "12345678-..." --tree-depth 3
# Life cycle inventory
volca inventory "12345678-..."
# Impact assessment (--method takes a method UUID, not a file path)
volca lcia "12345678-..." --method METHOD_UUID
# Matrix export (Ecoinvent universal format — runs locally, not via HTTP)
volca export-matrices ./output_dir# Summary: how well does a method match a database?
volca --db agribalyse mapping METHOD_UUID
# See every mapped CF with its match strategy (uuid/cas/name/synonym)
volca --db agribalyse mapping METHOD_UUID --matched
# List CFs that found no DB flow
volca --db agribalyse mapping METHOD_UUID --unmatched
# List DB biosphere flows with no CF
volca --db agribalyse mapping METHOD_UUID --uncharacterized
# Machine-readable output
volca --db agribalyse mapping METHOD_UUID --matched --format json# List, upload, delete databases
volca database # list (default)
volca database upload mydb.7z --name "My DB" # upload
volca database delete my-db # delete
# List, upload, delete method collections
volca method # list (default)
volca method upload EF-3.1.zip --name "EF 3.1" # upload
volca method delete ef-31 # delete| Feature | REST API | CLI |
|---|---|---|
| Search | ||
| Search activities | GET /db/{db}/activities?name=&geo=&product= |
activities --name --geo --product |
| Search flows | GET /db/{db}/flows?q=&lang= |
flows --query --lang |
| Analysis | ||
| Activity details | GET /db/{db}/activity/{id} |
activity ID |
| Supply chain tree | GET /db/{db}/activity/{id}/tree |
tree ID --depth N |
| Supply chain graph | GET /db/{db}/activity/{id}/graph?cutoff= |
— |
| Life cycle inventory | GET /db/{db}/activity/{id}/inventory |
inventory ID |
| LCIA (single method) | GET /db/{db}/activity/{id}/lcia/{methodId} |
lcia ID --method METHOD_UUID |
| LCIA batch | GET /db/{db}/activity/{id}/lcia-batch/{col} |
— |
| Flow details | GET /db/{db}/flow/{flowId} |
flow FLOW_ID |
| Flow activities | GET /db/{db}/flow/{flowId}/activities |
flow FLOW_ID activities |
| Flow Mapping | ||
| Mapping coverage | GET /db/{db}/method/{id}/mapping |
mapping METHOD_UUID |
| Per-flow mapping | GET /db/{db}/method/{id}/flow-mapping |
mapping METHOD_UUID --matched |
| Unmatched CFs | included in mapping response | mapping METHOD_UUID --unmatched |
| Uncharacterized flows | — | mapping METHOD_UUID --uncharacterized |
| Database Management | ||
| List databases | GET /db |
database |
| Upload database | POST /db/upload |
database upload FILE --name NAME |
| Load database | POST /db/{name}/load |
— (use config load = true) |
| Delete database | DELETE /db/{name} |
database delete NAME |
| Finalize linking | POST /db/{name}/finalize |
— |
| Method Management | ||
| List methods | GET /methods |
methods |
| Method details | GET /method/{id} |
— |
| Method factors | GET /method/{id}/factors |
— |
| List collections | GET /method-collections |
method |
| Upload collection | POST /method-collections/upload |
method upload FILE --name NAME |
| Delete collection | DELETE /method-collections/{name} |
method delete NAME |
| Reference Data | ||
| Flow synonyms | GET /flow-synonyms |
synonyms |
| Compartment mappings | GET /compartment-mappings |
compartment-mappings |
| Units | GET /units |
units |
| Matrix Export | ||
| Universal format | — | export-matrices DIR (local only) |
| Debug matrices | — | debug-matrices ID --output FILE (local only) |
| Auth | ||
| Login | POST /auth |
— |
All API routes are prefixed with /api/v1/. A dash (—) means the feature is only available in one interface.
Install the MUMPS sparse solver and other dependencies, then build:
# Debian/Ubuntu
sudo apt install build-essential python3 curl zlib1g-dev libmumps-seq-dev
# macOS
brew install gcc python3 curl mumps
# Fedora
sudo dnf install gcc gcc-c++ make python3 curl zlib-devel MUMPS-devel
# Arch Linux
sudo pacman -S base-devel python curl zlib mumpsInstall the Haskell toolchain via GHCup, then:
./build.sh # Build
./build.sh --test # Build and run tests- Install MSYS2 and open the "MSYS2 UCRT64" terminal
- Install dependencies:
pacman -S mingw-w64-ucrt-x86_64-gcc mingw-w64-ucrt-x86_64-mumps make python git
- Install GHCup for the compiler toolchain
- Run:
./build.sh # Same script as Linux/macOS
docker build -f docker/Dockerfile -t volca .
docker run -p 8080:8080 -v /path/to/data:/data volca./build.sh --test
# Or manually
cabal test --test-show-details=streamingTests cover matrix construction (sign convention), inventory calculation (golden values), parsers (EcoSpold1/2, ILCD, SimaPro, classification fields), and matrix export format compliance.
Apache License 2.0 — see LICENSE for details.