Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,6 @@ salience = tg.salience.SaliencePipeline(
clock = tg.clock.ClockRateModulator(
base_dilation_factor=config.clock.base_dilation_factor,
min_clock_rate=config.clock.min_clock_rate,
salience_mode=config.clock.salience_mode,
)
cooldown = ComputeCooldownPolicy(cooldown_tau=config.policies.cooldown_tau)

Expand All @@ -63,7 +62,7 @@ packet = tg.telemetry.ChronometricVector(
memory_strength=0.0,
).to_packet()

validate_packet_schema(packet, salience_mode=config.clock.salience_mode)
validate_packet_schema(packet)

if cooldown.allows_compute(elapsed_tau=clock.tau):
... # downstream work
Expand Down
8 changes: 1 addition & 7 deletions anomaly_poc.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,8 +74,6 @@ def run_poc(
clock = tg.clock.ClockRateModulator(
base_dilation_factor=cfg.clock.base_dilation_factor,
min_clock_rate=cfg.clock.min_clock_rate,
salience_mode=cfg.clock.salience_mode,
legacy_density_scale=cfg.clock.legacy_density_scale,
)

decay = DecayEngine(
Expand Down Expand Up @@ -125,11 +123,7 @@ def run_poc(
provenance_hash=compute_provenance_hash(s.provenance) if strict_replay_mode else None,
).to_packet()

validate_packet_schema(
packet,
salience_mode=cfg.clock.salience_mode,
require_provenance_hash=strict_replay_mode,
)
validate_packet_schema(packet, require_provenance_hash=strict_replay_mode)
packet["EVENT_KIND"] = event.kind
packet["ENCODED"] = bool(encoded)
packet["COMPUTE_ALLOWED"] = bool(compute_allowed)
Expand Down
2 changes: 0 additions & 2 deletions calibration_harness.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,6 @@ def run_calibration(config_path: str = "tg.yaml"):
clock = ClockRateModulator(
base_dilation_factor=config.clock.base_dilation_factor,
min_clock_rate=config.clock.min_clock_rate,
salience_mode=config.clock.salience_mode,
legacy_density_scale=config.clock.legacy_density_scale,
)
decay = DecayEngine(
half_life=config.memory.half_life,
Expand Down
151 changes: 57 additions & 94 deletions docs/usage.md
Original file line number Diff line number Diff line change
@@ -1,48 +1,28 @@
# How to Read the Logs
The Temporal Gradient outputs **Internal State Telemetry** rather than conventional debug lines. The goal is to show how internal time (τ) and memory retention respond to salience.
Canonical module references and compatibility shims are listed in [`docs/CANONICAL_SURFACES.md`](docs/CANONICAL_SURFACES.md).
# Usage

Temporal Gradient outputs **internal state telemetry** — packets that show
how internal time (τ) and memory retention respond to salience. This guide
covers the packet contract, configuration knobs, and the salience
components shipped by default.

## 0. Telemetry contract (canonical vs extended)
The telemetry packet is versioned and split into **required** vs **optional** keys.
For canonical symbol/module ownership (including telemetry validators and compatibility aliases), see [`docs/CANONICAL_SURFACES.md`](docs/CANONICAL_SURFACES.md).
See [`architecture.md`](architecture.md) for the data-flow diagram and
public API surface.

### Packet API contract
- `to_packet()` => returns the packet as a `dict` mapping for schema checks and in-memory processing.
- `to_packet_json()` => returns serialized JSON `str` for transport/logging boundaries.
- Avoid `json.loads(to_packet())`; parse only the JSON text returned by `to_packet_json()`.
## Telemetry packet contract

**Required keys (canonical schema):**
- `SCHEMA_VERSION`
- `WALL_T`
- `TAU`
- `SALIENCE`
- `CLOCK_RATE`
- `MEMORY_S`
- `DEPTH`
- `to_packet()` returns a `dict` for schema checks and in-memory processing.
- `to_packet_json()` returns a JSON string for transport or logging.

**Optional keys (extended telemetry):**
- `H`
- `V`
**Required keys:** `SCHEMA_VERSION`, `WALL_T`, `TAU`, `SALIENCE`,
`CLOCK_RATE`, `MEMORY_S`, `DEPTH`.

`CLOCK_RATE` and `MEMORY_S` are always present in canonical packets. If unset at construction time, serialization currently emits numeric fallbacks (`0.0`), not `null`.
**Optional keys:** `H`, `V`, `entropy_cost`, `PROVENANCE_HASH`.

`SCHEMA_VERSION` policy is strict: canonical serialization must emit exactly `"1.0"`. For migration input only, validators accept legacy `"1"` and normalize it to `"1.0"` on canonical re-serialization.
`SCHEMA_VERSION` must be exactly `"1.0"`. `CLOCK_RATE` and `MEMORY_S` fall
back to `0.0` when unset at construction time.

CLI tables should print **only canonical columns** by default (`WALL_T`, `TAU`, `SALIENCE`, `CLOCK_RATE`, `MEMORY_S`, `DEPTH`). Extended fields like `H` and `V` are intended for verbose/debug output, not the base schema. Demo scripts may also include an `INPUT` column for readability; it is not part of the canonical packet schema.
### Example packet

### Legacy mode compatibility (`legacy_density`)
`legacy_density` is a migration-only mode. It is not the canonical telemetry contract.

Behavior summary:
- Salience is derived from entropy density and clamped into `[0,1]`.
- Canonical telemetry schema strictness is intentionally bypassed.

Compatibility entry points referenced in this section are cataloged in [`docs/CANONICAL_SURFACES.md`](docs/CANONICAL_SURFACES.md).

Packet-shape examples:

Canonical packet (preferred):
```json
{
"SCHEMA_VERSION": "1.0",
Expand All @@ -57,19 +37,7 @@ Canonical packet (preferred):
}
```

Legacy compatibility packet (allowed only in `legacy_density` mode):
```json
{
"WALL_T": 1.0,
"TAU": 0.15,
"SALIENCE": 0.9,
"DEPTH": 0
}
```

See `docs/CANONICAL_VS_LEGACY.md` for the full migration guidance and deprecation horizon.

## Minimal packet round-trip example (canonical)
### Round-trip

```python
import temporal_gradient as tg
Expand All @@ -84,66 +52,61 @@ packet = tg.telemetry.ChronometricVector(
memory_strength=0.2,
).to_packet()

validate_packet_schema(packet, salience_mode="canonical")
print(packet["SCHEMA_VERSION"], packet["SALIENCE"])
validate_packet_schema(packet)
```

This prints canonical packet fields (`SCHEMA_VERSION` and `SALIENCE`) from the in-memory `dict` returned by `to_packet()`.


## 1. The clock-rate table
This table shows how the internal clock-rate is reparameterized by salience load (surprise × value).
## Reading clock-rate output

```text
WALL_T | TAU | INPUT | SALIENCE | CLOCK_RATE (dτ/dt)
============================================================================================
1.0 | 0.15 | "CRITICAL: SECURITY BREACH..." | 0.9 | 0.15x
2.0 | 1.15 | "Checking local weather..." | 0.4 | 1.00x
WALL_T | TAU | INPUT | SALIENCE | CLOCK_RATE
==============================================================================
1.0 | 0.15 | "CRITICAL: SECURITY BREACH..." | 0.9 | 0.15
2.0 | 1.15 | "Checking local weather..." | 0.4 | 1.00
```

Key metrics:
- **WALL_T:** External time elapsed (seconds).
- **TAU:** Internal time accumulator after clock-rate reparameterization.
- **SALIENCE:** Surprise×value score from the valuator.
- **CLOCK_RATE (dτ/dt):** Internal clock multiplier after salience modulation.

Interpretation:
- A CLOCK RATE below 1.0x means the internal clock slowed to process a high-load event (reduced internal clock rate), not “bullet time.”
- 1.0x indicates the baseline clock rate.
- **WALL_T** — external time elapsed (seconds).
- **TAU** — internal time accumulator after clock-rate reparameterization.
- **SALIENCE** — Ψ = H × V from the salience pipeline.
- **CLOCK_RATE** — dτ/dt. Below 1.0 means the internal clock slowed to
process a high-load event. 1.0 is the baseline.

## 2. The entropy sweep (memory audit)
At the end of the simulation, the decay engine reports which memories stayed above the retention threshold.
## Memory audit (post-simulation)

```
>>> MEMORY AUDIT (Post-Simulation)
[ALIVE] Strength: 1.42 | Content: "My name is Sentinel."
[PRUNED ] Content: "Rain. Water. Liquid."
[ALIVE] Strength: 1.42 | Content: "My name is Sentinel."
[PRUNED] Content: "Rain. Water. Liquid."
```

- **ALIVE:** The memory stayed above the pruning threshold because it started with high salience or was reconsolidated.
- **PRUNED:** The memory decayed below the threshold and was pruned.
- **ALIVE** above the pruning threshold.
- **PRUNED** decayed below the threshold and removed by the entropy sweep.

## 3. Configuration hints
Adjust these parameters in `simulation_run.py` and the supporting modules to shape the simulation (canonical policy surface: `temporal_gradient.policies.compute_cooldown`):
- `base_dilation_factor` and `min_clock_rate` in `ClockRateModulator` to control the clock-rate floor and sensitivity to salience load.
- `half_life` in `DecayEngine` to control decay speed.
- Reconsolidation cooldowns and diminishing returns in `EntropicMemory.reconsolidate` to avoid runaway reinforcement.
## Configuration knobs

## 4. Salience components (H/V)
The salience pipeline currently decomposes **H (novelty)** and **V (value)** into two concrete components with constrained inputs and normalized outputs.
- `clock.base_dilation_factor`, `clock.min_clock_rate` — clock-rate floor
and sensitivity to salience load.
- `memory.half_life`, `memory.decay_lambda` — decay speed.
- `memory.s_max`, `memory.initial_strength_max` — reconsolidation bounds.
- `policies.cooldown_tau` — compute eligibility window on internal time.

## Salience components

### RollingJaccardNovelty (H)
- **Operational definition:** Compute tokens from the incoming text, compare against a rolling history window, and take `1 - max_jaccard_similarity` across the window. The history stores the most recent **5** token sets (default `window_size=5`).
- **Tokenization:** Lowercase and split on the regex pattern `[a-z0-9']+` to form a unique token set.
- **Allowed inputs:** Current message text **plus** the internal rolling history (no external context).
- **Normalization:** Jaccard similarity is in `[0,1]`, so novelty output is normalized to `[0,1]`.
- **Swap-friendly note:** This component can be replaced with an embedding-based novelty scorer, provided the output stays normalized to `[0,1]` and only uses the current text plus an internal memory of prior text.

Tokenize the incoming text, compare against a rolling history window of
recent token sets, return `1 - max_jaccard_similarity`. Default
`window_size=5`. Tokenization: lowercase, split on `[a-z0-9']+`.

Output is in `[0, 1]`. Swap-friendly: any novelty scorer that takes the
current text plus internal history and returns a normalized score works.

### KeywordImperativeValue (V)
- **Operational definition:** Count keyword hits in the current text, then compute `min(max_value, base_value + hit_value * hits)`.
- **Default keyword list:** `["must", "never", "critical", "always", "don't", "stop", "urgent"]`.
- **Default weights:** `base_value=0.1`, `hit_value=0.2`, `max_value=1.0`.
- **Allowed inputs:** Current message text **only** (no access to memory or external context).
- **Normalization:** Value output is clamped to `[0,1]` via `max_value`.

Both components are expected to output normalized scores in `[0,1]`. The combined salience `psi = H × V` inherits the same normalization range.
Count keyword hits in the current text, return
`min(max_value, base_value + hit_value * hits)`.

- Default keywords: `["must", "never", "critical", "always", "don't",
"stop", "urgent"]`.
- Default weights: `base_value=0.1`, `hit_value=0.2`, `max_value=1.0`.

Output is clamped to `[0, 1]`. The combined salience `Ψ = H × V`
inherits the same range.
2 changes: 0 additions & 2 deletions sanity_harness.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,8 +29,6 @@ def run_harness(
clock = ClockRateModulator(
base_dilation_factor=active_config.clock.base_dilation_factor,
min_clock_rate=active_config.clock.min_clock_rate,
salience_mode=active_config.clock.salience_mode,
legacy_density_scale=active_config.clock.legacy_density_scale,
)
decay = DecayEngine(
half_life=active_config.memory.half_life,
Expand Down
Loading
Loading