Hypotheses validated by MVO

H1 — Portability

One artifact runs across laptop / edge / cloud without code changes.

experiment: 2 workloads × 3 environments criterion: ≥ 80% without code changes validated by: Layer 6

H2 — Simplicity

VYR reduces operational surface compared to equivalent baseline stack.

experiment: VYR vs K8s + job runner criterion: ≥ 50% fewer mandatory components validated by: Layer 7

Deferred hypotheses (post-MVO)

Hypothesis What is verified Requires Phase
H3 — Partition resilience System continues useful work during WAN split Multi-node fabric 6–12 mo
H4 — State continuity Eventual state converges without manual repair CRDT implementation 6–12 mo
H5 — Decision quality KYX improves outcome vs control group KYX causal engine + history 18–24 mo
H6 — Safety Automation does not cause cascading degradation KYX simulator + staged rollout 18–24 mo

MVO KPI targets

Cold start P50
<100
ms · pre-prod target
Cold start P99
<400
ms · pre-prod target
Bootstrap time
<30
min · single node
OCI fallback
>90%
success rate
Trace coverage
>80%
critical paths
Portability H1
≥80%
workloads without changes

Benchmark workload corpus (MVO subset)

Two workloads are selected for MVO. The full corpus of six classes is validated across later phases.

Workload Class What it demonstrates MVO
event transform / stream enrichment stateless transform portability, cold start, minimal dependencies ✓ included
disconnected collector with delayed sync offline-tolerant ephemeral state, offline tolerance, sync on reconnect ✓ included
local inference / edge scoring compute-bound Wasm performance, resource limits deferred
multi-step automation / agent workflow runner stateful workflow state continuity, KYX integration deferred
replicated control object / config distribution replicated state eventual consistency, CRDT convergence deferred
failure drill / arbitrated recovery scenario fault injection partition resilience, H3/H6 deferred